What are the tech terms?

Technology terms are as follows:

1. Virtual Reality

Virtual reality is a virtual created by a computer simulation system. Generally speaking, it is the use of technology to make people immersive and interact with this environment. This technology mainly includes simulated environment, perception, natural skills and sensors.

In addition to the visual perception generated by computer graphics technology, it also includes multiple perceptions such as auditory perception, tactile perception, force perception, motion perception, and even smell and taste. At present, virtual reality technology has been applied to medicine, military aerospace, interior design, industrial simulation, games, entertainment and many other industries.

2. Artificial Intelligence

Artificial Intelligence (AI) is a branch of computer science that attempts to understand the nature of intelligence and produce a new kind of intelligent machine that can respond in a manner similar to human intelligence. Research in this field includes robotics, speech recognition, image recognition, natural language processing, and expert systems.

Since the birth of artificial intelligence, its theory and technology have become increasingly mature, and its application fields are expanding. It is conceivable that the technology products brought by AI will be the "container" of human intelligence in the future. Artificial intelligence can simulate the information process of human consciousness and thinking. Artificial intelligence is not human intelligence, but it can think like a human being and may surpass human intelligence.

3. Cognitive computing

Cognitive computing originated from the name of ibm's artificial intelligence supercomputer "Watson", and now represents a new way of analyzing big data. As information increases over time, computers can learn and interact with each other to gradually improve their cognitive analysis behavior as data grows further, just as the brain does naturally. "Cognitive computing" is the "marriage" of artificial intelligence and big data.

4, quantum computing

Quantum computing is currently a popular research field. Quantum computers based on quantum mechanics, compared with ordinary computers, with unimaginable parallel computing and storage capacity, solving a system of equations with billions of variables takes 100 years. And a trillion quantum computer only needs 0.01 seconds to solve. With the application of quantum computers, current scientific problems such as code-breaking and gene sequencing will be easily solved.

5, deep learning

The concept of deep learning originated from the study of artificial neural networks. A new field in machine learning research, it is motivated by the need to build and model neural networks for analyzing and learning in the human brain, mimicking the mechanisms by which the human brain interprets data, such as images, sounds, and text.

Since 2006, the field of machine learning has seen breakthroughs. The Turing test is at least not that far-fetched. In terms of technical means, it depends not only on the cloud's ability to process big data in parallel, but also on algorithms. The algorithm is earning, and with the help of earning algorithms, mankind has finally found a way to solve the age-old problem of abstract concepts.

6, dt era

dt era of data processing technology. Although the term was coined a long time ago. But it was not until Ma Yun's speech at the it summit in March 2015 that it caught fire in China. Ma said the difference between the two is that the it era is centered on "I", and the dt era is centered on "others", making others stronger, more open, and taking on more responsibility.

7, computer vision

Computer vision is a study of how to make machines "see" science. Further, it refers to the use of cameras and computers instead of the human eye to recognize, track, and measure targets, and further process the graphics so that the computer processing becomes more suitable for the human eye to see or send the image to an instrument for inspection.

Imaginatively speaking, this means installing eyes (cameras) and brains (algorithms) on computers to allow machines to perceive the environment and objects. Our Chinese idiom "Seeing is believing" and Westerners often say "A picture is worth 10,000 words", which expresses the importance of vision to human beings. It is not difficult to compare the future of machines with vision is immeasurable, such as: intelligent robots, intelligent video surveillance, new human-computer interface and so on.