The technological applications of AI are mainly in the following areas:
Natural language processing (including speech and semantic recognition, automatic translation), computer vision (image recognition), knowledge representation, automated reasoning (including planning and decision-making), machine learning and robotics. According to the technology category, it can be divided into perceptual input and learning and training. A computer gets perceptual inputs of audio and video through speech recognition, image recognition, reading knowledge base, human-computer interaction, physical sensing, etc., and then learns from big data to get a brain with decision-making and creative capabilities.
From the PC era in the 1980s and 1990s, after entering the Internet era, what brings us is the explosion of information and the decentralization of information carriers. And after the network information access channel transferred from PC to mobile, the interconnection of everything became a trend, but the technical limitations led to the mobile Internet is difficult to give birth to more new applications and business models. And now, artificial intelligence has become the most exciting and anticipated technology of this era, and will become the focus of IT industry development in the next 10 years and beyond.
The concept of AI was actually fired up in the 1980s, but technical limitations in both hardware and software have kept it dormant for a long time. And now, the development of the four major catalysts of massively parallel computing, big data, deep learning algorithms, and human brain chips, as well as the reduction of computing costs, has led to a surge in AI technology.