Ten future trends of artificial intelligence

With the maturity of deep learning technology, AI is gradually becoming popular from cutting-edge technology.The game between AlphaGo and human beings is not a video game as we used to understand, the level of the video game will never be improved, but AlphaGo has the most critical "deep learning" function of artificial intelligence. "AlphaGo has two deep neural networks, Value Networks and Policy Networks. The Value Networks evaluate the position of the board and the Policy Networks choose where to place the pieces. These neural network models are trained by a new method that combines reinforcement learning from games learned by human experts in tournaments as well as in playing against themselves (Self-Play). In other words, the presence of artificial intelligence enables AlphaGo's Go level to rise as it learns.

The technological applications of AI are mainly in the following areas:

Natural language processing (including speech and semantic recognition, automatic translation), computer vision (image recognition), knowledge representation, automated reasoning (including planning and decision-making), machine learning and robotics. According to the technology category, it can be divided into perceptual input and learning and training. A computer gets perceptual inputs of audio and video through speech recognition, image recognition, reading knowledge base, human-computer interaction, physical sensing, etc., and then learns from big data to get a brain with decision-making and creative capabilities.

From the PC era in the 1980s and 1990s, after entering the Internet era, what brings us is the explosion of information and the decentralization of information carriers. And after the network information access channel transferred from PC to mobile, the interconnection of everything became a trend, but the technical limitations led to the mobile Internet is difficult to give birth to more new applications and business models. And now, artificial intelligence has become the most exciting and anticipated technology of this era, and will become the focus of IT industry development in the next 10 years and beyond.

The concept of AI was actually fired up in the 1980s, but technical limitations in both hardware and software have kept it dormant for a long time. And now, the development of the four major catalysts of massively parallel computing, big data, deep learning algorithms, and human brain chips, as well as the reduction of computing costs, has led to a surge in AI technology.