Since 65438+ 10, two waves of autonomous driving technology have landed, which has pushed people's enthusiasm for autonomous driving to a climax. And this year is actually an important year for autonomous driving, because if Google's autonomous driving project is started in 20 10, this year happens to be the tenth year of commercialization of autonomous driving technology.
In the past ten years, as the application scenario of artificial intelligence closest to life, autonomous driving has always been highly anticipated by people. Today, with the breakthrough and application of big data, AI, 5G and other technologies, Baidu, Uber, Didi and Wen Yuan Zhixing, more and more autonomous driving technologies have moved from the laboratory to the road.
For ADAS and AV, there may never be a single and most effective way to realize sensing technology. This magic number may be six, because every automobile manufacturer will decide how to realize it in its own way with six basic considerations, which will lead everyone to create their own unique methods to integrate sensors into future vehicles.
Some auto parts companies with strong comprehensive strength at home and abroad carry out multi-product layout on the sensors of autonomous vehicles, which can provide comprehensive autonomous driving solutions for downstream customers and form strong competitiveness. These companies include foreign companies such as Bosch, Continental, Valeo, Hella, Delphi, Fujitsu and Autoliv, and domestic companies such as Desai Siwei, Huayu Automobile and Baolong Technology.
More and more sensors are deployed on the whole vehicle to actively solve safety problems. How many sensors are there in our car today? How many sensors are needed to further improve autonomy? The answer to this question is, if we consider the sensors of ADAS-ultrasonic wave, radar, sensor camera, observation camera and lidar, it is estimated that the vehicle has 10 to 20 sensors, depending on the model. .
Sensors will be the key to solve the problem of high automation, and it is expected that the number and types of sensors will increase.
The environmental monitoring sensors for autonomous driving mainly include cameras and radars. First, the camera uses image recognition technology. Realize ranging and target recognition functions; Secondly, radar uses the time difference and phase difference between the transmitted wave and the reflected wave. To obtain the position and velocity data of the target object, according to the different types of waves used, radars can be divided into three types: millimeter wave radar, laser radar and ultrasonic radar.
Cameras: Autopilot Eyes
As for the camera, it can be divided into forward-looking camera, panoramic camera (side-looking+rear-looking camera) and internal-looking camera according to the coverage position. Forward-looking camera is the most critical, which can realize lane departure warning system (LDW), forward collision warning system (FCW), and? Pedestrian recognition warning (PCW) and other functions. Forward-looking cameras include monocular cameras, binocular cameras and even multi-camera cameras? First-class different solutions. Although binocular or multi-camera has higher ranging accuracy and wider viewing angle, it is difficult to measure the distance because of? Its high cost and high requirements for precision and computing chips make it impossible to mass produce. Mobileye's monocular camera solution is the mainstream of the market.
The industrial chain of vehicle-mounted cameras mainly involves three main links: upstream materials, midstream components and downstream products. Optical lenses, filters and protective films in upstream materials are used to manufacture lens groups, and wafers are used to manufacture? CMOS? Chip and? DSP? Signal processor; In the midstream lens group, CMOS? Chip and bonding material are assembled into modules and combined with? DSP? The signal processor is packaged into a camera product. At this level of the industrial chain, upstream suppliers can already supply complete camera products to downstream vehicle or first-tier supplier customers.
In the car camera industry chain, the camera and software algorithm together constitute a car camera solution, which is applied to self-driving cars. Vehicle-mounted cameras have a long industrial chain with many upstream and downstream links, and each link involves many domestic and foreign manufacturers and companies.
Compared with cameras used in consumer electronic products, car gauge cameras have higher requirements on shockproof, stability, continuous focusing characteristics, thermal compensation, anti-stray light and strong light interference, so their module assembly process is complicated and their technical fortress is high. From the global camera supply market, foreign companies such as Panasonic, Valeo, Fujitsu, China, Magna and so on. At present, it occupies a large share. What is the total market share of the five major manufacturers? About 59%, the concentration is relatively high.
Radar: the brain of autonomous driving
In terms of radar, it is mainly divided into three categories: 1, millimeter wave radar: between microwave and infrared, frequency range? 10GHz? -—200GHz with a wavelength of millimeters; 2. Lidar: between infrared and visible light, the frequency is about? 100000GHz, wavelength is nanometer; 3. Ultrasonic radar: The frequency is higher than? 20000 Hz. According to the formula: light speed = wave? Long * frequency, the higher the frequency, the shorter the wavelength. The shorter the wavelength, the higher the resolution; The higher the resolution, which means? The measurement accuracy of distance, speed and angle is higher.
The reversing radar we usually use is ultrasonic radar, which emits sound waves and can only reach the speed of sound. Ultrasonic radar is small in size and low in price, but it has poor detection accuracy and short range, which has great influence on high-speed movement and is not widely used in automatic driving.
Millimeter wave radar is widely used. It emits electromagnetic waves and propagates at the speed of light. The main millimeter-wave radars are 24GHz and 77GHz. 24GHz has low frequency, narrow bandwidth and relatively low accuracy, and is mainly used for blind spot monitoring and automatic parking. The accuracy of 77GHz is much higher, and the distance can be detected more accurately, and the weather has little influence on it. The integration with the camera can complete the perception of the environment well.
However, millimeter-wave radar can sense the distance, but there is no accurate way to sense the specific shape of an object or the distance between two people in front, and a lot of noise is detected. For example, on the open road, reflection will interfere with the judgment of millimeter wave radar because of some ups and downs or particles on the road.
Lidar can solve these problems well, and the accuracy can reach centimeter level. Each laser generator on the lidar represents a line. The commonly used mechanical rotating lidar includes 10 line, 64 line, 128 line, etc. Lidar is actually a kind of radar that works in optical band (special band), and its advantages are very obvious.
First, it has extremely high resolution: lidar works in optical band, and its frequency is 2 ~ 3 orders of magnitude higher than that of microwave. Therefore, compared with microwave radar, lidar has extremely high range resolution, angle resolution and speed resolution.
Second, the anti-jamming ability is strong: the laser wave is short, and it can emit a laser beam with a very small divergence angle (μrad), and the multipath effect is small (it will not form directional emission with microwave or millimeter wave and produce multipath effect), and it can detect low-altitude/ultra-low-altitude targets.
Thirdly, the information obtained is rich: the distance, angle, reflection intensity, speed and other information of the target can be directly obtained to generate multi-dimensional images of the target. Fourth, it can work all day: active laser detection does not depend on external illumination conditions or the radiation characteristics of the target itself. It only needs to emit its own laser beam and obtain the target information by detecting the echo signal of the emitted laser beam.
However, due to the limitation of price and volume, lidar is rarely assembled on production vehicles at present. Musk criticized lidar for being "heavy", "ugly" and "completely unnecessary" on many occasions. This is also a big disadvantage of lidar. At this stage, it is difficult to reduce its size, and its position on the roof is abrupt, which directly affects mass production, so we have not seen the lidar system installed on the mass production car.
At present, the last ultrasonic radar has become a common automobile part, supporting driving assistance functions such as automatic parking, and will also contribute to fully automatic driving in the future. Its working principle is mainly to measure obstacles in the range of 0.2-5m with the accuracy of 1-3 cm, and act as "the eyes of the car". Ultrasonic radar can be divided into analog, four-wire digital, two-wire digital and three-wire active digital. Their signal anti-jamming ability is improved in turn, and the technical difficulty and price are generally progressive.
Tesla's Autopilot has been highly dependent on ultrasonic radar since its launch, and has always insisted on using the 4+4+4 ultrasonic radar layout. In the early version, Tesla used 8 front and rear radars in parking assistance and all 12 radars in assisted driving. Tesla said that unlike cameras that monitor lane markings, ultrasonic radars can monitor the surrounding areas and remove blind spots such as vehicles or other objects.
Tesla's "preference" for ultrasonic radar is actually for a reason. As mentioned above, although lidar is good, its cost is too high, so it can't be assembled on a large scale in large vehicles for the time being, which also limits the promotion of high-level autonomous driving technology.
Ultrasonic radar is cheap. At present, the price of a single ultrasonic radar is about tens of yuan, the radar hardware cost of the reversing radar system is within 200 yuan, and the radar hardware cost of the automatic parking system is around 500 yuan. In contrast, the price of millimeter wave radar is still in the thousand yuan level, and the price of laser radar is as high as several hundred thousand yuan. The relatively low price tightly binds car companies and ultrasonic radar, which promotes the prosperity of the vehicle-mounted ultrasonic radar market.
According to p & amp; s? According to intelligence data, in 20 19, the global vehicle-borne ultrasonic radar market scale was 3.46 billion US dollars (about 24.39 billion yuan); The agency predicts that the global vehicle-borne ultrasonic radar market will maintain a compound annual growth rate of 5. 1% from 2020 to 2030, and will reach 6 1 billion USD (about RMB 42.98 billion) in 2030.
However, ultrasonic radar is not a breakthrough in autonomous driving technology, and it is limited by physical characteristics. The detection range of vehicle-mounted ultrasonic radar is only a few meters, which can not accurately describe the position of obstacles. In addition, many radars in the same frequency band use time division multiplexing to avoid echo "fighting" and slow down information acquisition; Its detection accuracy is easily affected by vehicle speed, vibration, temperature and humidity, and it is full of challenges in anti-interference and calibration. In a word, ultrasonic radar is an "auxiliary material" rather than a "staple food". Only with millimeter-wave radar, camera and even laser radar can we support a higher level of driver assistance function.
Integration is the future of sensors.
Obviously, sensors will be the key to solve the high level of automation, and the number and types of sensors are expected to increase. More and more sensors are just the tip of the iceberg. Sensors will generate a lot of data, and the system is seriously limited by its processing power.
So the more sensors, the better? Some people may think so, but for the sake of cost or integration, the number of sensors in the car will not increase indefinitely. It is predicted that the number of automatic sensors will reach a certain level at some point, and the main difference lies in the software level and the ability of enterprises to effectively handle a large amount of data. Some OEMs such as Tesla still don't use lidar, but bet on the combination of sensors and AI computing to achieve a high level of automation.
Just like human feelings, sensors must be strategically positioned to continuously feed back the information around the car. However, there are technical limitations in the placement of sensors. For example, condensed water in headlights may prevent lidar from working. In snowy or cold weather, frost may cause sensor failure. The infrared sensor cannot penetrate the glass, nor can it be placed behind the windshield.
At present, there are three mainstream solutions for autonomous driving. One is based on visual advantages, using GPS maps and AI artificial intelligence for automatic driving. At present, Tesla mode is mainly dominated by vision. Tesla collects environmental data through the cameras of all Tesla cars, and combines image processing and machine learning to pass without relying on pre-recorded maps. Tesla cars collect data for learning while driving, share the learned knowledge with all Tesla cars, view the terrain in a way similar to the human eye, and then analyze and guide autonomous vehicles to make decisions through artificial intelligence.
Second, based on lidar, visual guidance, using high-precision maps and artificial intelligence for automatic driving. This is the automatic driving mode adopted by mainstream traditional OEMs such as General Motors, Mercedes-Benz, Ford and many automatic driving companies including Waymo and Google. These vehicles rely on a pre-recorded 3D high-resolution map of the surrounding environment, which was previously captured and drawn by vehicles equipped with lidar. Then, the vehicle can use the map, use its own lidar equipment to locate and determine whether the environment has changed, and then control when cruising in the map area.
The third is artificial intelligence automatic driving based on the Internet of Vehicles and multi-sensor fusion. The networking of vehicles requires huge infrastructure investment, and all the running autonomous driving needs to be in the same platform. Compared with the first two strategies, this is a broader ecosystem, and the complexity and uncertainty of vehicle autonomous driving can be reduced by investing in building smarter roads. This requires the joint efforts of automobile manufacturers, V2X suppliers and municipal authorities to create the infrastructure and standards of vehicles, so that vehicles can navigate smoothly and reduce the error threshold.
Obviously, the first two solutions are based on the current traditional road conditions, car conditions and laws and regulations. Although Tesla's scheme is only adopted by one company, Tesla's volume in the electric vehicle market is also very large. It is hard to say that the vision-based autopilot scheme is definitely worse than the scheme based on lidar.
However, one thing is certain: the third scheme based on the Internet of Vehicles is the only way for the future development of autonomous driving. Under the leadership of the Internet of Vehicles, a large number of sensors are inevitably needed, which cooperate with each other to form a complete automatic driving system with the car itself. Therefore, the development prospect of sensors can almost be described as smooth sailing.
Baixing pingche
In 20 19, the global output of self-driving cars is about several thousand, and it is expected to increase to 400,000 vehicles per year before 2032, with the cumulative total output reaching 65.438+0 billion vehicles. The total revenue related to the production of self-driving cars will also reach 60 billion US dollars, of which 40% will come from the vehicles themselves, 28% from sensing hardware, 28% from computing hardware, and the remaining 4% will come from integration. This means that in the next 15 years, a complete industrial ecology will be built around the technology of self-driving cars.
In this regard, Yole? Dédevelopment analysts predict that the sensor revenue will reach 400 million US dollars for lidar, 60 million US dollars for radar, 65.438+600 million US dollars for camera, 230 million US dollars for IMU and 200 billion US dollars for GNSS equipment in 2024, but the distribution of different types of sensors may change again in the next 65.438+05 years. The total revenue of sensor hardware will reach $654.38+0.7 billion in 2032. In contrast, the income of computing hardware should belong to the same order of magnitude. In any case, this is a huge market, and no one wants to give up this cake automatically.
This article comes from car home, the author of the car manufacturer, and does not represent car home's position.