Cars are becoming a moving sensor platform

With the advent of systems such as Advanced Driver Assistance Systems (ADAS) and the direction of unmanned vehicles, cars need to be aware of the surrounding environment. The driver is able to perceive the environment around us, make judgments, and react quickly in different situations. However, no one is perfect. We will be tired, distracted, and make mistakes. To improve safety, automakers are designing ADAS for passenger cars. Cars rely on a variety of sensors to understand the surrounding environment in many different situations. This data is then transmitted to high-precision processors such as TI's TDA2x for functions such as automatic emergency braking (AEB), lane departure warning (LDW) and blind spot monitoring.

Automotive sensor

Several sensors are often used for ambient sensing. Passive sensor—Used to sense rays that are reflected or emitted from an object.

Visual Image Sensor - all imagers operating in the visible spectrum

Infrared image sensor - operates outside the visible spectrum. It can be near infrared or thermal infrared (far infrared).

Passive sensors are affected by the environment - different moments of the day, weather, etc. For example, a visual sensor is affected by the amount of visible light at different times of the day.

Active sensor - emits a ray and measures the response of the reflected signal. The advantage is that the measurement results can be obtained at any time, regardless of the early, late or late season.

Radar—transmits radio waves that determine the distance, direction, and velocity of the object based on radio waves reflected from the object.

Ultrasound—transmits an ultrasonic wave that determines the distance of the object based on the ultrasonic waves reflected from the object.

Light - scans the laser reflected from an object to determine the distance of the object

Time of Flight—A camera that measures the time it takes for a emitted infrared beam to bounce off an object back to the sensor to determine the distance of the object.

Structured Light—A known light pattern is projected onto an object, typically projected by TI's Digital Light Processing (DLP) device. The deformation of this pattern is captured by the camera and analyzed to determine the distance of the object.

In order to improve accuracy, reliability and durability in a variety of different situations, at least one sensor is required to observe the same scene. All sensor technologies have their inherent limitations and advantages. Different sensor technologies can be combined to fuse data from different sensors in the same scene, providing a more stable and durable solution, "data fusion eliminates data obfuscation." An example of this is the combination of visible light sensors and radar.

The advantages of visible light sensors include high resolution, the ability to identify objects and classify them, and the ability to provide important information. However, their performance is affected by the amount of light available and the weather (such as fog, rain and snow). Other factors such as overheating can cause the quality of the image to drop due to noise. Precision image processing on TI processors can mitigate some of these effects.

On the other hand, radar can pass through rain and snow and can measure distance very quickly and efficiently. Doppler radar has the added advantage of being able to detect the motion of an object. However, the radar resolution is low and the object cannot be easily identified. The fusion of visible and radar data provides a more robust solution in many different situations.

In addition, the cost varies between sensors, which also affects the best choice for a particular application. For example, Lidar (LIDAR) provides very accurate distance measurement, but it is much more expensive than passive image sensors. With continuous development, the cost will continue to decrease, and the car will eventually be able to look at the six directions with the help of a variety of sensors.

The TDA processor family is highly integrated and developed on a programmable platform that meets the high-intensity processing needs of automotive vehicles equipped with ADAS. Data from different sensors observing this scene can be provided to the TDA2x and combined into more complete photos to support rapid and intelligent decision making. For example, a visual sensor will display a mailbox in a darker context similar to a human shape. The TI processor can perform sophisticated pedestrian detection, which identifies it as a possible pedestrian on the roadside, depending on the proportion of the object. However, data from the thermal sensor will recognize that the temperature of the object is too low and is unlikely to be a living object, so it may not be a pedestrian. Therefore, sensors with different operating characteristics can provide a higher level of security.

The ultimate goal is to create a completely autonomous car that will eventually lead to a world without traffic accidents. TI is actively engaged in research and development of sensors and processing technologies to help customers develop unmanned vehicles. However, these problems should be solved in a timely manner. I firmly believe that when we face the development of driverless vehicles, the problem is not whether we can achieve it or not, but when it can be achieved.

Bulkhead Lamp

Bulkhead Lamp,Led Bulkhead Lamp,Round Bulkhead Light,Bulkhead Ceiling Light

Changxing Fanya Lighting Co.,Ltd , https://www.fyledlights.com

Posted on