Result analysis in line with expected development

Terranet
Terranet

In This Article:

As previously communicated, during the second quarter, Terranet conducted an initial system integration in a vehicle outdoors. The next step has been to expand the testing to later report back to the market on the results. The result analyses now confirm that the technology, so far in the process, is developing according to plan.

The second quarter delivered several technical advancements, including integration with the vehicle’s communication platform, evaluation of the optimal placement of hardware components, improvements to the AI model and motion compensation.

To confirm the progress made in the second quarter, extended tests were carried out during the summer. BlincVision has now been successfully integrated with the NVIDIA Orin computer platform, which is a significant achievement and of great importance to our customers in the automotive industry. However, to maximize performance, further optimizations will be carried out.

Different placements of the event cameras have been tested, such as moving from roof mounting to placement behind the windshield for improved weather protection and simplified vehicle integration. This adjustment has not affected the detection capability, but the quality of the glass is an important parameter. For maximum performance, adaptation of the area in front of the sensor is required to enhance the passthrough of the laser’s infrared wavelength. This is a well-known adjustment for sensors and something that is typically done together with the vehicle manufacturer when the system is integrated.

During the second quarter, the initial integration of the system was carried out in a vehicle outdoors, meaning it is now being tested in a dynamic, moving environment. Motion compensation is crucial within ADAS (Advanced Driver Assistance System) to ensure that sensors and cameras can correctly interpret the surroundings despite the vehicle's movements. The system has been tested at various driving speeds, all according to EuroNCAP (scenarios CPNA-25 and CPNCO-50). These scenarios involve a person moving across the lane from the right side or appearing from behind a stationary object. BlincVision's unique combination of event cameras and AI-developed model requires the collection of large amounts of training data from different traffic environments and scenarios. This data collection is a key component in developing and fine-tuning the system in order to handle a maximum variation of traffic conditions. The tests show that the model can be further optimized through extensive training on different scenarios with multiple objects.