Even state-of-the-art computer vision systems (e.g., CNNs, Vision Transformers) achieve high accuracy only when trained on vast and diverse datasets.
Once environmental conditions deviate from training data — such as load changes or different camera models — AI systems are prone to failure.
A few centimeters of vertical shift in the camera due to changing vehicle load can significantly degrade detection accuracy of existing AI system.
Humans can adapt and drive accurately in any posture.
Existing AI system requires calibration if camera is changed, that is, AI models trained on one type of cameras often fail or degrade on a different one.
Humans can see and drive with any kind of glasses — no retraining required.
Requires comprehensive coverage of all variations during training.
Focuses on pattern recognition rather than true comprehension of physical or contextual relationships.
High vulnerability when facing unseen conditions outside the training distribution.
Our technology goes beyond dataset diversity by incorporating physical reasoning, scene understanding, and contextual awareness, enabling AI systems to generalize like humans across varying environments and devices with reliable performance.
Humans adapt using multi-modal perception even under extreme conditions, while current AI systems often collapse under minor distribution shifts.
Our technology is designed to bridge this critical gap.
Incident Description: On May 7, 2016, a Tesla Model S operating under Autopilot collided with a tractor-trailer in Williston, Florida. The vehicle failed to recognize the white side of the trailer against a brightly lit sky, resulting in a fatal crash.
Key Findings: The National Transportation Safety Board (NTSB) concluded that the probable cause was the truck driver's failure to yield, combined with the Tesla driver's inattention due to overreliance on vehicle automation. The system's operational design permitted prolonged disengagement from the driving task, contributing to the driver's overreliance.
Incident Description: Studies have shown that Mobileye's Advanced Driver-Assistance Systems (ADAS), particularly the Lane Departure Warning System (LDWS), experience significant performance degradation under heavy rainfall. When precipitation exceeds 20 mm, the system's ability to detect lane markings diminishes, potentially leading to system failure.
Key Findings:
Research indicates that at rainfall levels above 20 mm, the ADAS sensors' visibility range decreases substantially, impairing functionality regardless of vehicle speed.
The Mobileye 630 system, widely used for LDWS, was tested and found to have reduced performance in adverse weather conditions.