Using a low-powered laser to detect the chemical composition of various objects, a newly-formed company called Outsight, which formed out of an existing company called Dibotics, is hoping its 3D semantic camera will appeal to self-driving vehicle developers. The concept should find fans among safety advocates who fear current sensing packages don’t go far enough.
Outsight’s founding team consists of alums of firm Dibotics, which pioneered solutions for processing 3D data.
“Our 3D Semantic Camera is not only able to tackle current driving safety problems, but bring unique value to markets like infrastructure management,” says Raul Bravo, the company’s president and cofounder. “With being able to unveil the full reality of the world by providing information that was previously invisible, we at Outsight are convinced that a whole new world of applications will be unleashed. This is just the beginning.”
Semantic cameras, which can differentiate the material makeup of different objects, could play an important role in future robotic applications, including Level 4 and 5 self-driving cars. In the near term, likely applications include man-controlled machines like construction and mining equipment and helicopters.
The sensing system can detect the chemical composition of many objects via a low powered, long range, and eye-safe broadband laser that allows for hyperspectral analysis in real time. Outsight’s camera also has 3D SLAM on Chip(R) capability to provide actionable information and object classification.
One of the big advantages of semantic cameras in self-driving applications is that they can provide important information regarding road conditions, identifying, for example, black ice. The system can also quickly identify pedestrians and bicyclists through its material identification capabilities.
Outsight says it’s partnering in development programs with OEMs and Tier1 providers in Automotive, Aeronautics, and Security-Surveillance markets.
Outsight launched earlier this year at AutoSens.