OTSL Release 3D Autonomous Driving Simulator with World’s First Dynamic Real-time Simulation, Camera Simulator and Ultrasonic Simulator
OTSL Completes 3D Real-time Sensor Simulator Product Lineup Aimed at Realizing Zero-Blind-Spot Autonomous Driving Simulatio
Date: Oct. 17, 2018
Source: OTSL Inc.
– Worldwide Simultaneous Release of 3 Products, New Infrared Simulator Enabling World’s First Dynamic Real-time Simulation, Camera Simulator, and Ultrasonic Simulator –
OTSL Inc., a short-distance wireless system and embedded system developer and distributor, has enhanced its 3D real-time sensor simulator product lineup for autonomous driving, COSMOsim, and released on October 17 three new types of simulators–worldwide simultaneously–for infrared/ uBolometer (*), camera, and ultrasonic sensors to be used for automotive manufacturers, system component suppliers (vehicle sensor manufacturers), and semiconductor manufacturers.
Last year, OTSL released Advanced Millimeter Wave Radar Simulator (AMMWR Simulator) and Advanced Laser Radar/LIDAR Simulator (ALR Simulator), entering the business of real-time simulators for autonomous driving. This time OTSL adds Advanced Infrared/ uBolometer (*) Simulator (AIRB Simulator), Advanced Camera Image Sensor Simulator (ACIS Simulator), and Advanced Ultrasonic Simulator (AUS Simulator) to its simulator product lineup, offering a total of five types of simulators that can be used for all sensor systems for autonomous driving.
“Expectations towards autonomous driving are growing worldwide, creating higher demand for high-performance simulators that can verify safety and accuracy in environments that closely match real driving conditions,” said Shoji Hatano, CEO, OTSL. “OTSL’s simulator product lineup for autonomous driving provides an integrated platform that allows you to operate five types of simulators at the same time in real time on a single screen. OTSL aims to eliminate almost all blind spots for autonomous driving, which are hard to identify in a real driving test, by combining simulations using multiple sensors with different features.”
AIRB Simulator superimposes the far-infrared (temperature) data of all road objects on a 3D map and visualizes them in real time while moving them freely in any direction and at any speed. Since infrared/ uBolometer (*) sensors not only enable recognizing objects at night but also are hardly influenced by bad weather such as rain and snow, they will hopefully be used mainly in Europe and America which have road conditions where lighting infrastructure on expressways is often underdeveloped. AIRB Simulator analyzes the material data of objects on a 3D map from their molecular structures and performs sophisticated calculation processing in real time to bring data such as absorption spectra, thermal radiation based on the blackbody radiation model, and temperature rises due to the radiation energy of sunlight as close to real far-infrared (temperature) data as possible. AIRB Simulator is the world’s first infrared sensor simulator for autonomous driving that enables dynamic real-time simulation.
ACIS Simulator enables simulation for autonomous driving not only for a camera but also for lens configurations. Since autonomous driving technology using cameras can be implemented at low cost, it is an effective solution in Japan and other regions where road conditions are relatively good due to spread of night lighting infrastructure. However, common camera simulators for autonomous driving do not allow you to configure the detailed optical characteristics that are different for each lens, such as the ghost phenomenon where the light reflected at the lens surface is shown in the image or where the image is distorted or deformed due to light refraction in the lens. So, simulation images remain different from real-world images. ACIS Simulator allows you to configure optical characteristics such as aberration and depth of field that are different for each lens, and the settings of lens characteristics, such as enabling/ disabling of the anti-reflection coating, enabling a more accurate real-time simulation to be implemented.
AUS Simulator is designed for ultrasonic sensors that are mainly used for autonomous parking or parking assist technology. The ability to freely set the number of sensors, the mounting height and angle, the detection capability depending on the material of the mounting location and the mounting method, and the ultrasonic irradiation angle and intensity enables real-time simulation in various situations of not only parking but also autonomous driving.
OTSL’s 3D real-time sensor simulator product lineup for autonomous driving, COSMOsim, is the world’s only platform that allows you to operate five types of simulators for millimeter-wave radar, LIDAR, camera, infrared/ uBolometer (*), and ultrasonic sensors simultaneously in real time on a single screen. Automotive manufacturers can simulate a driving situation by sensor-based modeling and check the recognition and control of autonomous driving, and verify the sensor-mounting positions on vehicles efficiently, which eliminates the need for test driving with real vehicles. System component suppliers (vehicle sensor manufacturers) can review the design parameters and check the reaching distance and sensing area with higher efficiency by visualizing the behavior of vehicle sensors. Semiconductor manufacturers developing sensor devices can model and simulate a device under development and verify it at high speed.