Learn more about our research areas – the future of automotive safety & user experience.
The next decade will bring game-changing innovations to the automotive industry. Within our research we anticipate the needs of the market ahead of time.
RESEARCH AREAS – OVERVIEW
DMS & OMS FOR AUTONOMOUS VEHICLES
From Level 3 onwards, the way we use vehicles will change. Likewise, their systems for safety and human interaction will have to change as well. Our research will provide Tier-1s and OEMs with in-cabin analysis software that meets future needs.
Our research in this area focuses on:
- Advanced driver availability monitoring systems for Level 3
- Whole cabin safety & UX during autonomous driving
- Safety in automated shuttles
The system targeted within this project will be based on a realistic and highly complex task from the automotive industry – the interior analysis of vehicles (interior monitoring) – and will enable emotion3D to investigate important system aspects of camera networks (e.g. dependencies of individual processing stages or effects of hardware configurations on the overall performance). The envisaged end result is a system concept optimised in terms of quality, robustness and adaptability for the analysis of 3D scenes using camera networks. The system will be scalable and used for monitoring vehicle interiors of conventional as well as autonomous vehicles.
Read more about the project on the official site of FFG (Austrian Research Promotion Agency) by clicking here.
Funding agency: This project is funded by the FFG within the Early Stage 2019 program.
In the i3DOC project, an embedded vision platform for new applications based on stereo and 360° vision in the industrial / safety-relevant environment is to be researched. Solutions suitable for use in automotive are only available to a limited extent and either only address individual aspects or are based on PC architectures and are therefore largely unsuitable for mobile or even cost-sensitive applications. The aim of this project is to conceptually develop an embedded architecture for 360° 3D vision applications for safety-relevant applications and to validate the selected architectural approaches.
Partners: Mission Embedded | emotion3D
Funding agency: This project is funded by the Wirtschaftsagentur Wien as part of the funding initiative Co-Create 2017 within the FORSCHUNG program.
PERSONALIZED PASSIVE SAFETY SYSTEMS
Passive safety systems (e.g. airbags) have greatly enhanced driving safety. However, these systems follow a ”few-sizes-fit-all” approach and thus today, especially for women, children, elderly and people deviating from the average, this leads to significantly higher risks. Any seatbelt-wearing female occupant is 73% more likely to suffer from serious injuries than seatbelt-wearing male occupants (Univ. Virginia).
Occupant monitoring enables individualized deployment of passive safety systems according to the occupants’ individual characteristics. This optimizes the protective function while simultaneously mitigating the risks of unnecessary harm.
Details about this exciting project will follow soon.
Funding agency: This project is funded by the European Commission within the Horizon 2020 FTI program.
COMBINING EXTERIOR & IN-CABIN SENSING
In-cabin analysis and exterior analysis are a powerful combination for providing innovative safety and driving experience features. In the future it will be important for ADAS like automatic emergency breaking systems to know the status of the driver and the passengers.
Within this field, we are working with large Tier-1 partners on innovative approaches to provide occupant-aware advanced driver assistance systems.
The concept of the resultant integrated, intelligent general vision system to be developed delivers outcomes and findings about in-depth automation steps as follows:
– Aimed at strongly improved image- and therefore environment perception, the headlamps provide a tailored adaptive scenery illumination to assigned lamp-cameras.
– The stereoscopic, high resolution camera system provides significantly improved information about data related to glare free high beam control, as object classification, positioning (horizontal, vertical), distance, direction of movement, lane,
– The stereoscopic camera system enables highly improved information processing based on data relevant for autonomous driving like addressed and labelled lanes, topology, free space, obstacles, traffic participants, pedestrians, etc.
Read more about the project on the official site of TU Vienna by clicking here.
Partners: ZKW | TU Vienna | emotion3D
Funding agency: This project is funded by the FFG within the IKT der Zukunft program.
The safe detection of vulnerable road users (pedestrians, cyclists) is an important social goal as well as a demanding technical challenge, which is currently insufficiently solved, especially in poor visibility conditions (night, fog), at long distances (>100m) and when approaching from the side (crossing situations). The SmartProtect project addresses this problem by means of a holistic, integrative approach that includes software, a combination of imaging and distance-detecting sensor technology and component design with interactive modular architecture. The main characteristics of the novel automotive perception system are fused sensor data with prediction function and automated, problem-specific control of headlights and infrared lighting, supported by reactive camera and LIDAR systems, which are installed in the same component housing and cover different detection angles. The envisaged multimodal systems will be evaluated in real driving conditions and in representative, reproducible test scenarios.
Partners: ZKW | TU Vienna | emotion3D
Funding agency: This project is funded by the FFG within the MdZ-2019 program.
SIMULATION-BASED AI DEVELOPMENT
For training, testing algorithm improvement and validation a large amount of high-quality data is required. Generating data in real-life is highly effortful and raises questions of data privacy. Thus, we are heavily investigating alternative ways to create data such as from computer graphics simulations.
The aim of the project is to provide a cost- and time-efficient simulation workflow for the development of environment analysis applications for vehicle interiors. Our innovative approach simulates complex dynamic vehicle interior scenarios (e.g. generation and animation of 3D person models, materials and surfaces, matching sensor configurations, etc.) and in particular enables the production of synthetic data for training, validation and testing purposes. This avoids the need for costly data acquisition and manual annotation and enables the replication of a variety of scenarios, environmental conditions and sensor modalities. In contrast to real (i.e. non-simulated) data sets, the planned simulation workflow can also capture rare or potentially dangerous scenarios (e.g. situations during a collision or microsleep). In addition to applications in the automotive sector, we also see operator safety in the transport sector (e.g. bus, train) and commercial vehicle sector (e.g. trucks, construction machinery) as promising areas of application for the proposed simulation workflow.
Partners: BECOM Systems | Rechenraum | TU Vienna | emotion3D
Funding agency: This project is funded by the FFG within the MdZ-2020 program.