DUBLIN, Feb. 13, 2024 /PRNewswire/ -- The "China Automotive Multimodal Interaction Development Research Report, 2023" report has been added to ResearchAndMarkets.com's offering.
China Automotive Multimodal Interaction Development Research Report, 2023 combs through the interaction modes of mainstream cockpits, the application of interaction modes in key vehicle models launched in 2023, the cockpit interaction solutions of suppliers, and the multimodal interaction fusion trends.
By sorting out the interaction modes and functions of new models rolled out in the previous year, it can be seen that active, anthropomorphic and natural interaction has become the main trend. In terms of interaction mode, in single-modal interaction, the control scope of mainstream interactions such as touch and voice has expanded from inside to outside cars, and the application cases of novel interactions like fingerprint and electromyography in cars have begun to increase; in multimodal fusion interaction, multiple fusion interactions, for example, voice + head posture/face/lip language, and face + emotion/smell, are being available to cars, aiming to create more active and natural human-vehicle interaction.
Large AI models are evolving from the single-modal to the multi-modal and multi-task fusion. Compared with the single-modal that can only process one type of data such as text, image and speech, the multimodal can process and understand multiple types of data, including vision, hearing and language, thus better understanding and generating complex information.
As multimodal foundation models continue to develop, their capabilities will also be significantly improved. This improvement gives AI Agent higher capabilities of perception and environment understanding to achieve more intelligent, automatic decisions and actions, and also creates new possibilities for its application in automotive, providing a broader prospect for future intelligent development.
The Spark Cockpit OS developed by iFlytek based on the Spark Model supports multiple interaction modes such as voice, gesture, eye tracking and DMS/OMS. The Spark Car Assistant enables multi-intent recognition by deep understanding of the context, providing more natural human-machine interaction. The iFlytek Spark Model, first mounted on the model EXEED Sterra ES, will bring five new experiences: Vehicle Function Tutor, Empathy Partner, Knowledge Encyclopedia, Travel Planning Expert, and Physical Health Consultant.
AITO M9, to be launched in December 2023, has HarmonyOS 4 IVI system built in. Xiaoyi, the intelligent assistant in HarmonyOS 4, has been connected to Huawei Pangu Model, which includes natural language model, visual model, and multi-modal model. The combination of HarmonyOS 4 + Xiaoyi + Pangu Model further enhances ecosystem capabilities such as cooperation of devices, and AI scenarios, and provides diverse interaction modes, including voice recognition, gesture control, and touch control, using multimodal interaction technology.
Key Topics Covered:
1 Overview of Multimodal Interaction
1.1 Definition of Multimodal Interaction
1.2 Multimodal Interaction Industry Chain
1.3 Multimodal Fusion Algorithms
1.4 Multimodal Interaction Policy Environment
2 Human-Computer Interaction Based on Touch
2.1 Haptic Interaction Development Route
2.2 Highlights of Haptic Interaction of OEMs
2.3 Cockpit Display Trends
2.4 Development Trends of Smart Surface Materials
2.5 Haptic Feedback Mechanism
3 Human-Computer Interaction Based on Hearing
3.1 Voice Function Development Route
3.2 Summary on Voice Functions of OEMs
3.3 Summary on OTA Updates on Voice Functions of OEMs
3.4 Development Trends of Voice Interaction Images
3.5 Application of Voiceprint Recognition in Car Models
3.6 Customization Trends of Voice Functions
3.7 Major Suppliers of Voice Functions
3.8 Voice Function Development Models of OEMs
4 Human-Computer Interaction Based on Vision
4.1 Face Recognition
4.2 Gesture Recognition
4.3 Lip Movement Recognition
4.4 Other Visual Interaction
5 Human-Computer Interaction Based on Smell
5.1 Olfactory Interaction Function Development Route
5.2 Principle of Intelligent Fragrance System
5.3 Fragrance System Technology
5.4 Application of Olfactory Interaction in Car Models
5.5 Summary on Fragrance System Technologies of OEMs
5.6 Olfactory Interaction Design Trends
5.7 Summary on Olfactory Interaction Suppliers
6 Human-Computer Interaction Based on Biometrics
6.1 Fingerprint Recognition
6.2 Iris Recognition
6.3 Myoelectric Recognition
6.4 Vein Recognition
6.5 Heart Rate Recognition
7 Multimodal Interaction Application by OEMs
7.1 Emerging Carmakers
7.1.1 Multimodal Interaction in Xpeng G6
7.1.2 Multimodal Interaction in Li L7
7.1.3 Multimodal Interaction in NIO EC7
7.1.4 Multimodal Interaction in Neta GT
7.1.5 Multimodal Interaction in HiPhi Y
7.1.6 Multimodal Interaction in Hycan A06
7.1.7 Multimodal Interaction in Hycan V09
7.1.8 Multimodal Interaction in New AITO M7
7.1.9 Multimodal Interaction in AITO M9
7.2 Conventional Chinese Independent Automakers
7.2.1 Multimodal Interaction in Chery Cowin Kunlun
7.2.2 Multimodal Interaction in WEY Blue Mountain DHT PHEV
7.2.3 Multimodal Interaction in Hyper GT
7.2.4 Multimodal Interaction in Trumpchi E9
7.2.5 Multimodal Interaction in Voyah Passion
7.2.6 Multimodal Interaction in Denza N7
7.2.7 Multimodal Interaction in Frigate 07
7.2.8 Multimodal Interaction in Changan Nevo A07
7.2.9 Multimodal Interaction in Jiyue 01
7.2.10 Multimodal Interaction in ARCFOX Kaola
7.2.11 Multimodal Interaction in Deepal S7
7.2.12 Multimodal Interaction in Galaxy L6
7.2.13 Multimodal Interaction in Lynk & Co 08
7.2.14 Multimodal Interaction in LIVAN 7
7.2.15 Multimodal Interaction in ZEEKR X
7.2.16 Multimodal Interaction in ZEEKR 009
7.2.17 Multimodal Interaction in IM LS7
7.2.18 Multimodal Interaction in GEOME G6
7.3 Conventional Joint Venture Automakers
7.3.1 Multimodal Interaction in Mercedes-Benz EQS AMG
7.3.2 Multimodal Interaction in GAC Toyota bZ 4X
7.3.3 Multimodal Interaction in FAW Toyota bZ 3
7.3.4 Multimodal Interaction in Buick Electra E5
7.3.5 Multimodal Interaction in 11th Generation GAC Honda Accord
7.3.6 Multimodal Interaction in FAW Audi e-tron GT
7.3.7 Multimodal Interaction in BMW XM
7.4 Concept Cars
7.4.1 Multimodal Interaction in Audi A6 Avant e-tron
7.4.2 Multimodal Interaction in BMW i Vision Dee
7.4.3 Multimodal Interaction in RAM 1500 Revolution
7.4.4 Multimodal Interaction in Peugeot Inception
7.4.5 Multimodal Interaction in Yanfeng XiM23s
8 Multimodal Interaction Solutions of Suppliers
8.1 Aptiv
8.2 Cipia Vision
8.3 Cerence
8.4 Continental
8.5 iFlytek
8.6 SenseTime
8.7 ADAYO
8.8 Desay SV
8.9 ArcSoft Technology
8.10 AISpeech
8.11 Horizon Robotics
8.12 ThunderSoft
8.13 PATEO
8.14 Joyson Electronics
8.15 Huawei
8.16 Baidu
8.17 Tencent
8.18 Banma Network
8.19 MINIEYE
8.20 Hikvision
9 Multimodal Interaction Summary and Trends
9.1 Multimodal Interaction Fusion Trends
9.2 Cockpit Computing Power Required by Multimodal Interaction
9.3 Large AI Models Required by Multimodal Interaction
9.4 Integration of Multimodal Interaction and Cockpit Hardware
9.5 Summary on Multimodal Interaction Features in Typical Car Models
For more information about this report visit https://www.researchandmarkets.com/r/jva2r1
About ResearchAndMarkets.com
ResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.
Media Contact:
Research and Markets
Laura Wood, Senior Manager
[email protected]
For E.S.T Office Hours Call +1-917-300-0470
For U.S./CAN Toll Free Call +1-800-526-8630
For GMT Office Hours Call +353-1-416-8900
U.S. Fax: 646-607-1904
Fax (outside U.S.): +353-1-481-1716
Logo:https://mma.prnewswire.com/media/539438/Research_and_Markets_Logo.jpg
SOURCE Research and Markets
WANT YOUR COMPANY'S NEWS FEATURED ON PRNEWSWIRE.COM?
Newsrooms &
Influencers
Digital Media
Outlets
Journalists
Opted In
Share this article