Novel Divergent Thinking: Critical Solutions for the Future

Co-created by the Catalyzer Think Tank divergent thinking and Gemini Deep Research tool.

 

1. Executive Summary

Japanese automotive and micromobility leaders, including Honda, Yamaha, Toyota, Suzuki, and Shimano, are actively engaged in research and development (R&D) focused on decentralized edge Artificial Intelligence (AI). This push is driven by the need for real-time responsiveness, enhanced safety, increased personalization, and the enablement of next-generation vehicle capabilities. Edge AI, which involves processing data locally on the vehicle or device rather than solely relying on the cloud, is proving crucial for applications demanding low latency and high reliability, such as fully active suspension systems, complex multi-motor control strategies, and sophisticated human-machine interfaces (HMIs).

Key developments indicate a significant focus on AI-driven active and adaptive suspension systems, with patents, prototypes, and initial product releases emerging from Honda, Yamaha, Suzuki, and Shimano. These systems leverage sensors like Inertial Measurement Units (IMUs) and, in some cases, machine learning algorithms trained with rider feedback, to adjust suspension characteristics in real-time for improved comfort, handling, and stability. Concurrently, R&D into AI control for multi-power or multi-motor vehicles, particularly two-wheelers featuring all-wheel drive (AWD) or two-wheel drive (2WD) configurations, appears less mature, though conceptual patents from Yamaha suggest future possibilities in dynamic torque vectoring.

A substantial area of innovation lies in human-centric AI. Companies are exploring ways for AI to interpret complex human signals – encompassing observable actions, inferred cognitive states, and emotional responses – to create safer, more intuitive, and more engaging mobility experiences. Toyota’s Yui agent and “Driving Sensei” concept, Honda’s AI Driver Model derived from brain research, and Yamaha’s MOTOROiD pursuing the Jin-Ki Kanno ideal of human-machine unity exemplify this trend. These initiatives aim to move beyond simple assistance towards systems that can potentially adapt to driver/rider skill levels, reduce stress, enhance focus, and even contribute to user development goals related to cognitive and affective states, often termed “thinking/feeling development goals.” Shimano is also contributing through component-level AI that learns rider preferences.

Architecturally, approaches vary. Honda is pursuing powerful centralized System-on-Chip (SoC) solutions within a software-defined vehicle (SDV) framework, evolving towards a centralized E/E architecture. Yamaha’s MOTOROiD and Shimano’s intelligent components suggest more distributed or integrated edge processing capabilities. Toyota employs high-performance platforms like Nvidia DRIVE, likely blending edge processing with cloud connectivity for its AI agents. Suzuki’s public initiatives currently appear more focused on leveraging cloud-based AI for connected services and internal operations.

Despite significant progress, substantial challenges remain. Ensuring the reliability, robustness, and security of edge AI systems, particularly in safety-critical applications operating in harsh automotive environments, is paramount. Furthermore, the collection and processing of sensitive human data for personalization and state monitoring raise critical ethical and data privacy concerns that require robust governance frameworks, transparency, and user control. The trajectory points towards increasingly integrated, adaptive, and personalized vehicle AI, shifting the paradigm from simple automation towards deeper human-machine collaboration and augmentation, though the full realization of AI fostering human developmental goals through mobility remains a forward-looking vision.

2. The Landscape of Decentralized Edge AI in Japanese Mobility

The integration of Artificial Intelligence (AI) at the network edge represents a pivotal technological shift in the automotive and micromobility sectors. Japanese manufacturers are actively exploring and implementing edge AI solutions to enable real-time control, enhance safety, and deliver personalized user experiences.

2.1 Defining Edge AI in the Automotive/Micromobility Context

Edge AI, often referred to as AI at the edge, involves performing AI computations directly on the device or vehicle, close to the source of data generation (e.g., sensors), rather than transmitting vast amounts of raw data to centralized cloud servers for processing.1 This localized processing architecture is fundamentally different from traditional cloud-based AI.

The primary drivers for adopting edge AI in mobility are compelling. Firstly, it drastically reduces latency, the delay between data acquisition and decision-making.1 For safety-critical applications like active suspension adjustments, collision avoidance systems (part of Advanced Driver Assistance Systems or ADAS), and real-time stability control, millisecond response times are essential, making cloud round-trips often impractical.4 Secondly, edge processing enhances data privacy and security by minimizing the transmission of potentially sensitive data, such as driver biometrics or precise location information, to external servers.1 Thirdly, it improves reliability and robustness, as the system can continue to function even with intermittent or lost network connectivity, a common occurrence in real-world driving scenarios.6 Finally, processing data locally conserves network bandwidth and reduces the associated costs of transmitting large volumes of sensor data.3 These benefits collectively make edge AI a foundational technology for the advanced vehicle systems being pursued.

However, deploying AI at the edge introduces significant challenges. Edge devices, such as in-vehicle Electronic Control Units (ECUs) or sensor modules, typically operate under resource constraints, including limited processing power, memory, storage capacity, and stringent power consumption limits.6 Vehicles also present harsh environmental factors, such as extreme temperatures, vibrations, and dust, which can affect hardware performance and reliability.6 Furthermore, edge systems must cope with data variability, including noisy sensor inputs, missing data due to sensor malfunction or occlusion, and the need to fuse information from diverse sensor types.3 The distributed nature of edge devices also increases the attack surface, raising security vulnerabilities related to physical tampering, reverse engineering, and network intrusions.6 Ensuring the overall robustness and reliability of these complex systems in dynamic, unpredictable environments remains a critical hurdle.6

2.2 Architectural Approaches by Key Players

Japanese companies are adopting diverse architectural strategies for implementing edge AI, reflecting varying technological priorities, legacy systems, and strategic partnerships.

  • Honda: Honda is developing its “Asimo OS” as the core software for its upcoming software-defined vehicles (SDVs), particularly the Honda 0 Series EVs.9 This OS is designed to control onboard computers for AD/ADAS functions and enable personalization by learning driver tendencies.9 Recognizing the massive computational demands (projected 500x increase by 2030), Honda is collaborating with Renesas Electronics on a high-performance, centralized System-on-Chip (SoC) intended for the central ECU.9 Their electrical and electronic (E&E) architecture is planned to evolve from a domain-centric structure (integrating functions into three main domain ECUs) to a more centralized architecture that integrates various functions like personalization and emotion/intention estimation.11 This strategy suggests a powerful central compute node performing significant AI processing at the edge, likely aggregating data from various sensors. Honda is also exploring distributed edge computing concepts for vehicle-to-everything (V2X) communication systems.13
  • Yamaha: Yamaha’s approach, particularly evident in the MOTOROiD concept motorcycle, appears more integrated and potentially distributed at the component level.15 MOTOROiD features a central control unit that integrates data and commands for its image recognition AI, the Active Mass Center Control System (AMCES) for self-balancing, the haptic HMI, IMU data, and powertrain control.16 The AMCES system itself relies on localized IMU sensing and rapid actuation to maintain balance in real-time.16 This suggests significant onboard processing capabilities distributed within the vehicle’s core systems. The futuristic Y/AI concept further hints at AI integration for functions like on-the-fly suspension adjustment.18 Collaboration with Final Aim on design processes using generative AI also points towards advanced digital integration strategies.19 While MOTOROiD clearly employs edge processing, the specific degree of decentralization in its architecture isn’t explicitly detailed.15
  • Toyota: Toyota utilizes high-performance computing platforms like Nvidia DRIVE within its vehicles, enabling real-time decision-making for autonomous navigation, indicative of significant edge processing capabilities.20 Their Concept-i and LQ vehicles feature the “Yui” AI agent, designed to learn driver emotions and preferences using sensors like cameras and microphones, adapting the HMI (lighting, sound, haptics, fragrance) and providing personalized support.21 This likely involves a hybrid architecture: edge processing for immediate sensor interpretation and HMI responses, potentially combined with cloud connectivity for deeper learning, model updates, and accessing broader contextual information.22 Specifics on the decentralization aspects of Yui’s architecture are limited in the available material.21
  • Suzuki: Suzuki’s prominent AI initiative involves adopting Microsoft’s Azure OpenAI Service, primarily for internal process optimization (e.g., document search, meeting summaries) and planned future integration into connected vehicle services, such as AI-powered navigation assistants.24 They established a Data Analysis Promotion Group to drive AI adoption company-wide.24 However, based on the provided information, Suzuki’s current public focus appears more oriented towards cloud-based generative AI applications rather than advanced, decentralized edge AI for real-time vehicle dynamics control or complex human state monitoring.5
  • Shimano: As a leading component manufacturer, Shimano integrates intelligence directly at the edge within its products.27 Systems like Di2 electronic shifting, AUTO SHIFT, and FREE SHIFT utilize onboard sensors (tracking cadence, torque, speed) and proprietary intelligent algorithms to perform real-time gear adjustments and adaptation based on riding conditions.29 Furthermore, Shimano has patented machine learning-based automatic suspension control systems that learn from rider feedback and extensive sensor data (including accelerometers, position sensors, camera input) processed locally to optimize performance.32 This component-level edge AI strategy allows bicycle manufacturers to incorporate advanced features readily. However, the wireless nature of some systems (like Di2) introduces potential security vulnerabilities.34

Table 1: Comparative Analysis of Edge AI Architectures

 

Company

Key AI System/Concept

Stated Architecture

Key Hardware

Edge Processing Focus

Key Snippets

Honda

Asimo OS, Honda SENSING, AI Driver Model

Domain -> Centralized (SoC-based)

Central SoC (Renesas collab), Domain ECUs, Cameras, Sensors

ADAS/AD, Personalization, Cognitive State, HMI

9

Yamaha

MOTOROiD (AI, AMCES, HMI), AMSAS, Y/AI

Integrated/Distributed (Implied)

Control Unit, IMU, Actuators, Cameras, Haptics

Stability Control, HMI, Rider Interaction, Vision

15

Toyota

Yui Agent (Concept-i/LQ), Driving Sensei

Hybrid (Edge + Cloud likely)

Nvidia DRIVE Platform, Cameras, Microphones, Haptics

ADAS/AD, Emotion/Preference, HMI, Skill Coaching

20

Suzuki

Azure OpenAI (Connected Services), S.A.E.S.

Cloud-centric (primarily), Component

Cloud APIs, Standard ECUs, Suspension Sensors

Infotainment/Services, Electronic Suspension Ctrl

24

Shimano

Di2, AUTO SHIFT, FREE SHIFT, ML Suspension

Component-level Edge

Microcontrollers, Sensors (Speed, Torque, Cadence, Accel.), Actuators

Drivetrain Control, Suspension Control, Rider Pref.

27

2.3 Analysis of Architectural Trends

The diverse architectural approaches adopted by these Japanese mobility players reveal several underlying factors. Honda’s move towards a powerful, centralized SoC aligns with the broader automotive trend towards consolidating ECU functions in SDVs, aiming for scalability, easier updates, and potentially cost savings in the long run.10 This architecture facilitates complex sensor fusion and sophisticated AI required for higher levels of automated driving.

Conversely, Yamaha’s MOTOROiD, with its tightly integrated systems like AMCES, suggests a focus on optimizing specific functionalities (like self-balancing) through dedicated, localized processing, prioritizing immediate responsiveness for unique vehicle dynamics.16 Shimano’s component-level strategy is inherent to its business model, driving innovation that can be adopted modularly by various bicycle manufacturers.29

Toyota’s hybrid approach, leveraging powerful edge platforms like Nvidia DRIVE while potentially utilizing cloud connectivity for its Yui agent, reflects a balance between real-time performance needs for driving tasks and the data-intensive learning required for sophisticated personalization and emotional understanding.20 Suzuki’s current emphasis on cloud-based AI might reflect a strategy focused on connected services and leveraging existing platforms like Microsoft Azure for rapid deployment, possibly with plans to integrate more edge capabilities later.24

Crucially, regardless of the specific architecture, edge AI’s core benefits—particularly low latency and real-time processing—are indispensable for enabling the advanced vehicle dynamics control (active suspension, multi-motor coordination) and nuanced human-centric interactions (interpreting emotion, providing skill feedback) that are central to the future visions outlined by these companies.3 The evolution towards SDVs further necessitates robust edge computing platforms capable of handling complex AI models and receiving over-the-air updates.9

3. AI-Driven Advancements in Vehicle Dynamics

Edge AI is a critical enabler for enhancing vehicle dynamics, allowing for real-time adaptation and control beyond the capabilities of traditional mechanical or passive electronic systems. Key areas of focus include active suspension and the control of multi-power/multi-motor systems, particularly in two-wheeled vehicles.

3.1 Active Suspension Systems: Real-time AI Control and Adaptation

Active suspension systems aim to improve ride comfort, handling stability, and safety by dynamically adjusting suspension characteristics in response to road conditions, vehicle motion, and potentially driver inputs or preferences. AI, particularly edge AI, is instrumental in processing sensor data and executing control commands with the necessary speed and intelligence.

  • Honda: Honda has incorporated electronically controlled suspension on models like the NT1100 touring motorcycle, where an IMU informs the system’s adjustments, suggesting sophisticated control algorithms are already in use.39 Patent filings describe steering actuators linked to suspension apparatus, potentially controlled based on detected roll angular velocity, hinting at integrated chassis control research.40 Research collaborations have also explored semi-active steering dampers using magnetorheological fluids (MRF), which could be dynamically controlled by an AI system based on real-time conditions.41 While explicit confirmation of “AI-driven” adaptive learning in current production suspension is lacking in the provided materials, the use of IMUs points towards advanced, real-time processing.39
  • Yamaha: Yamaha’s conceptual work, like the Y/AI motorcycle, explicitly envisions AI adjusting suspension on the fly.18 Their research into AMCES 17 and the Advanced Motorcycle Stabilization Assist System (AMSAS) 36 demonstrates practical application of AI and sensor-driven control (using IMUs and actuators) for chassis stabilization, particularly at low speeds. Although not full active suspension in the traditional sense, these systems represent steps towards AI-managed vehicle dynamics, focusing on stability and rider confidence.36
  • Suzuki: Suzuki recently introduced its first electronic suspension system, the Suzuki Advanced Electronic Suspension (S.A.E.S.), on the 2024 GSX-S1000GX+ model.38 This system allows riders to electronically select damping settings (soft, medium, hard, or a user-customized profile) and is part of the broader Suzuki Intelligent Ride System (SIRS).38 While electronically controlled, the available descriptions do not explicitly confirm that S.A.E.S. employs predictive AI or adaptive learning based on real-time road/rider analysis; it appears to be a pre-set, selectable system.38
  • Shimano: Shimano has a history with electronic suspension, dating back to the NEXAVE C910 system in 2001, which adjusted settings based on road conditions and speed.27 More significantly, recent patents (US 11866114 B2) detail a sophisticated automatic suspension control system for mountain bikes utilizing machine learning.32 This system integrates data from a wide array of sensors (speed, cadence, torque, accelerometers on suspension and frame, tire pressure, brake usage, yaw, roll, pitch, camera) and incorporates direct rider feedback (e.g., “Like/Dislike” responses to system adjustments) as training data.32 The goal is to enable the system to learn rider preferences and automatically optimize suspension behavior (including potentially seat post height/position) for specific terrains and riding styles, processed at the edge.32 Another patent describes electronically adjusting suspension stroke length and sag based on recorded data.33
  • General Research: Broader research explores the application of AI, machine learning, and even quantum computing techniques for optimizing suspension system parameters and control strategies.45 Systematic reviews highlight the ongoing development of various active safety systems for motorcycles, including stability control.46

The development trajectory clearly shows a move from passive or manually adjustable suspension towards electronically controlled systems, with AI and machine learning poised to enable fully active and adaptive capabilities. Shimano’s patented approach, incorporating machine learning trained by rider feedback, represents a significant step towards truly personalized and intelligent suspension operating at the edge.32

3.2 Multi-Power and Multi-Motor Control Strategies: AI for AWD/2WD Motorcycles and Beyond

Controlling vehicles with multiple power sources (hybrid) or multiple driven wheels/motors (2WD, AWD) introduces complexities requiring sophisticated control strategies, particularly for maintaining stability and optimizing performance in dynamic conditions. AI offers the potential to manage these systems intelligently.

  • Yamaha: Yamaha has explored multi-wheel concepts, most notably with the leaning three-wheeled Niken. A recent patent application related to this platform describes the potential use of independent electric motors in each of the two front wheels.47 The patent suggests using sensor data (bank angle, vehicle speed, angular speed) to automatically actuate these motors, potentially independently, to provide anti-tipping characteristics during cornering.47 While not explicitly detailed as “AI control” in the snippets, managing the torque distribution between these independent motors based on real-time sensor data for enhanced traction, stability, and rider feel would necessitate advanced algorithms, likely AI-driven and processed at the edge, aligning with Yamaha’s Jin-Ki Kanno philosophy of rider-machine unity.47 Their MOTOROiD concept, however, utilizes a single rear hub motor combined with the AMCES for balance.15
  • Honda, Suzuki, Toyota: The provided research material lacks specific examples of Honda, Suzuki, or Toyota developing AI control systems for multi-motor motorcycles or advanced AWD/2WD strategies for two-wheelers. Honda’s focus includes developing efficient e-Axles for its electric cars, leveraging expertise from hybrid vehicle development 48, and holds numerous general automotive patents.49 Suzuki is concentrating on single-motor electric scooters like the e-ACCESS and e-Address for market entry.50 Toyota’s R&D in this area appears centered on four-wheeled vehicles.53
  • General Research & Patents: Academic research investigates the dynamics and control challenges of AWD motorcycles.55 Patents exist for automotive torque vectoring systems 56 and AI-based energy management strategies for hybrid vehicles, often employing techniques like reinforcement learning.58 These concepts, while often demonstrated in automotive contexts, could theoretically be adapted for multi-power or multi-motor two-wheeled vehicles, managing power distribution between combustion engines and electric motors, or coordinating multiple electric motors for optimal traction and efficiency. Research into multi-agent reinforcement learning (MARL) for vehicle control also exists, potentially applicable to coordinating multiple motors or actuators.60

While AI-controlled active suspension is seeing tangible development and initial deployment, AI specifically for managing multi-motor configurations (like AWD or independent 2WD) on motorcycles appears to be a more nascent field, primarily explored conceptually in patents like Yamaha’s Niken evolution.47 The complex dynamics of motorcycles, especially during leaning maneuvers, present significant challenges for coordinating multiple drive motors effectively and reliably.55 Implementing such systems would require sophisticated edge AI capable of real-time sensor fusion and predictive control to ensure stability and enhance, rather than hinder, the riding experience.

3.3 Analysis of Vehicle Dynamics Advancements

The progress in AI-driven vehicle dynamics highlights a difference in maturity between suspension control and multi-motor control for two-wheelers. Active and adaptive suspension technologies are benefiting from advancements in sensors (IMUs), actuators, and control algorithms, with companies like Shimano pushing the boundary with explicit machine learning integration at the component level.32 This component-focused innovation potentially accelerates the adoption of edge AI features in the broader bicycle and micromobility market, as manufacturers can integrate these intelligent components more easily.

In contrast, AI control for multi-motor motorcycles remains largely conceptual or in early research phases, based on the available information. Yamaha’s Niken patent 47 offers a glimpse into the potential, but the practical implementation of AI for dynamic torque vectoring across multiple wheels on a leaning vehicle presents considerable hurdles in control theory, sensor fusion, and ensuring rider acceptance and safety.55 This area represents a significant frontier for future R&D, demanding robust edge AI solutions to manage the intricate real-time calculations required for stable and intuitive multi-motor propulsion on two or three wheels.

4. Human-Centric AI: Bridging Machine Control and Human Development

A significant thrust in the R&D efforts of Japanese mobility companies involves developing AI systems that interact with humans on a deeper level, moving beyond simple automation or assistance towards enhancing the user’s state, capabilities, and overall experience. This aligns with the user query’s focus on AI that “autocorrects machine towards human’s matter, molecular and observable signals towards their thinking/feeling development goals.”

4.1 Decoding “Thinking/Feeling Development Goals”: Framing the User’s Concept

The phrase “thinking/feeling development goals” suggests an ambition for AI in mobility that transcends basic functionality. It implies systems designed not just to transport or assist, but to actively contribute to the user’s cognitive and affective well-being and growth within the context of driving or riding. This can be interpreted as encompassing several interconnected objectives:

  1. Cognitive Enhancement/Support: AI assisting with or improving mental processes like attention, focus, hazard perception, situational awareness, and decision-making skills.35
  2. Affective State Regulation: AI sensing and positively influencing emotional states, such as reducing stress, mitigating frustration, increasing engagement, promoting calmness, or enhancing enjoyment.21
  3. Skill Development and Mastery: AI acting as a coach or providing adaptive assistance that helps users improve their driving or riding proficiency and confidence.37
  4. Human-Machine Relationship/Symbiosis: Fostering a deeper sense of connection, trust, partnership, or even unity between the human and the machine, as exemplified by Yamaha’s Jin-Ki Kanno philosophy.15

This vision aligns with established research fields including Affective Computing (AI understanding and responding to emotions) 66, Cognitive Augmentation 72, Intelligent Tutoring Systems / AI Scaffolding (AI supporting learning) 68, and Human-Machine Teaming / Shared Control / Symbiotic Systems (frameworks for collaboration and integration).70

4.2 Sensing the Human: Modalities and Technologies

Realizing human-centric AI hinges on the ability to accurately perceive the user’s state and intentions through various sensing modalities, processed primarily at the edge for real-time interaction.

  • Vision Systems: Cameras are widely used for monitoring driver gaze direction, head position, eye blinking patterns (for fatigue/distraction detection), and facial expressions (for emotion inference).21 Yamaha’s MOTOROiD also uses vision for recognizing the owner’s face and gestures.15
  • Audio Systems: Microphones capture voice commands and analyze tone of voice for emotional content, as seen in Toyota’s Yui system.21
  • Haptic Interfaces: These allow for bidirectional communication through touch. Yamaha’s MOTOROiD uses haptic elements in the seat/hip area (and the LEAF structure in MOTOROiD 2) to sense rider movements and provide feedback, fostering non-verbal communication.15 Honda also researches tactile feedback via seatbelts.77
  • Vehicle Dynamics Sensors: Data from IMUs, steering sensors, pedals, and wheel speeds are not only used for vehicle control but can also be analyzed by AI to infer driver intent, skill level, or state (e.g., abrupt inputs indicating stress or panic).32
  • Biometric Sensors (Primarily Research): While not explicitly confirmed in current mass-market systems by the snippets, research actively explores more direct physiological measures. Toyota collaborates with USC on sensor fusion using Electrodermal Activity (EDA/GSR), Electrocardiogram (ECG), Photoplethysmography (PPG), and respiration sensors to detect driver affective states.78 Honda’s research references Galvanic Skin Response (GSR) and Heart Rate (HR) indices for trust modeling.79 Studies also investigate breath analysis 80 and sweat analysis 81 for driver state monitoring. Wearable patches tracking skin temperature, humidity, heart rate, and blood oxygen are also being developed for emotion detection.82 These advanced sensors offer potential for more accurate state assessment but face challenges in non-intrusive integration and reliability in vehicles.
  • Radar: Emerging research explores using radar for non-contact monitoring of vital signs like breathing and heart rate, potentially overcoming limitations of cameras (e.g., working through blankets) and privacy concerns, though challenges in filtering noise remain.83

4.3 AI for Interpretation: From Signals to Understanding

Raw sensor data requires sophisticated AI algorithms, often running at the edge, to interpret its meaning and infer the user’s underlying state.

  • Emotion Recognition: AI models, often using deep learning, analyze facial expressions, voice tone, physiological signals (if available), and driving behavior to estimate emotional states like stress, fatigue, drowsiness, frustration, or engagement.21 Toyota’s Yui explicitly aims to do this.21
  • Preference Learning: AI systems learn individual user preferences over time by observing choices, driving patterns, and interactions with vehicle systems (e.g., climate, music). Honda’s Asimo OS aims to learn driver tendencies 9, and Toyota’s Yui learns preferences from conversation history and behavior.21 Shimano’s suspension learns preferences from rider feedback.32
  • Cognitive State Assessment: AI analyzes gaze patterns, driving inputs, and potentially physiological signals to assess alertness, attention levels, distraction, and cognitive load.35 Honda’s AI Driver Model compares driver behavior to a normative model based on brain research to estimate risk associated with cognitive state.35
  • Intention Prediction: By analyzing current actions, past behavior, and contextual information (e.g., navigation route, traffic), AI attempts to predict the driver’s or rider’s immediate intentions (e.g., lane change, turn, braking).77 Yamaha’s MOTOROiD HMI aims to sense rider intentions through subtle movements.15
  • Skill Level Assessment: AI can infer driver/rider skill by analyzing smoothness of control inputs, consistency, hazard response, and efficiency compared to expert models or population data.37 Honda’s Asimo OS intends to understand skill level 9, and Toyota’s Driving Sensei concept is built around assessing and improving skill.37

4.4 Company Initiatives Towards Human Augmentation & Development Goals

Each company is pursuing distinct approaches to human-centric AI, reflecting their unique philosophies and technological strengths.

  • Honda: Honda’s efforts are deeply rooted in understanding human cognition and behavior for safety.
  • The Asimo OS platform for SDVs explicitly aims to learn driver characteristics, skill level, and preferences to personalize ADAS behavior and suggest optimal drive modes, adapting the vehicle to the individual.9
  • Honda SENSING systems use AI for real-time hazard detection and intervention (braking, steering assist, automatic lane changes), framed as an “expansion of human capabilities” by providing a safety net and reducing driving burden.86 These systems operate at the edge for immediate response.86
  • Their AI Driver Model, informed by fMRI brain research and gaze tracking, compares driver behavior to a “normative” safe driving model. It provides assistance (warnings, subtle control inputs) to mitigate risks associated with low spatial recognition or delayed reactions, aiming to make drivers feel their skills have improved, suggesting an implicit cognitive development goal.35
  • Research into Cooperative Intelligence explores AI systems that understand human intention and collaborate towards shared mobility goals.77
  • Honda Research Institute is also exploring the use of Large Language Models (LLMs) within Advanced Rider Assistance Systems (ARAS) for motorcycles, potentially enhancing safety and decision-making through contextual intelligence.88
  • Toyota: Toyota, particularly through Toyota Research Institute (TRI), focuses on AI for both safety and creating deeper emotional connections and skill development.
  • The Concept-i/LQ vehicles with the Yui AI agent represent a strong push towards affective computing. Yui uses AI (including deep learning) to sense driver emotion (via expressions, voice, actions) and preferences, adapting the HMI (lighting, sound, seat adjustments, fragrance) to reduce stress, enhance alertness, and personalize the driving experience through conversation and route suggestions.21 This directly targets “feeling” development goals like improved well-being.
  • The Driving Sensei concept is an explicit attempt at AI-driven skill development.37 It uses an AI coach providing real-time, natural language instruction (based on driver actions) and AI-powered support (demonstrating expert maneuvers like drifting) to help drivers master skills and become safer, while remaining engaged.37 This directly targets “thinking” development goals related to driving proficiency.
  • TRI’s broader research emphasizes human-focused learning, building predictive models of driver behavior and intent, and exploring shared autonomy where human and AI collaborate.37 They also research emotional well-being in human-robot interaction 92 and use advanced biometric sensors (ECG, EDA etc.) in driving studies.78
  • Yamaha: Yamaha’s approach is heavily influenced by its Jin-Ki Kanno philosophy, aiming for a profound sense of unity and partnership between rider and machine.
  • MOTOROiD and MOTOROiD 2 are physical embodiments of this vision.15 Using AI for image recognition (owner face/gestures), AMCES for autonomous self-balancing, and advanced haptic HMIs (sensing rider movement/intention via hip contact or the LEAF structure), MOTOROiD aims to communicate non-verbally, respond intuitively, and provide subtle, unnoticed support (like steering assist).15 The goal is to transform the machine into a “partner” or “lifetime companion,” enhancing the fun and feeling of control, directly addressing the “feeling” aspect of development goals through rider-machine unity.15
  • The AMSAS system provides low-speed stability assistance, aiming to give riders “peace of mind” and allow them to focus more on operating the bike, indirectly supporting skill development and confidence.36 This falls under the Jin-Ki Kanno x Jin-Ki Anzen safety vision, which includes a “Skills” pillar focused on improving user riding skills, though specific AI-driven initiatives for this pillar are not detailed.36
  • Yamaha’s research into AI musicians (Muens) that synchronize with human performers, analyzing expression and predicting timing, demonstrates their interest in AI understanding and collaborating with human nuance and “Kansei” (sensibility/feeling).93
  • Shimano: Shimano focuses on component-level intelligence to enhance the cycling experience.
  • Their adaptive suspension patent using machine learning and rider feedback explicitly aims to personalize performance according to rider preference and terrain.32 By optimizing the bike’s setup automatically, it can enhance rider confidence and allow them to focus more on the trail, potentially aiding skill progression.
  • AUTO SHIFT technology automatically selects the optimal gear based on real-time analysis of cadence, torque, speed, and terrain, reducing cognitive load and ensuring efficient power delivery, which can make riding feel easier and more intuitive.29

Table 2: Matrix of Human-Centric AI Approaches

 

Company

Key System/Concept

Sensing Modalities Used

AI Interpretation Focus

Augmentation Strategy

Connection to “Thinking/Feeling Goals”

Key Snippets

Honda

Asimo OS, SENSING, AI Driver Model

Vision (Gaze, Face), Vehicle Sensors, Brain Activity (fMRI Res.), LLMs (ARAS Res.)

Cognition (Risk, Attention), Skill, Preference, Intent

Assistance, Adaptation, Collaboration

Enhance safety awareness (Thinking), Improve perceived skill (Feeling), Reduce cognitive load (Thinking)

9

Toyota

Yui Agent, Driving Sensei

Vision (Face, Expr.), Audio (Voice), Haptics (Seat), Vehicle Sensors, Biometrics (Res.)

Emotion, Preference, Skill, Alertness

Adaptation, Coaching, Personalization

Reduce stress, Improve alertness (Feeling), Develop driving skills (Thinking), Emotional bond (Feeling)

21

Yamaha

MOTOROiD/2 (AMCES, HMI), AMSAS

Vision (Face, Gesture), Haptics (Hip/LEAF), IMU, Vehicle Sensors

Intent, Rider Movement, Presence

Symbiosis, Assistance, Stability

Foster rider-machine unity (Jin-Ki Kanno) (Feeling), Enhance control/fun (Feeling), Peace of mind (Feeling)

15

Shimano

ML Suspension, AUTO SHIFT

Vehicle Sensors (Speed, Torque, Cadence, Accel.), Rider Feedback, Camera (Patent)

Rider Preference, Terrain, Efficiency

Optimization, Automation, Adaptation

Enhance confidence (Feeling), Reduce cognitive load (Thinking), Improve efficiency/focus (Thinking)

29

4.5 Relevant Theoretical Frameworks

The approaches being pursued by these companies map onto several theoretical frameworks for human-AI interaction:

  • AI Scaffolding / Intelligent Tutoring Systems: This involves AI providing temporary, adaptive support to facilitate learning. Toyota’s Driving Sensei, with its AI coach offering real-time instructions and tips, clearly aligns with this concept, aiming to build driver skill.37 Honda’s AI Driver Model, guiding drivers towards a normative model, also contains elements of scaffolding.35
  • Shared Control / Human-Autonomy Teaming: This framework describes scenarios where human and AI continuously share control authority over a task. Many ADAS functions, like Honda’s SENSING providing steering assistance or Yamaha’s AMSAS subtly adjusting balance, fit this model.36 The AI collaborates with the human, rather than simply taking over or being supervised.
  • Symbiotic Driving / Human-Machine Symbiosis: This represents a deeper level of integration, where the capabilities of human and machine merge, potentially creating a co-dependent system that performs beyond the sum of its parts.70 Yamaha’s vision for MOTOROiD and Jin-Ki Kanno, aiming for intuitive, non-verbal communication and a sense of partnership, strongly resonates with this concept.15 Honda’s Cooperative Intelligence research also points in this direction.87

4.6 Analysis of Human-Centric AI Developments

The exploration of human-centric AI reveals distinct philosophical and technological paths among the key players. Honda leverages deep research into human cognition (even brain activity) to build AI assistants focused primarily on enhancing safety and driver capability within established driving paradigms.35 Toyota pursues both explicit skill development (Driving Sensei) and affective computing (Yui) to create vehicles that are not only safer but also emotionally resonant and act as learning partners.22 Yamaha’s focus on Jin-Ki Kanno drives its efforts towards creating an almost organic sense of unity and intuitive interaction, particularly evident in the MOTOROiD concepts, prioritizing the experiential “feeling” of riding.15 Shimano, operating at the component level, provides tools for performance optimization and adaptation that enhance rider confidence and reduce cognitive load.29

While significant progress is being made in AI systems that assist with cognitive tasks (e.g., hazard perception via ADAS) and offer skill-based feedback (e.g., Driving Sensei concept), the ability of AI to genuinely foster deeper emotional well-being or achieve the more profound aspects of “feeling” development goals remains more conceptual and challenging to implement and measure.15 Yamaha’s pursuit of Jin-Ki Kanno through MOTOROiD is ambitious but its full realization is likely still some way off.15

The advancement of these human-centric AI systems is heavily dependent on progress in sensor technology.84 Accurately capturing subtle human physiological and behavioral signals non-intrusively within a complex vehicle environment is crucial.78 While cameras and microphones are standard, the integration and reliable interpretation of advanced biometrics or haptic feedback systems remain key areas for development that will pace the realization of these ambitious human-augmentation goals.15

5. Navigating the Challenges: Ethics, Reliability, and Security

The development and deployment of advanced edge AI systems in vehicles, especially those interacting closely with human users and controlling safety-critical functions, present significant challenges related to data privacy, security, system reliability, and ethical considerations.

5.1 Data Privacy in Learning Systems

AI systems designed to learn driver/rider behavior, skills, preferences, and even emotional states inherently rely on collecting and processing vast amounts of potentially sensitive personal data.21 This includes driving patterns, location history, biometric signals (in research contexts), facial features, voice recordings, and interaction logs.21

  • The Challenge: Balancing the data requirements for effective AI personalization and adaptation with stringent privacy regulations (like CCPA, GDPR) and user expectations is a major hurdle.97 Users need transparency and control over what data is collected, how it’s used, and who it’s shared with.98
  • Company Approaches:
  • Honda: Has faced regulatory scrutiny regarding data practices.99 Their official Vehicle Data Privacy Notice outlines commitments to transparency, choice (opt-outs for certain data collection), data minimization, security, and respect for context when handling “Covered Information,” which includes “Driver Behavior Information”.98 However, the notice lacks specific details on how data inferred by advanced AI about cognitive or emotional states would be treated differently or how it relates to “development goals”.98 They commit to de-identification where appropriate.98
  • Yamaha: Their AI Usage Policy (focused on sound/music) includes commitments to protecting privacy, establishing appropriate data handling systems, and respecting stakeholder rights.100 Their general website privacy policy covers data collection.102 The specific application of these principles to the data collected by MOTOROiD (facial recognition, gestures, haptic inputs) remains unclear from the provided documents.16
  • Toyota: The Yui system’s function relies on measuring emotion and preferences.21 While the goal is to improve quality of life and build a relationship, the ethical handling and privacy implications of continuously monitoring and mapping emotions require careful consideration.97 Specific privacy policies for Yui are not detailed in the snippets.23
  • Edge AI’s Role: Processing data locally via edge AI can mitigate some privacy risks by reducing the need to transmit raw sensitive data to the cloud.1 Techniques like federated learning, where models are trained on decentralized data without sharing the raw data itself, offer potential solutions.4 However, securing the data stored and processed on the edge device remains critical.

5.2 Security Vulnerabilities of Edge AI in Vehicles

Deploying AI capabilities across distributed edge devices (sensors, ECUs, actuators) within a vehicle creates new security challenges.

  • The Challenge: Edge components can be physically accessed, making them vulnerable to tampering, reverse engineering of algorithms (IP theft), and manipulation.103 Wireless communication links, used for updates or inter-component communication (e.g., Shimano Di2), can be susceptible to jamming, spoofing, or replay attacks.34 Ensuring the security and integrity of data and models across a complex, distributed edge architecture is difficult.5 Malicious actors could potentially compromise safety-critical functions or manipulate systems that influence driver behavior.103
  • Mitigation Strategies: Robust security measures are essential, including secure boot processes, hardware roots of trust (RoT), data encryption (at rest and in transit), access controls, and anomaly detection systems designed to identify malicious activity targeting AI models or edge devices.5 Centralizing some cybersecurity functions onto dedicated edge AI ECUs might simplify integration and maintenance.5 A holistic approach considering both traditional cybersecurity threats and AI-specific attacks (like model poisoning or evasion) is necessary.103
  • Current Status: While companies like Yamaha 100 and Honda 98 state commitments to security in their policies, detailed technical information on the specific security architectures protecting their edge AI systems is generally not publicly available in the provided research. The documented vulnerabilities in systems like Shimano’s Di2 highlight the real-world risks.34

5.3 Reliability Considerations for Safety-Critical AI Systems

Edge AI systems controlling vehicle dynamics or interacting with drivers must function reliably under demanding conditions.

  • The Challenge: Automotive environments involve vibrations, temperature fluctuations, and electrical noise, which can degrade sensor performance and hardware reliability.6 Edge AI algorithms must be robust to noisy, incomplete, or variable sensor data.6 Hardware failures in sensors or processors must be tolerated without causing catastrophic system failure.6 Ensuring the AI makes consistently correct and safe decisions, especially in unforeseen circumstances or edge cases, is a major validation challenge.4 Algorithmic bias, where AI performs poorly for certain demographics or conditions due to skewed training data, is also a reliability concern.97
  • Solutions and Approaches: Designing systems with redundancy (multiple sensors, backup processors), fault tolerance, and fail-safe mechanisms is crucial.8 Employing robust control strategies that can handle uncertainty is key.17 Extensive testing and validation using simulations (like Toyota using CARLA 104) and real-world driving are necessary.105 Developing lightweight, efficient AI models optimized for resource-constrained edge hardware is also important.7
  • Company Efforts: Honda emphasizes rigorous testing 105 and collaborates on high-performance SoCs potentially incorporating reliability features.10 Yamaha conducted extensive testing for AMSAS development.36 Suzuki highlights the reliability of components like the battery in its e-Address scooter.51 Shimano emphasizes the durability of its e-bike components.106

5.4 Ethical Dimensions of Affective Computing and Cognitive Influence

Using AI to sense, interpret, and potentially influence human emotions and cognitive states introduces complex ethical considerations.

  • The Challenge: Systems like Toyota’s Yui 21 or Honda’s AI Driver Model 35 raise questions about user autonomy (is the AI subtly manipulating choices or feelings?), potential for manipulation, transparency (does the user understand how the AI works and influences them?), fairness (does the AI work equally well for everyone?), and psychological impact (could over-reliance erode skills, or could constant monitoring induce anxiety?).66 Defining the appropriate boundaries for AI intervention and ensuring alignment with human values and well-being is critical.87 The “black box” nature of some AI models makes it difficult to understand their decision-making processes, hindering trust and accountability.97
  • Frameworks and Principles: Addressing these challenges requires adopting principles of Responsible AI (RAI), emphasizing human-centered values, fairness, transparency, accountability, reliability, safety, and privacy.108 Developing Explainable AI (XAI) techniques can help demystify AI decision-making.97 Maintaining a human-in-the-loop approach, where the human retains meaningful control and agency, is often advocated.72
  • Company Stances: Honda explicitly promotes a human-centered approach through its Cooperative Intelligence concept, aiming to respect human values like competence and self-esteem, and has established a responsible AI council.65 Yamaha’s AI Usage Policy emphasizes “AI & Humans Working in Harmony,” respecting rights, ensuring transparency, and establishing AI governance.100 Toyota’s development of Yui aims to build an “emotional bond” 22 and TRI focuses on “amplifying people” 37, suggesting positive intent, but formal ethical frameworks specifically for their affective computing initiatives are less explicit in the provided material.23

5.5 Analysis of Challenges and Governance

The push towards personalized, adaptive, and human-centric AI in vehicles creates a fundamental tension, often referred to as the privacy paradox. The very data needed to make these systems effective (detailed behavioral, cognitive, and potentially emotional information) is highly sensitive, demanding robust privacy protections and user control that can sometimes limit functionality.21 Edge processing offers a partial technical solution by reducing data transmission, but the core challenge of responsible data governance for these learning systems remains.1

Furthermore, while edge AI enables the low latency required for safety-critical functions, ensuring the reliability of these complex systems in the unpredictable real world is arguably their Achilles’ heel.4 Failures in edge AI controlling steering, braking, suspension, or intervening based on inferred driver state could have severe consequences. This necessitates significant investment in robust design, redundancy, and rigorous validation methodologies that may still be evolving.8

Finally, the ethical landscape surrounding AI that interacts with human cognition and emotion is still developing.71 While companies are establishing high-level AI ethics principles 65, translating these into concrete design guidelines, operational safeguards, and governance mechanisms for systems that actively monitor and potentially shape user “thinking and feeling” lags behind the rapid pace of technological exploration.100 Striking the right balance between innovation, safety, user benefit, autonomy, and ethical responsibility will be crucial for the successful adoption of these advanced human-centric AI systems.

6. Conclusion and Future Outlook

6.1 Synthesis of Findings: Current State vs. Future Vision

The R&D landscape among leading Japanese mobility companies reveals significant investment in decentralized edge AI as an enabling technology for future vehicles and micromobility solutions. Tangible progress is evident in areas like electronically controlled suspension (Honda, Suzuki, Yamaha concepts) and component-level AI for optimization (Shimano). Advanced driver assistance systems (Honda SENSING, Toyota Safety Sense) increasingly leverage AI for perception and basic intervention. Architecturally, strategies range from Honda’s pursuit of powerful centralized edge compute nodes to Yamaha’s and Shimano’s more distributed or component-integrated intelligence.

However, the realization of the more ambitious visions presented – fully active, AI-adaptive suspension across product lines, widespread AI control for multi-motor two-wheelers, and particularly AI systems capable of fostering deep human cognitive and emotional development – remains largely in the conceptual, prototype, or early research stages. Concepts like Yamaha’s MOTOROiD aiming for Jin-Ki Kanno and Toyota’s Yui agent and Driving Sensei coach represent compelling future directions but are not yet fully realized commercial products.

6.2 Addressing the User Query: Progress Towards AI-Driven Human Development via Vehicles

Evaluating the progress towards vehicles that “autocorrect” towards human “thinking/feeling development goals,” the analysis indicates that foundational elements are being actively developed. Companies are building systems capable of:

  • Sensing: Capturing increasingly diverse human signals (gaze, expression, voice, movement, potentially biometrics).
  • Interpreting: Using AI to infer aspects of the user’s state (fatigue, stress, attention, basic preferences, rudimentary skill level).
  • Assisting/Adapting: Providing real-time support (warnings, control interventions, HMI adjustments) based on this interpretation.

However, the leap towards AI actively and demonstrably fostering development – systematically improving cognitive skills, enhancing emotional regulation capabilities, or cultivating a profound sense of human-machine unity – is substantial and largely aspirational at this point. Current systems primarily focus on safety, comfort, and immediate task assistance or personalization. While Honda’s AI Driver Model aims to make drivers feel more skilled 35, Toyota’s Driving Sensei concept explicitly targets skill coaching 37, and Yamaha’s MOTOROiD seeks emotional unity 15, these represent the cutting edge of research and concept development rather than widespread, validated capabilities for achieving deep-seated “thinking/feeling development goals.”

6.3 Key Innovators and Differentiators among Japanese Players

  • Honda: Stands out for its foundational research linking brain activity to driving behavior 35, its development of the Asimo OS and centralized SoC architecture for SDVs 9, and its Cooperative Intelligence concepts.87
  • Yamaha: Differentiates itself through the Jin-Ki Kanno philosophy and its embodiment in the MOTOROiD concepts, pushing boundaries in HMI, non-verbal communication, and symbiotic human-machine interaction.15 Their stabilization technologies (AMCES, AMSAS) are also notable.36
  • Toyota: Leads in exploring affective computing in vehicles with the Yui agent 21 and is explicitly researching AI for driver skill development through the Driving Sensei concept via TRI.37
  • Shimano: Innovates uniquely at the component level, integrating sophisticated edge AI and machine learning directly into bicycle drivetrains and suspension systems, potentially driving faster adoption in micromobility.29
  • Suzuki: Based on the available information, appears focused on leveraging existing AI platforms (like Azure OpenAI) for connected services and operational efficiency, with less public emphasis currently on cutting-edge edge AI for vehicle dynamics or deep human-centric systems.24

6.4 Recommendations for Future R&D and Industry Focus

To realize the potential of decentralized edge AI in mobility, particularly for advanced vehicle dynamics and human-centric applications, continued focus is needed in several areas:

  1. Edge AI Platforms: Develop more powerful, energy-efficient, and robust edge computing hardware (SoCs, NPUs) and software frameworks specifically designed to handle the complex AI models and real-time demands of automotive and micromobility applications under harsh environmental conditions.
  2. Reliability and Safety Validation: Establish standardized methodologies and tools for rigorously testing, validating, and ensuring the safety and reliability of complex, adaptive AI systems operating at the edge, especially those controlling safety-critical functions or influencing human state.
  3. Human Sensing Technology: Advance the development and integration of non-intrusive, accurate, and reliable sensors capable of capturing a wider range of human physiological and behavioral signals (including nuanced cognitive and emotional indicators) within the vehicle environment.
  4. Human State Modeling: Improve AI models for interpreting complex human signals to achieve a deeper understanding of user state, intent, cognitive load, emotional nuances, and skill progression.
  5. Ethical Governance and Transparency: Develop and implement clear, specific ethical guidelines and robust governance frameworks for the design and deployment of AI systems involved in affective computing and cognitive influence within vehicles. Prioritize transparency, user control, data privacy, fairness, and the preservation of human autonomy and well-being. Methods for Explainable AI (XAI) are crucial.
  6. Multi-Motor Control: Intensify research into AI-driven control strategies for multi-motor electric two-wheelers (2WD/AWD), focusing on achieving enhanced stability, traction, and intuitive rider feel across diverse conditions.
  7. Co-Adaptive Learning Systems: Explore AI architectures where both the human user and the AI system can learn and adapt from each other over time, fostering genuine skill development and a more synergistic human-machine partnership.

In conclusion, Japanese companies are making significant strides in leveraging decentralized edge AI to create more intelligent, responsive, and personalized vehicles. While advancements in AI-controlled suspension and basic human-state monitoring are becoming tangible, the vision of vehicles as active partners in human cognitive and emotional development remains a compelling but long-term goal, contingent on overcoming substantial technical, ethical, and validation challenges.

Works cited

  1. What is edge AI? | Micron Technology Inc., accessed May 7, 2025, https://www.micron.com/about/micron-glossary/edge-ai
  2. The Future of Industrial Automation with Edge AI – Cyient, accessed May 7, 2025, https://www.cyient.com/whitepaper/edge-ai-revolutionizing-industrial-automation-in-manufacturing-automotive-and-plant-operations
  3. www.arxiv.org, accessed May 7, 2025, http://www.arxiv.org/pdf/2503.09638
  4. Revisiting Edge AI: Opportunities and Challenges – Homepages of UvA/FNWI staff, accessed May 7, 2025, https://staff.fnwi.uva.nl/a.d.pimentel/artemis/InternetComputing24.pdf
  5. Driving Intelligence: How Edge AI Is Transforming Vehicle Threat Detection – VicOne, accessed May 7, 2025, https://vicone.com/blog/driving-intelligence-how-edge-ai-is-transforming-vehicle-threat-detection
  6. Robustness and Reliability in Edge AI Systems – XenonStack, accessed May 7, 2025, https://www.xenonstack.com/blog/edge-ai-systems
  7. Chapter 10: Edge AI Challenges and Real-World Mitigations – Wevolver, accessed May 7, 2025, https://www.wevolver.com/article/2024-state-of-edge-ai-report/edge-ai-challenges-and-real-world-mitigations
  8. Mobility Tech Forum 2025: Exploring the Future of Autonomous Driving – EE Times Europe, accessed May 7, 2025, https://www.eetimes.eu/mobility-tech-forum-2025-exploring-the-future-of-autonomous-driving/
  9. Honda delivers details on progress of software-defined vehicle …, accessed May 7, 2025, https://www.repairerdrivennews.com/2025/03/13/honda-delivers-details-on-progress-of-software-defined-vehicle-development/
  10. Chiplet technology and advanced SoCs are shaping the future of software-defined vehicles, accessed May 7, 2025, https://www.rdworldonline.com/chiplet-technology-and-advanced-socs-are-shaping-the-future-of-software-defined-vehicles/
  11. New Possibilities of Mobility Brought by Honda 0 Series | Honda …, accessed May 7, 2025, https://global.honda/en/stories/154-2501-ev-ces2025.html
  12. 25-1-9 KHI by Pakistan Today – Issuu, accessed May 7, 2025, https://issuu.com/pakistantoday-paperazzi/docs/25-1-9_khi
  13. Vehicle-to-Everything-Car Edge Cloud Management with Development, Security, and Operations Automation Framework – MDPI, accessed May 7, 2025, https://www.mdpi.com/2079-9292/14/3/478
  14. Distributed Edge Computing System for Vehicle Communication (Conference Paper), accessed May 7, 2025, https://par.nsf.gov/biblio/10494088-distributed-edge-computing-system-vehicle-communication
  15. MOTOROiD – Yamaha Motor Design, accessed May 7, 2025, https://global.yamaha-motor.com/design_technology/design/concept/motoroid/
  16. Examining MOTOROiD | Yamaha Motor Co., Ltd., accessed May 7, 2025, https://global.yamaha-motor.com/design_technology/technology/electronic/011/
  17. Robust Control Strategy for Robotic Motorcycle Without Falling Down at Low-Speed Driving, accessed May 7, 2025, https://www.researchgate.net/publication/365630290_Robust_Control_Strategy_for_Robotic_Motorcycle_Without_Falling_Down_at_Low-Speed_Driving
  18. Yamaha Y/AI: A Glimpse into the AI-Powered Future of Motorcycles – Captain Electro, accessed May 7, 2025, https://www.captainelectro.com/motorcycles/yamaha-yai-is-a-motorcycle-concept-so-wild-it-makes-a-ufo-look-like-a-horse-drawn-cart
  19. How Generative AI Helped Final Aim to Design a Unique Yamaha EV for Farming Terrain, accessed May 7, 2025, https://app.gnosis-mfe.autodesk.com/autodesk-university/de/class/How-Generative-AI-Helped-Final-Aim-to-Design-a-Unique-Yamaha-EV-for-Farming-Terrain-2024
  20. The Role of Edge AI and AI Agents in the Automotive Industry, accessed May 7, 2025, https://www.xenonstack.com/blog/edge-ai-agents-automotive-industry
  21. Toyota Defines Future of Mobility with Concept Car “TOYOTA …, accessed May 7, 2025, https://global.toyota/en/detail/19129483
  22. Toyota’s New “LQ” Wants to Build an Emotional Bond with Its Driver, accessed May 7, 2025, https://media.toyota.ca/en/releases/2019/toyotas-new-lq-wants-to-build-an-emotional-bond-with-its-driver.html
  23. Toyota Concept-i Makes the Future of Mobility Human, accessed May 7, 2025, https://pressroom.toyota.com/toyota-concept-i-future-of-mobility-human-ces-2017/
  24. Suzuki embraces Azure OpenAI Service, supercharges AI use with five apps, and sparks ideas for AI in business | Microsoft Customer Stories, accessed May 7, 2025, https://www.microsoft.com/en/customers/story/1752392067274688790-suzuki-motor-corporation-azure-automotive-en-japan
  25. Mild Hybrid Vehicles Market Size & Share | Industry Growth [2032], accessed May 7, 2025, https://www.skyquestt.com/report/mild-hybrid-vehicles-market
  26. Suzuki turns to AI for service – GoAutoNews Premium, accessed May 7, 2025, https://premium.goauto.com.au/suzuki-turns-to-ai-for-service/
  27. Di2 | SHIMANO BIKE COMPONENT, accessed May 7, 2025, https://bike.shimano.com/en-AU/technologies/details/di2.html
  28. E-Bike Cycling | Components, Drive units, Footwear | Shimano, accessed May 7, 2025, https://bike.shimano.com/category/e-bike.html
  29. New Intelligent Shifting Technologies to Elevate Your Ride, accessed May 7, 2025, https://bike.shimano.com/stories/article/new-intelligent-shifting-technologies-to-elevate-your-ride.html
  30. E-Bike Systems | Shimano Lifestyle, accessed May 7, 2025, https://lifestylebike.shimano.com/US/products/e-bike/
  31. New Intelligent Shifting Technologies to Elevate Your Ride – shimano bike, accessed May 7, 2025, https://bike.shimano.com/en-NA/stories/article/new-intelligent-shifting-technologies-to-elevate-your-ride.html
  32. Shimano Files Patent Detailing Machine Learning for Automatic …, accessed May 7, 2025, https://www.pinkbike.com/news/shimano-files-patent-detailing-machine-learning-for-automatic-suspension-control.html
  33. Patent Patrol: Shimano electronic suspension control automatically adjusts stroke, accessed May 7, 2025, https://bikerumor.com/patent-patrol-shimano-suspension-control-adjusts-stroke/
  34. MakeShift: Security Analysis of Shimano Di2 Wireless Gear Shifting in Bicycles | USENIX, accessed May 7, 2025, https://www.usenix.org/conference/woot24/presentation/motallebighomi
  35. Intelligent driver-assistive technology|Honda Technology|Honda, accessed May 7, 2025, https://global.honda/en/tech/Intelligent_driver_assistance/
  36. Newsletter :Developing the Advanced Motorcycle Stabilization …, accessed May 7, 2025, https://global.yamaha-motor.com/news/2023/0327/newsletter.html
  37. Toyota Research Institute Showcases Latest AI-Assisted Driving …, accessed May 7, 2025, https://pressroom.toyota.com/toyota-research-institute-showcases-latest-ai-assisted-driving-technology/
  38. 2024 Suzuki GSX-S1000GX+: Suzuki’s New Electronic Suspension – VikingBags, accessed May 7, 2025, https://www.vikingbags.com/blogs/news/the-allnew-2024-suzuki-gsxs1000gx-first-suzuki-bike-to-receive-suzukis-new-electronic-suspension
  39. 2025 Honda NT1100 First Ride Review – All the changes tested – YouTube, accessed May 7, 2025, https://www.youtube.com/watch?v=i2ChVoAYSgI
  40. Does Honda’s Motorcycle Patent Describe an Anti-Fall Device? – RideApart.com, accessed May 7, 2025, https://www.rideapart.com/news/748871/honda-motorcycle-patent-steering/
  41. Design of a Motorcycle Steering Damper for a Safer Ride – MDPI, accessed May 7, 2025, https://www.mdpi.com/2075-1702/8/2/24
  42. Robust Control Strategy for Robotic Motorcycle Without Falling Down at Low-Speed Driving – Yamaha Motor Global, accessed May 7, 2025, https://global.yamaha-motor.com/design_technology/technical_review/pdf/browse/58ts02.pdf
  43. Yamaha Developing Advanced Motorcycle Stabilization Assist System – Roadracing World, accessed May 7, 2025, https://www.roadracingworld.com/news/yamaha-developing-advanced-motorcycle-stabilization-assist-system/
  44. Robot learns motorcycle stunts: AI on two wheels, accessed May 7, 2025, https://motorcycles.news/en/robot-learns-motorcycle-stunts-ai-on-two-wheels/
  45. Multi-Objective Optimization of Independent Automotive Suspension by AI and Quantum Approaches: A Systematic Review – MDPI, accessed May 7, 2025, https://www.mdpi.com/2075-1702/13/3/204
  46. Full article: Active safety systems for powered two-wheelers: A systematic review, accessed May 7, 2025, https://www.tandfonline.com/doi/full/10.1080/15389588.2019.1700408
  47. Is Yamaha Tricking Out Its Wild 3-Wheeler With a Bunch Of New Tech?, accessed May 7, 2025, https://www.rideapart.com/news/755023/yamaha-niken-new-patent-tech/
  48. Honda Introduces Next-generation Technologies for Honda 0 Series Models at Honda 0 Tech Meeting 2024 | Honda Global Corporate Website, accessed May 7, 2025, https://global.honda/en/newsroom/news/2024/c241009eng.html
  49. Honda U.S. Patents, Patent Applications and Patent Search – Justia Patents Search, accessed May 7, 2025, https://patents.justia.com/company/honda
  50. Suzuki Unveils Three New Motorcycle Models Including an EV Scooter at the Bharat Mobility Global Expo 2025, accessed May 7, 2025, https://www.globalsuzuki.com/globalnews/2025/0117a.html
  51. e-Address | PRODUCTS | SUZUKI MOTORCYCLE GLOBAL SALON, accessed May 7, 2025, https://www.globalsuzuki.com/motorcycle/smgs/products/2025e-address/
  52. Suzuki is the latest to go electric with its e-Address scooter – New Atlas, accessed May 7, 2025, https://newatlas.com/motorcycles/suzuki-e-address-scooter/
  53. GenAI Is Changing Everything at Toyota — AI Chief Brian Kursar Explains How, accessed May 7, 2025, https://www.cdomagazine.tech/opinion-analysis/genai-is-changing-everything-at-toyota-ai-chief-brian-kursar-explains-how
  54. Toyota Research & Development: A Movement of Movement – Toyota USA Newsroom, accessed May 7, 2025, https://pressroom.toyota.com/toyota-research-development-a-movement-of-movement/
  55. Trajectory Preview Tracking Control for Self-Balancing Intelligent Motorcycle Utilizing Front-Wheel Steering – MDPI, accessed May 7, 2025, https://www.mdpi.com/2571-5577/7/6/115
  56. US11098795B2 – Torque vectoring apparatus – Google Patents, accessed May 7, 2025, https://patents.google.com/patent/US11098795B2/en
  57. WO2006068607A1 – All -wheel drive torque vectoring system – Google Patents, accessed May 7, 2025, https://patents.google.com/patent/WO2006068607A1/en
  58. US8337357B2 – Hybrid vehicle auxiliary equipment energy management – Google Patents, accessed May 7, 2025, https://patents.google.com/patent/US8337357B2
  59. Hybrid power energy management method and system for vehicle fuel cell – Google Patents, accessed May 7, 2025, https://patents.google.com/patent/CN112757922A/en
  60. (PDF) Multi-agent reinforcement learning for autonomous vehicles: a survey – ResearchGate, accessed May 7, 2025, https://www.researchgate.net/publication/365441752_Multi-agent_reinforcement_learning_for_autonomous_vehicles_a_survey
  61. US20150301532A1 – Networked multi-role robotic vehicle – Google Patents, accessed May 7, 2025, https://patents.google.com/patent/US20150301532A1/en
  62. Behaviorally-Aware Multi-Agent RL With Dynamic Optimization for Autonomous Driving | Request PDF – ResearchGate, accessed May 7, 2025, https://www.researchgate.net/publication/387851784_Behaviorally-aware_Multi-Agent_RL_with_Dynamic_Optimization_for_Autonomous_Driving
  63. An Adaptive Energy Orchestrator for Cyberphysical Systems Using Multiagent Reinforcement Learning – MDPI, accessed May 7, 2025, https://www.mdpi.com/2624-6511/7/6/125
  64. Design of Reward Function on Reinforcement Learning for Automated Driving – arXiv, accessed May 7, 2025, https://arxiv.org/html/2503.16559v1
  65. American Honda steers AI efforts with data governance, quality focus | CIO Dive, accessed May 7, 2025, https://www.ciodive.com/news/American-Honda-AI-governance-data-quality-adoption/746794/
  66. Empathetic AI Chatbots: The Transformative App for AI | SAP, accessed May 7, 2025, https://www.sap.com/japan/blogs/empathy-affective-computing-ai
  67. (PDF) Emotion-aware Design in Automobiles: Embracing Technology Advancements to Enhance Human-vehicle Interaction – ResearchGate, accessed May 7, 2025, https://www.researchgate.net/publication/391239854_Emotion-aware_Design_in_Automobiles_Embracing_Technology_Advancements_to_Enhance_Human-vehicle_Interaction
  68. Computational Teaching for Driving via Multi-Task Imitation Learning – ResearchGate, accessed May 7, 2025, https://www.researchgate.net/publication/384599156_Computational_Teaching_for_Driving_via_Multi-Task_Imitation_Learning
  69. Jin-Ki Kanno, Yamaha’s MOTOROiD AI Robot Self-Balancing Motorcycle Concept, accessed May 7, 2025, https://resident.com/vehicles-and-transportation/2025/01/17/jin-ki-kanno-yamahas-motoroid-ai-robot-self-balancing-motorcycle-concept
  70. Are Rider-Horse or Centaurs intelligent Human Systems Integration?First Sketch of reversible and non-reversible human technology/machine/AI Symbiosis – ResearchGate, accessed May 7, 2025, https://www.researchgate.net/publication/360025721_Are_Rider-Horse_or_Centaurs_intelligent_Human_Systems_IntegrationFirst_Sketch_of_reversible_and_non-reversible_human_technologymachineAI_Symbiosis
  71. Affective computing in the modern workplace – ResearchGate, accessed May 7, 2025, https://www.researchgate.net/publication/342074413_Affective_computing_in_the_modern_workplace
  72. Symbiotic AI: Augmenting Human Cognition from PCs to Cars – arXiv, accessed May 7, 2025, https://www.arxiv.org/pdf/2504.03105
  73. Computational Teaching for Driving via Multi-Task Imitation Learning – arXiv, accessed May 7, 2025, https://arxiv.org/html/2410.01608v1
  74. A review of shared control in automated vehicles: System evaluation – ResearchGate, accessed May 7, 2025, https://www.researchgate.net/publication/368254178_A_review_of_shared_control_in_automated_vehicles_System_evaluation?_tp=eyJjb250ZXh0Ijp7InBhZ2UiOiJzY2llbnRpZmljQ29udHJpYnV0aW9ucyIsInByZXZpb3VzUGFnZSI6bnVsbCwic3ViUGFnZSI6bnVsbH19
  75. Inside Toyota’s Advanced Driver Monitoring Technology – DCH Toyota of Oxnard, accessed May 7, 2025, https://www.toyotaofoxnard.com/blog/2025/april/2/inside-toyotas-advanced-driver-monitoring-technology.htm
  76. Newsletter : MOTOROiD’s Evolutionary Path to Explore New Human …, accessed May 7, 2025, https://global.yamaha-motor.com/news/2024/0524/newsletter.html
  77. Honda R&D|Innovative Research Excellence, accessed May 7, 2025, https://global.honda/en/RandD/hgrx/
  78. Sensor Fusion in Driving | Biosignal Sensing & Processing, accessed May 7, 2025, https://sail.usc.edu/~biosp/projects/Toyota/
  79. Real-Time Trust Prediction in Conditionally Automated Driving Using Physiological Measures, accessed May 7, 2025, https://par.nsf.gov/servlets/purl/10500437
  80. Toshihisa Sato National Institute of Advanced Industrial Science and Technology · Human-Centered Mobility Research Center – ResearchGate, accessed May 7, 2025, https://www.researchgate.net/profile/Toshihisa-Sato
  81. The utility of gaze entropy measures in estimating visual scanning, accessed May 7, 2025, https://researchbank.swinburne.edu.au/items/57f6ea35-db30-4322-b29a-9b98911fe376/1/Brook%20Shiferaw%20Thesis.pdf?.vi=save
  82. High-tech sticker can identify real human emotions | Penn State University, accessed May 7, 2025, https://www.psu.edu/news/research/story/high-tech-sticker-can-identify-real-human-emotions
  83. AI giving a boost to efforts to monitor health via radar – Japan Today, accessed May 7, 2025, https://japantoday.com/category/features/health/ai-is-giving-a-boost-to-efforts-to-monitor-health-via-radar
  84. (PDF) Review and Perspectives on Human Emotion for Connected Automated Vehicles, accessed May 7, 2025, https://www.researchgate.net/publication/377823567_Review_and_Perspectives_on_Human_Emotion_for_Connected_Automated_Vehicles
  85. Revolutionizing Road Safety: AI-Powered Vehicle Monitoring Systems Unveiled at CES 2025 in Las Vegas – – Farmonaut, accessed May 7, 2025, https://farmonaut.com/news/revolutionizing-road-safety-ai-powered-vehicle-monitoring-systems-unveiled-at-ces-2025-in-las-vegas/
  86. Further advancement of Honda SENSING – Safety and driver …, accessed May 7, 2025, https://global.honda/en/stories/049.html
  87. Cooperative Intelligence – A Humane Perspective – Honda Research Institute Europe, accessed May 7, 2025, https://www.honda-ri.de/pubs/pdf/4274.pdf
  88. LLM Safety Framework Development for Motorcycle Advanced Rider Assistance Systems (ARAS) – Honda Research Institute USA, accessed May 7, 2025, https://usa.honda-ri.com/-/llm-safety-framework-development-for-motorcycle-advanced-rider-assistance-systems-aras-
  89. Toyota’s New “LQ” Wants to Build an Emotional Bond with Its Driver – Congress.gov, accessed May 7, 2025, https://www.congress.gov/116/meeting/house/110513/documents/HHRG-116-IF17-20200211-SD037.pdf
  90. Artificial Intelligence Agent “Yui” delivers personalised driving experience – Torque Toyota, accessed May 7, 2025, https://www.torquetoyota.com.au/2019/10/11/artificial-intelligence-agent-yui-delivers-personalised-driving-experience/
  91. Toyota Research Institute emphasizes safety of AI-assisted driving – The Robot Report, accessed May 7, 2025, https://www.therobotreport.com/toyota-research-institute-emphasizes-safety-of-ai-assisted-driving/
  92. Well-Being from the Perspective of Responsible Research and Technology Development | Frontier Research | Mobility | Toyota Motor Corporation Official Global Website, accessed May 7, 2025, https://global.toyota/en/mobility/frontier-research/42097278.html
  93. Research and Development – AI Music Ensemble Technology – Yamaha Corporation, accessed May 7, 2025, https://www.yamaha.com/en/tech-design/research/technologies/muens/
  94. Research and Development – Yamaha Corporation, accessed May 7, 2025, https://www.yamaha.com/en/tech-design/research/
  95. Driver simulators to test shared controls, limited autonomy vehicle systems – Federal Highway Administration, accessed May 7, 2025, https://highways.dot.gov/media/4086
  96. From the Concept of Being “the Boss” to the Idea of Being “a Team”: The Adaptive Co-Pilot as the Enabler for a New Cooperative Framework – MDPI, accessed May 7, 2025, https://www.mdpi.com/2076-3417/11/15/6950
  97. The Ethics of AI in Transportation: Balancing Safety, Privacy, and Fairness – Numalis, accessed May 7, 2025, https://numalis.com/ethics-of-ai-in-transportation-safety-privacy-fairness/
  98. Honda/Acura Vehicle Data Privacy Notice, accessed May 7, 2025, https://www.honda.com/privacy/connected-product-privacy-notice
  99. Honda Settles CPPA Allegations Regarding California Consumer Privacy Act Violations, accessed May 7, 2025, https://www.insideprivacy.com/ccpa/honda-settles-cppa-allegations-regarding-california-consumer-privacy-act-violations/
  100. Yamaha Group AI Usage Policy for Sound and Music – Sustainability …, accessed May 7, 2025, https://www.yamaha.com/en/sustainability/related-information/policy-type/ai-usage-policy/
  101. VEHICLE DATA PRIVACY PRACTICES – MyGarage, accessed May 7, 2025, https://mygarage.honda.com/resource/AutoLinkLegalTerms/connected-product-privacy-policy.pdf
  102. Privacy Policy – Yamaha Motor, accessed May 7, 2025, https://yamaha-motor.com/privacy-policy
  103. Edge AI: The forgotten challenge of AI security – PACE Anti-Piracy, accessed May 7, 2025, https://paceap.com/edge-ai-the-forgotten-challenge-of-ai-security/
  104. Toyota Research Institute Supports Development of Open-Source Automated Driving Simulator | Corporate | Global Newsroom, accessed May 7, 2025, https://global.toyota/en/newsroom/corporate/23017368.html
  105. Autonomous Driving: The Future Is Now – Mile High Honda, accessed May 7, 2025, https://www.milehighhonda.com/blogs/6116/autonomous-driving-the-future-is-now
  106. Shimano Auto Shift – How Auto Shift Works In Reality – YouTube, accessed May 7, 2025, https://m.youtube.com/watch?v=hZ2MS1GTpag&pp=ygUWI-2DgOydtOuwjey7qO2KuOuhpOufrA%3D%3D
  107. Motorcycle Artificial Intelligence: The Future of Motorcycles: Exploring the Role of Artificial Intelligence – FasterCapital, accessed May 7, 2025, https://fastercapital.com/content/Motorcycle-Artificial-Intelligence–The-Future-of-Motorcycles–Exploring-the-Role-of-Artificial-Intelligence.html
  108. Responsible AI Pattern Catalogue: A Collection of Best Practices for AI Governance and Engineering – OPUS at UTS, accessed May 7, 2025, https://opus.lib.uts.edu.au/rest/bitstreams/85926613-b66a-49b8-8b6d-00b744a17faa/retrieve
  109. The Combination of Artificial Intelligence and Extended Reality: A Systematic Review, accessed May 7, 2025, https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2021.721933/full
  110. Toyota’s system for automated driving and building an “emotional bond between driver and car” is named Yui : r/evangelion – Reddit, accessed May 7, 2025, https://www.reddit.com/r/evangelion/comments/10nq2nn/toyotas_system_for_automated_driving_and_building/