+353-1-416-8900REST OF WORLD
+44-20-3973-8888REST OF WORLD
1-917-300-0470EAST COAST U.S
1-800-526-8630U.S. (TOLL FREE)
New

OEMs and Suppliers' Embodied Artificial Intelligence (and AI Robot) Layout Trend Report, 2024-2025

  • PDF Icon

    Report

  • 350 Pages
  • February 2025
  • Region: China, Global
  • Research In China
  • ID: 6051187

From Automobiles to Embodied Artificial Intelligence (EAI): Differentiated Layout Strategy amid Industrial Correlation

1. EAI Layout: Multiple Paths of OEMs and Suppliers

In the wave of Embodied Artificial Intelligence (EAI), OEMs and suppliers in the automotive industry chain have set foot in the industry and vigorously explored development strategies that suit them. At present, they deploy EAI by way of independent R&D, cooperative R&D, investment and cooperative application exploration. Companies need to choose their specific layout models, each of which has its own advantages and disadvantages, according to their own resources, technical strength and market goals.

Simply put, EAI is an agent that perceives and acts based on the physical body. It is like giving a robot a 'body' that can interact with the environment, so that it can learn and perform tasks by observing, moving, and speaking like humans. The concept has brought unprecedented opportunities and challenges to the automotive industry.

(1) With coherent technological development, OEMs are accelerating the deployment of humanoid robots:

OEMs' EAI deployment focus on technology reuse, supply chain collaboration, and market incremental mining. Moreover, intelligent driving technology (such as perception algorithms and decision-making models) is highly homologous to EAI. OEMs can draw the AI capabilities they have accumulated in the field of autonomous driving on robot development to reduce R&D costs. In the future, as hardware costs decline and foundation model capabilities improve, EAI will gradually penetrate from industrial scenarios into the consumer sector, becoming a key pillar for the intelligent transformation of OEMs.

On November 6, 2024, Xpeng unveiled the next-generation humanoid robot 'Iron' at the 'XPENG AI Day'. It has entered the Guangzhou factory to participate in the production and training of Xpeng P7+. In the future, it will focus on scenarios such as factories and offline stores.

Iron is designed with a humanoid structure, with a height of 178cm, a weight of 70kg, and a 1:1 human hand size design. It has 15 degrees of freedom and supports tactile feedback; it has more than 60 joints and can simulate a variety of human actions, such as standing, lying, sitting, etc.

Equipped with Xpeng AI Eagle Eye Vision System, Iron boasts 720° environmental perception capabilities without blind spots, and also uses end-to-end foundation models and reinforcement learning algorithms.

Fitted with the Xpeng AIOS which is also seen in Xpeng’s vehicles, Iron can communicate fluently and freely, has memory and can reason.

It uses a Turing AI chip, which is specially customized for foundation models. It has a 40-core processor with a computing power of 3000T. This chip is the first chip that can be used in AI cars, AI robots and flying cars simultaneously.

(2) Suppliers are making every effort to deeply lay out the EAI industry chain

Automotive suppliers rely on existing hardware technologies (such as sensors, chips, motors) and supply chain resources to extend into the field of EAI. Their core logic lies in technology reuse and collaborative cost reduction. For example, the LiDAR vendor RoboSense has adapted its automotive perception solution to robots so as to migrate its environment modeling capabilities; motor companies have reused automotive powertrain technology to develop high-density joint drive modules. This move can leverage mature manufacturing experience and customer network, but it has to address the stringent requirements of robotic scenarios for hardware flexibility.

On January 3, 2025, RoboSense hosted its first AI Robotics Global Online Launch Event of 2025, titled “Hello Robot”, comprehensively presenting RoboSense's robotics technology platform strategy, and releasing a variety of digital LiDAR sensors, robot vision products and incremental parts and solutions for robots.

At the press conference, RoboSense launched the second-generation Papert 2.0 dexterous hand which incorporates 20 degrees of freedom, a maximum load of 5 kilograms and 14 force sensors on the fingertips and palms. Together with the robotic arms and control system, it can flexibly replicate the fine movements and operations of the human hands, such as delicately picking up eggs and screws.

In addition, RoboSense demonstrated for the first time its self-developed humanoid robot, which is defined as a universal parts development platform for R&D of incremental components and solutions of robots. Based on complete machines, RoboSense will focus on three types of incremental parts including vision, touch, and joints of robots to empower the robot industry. RoboSense also announced products such as the FS-3D force sensor for end-point motion control of legged robots, the LA-8000 high-power density linear motor, and the DC-G1 highly integrated, small-sized, high-computing-power, low-power robot domain controller.

(3) Technical talents in the field of autonomous driving dabble in EAI

Similar technical paths and industry dividends stimulate autonomous driving practitioners to dabble in EAI. After L4 autonomous driving encountered obstacles in commercialization, capital and talent shifted to EAI. Autonomous driving and EAI highly overlap in perception algorithms and decision models (end-to-end reinforcement learning), so that they can share algorithms. Autonomous driving practitioners excel at AI algorithms and quick iterations, but they need to learn knowledge about hardware interaction such as mechanical control (such as force feedback and motion planning), and face the challenges like uncertainty in technical routes and commercial verification. Their aims to dominate early EAI technology through collaborative innovation of software and hardware.

In November 2023, WeRide's former COO Zhang Li announced that he would join LimX Dynamics as chief operating officer (COO). Zhang Li has extensive experience in the field of autonomous driving and has led WeRide to perform remarkably in the commercialization and technology R&D of robotaxis. His joining marks LimX Dynamics’ further efforts in the field of EAI, and also reflects the in-depth integration trend of autonomous driving and EAI.

In October 2024, LimX Dynamics launched TRON 1, a 'three-in-one' modular polymorphic bipedal robot designed specifically for mobility and manipulation in complex environments. Advanced motion control algorithms and high-precision sensor fusion technology enable TRON1 to walk stably in unstructured terrain (such as stairs and gravel roads) and perform difficult actions such as climbing slopes and overcoming obstacles. The modular design allows users to flexibly configure functional modules according to task requirements, such as installing robotic arms for grasping or integrating visual sensors for environmental perception and navigation. The core advantage of TRON1 lies in its highly flexible movement and strong environmental adaptability, which gives it wide application potential in industrial inspection, emergency rescue, logistics and distribution and other fields.

2. Robotics and automotive industries: collaborative development with similarities and differences

(1) Industrial commonality: collaborative foundation of technology and supply chain

Hardware: The robotics and automotive industries involve highly similar hardware components. Motors, sensors, deceleration/conversion mechanisms, batteries, bearings, structural parts, cooling systems, controllers, chips and other hardware are widely used in both industries. For example, the design of Tesla Optimus borrows heavily from automotive hardware technology.

Tesla Optimus has a total of 14 rotary actuators, each of which uses 2 angular contact ball bearings and a cross roller bearing; each of its 14 linear actuators has a four-point contact bearing and a ball bearing. These four types of bearings have been widely used in the automotive industry.

Sensors: LiDAR, cameras, radar, etc. all play key roles in both autonomous driving and robotic navigation. Vehicles use LiDAR to accurately detect obstacles ahead and recognize road boundaries to achieve autonomous driving. AI robots scan the surrounding environment by LiDAR to flexibly avoid obstacles and navigate accurately. In addition, automotive ECUs and robot motion controllers share the same underlying logic. They both receive sensor signals, quickly calculate according to the preset algorithms, and issue instructions to actuators.

The Xpeng AI Eagle Eye Vision System mounted on Xpeng Iron relies on Xpeng's intelligent driving system, which combines end-to-end foundation models and reinforcement learning algorithms. This vision system allows Iron to observe the surrounding environment within a 720° range without blind spots.

Software technology: The algorithms accumulated by OEMs in the field of autonomous driving provide valuable experience for the development of EAI. In autonomous task processing, humanoid robots and autonomous vehicles follow the process of 'perception - decision-making - execution', and they are the same to a certain extent at the model level. Key algorithms for path planning and motion trajectory prediction, intelligent driving algorithms can be reused on humanoid robots.

For example, Tesla applies the vision, navigation and AI algorithms in FSD to Optimus, which can thereby perceive the environment and make autonomous decisions. Benefiting from Tesla's increasingly powerful AI training capabilities and autonomous driving scenario simulation systems such as Dojo, Tesla’s robots will have the ability to recognize environmental paths and surrounding objects and plan paths before leaving the factory.

Supply chain: The mature experience of the automotive supply chain provides strong support for the development of the robotics industry. The robotics industry chain and the automotive supply chain have the same technical origins in some parts and components, such as batteries, motors, bearings and so on. After long-term development, the automotive supply chain has experience in large-scale automated production and can help achieve mass production of robots with lower costs.

(2) Industrial difference: differences in market demand and technical focus

Supply chain - custom parts:

Automobiles: Thanks to highly standardized automotive production, many auto parts are universal and can be mass-produced to reduce costs.

Robots: For diverse application scenarios, EAI products vary in forms and functions, so that their parts are mostly customized. For example, robots used for medical surgeries and household cleaning robots whose components such as joint structures and sensors vary greatly due to different functional requirements are difficult to see large-scale standardized production.

Supply chain - response speed:

Automobiles: The automotive production cycle is long, and it may take several years from design to mass production. The relatively stable supply chain does not need respond in a very short time. OEMs usually make production plans in advance and sign long-term contracts with suppliers to ensure stable supply of parts.

Robots: Rapid changes in market demand and frequent technological updates and iterations require the supply chain to respond faster. For example, consumer-grade smart robot manufacturers have to quickly adjust product designs once new functional requirements or technological breakthroughs emerge in the market.

Hardware technology - integration and complexity:

Automobiles: An automobile means a highly integrated and complex system with many hardware components working closely together. Engines/motor systems, electric drive systems, battery systems, transmissions, chassis, electrical systems, etc. all involve complex designs and functions, and they should adapt to each other.

Robots: Although multi-hardware collaboration is also involved, the integration method and focus are different from automobiles. EAI products may focus more on the integration of specific functional modules, such as robot joint modules integrating motors, sensors and controllers.

Hardware technology - energy storage:

Automobiles: New energy vehicles use power batteries as their main energy storage devices, such as lithium-ion batteries. They need huge battery capacity which is usually measured in kWh for the purpose of a long cruising range, usually around a few hundred kilometers.

Robots: Diverse and flexible energy storage hardware can be selected according to product functions and application scenarios. Compared with new energy vehicles, the energy storage hardware of EAI products should feature miniaturization, lightweight and adaptability to different scenarios, rather than a long cruising range.

Software technology - algorithms and data processing:

Automobiles: In the field of autonomous driving, algorithms center on environmental perception, decision-making and planning. For example, object detection and recognition algorithms recognize vehicles and pedestrians on roads, and path planning algorithms plan safe driving paths based on sensor data. Data processing is mainly about massive but stereotypical road information collected by automotive sensors.

Robots: Algorithms cover a wider range of fields. In addition to environmental perception and motion planning, they also include interactive algorithms for natural language processing and emotion recognition. Data comes from various sources, including not only sensor data, but also user interaction data, cloud data, etc.

3. Layout: key considerations for OEMs and suppliers

Technology R&D and innovation:

They should invest heavily in core technologies of EAI, such as foundation models, sensor fusion, motion control, human-computer interaction and other fields. For example, in the research and development of Optimus, Tesla has continuously optimized the effects of migrating FSD technology to enhance the robot's perception and decision-making capabilities; Huawei continues to iterate the Pangu Models to enhance task planning and multi-scenario generalization capabilities for EAI.

OEMs and suppliers should avoid duplicating research and development, as the automotive and robotics industries have significant commonalities in many aspects. In terms of supply chain, the raw materials and parts supply channels required by the two often overlap. Integrating supply chain resources can significantly reduce procurement costs and management difficulties. At the software technology level, AI foundation models and deep learning algorithms are the core driving forces for the realization of EAI. Sharing these technical achievements can accelerate the research and development process. As for hardware technology, sensors perceiving the environment as well as motors, gears, bearings, etc. which handle power transmission and mechanical movement are highly similar. Unifying technical standards and R&D plans can avoid repeated R&D investment.

Market demand and application scenario expansion:

In the emerging EAI field, market demand and application scenarios are still being explored and expanded. Enterprises should delve in market surveys, gain an in-depth understanding of the demand of different industries and users, and develop targeted products and solutions. They should actively cooperate with potential customers, carry out pilot projects and application demonstrations, accumulate market experience, and improve the market adaptability and competitiveness of their products.

In addition to common scenarios such as industrial manufacturing, logistics warehousing, and home services, they should vigorously explore the application potential of EAI in sectors such as medical care, education and agriculture. For example, they should develop assistive robots for medical rehabilitation, intelligent robots for education, picking and farming robots for agricultural production, etc. By expanding application scenarios, they can seize more market share and depend less on a single market.

Talent training and introduction:

EAI involves multiple disciplines, so OEMs and suppliers should recruit and train talents with interdisciplinary knowledge and skills. For example, Geely, BYD, Huawei and other companies established dedicated EAI research teams at the end of 2024, and they clearly require candidates to have knowledge in multiple fields such as machinery, automation, mechanics, computers, mathematics, electronic information and computing when recruiting talents. Companies can also improve the interdisciplinary capabilities of existing employees and build a professional talent team through internal training, cooperation with universities, etc.

Table of Contents

1 Overview of EAI
  • Terminology
  • Policy
  • With Strong Policy Support, Regions Have Established Innovation Centers and Laid out AI Robot Industry Clusters
  • National Humanoid Robot Policies/Plans (2021-2024)
  • National Top-level Design Support the Development of the Humanoid Robot Industry
  • Overview of EAI
  • Basic Concept
  • Development and Evolution
  • Status Quo of Industrial Development
  • Application Scenario Evolution
  • With Mature Technology, the Humanoid Robot Industry at Home and Abroad Has Been Developing Fast
  • Overview of AI Humanoid Robots
  • Definition
  • Composition
  • AI Foundation Models Empower Humanoid Robot Technology Routes
  • Intelligent Grading
  • Other applications of AI foundation models - LLMs Empower Robotic Arm Planning and Control of AI Humanoid Robots
  • Application Scenarios
  • Industry Chain
  • Commonalities and Differences between Automotive and Robotics Industries
  • Commonalities between Smart Cars and Robots in Hardware Components
  • Commonalities between Smart Cars and Robots in Software Technology
  • Commonalities between Smart Cars and Robots in Supply Chain and Production
  • Commonalities between Xpeng/Tesla's Humanoid Robots and Smart Cars in Technology
  • Differences between Smart Cars and EAI
  • EAI Team Building
  • EAI Team Building of Domestic OEMs
2 Overview of EAI Technology
  • Technology System
  • Three Core Links in Realization of EAI
  • Multimodal Foundation Models Drive Embodied Perception to Embodied Cognition
  • Embodied Imagination: Imagine How the Perceived Objects and Tasks Are Performed in Simulation Engines
  • Embodied Execution: Transform Solutions Generated in Embodied Imagination into Practice
  • EAI Can Be Viewed as an “AI Agent” That Interacts with the Environment
  • EAI Is Quite Different from Traditional Intelligence in Learning Methods
  • Four Core Elements of EAI
  • EAI Technology System
  • EAI Technology System - Perception Module
  • EAI Technology System - Decision-making Module
  • EAI Technology System - Action Module
  • EAI Technology System - Feedback Module
  • Brain of EAI
  • Composition
  • Technology Core: Foundation Models
  • VLMs (Hierarchical Models)
  • Summary
  • Pi Zero
  • PaLM-E: Embodied Multimodal Language Model
  • Figure AI Collaborates with OpenAI to Launch a Three-tier Decision-making Solution
  • Noematrix Brain
  • Galbot’s Universal Three-tier Foundation Model System
  • VLA (End-to-end)
  • Summary
  • Concept and Technology Stack
  • NAVILA: Vision-Language-Action Model for Legged Robots for Navigation
  • OpenVLA: Open Source Vision-Language-Action Model
  • OpenVLA: End-to-end Training of VLM
  • Vision-Language-Action (VLA) Model - Robotic Transformer2 (RT-2)
  • Uni-NaVid Proposes a VLA Model That Unifies Multiple Embodied Navigation Tasks
  • QUAR-VLA: Vision-Language-Action Model for Quadruped Robots
  • RoboMamba: End-to-end VLA Model
  • VLN (Vision-Language Navigation)
  • Summary
  • Basic Concept
  • Main Implementation Methods
  • Comparison between VLA and VLN Models
  • LH-VLN: Vision-Language Navigation with a Long-term Development Vision: Platforms, Benchmarks and Methods
  • Safe-VLN: Collision Avoidance for Autonomous Robot VLN in Continuous Environments
  • MC-GPT: Augment VLN through Memory Graphs and Reasoning Chains
  • World Models
  • Summary
  • Basic Architecture
  • Definition and Application
  • AgiBot and Shanghai AI Lab Propose the Embodied 4D World Model - EnerVerse
  • D-VLA: A 3D Vision-Language-Action Generative World Model
  • RoboDreamer: Learning Compositional World Models for Robot Imagination
  • IRASim - A World Model in Robotics
  • Robotic World Model: A Neural Network Simulator for Robust Policy Optimization in Robotics
  • EAI Ontology
  • EAI Actively Interacts with the Physical Environment and Covers a Wide Range of Embodied Forms
  • Humanoid Robot Composition
  • Three Core Sensors of Robots - Force/Torque Sensor
  • Three Core Sensors of Robots - Vision Sensor/Tactile Sensor
  • Actuator: Key Component of Robot Hardware System
  • Robot Actuator Classification
  • Key Part of Linear Drive - Ball Screw
  • High-value Robot Reducers
  • Robot’s Core Driver - Motor
  • Servo Motor: Responsible for Precise Robot Control
  • Key Component of Dexterous Hand - Coreless Motor
  • Tesla Optimus Uses Frameless Torque Motors in 28 Actuators
  • Dexterous Hand Composition
  • Dexterous Hand Classification by Transmission Method
  • Dexterous Hand Perception System
  • Simulation Data
  • Summary of EAI Data Generalization Models
  • UnrealZoo: A Rich Collection of Photo-realistic 3D Virtual Worlds Built on Unreal Engine
  • RoboTwin: A Dual-Arm Robot Benchmark with Generative Digital Twins
  • RoboGSim: A Real2Sim2Real Data Synthesizer and Closed-loop Simulator
  • Arbitrary Point Trajectory Model: ATM Framework
3 Transformation of Intelligent Driving Practitioners to EAI
  • Practitioners Transitioning from Intelligent Driving to Other Arenas
  • Galaxea AI Is Committed to Full-stack Independent R&D of Embodied Robots, and Raised over RMB200 million in Pre-Series A Financing
  • The First EAI Company to Implement an End-to-end Simulation Learning Algorithm in a Closed Loop
  • Novel Visual Simulation Learning Algorithm: 3D Diffusion Strategy (DP3)
  • Launch of R1 Full-size Dual-arm Humanoid Robot
  • LimX Dynamics Focuses on the R&D and Manufacturing of Sports Intelligence and Legged Robots
  • Leverage the Understanding Capabilities of LLMs to Promote the Integration of Spatial and Motion Intelligence on Humanoid Robots
  • Release of CL-2
  • TRON 1'Three-in-one' Modular Polymorphic Bipedal Robot
  • Galbot Has Developed Tens of Millions of Scenario Data and Billions of Captured Data
  • DexGraspNet 2.0
  • Open6DOR Spatial Intelligent Foundation Model System
  • End-to-end Navigation Foundation Model - NaVid
  • Release of Galbot (G1)
  • The Former Head of Autonomous Driving at Alibaba Damo Academy Founded Udeer·AI
  • LPLM-10B EAI Model
  • Master2000 Plug-and-play Universal Embodied Brain
  • AI 130 Commercial Patrol Robot
  • CenoBots
  • Four Core Technologies of Intelligent Cleaning Robots
  • SP50 Intelligent Cleaning Robot
  • L4 AI-powered Washer Dryer Robot
  • AI² Robotics Solves the Core Problem of EAI: Extend AGI to the Physical World
  • Universal EAI System: AI2R Brain
  • Alpha Bot 1S Debuted at the 2024 World Robot Conference
  • Wanren AI - Foundation Model Application Covering the Entire Disease Process
  • Dong Feng Model: “New Engine” Driving the Comprehensive Upgrade of The Medical Industry
  • Memex Technology System
  • Jacobi.ai Is Committed to the Commercialization of Open Scenarios
  • J-Box Technology
  • J-Mind Technology
  • PIA AUTOMATION Deploys Humanoid Robots in Industrial Scenarios
  • Most Core Chips of Humanoid Robots Have Been Replaced by Domestic Counterparts
  • JARVIS Humanoid Robot
  • Strategic Cooperation with Beijing Embodied AI Robot Innovation Center
4 Layout of OEMs in EAI
  • Layout of Global OEMs in EAI Robots
  • Robot Deployment Timeline of OEMs
  • Parameters of OEMs’ Latest Robots
  • Tesla
  • Development History of Optimus
  • Evolution of Optimus
  • Latest Features of Optimus
  • Introduction to Optimus Gen 3
  • Three Major Updates of Optimus Gen 3
  • Degrees of Freedom of Optimus Gen 3
  • Comparison between Three Generations of Optimus
  • Hardware Parameters and Cost Breakdown of Optimus
  • Optimus - Environmental Perception
  • AI Humanoid Robot BOM Cost Estimation
  • AI Humanoid Robot Hardware Disassembly - Design Based on Automotive Technology
  • AI Humanoid Robot Hardware Disassembly - Rotary Actuator
  • AI Humanoid Robot Hardware Disassembly - Linear Actuator
  • AI Humanoid Robot Hardware Disassembly - Hand Control
  • AI Humanoid Robot Hardware Disassembly - Knee Joint
  • AI Humanoid Robot Hardware Disassembly - Brain
  • AI Humanoid Robot Hardware Disassembly - Brain Chip
  • AI Humanoid Robot Hardware Disassembly - Brain Training Module
  • AI Humanoid Robot Hardware Disassembly - Brain Training Cluster
  • AI Humanoid Robot Software Algorithm - Introduction to FSD - Migration of FSD Technology to Optimus
  • AI Humanoid Robot Software Algorithm - Introduction to FSD - FSD Is Connected with OPTIMUS Underlying Module
  • AI Humanoid Robot Software Algorithm - Introduction to FSD - FSD Neural Network Training
  • AI Humanoid Robot Software Algorithm - Introduction to FSD - FSD Neural Network Training Progress
  • AI Humanoid Robot Software Algorithm - Introduction to FSD - FSD Massive Database Supports Robot AI Training
  • AI Humanoid Robot Software Algorithm - Perception Algorithm
  • AI Humanoid Robot Software Algorithm - Motion Planning Algorithm
  • Xpeng
  • Comparison between Two Generations of Humanoid Robots
  • Basic Parameters of the Fourth-generation Humanoid Robot 'Iron'
  • “Iron” Eagle Eye Vision System and AIOS
  • Turing AI Chip
  • Application Prospects of “Iron”
  • Introduction to AI Humanoid Robots
  • AI Humanoid Robot Hardware Disassembly - Joint
  • AI Humanoid Robot Hardware Disassembly - Dexterous Hand
  • AI Humanoid Robot Hardware Disassembly - Robotic Arm
  • AI Humanoid Robot Software Algorithm - Perception and Interaction Are Connected with Automotive Underlying Capabilities
  • AI Humanoid Robot Software Algorithm - Automotive Technology - XNGP Foundation Model
  • AI Humanoid Robot Software Algorithm - Automotive Technology - Underlying XNGP Technology: XBrain
  • AI Humanoid Robot Software Algorithm - Perception Architecture - XNet 2.0
  • AI Humanoid Robot Software Algorithm - Perception Architecture - XNet Data Annotation
  • AI Humanoid Robot Software Algorithm - Planning & Control Architecture - XPlanner
  • Xiaomi
  • Robot Layout
  • AI Humanoid Robot Planning
  • Quadruped Robot: CyberDog 2
  • CyberOne Hardware Architecture
  • CyberOne Joint Module
  • CyberOne Power System
  • CyberOne Communication System
  • CyberOne Control System
  • CyberOne Software Algorithm
  • GAC
  • Comparison between the Second- and Third-generation Robots in Parameters and Features
  • The Second-generation EAI Robot
  • Basic Parameters of the Third-generation EAI Robot - GoMate
  • GoMate Drive and Control Integrated Joint Module
  • GoMate Micro Low Voltage Servo Driver
  • GoMate Dexterous Hand
  • GoMate Software and System
  • Application of Autonomous Driving Technology on Humanoid Robots
  • BYD
  • EAI Layout
  • Hyundai
  • Acquisition of 80% Shares in Boston Dynamics
  • X-ble Shoulder Exoskeleton Robot
  • X-ble MEX Wearable Medical Robot
  • Two Exoskeleton Wearable Devices Were Launched in September 2019
  • DAL-e Intelligent Service Robot
  • Chery
  • New Bipedal Robot: Mornine
  • Mornine Application Planning
  • Toyota
  • Humanoid Robot Development History
  • TRI Partners with Boston Dynamics to Develop Humanoid Robots
  • 'Busboy” Home Service Robot
  • Welwalk Wearable Robot
  • Huawei
  • Robot Layout
  • Official Entry into EAI Industry with Further Strengthened Trends
  • Pangu EAI Model
5 EAI Layout of Automotive Industry Chain Suppliers
  • Sensor
  • Speed Reducer
  • Dexterous Hand, Motor
  • Complete Machine
  • Hesai Technology
  • Introduction
  • Robot LiDAR
  • Mini High-performance 3D LiDAR
  • LiDAR Suitable for Robot Scenarios - QT128
  • LiDAR Suitable for Robot Scenarios - XT32
  • RoboSense
  • Introduction
  • Papert 2.0
  • Active Camera
  • Digital LiDAR Empowers Automotive and Robot Perception Capabilities
  • Incremental Robot Parts Development
  • Tuopu Group
  • Introduction
  • Plan to Invest RMB5 Billion in Building a Production Base for Core Robot Components
  • Build the Competitiveness of the Robot Actuator Business with Multiple Core Advantages
  • Sanhua Intelligent Controls
  • Introduction
  • Robot Project Layout
  • Strategic Cooperation with Leaderdrive
  • Inovance
  • Introduction
  • SCARA Robot - IR-S4
  • Six-joint Robot - IR-R300
  • Supporting Robot Software
  • IRCB500 Industrial Robot Control Cabinet
  • Industrial Robot Precision Machinery - Ball Screw
  • Industrial Robot Precision Machinery - Linear Guide
  • Zhaowei Machinery & Electronics
  • Introduction
  • High-performance Micromotor Technology Helps the Application of Humanoid Robots
  • A Press Conference Was Held at the China Hi-Tech Fair and Highly Reliable Dexterous Hands Integrated with Finger Drive Were Officially Launched
  • Machine Joint Brushless Servo
  • Micro Drive System for Walking Actuator of Lawn Mowing Robot
  • Shuanghuan Company
  • Introduction
  • RV Reducer Layout
  • High-precision Harmonic Transmission Reducer
  • High-precision Robot Cycloid Planetary Reducer - SHPR-C
  • HCFA
  • Introduction
  • Robot Research Project
  • “YOLO01” Universal Humanoid Robot
  • Self-developed Parts of “YOLO01” Humanoid Robot
  • Thundersoft
  • Introduction
  • The Next Generation of Multi-modal Intelligent Robots Will Be Empowered by Arm Technology
  • Thundercomm Officially Launched RUBIK Pi 3 (Development Kit)
  • Rubik Robot Foundation Model
  • KNEWBOTS Unveiled Many New Intelligent Robots at CeMAT ASIA on November 5, 2024
  • KNEWBOTS Robot Software Platform (RSP)
  • CATL
  • Introduction
  • Robot Layout: Independent Robotic R&D and External Cooperation in Parallel
  • Launch of CharGo'' Mobile Storage, Charging And Inspection Integrated Robot'
  • ZongMu Technology
  • Introduction
  • FlashBot
  • FlashBot Business Model - Empower the Integration of Light, Storage and Charging
  • SenseTime
  • Introduction
  • EAI Layout
  • SenseNova 5.5 Empowers Humanoid Robots with'Eyesight' and'Brain Power'
  • Large Devices Create a New Generation of AI Infrastructure for EAI
  • Haomo.AI
  • Introduction
  • A Variety of Robots Have Landed in the Wisdom Valley, Working together to Create New Productivity
  • DriveGPT Empowers Little Magic Camel
  • Horizon Robotics
  • Profile
  • Evolution of Sweet Potato Robot
  • Release of Full Link Sweet Potato Robot at the 2024 Developer Day
  • RDK One-stop Intelligent Robot Developer Kit
  • Rising Sun 5 Intelligent Computing Chip
  • Sweet Potato Robot Algorithm Center (NodeHub)/RDK-Copilot Development Assistant
6 Development Trends of EAI
  • Technology Trends
  • EAI Promotes the Development of Robotics Technology from Simple Perception to Multi-modal Perception
  • Coordinated Development of Software and Hardware Promotes Positive Industry Cycle
  • The Vision + Touch Solution Has Better Technical Indicators and Is Expected to Become the Mainstream Solution in the Future
  • Technology Trends
  • Product Trends
  • Various Robot Carrier Forms Develop Together
  • EAI and Various Robot Carrier Forms Develop Together - Robot Dog
  • EAI and Various Robot Carrier Forms Develop Together - Special Robots
  • Product Trends
  • Industrial Market Trends
  • Global and Chinese Humanoid Robot Markets Continue to Grow
  • Market Demand Trend: Humanoid Robots Are Expected to Alleviate the Labor Shortage in the Market
  • An Investment Boom in EAI in 2024 with Incremental Components and Embodied Models Favored by Investors
  • Industry Trends

Companies Mentioned

  • Galaxea AI
  • LimX Dynamics
  • Galbot
  • Udeer·AI
  • CenoBots
  • AI² Robotics
  • Wanren AI
  • Jacobi.ai
  • PIA AUTOMATION
  • Tesla
  • Xpeng
  • Xiaomi
  • GAC
  • BYD
  • Hyundai
  • Chery
  • Toyota
  • Huawei
  • Hesai Technology
  • RoboSense
  • Tuopu Group
  • Sanhua Intelligent Controls
  • Inovance
  • Zhaowei Machinery & Electronics
  • Shuanghuan Company
  • HCFA
  • Thundersoft
  • CATL
  • ZongMu Technology
  • SenseTime
  • Haomo.AI
  • Horizon Robotics

Methodology

Loading
LOADING...