Filter

Arm-based Quadruped and Humanoid Robotics for Physical AI use-cases

TrendingRecently Added

Description

Why is this important?

Arm provides a foundational compute platform for Physical AI

Low-cost Arm-powered robotic platforms — both quadruped and humanoid — are rapidly lowering the barrier to entry for Physical AI: systems that perceive, reason, and act in the real world. Quadruped platforms such as PuppyPi, Waveshare SpotMicro-style robots, OpenDog derivatives, and Raspberry Pi / RK3588-based kits have already made dynamic locomotion accessible. In parallel, compact humanoid and biped platforms are available with Arm-based compute, servo-actuated joints, and open software stacks.

These robots typically integrate:

  • Arm CPUs (Cortex-A class, often paired in heterogeneous SoCs with NPUs or GPUs)
  • Cameras, IMUs, joint encoders, and force sensors
  • Real-time motor control running alongside Linux-based AI stacks

This combination makes them ideal testbeds for exploring physical AI under real-world constraints: latency, power, noisy sensors, contact dynamics, and safety-critical control — all running fully on-device, without reliance on cloud inference.

By leveraging efficient Arm-native ML frameworks (e.g. PyTorch + ExecuTorch, LiteRT, ONNX Runtime, or accelerated ROS 2 pipelines), developers can study how modern AI models behave when tightly coupled to physical embodiment, whether quadrupedal or humanoid. Arm-based systems enable tight perception–action loops, reducing the latency from sensing a photon or audio wave to taking a meaningful physical action.

Example Platforms

Quadruped

Humanoid / Biped

Boards

Participants may use off-the-shelf mechanical platforms or open 3D-printable designs. Mechanical design is not a focus of this project.

Project Summary

Design and implement an original software application for a low-cost Arm-based quadruped or humanoid robot that demonstrates intelligent interaction with the physical world.

This project is not about assembling a robot kit, configuring firmware, or running stock demos. Robot platforms are treated as execution targets, not solutions. Each submission must demonstrate original software design and clearly articulated system architecture.

Your goal is to architect how sensing, perception, inference, decision-making, and control software operate together on Arm hardware under real-world physical constraints.

Each project must:

  • Implement a custom software application written by the participant
  • Define a clear software architecture (modules, data flow, control loops)
  • Run all AI inference on-device
  • Integrate inference into a closed-loop control system
  • Demonstrate measurable impact on physical robot behaviour

Projects that only run vendor demo code will not be considered sufficient.

Example Project Areas

Physical AI applications

  • Vision-based locomotion (Terrain classification, obstacle detection, or foothold selection using cameras)
  • Multi-sensor fusion for balance and navigation (Fuse camera, IMU, and joint feedback for state estimation, slip detection, or recovery behaviours)
  • Learning-based control (Neural policies (MLPs or lightweight transformers) for gait adaptation, balance, or stepping)
  • Human–robot interaction (Gesture recognition, visual tracking, or local natural-language command processing).
  • Embodied exploration (Curiosity-driven navigation or semantic mapping in indoor environments).

Software architecture and optimisation

  • Designing low-latency perception–decision–action pipelines
  • Integrating ML inference into real-time control loops
  • Optimising models for latency, power, and memory footprint
  • Evaluating different ROS 2 or non-ROS architectural patterns

Utilise your creativity and try out unusual applications or techniques. Explore different sensor options, frameworks, and robot behaviours. You are welcome to also submit other variants of robotics (e.g. robotic arm, autonomous drones and other vehicles), provided the project demonstrates the same closed-loop, on-device physical AI principles.

Resources from Arm and our partners

Support Level

This project is designed to be self-serve but comes with opportunity of some community support from Arm Ambassadors, who are part of the Arm Developer program. If you are not already part of our program, click here to join.

You are also welcome to contact Arm-Developer-Labs@arm.com to enquire about further support.

Benefits

Standout project contributions to the community will earn digital badges. These badges can support CV or resumé building and demonstrate earned recognition.

To receive the benefits, you must show us your project through our online form. Please do not include any confidential information in your contribution. Additionally if you are affiliated with an academic institution, please ensure you have the right to share your material.