Skip to main content
New

Nema-Medbot-MikyRobot

  • February 24, 2026
  • 0 replies
  • 4 views

Forum|alt.badge.img

AXELERA CHALLENGE – PROJECT UPDATE

We are progressing on the development of NEMA–MedBot–Miky, a modular robotic platform for assistance, human–robot interaction, and embodied edge AI experimentation.

NEMA is our proprietary artificial intelligence, developed in-house, designed to orchestrate perception, reasoning, and action across our robotic systems.

Recent progress:

  • Hardware integration of Axelera Metis2 on Orange Pi 5 Plus via PCIe
  • Successful device enumeration on the PCIe bus
  • Voyager SDK installation in progress to enable accelerated on-device inference

The architecture we are building is edge-first:

  • Orange Pi handles ROS2, sensors, actuators, and control logic
  • Metis2 is dedicated to neural inference (vision, perception, multimodal models)
  • NEMA coordinates AI pipelines and decision-making

This approach enables low latency, efficient resource usage, and a scalable, industrial-grade architecture.

NEMA–MedBot–Miky is not a maker project.
We are a real startup building a platform with commercial and industrial objectives.
Part of the codebase will remain proprietary. We are open to strategic and commercial partnerships.

Next steps:

  • First accelerated inference on Metis2
  • Vision → AI → ROS → motor pipeline
  • Head movement and interaction tests

We thank Axelera for supporting the edge AI ecosystem and for the opportunity to participate in the challenge.

MINI TECHNICAL REPORT

NEMA–MedBot–Miky – Technical Progress Update

Abstract

NEMA–MedBot–Miky is a modular edge-AI robotic platform for assistance, human–robot interaction, and embodied intelligence research. The system combines Orange Pi 5 Plus for control and ROS2 with the Axelera Metis2 accelerator for neural inference. A proprietary AI engine (NEMA) orchestrates perception, reasoning, and action. The architecture is designed for low latency, modularity, and scalability, targeting real-world industrial and assistive applications.

Hardware Status

  • Orange Pi 5 Plus operational
  • Axelera Metis2 installed via PCIe
  • Metis2 device enumerated on PCIe bus
  • USB Bluetooth dongle and external speaker integrated

Software Status

  • Armbian Linux running on Orange Pi
  • ROS2 workspace initialized
  • Neural TTS (Piper) operational
  • Automatic audio routing and Bluetooth reconnection implemented
  • Voyager SDK installation in progress

System Architecture

  • Orange Pi → ROS2, sensors, actuators, control
  • Metis2 → accelerated neural inference
  • NEMA → AI orchestration and decision layer

Current Capabilities

  • Voice output with neural TTS
  • Automatic startup greeting
  • Modular software structure

Next Steps (24–72h)

  • Complete Voyager SDK installation
  • Run first inference demos on Metis2
  • Integrate vision output with ROS2
  • Enable head motor control

Notes on IP and Commercial Strategy

  • Real startup project
  • Partial proprietary codebase
  • Focus on commercial and industrial partnerships