Skip to main content

Feature Request Pipeline

15 Posts

VitoCadet

Nema AI - Medbot - Miky robotNew

NEMA MedBot – Edge Identity Workflow with Metis 2 Acceleration RoadmapBody:We are presenting the current prototype of NEMA MedBot / Mickey Robot developed by AtomC.This stage validates a fully on-device identity recognition pipeline running on Orange Pi hardware. The system performs real-time acquisition, local inference and dashboard update without any cloud dependency.The architecture has been designed to integrate Axelera Metis® 2 as the dedicated hardware acceleration layer for inference stages. The current prototype confirms end-to-end workflow stability and modular separation of perception and orchestration layers.Next steps include progressive offloading of supported inference blocks to Metis 2 via the Voyager toolchain, enabling reduced latency, deterministic performance and scalable multi-model deployment.The project is part of an ongoing commercial R&D initiative targeting healthcare and posture analysis environments, with clinical validation planned in collaboration with a medical partner in the province of Brescia.We appreciate the opportunity provided by Axelera AI and look forward to further technical collaboration.https://github.com/VITOTRAVE/nema-medbot-edge-demohttps://youtu.be/AuaEN8rGLGAwww.aromc135.comwww.ciaochili.ithttps://www.linkedin.com/posts/dott-prof-vito-traversa-66a8682b0_axeleraai-metis2-edgeai-activity-7433985494352543744-iY85?utm_source=share&utm_medium=member_android&rcm=ACoAAErgv7IB2M9dvSY-dPNXxuXRuNsIX-LsMVU 

VitoCadet

Nema-Medbot-MikyRobotNew

AXELERA CHALLENGE – PROJECT UPDATEWe are progressing on the development of NEMA–MedBot–Miky, a modular robotic platform for assistance, human–robot interaction, and embodied edge AI experimentation.NEMA is our proprietary artificial intelligence, developed in-house, designed to orchestrate perception, reasoning, and action across our robotic systems.Recent progress:Hardware integration of Axelera Metis2 on Orange Pi 5 Plus via PCIe Successful device enumeration on the PCIe bus Voyager SDK installation in progress to enable accelerated on-device inferenceThe architecture we are building is edge-first:Orange Pi handles ROS2, sensors, actuators, and control logic Metis2 is dedicated to neural inference (vision, perception, multimodal models) NEMA coordinates AI pipelines and decision-makingThis approach enables low latency, efficient resource usage, and a scalable, industrial-grade architecture.NEMA–MedBot–Miky is not a maker project.We are a real startup building a platform with commercial and industrial objectives.Part of the codebase will remain proprietary. We are open to strategic and commercial partnerships.Next steps:First accelerated inference on Metis2 Vision → AI → ROS → motor pipeline Head movement and interaction testsWe thank Axelera for supporting the edge AI ecosystem and for the opportunity to participate in the challenge.MINI TECHNICAL REPORTNEMA–MedBot–Miky – Technical Progress UpdateAbstractNEMA–MedBot–Miky is a modular edge-AI robotic platform for assistance, human–robot interaction, and embodied intelligence research. The system combines Orange Pi 5 Plus for control and ROS2 with the Axelera Metis2 accelerator for neural inference. A proprietary AI engine (NEMA) orchestrates perception, reasoning, and action. The architecture is designed for low latency, modularity, and scalability, targeting real-world industrial and assistive applications.Hardware StatusOrange Pi 5 Plus operational Axelera Metis2 installed via PCIe Metis2 device enumerated on PCIe bus USB Bluetooth dongle and external speaker integratedSoftware StatusArmbian Linux running on Orange Pi ROS2 workspace initialized Neural TTS (Piper) operational Automatic audio routing and Bluetooth reconnection implemented Voyager SDK installation in progressSystem ArchitectureOrange Pi → ROS2, sensors, actuators, control Metis2 → accelerated neural inference NEMA → AI orchestration and decision layerCurrent CapabilitiesVoice output with neural TTS Automatic startup greeting Modular software structureNext Steps (24–72h)Complete Voyager SDK installation Run first inference demos on Metis2 Integrate vision output with ROS2 Enable head motor controlNotes on IP and Commercial StrategyReal startup project Partial proprietary codebase Focus on commercial and industrial partnerships

VitoCadet

Project Update: NEMA AI Integration on MedBot and Early Miky Robot PrototypingNew

Project Update – MedBot / Miky RobotWe are currently testing our proprietary artificial intelligence, NEMA, and adapting some of its capabilities to the MedBot project.In particular, we are integrating NEMA on Orange Pi together with the Metis accelerator board by Axelera, with the goal of equipping MedBot with a new layer of intelligence dedicated to assistance, interaction, and user support.The same NEMA AI is also being integrated into Miky Robot, conceived as a humanoid interface for MedBot.At this early stage, we are working on the first hardware components and their integration with artificial intelligence.Current Status of Hardware DevelopmentWe are 3D printing the first parts of the robot, especially the head and some structural components.We are performing printing iterations and mechanical tests.In parallel, we are testing the integration between NEMA and the Axelera hardware platform.We have also started developing the case for the monitor that will host the Axelera mainboard, but currently only some parts of the case are in the prototyping phase.Realistic Goals by the End of FebruaryBy the end of February, we expect to have:an initial working integration of NEMA on MedBot;a partial prototype of Miky Robot, likely including:the head with AI capabilities and sensors,some structural components,and, if possible, an early preliminary version of arms and hands;a first partial version of the mainboard case, not yet complete.This is therefore an early prototyping phase, focused on validating the integration between AI and hardware rather than delivering a complete system.We can share some work-in-progress images of the 3D printing phases and hardware components currently under development, to show the real status of the project.We will continue sharing updates as the system evolves from an initial prototype toward a more integrated solution.Best regards,Vito Traversa 

Problem with axdevice --refresh causing NVMe drives to be stopped on a running systemImplemented

Putting this up for folk who may hit a similar issue.If you have a machine which has a single root PCIe device, where the output of lspci -tv looks something like this;alsutton@svr204:~$ lspci -tv-[0000:00]-+-00.0 Intel Corporation 8th Gen Core Processor Host Bridge/DRAM Registers +-02.0 Intel Corporation CoffeeLake-S GT2 [UHD Graphics 630] +-08.0 Intel Corporation Xeon E3-1200 v5/v6 / E3-1500 v5 / 6th/7th/8th Gen Core Processor Gaussian Mixture Model +-14.0 Intel Corporation Cannon Lake PCH USB 3.1 xHCI Host Controller +-14.2 Intel Corporation Cannon Lake PCH Shared SRAM +-16.0 Intel Corporation Cannon Lake PCH HECI Controller +-1b.0-[01]----00.0 Micron/Crucial Technology P2 [Nick P2] / P3 / P3 Plus NVMe PCIe SSD (DRAM-less) +-1f.0 Intel Corporation B360 Chipset LPC/eSPI Controller +-1f.3 Intel Corporation Cannon Lake PCH cAVS +-1f.4 Intel Corporation Cannon Lake PCH SMBus Controller +-1f.5 Intel Corporation Cannon Lake PCH SPI Controller \-1f.6 Intel Corporation Ethernet Connection (7) I219-VYou need to be very careful running `axdevice --refresh`.I’ve got two machines that, by default, present the pci device tree in this way, and running `axdevice --refresh` triggered the NVMe drive to be disconnected from the system, resulting in filesystem corruption, which put the whole system into a read-only mode.It would be awesome if someone could update the axdevice script to only affect Axelera devices, and yes, this is a machine with the card installed, it just hasn’t been detected.

LattePanda Sigma works with Metis M.2!New

Hi all! I wanted to contribute a little bit about my experience bringing up the Metis M.2 in my LattePanda Sigma. Im an EE who is self taught, I have worked for Harley Davidson, Span.io, Enel X NA, and I’ve done several consulting projects on synthesizers and drum machines. My interest in AI is hobby and curiosity based, not professional. Software engineering is a new skill Im creating.I first tried to run Metis on the Windows 11 operating system that came with LattePanda Sigma, and following the instructions on the git was extremely confusing. I have ADD, and some other learning differences, but that wasn’t the reason installing was so hard on Windows. It’s because there isn’t a single page with coherent instructions telling the user how to bring up the hardware in a simple step by step format. Having the user click between git pages on firmware, installation of drivers, wsl, putting Windows into test mode, using multiple programming environments, etc is painful and makes installing the hardware on Windows a miserable, confusing, and difficult experience.I’ll probably make my own step by step guide for Windows at some point since the instructions on git are confusing.I then tried to do the install on an old version of Ubuntu (support 24.04 please!) and I had much less trouble. I HIGHLY RECOMMEND using Ubuntu 22.04 to run the hardware and software over Windows. The instructions to install the Voyager-SDK on Ubuntu 22.04 actually worked pretty well. I was surprised.I ran into an issue at one point after installing the SDK: I couldn’t detect the Metis M.2 card. I couldn’t really figure out how to install the driver from the instructions for Ubuntu installation, so I had to download the .deb driver and install it similar to the “instructions for installing Voyager-SDK using Docker” to get the driver installed. Once the driver is installed, and the clunky SDK is installed and running, the demos with YOLO worked great, with the Metis M.2 card barely breaking a sweat doing inferences.