🎉 The Pioneer 10 Have Been Selected! 🚀
Get involved in the Axelera community. Ask questions, get inspired, share projects and engage with the AI world
Share ideas and discuss Axelera’s products and innovations
The support hub for all things Axelera. Products, SDK, customer help and more.
Talk about innovations and advancements in the AI world.
Advancing language models and natural language understanding.
Bonjour,Je n’arrive pas à récupérer le fichier `axelera-runtime.tar.gz`. Le script `install.sh` affiche “Runtime Installed OK” mais ne télécharge ni n’extrait quoi que ce soit.Aucune URL n’apparaît dans les logs et `libaxruntime.so` est absent de mon système.Merci de me fournir un lien de téléchargement direct pour installer le runtime manuellement.Cordialement,
Hi,I'm having trouble installing the metis-dkms driver. Can you help me?The device is a NanoPC-T6 from Friendly Elec, and the operating system is Ubuntu 22.04, also provided by Friendly Elec.I'm leaving the error message and OS information below. If there is any more information I should provide, please let me know. Thank you. The Error:$ sudo dpkg -i metis-dkms_0.07.16_all.deb(Reading database ... 155441 files and directories currently installed.)Preparing to unpack metis-dkms_0.07.16_all.deb ...Deleting module metis-0.07.16 completely from the DKMS tree.Unpacking metis-dkms (0.07.16) over (0.07.16) ...Setting up metis-dkms (0.07.16) ...Loading new metis-0.07.16 DKMS files...Building for 6.1.99Building for architecture aarch64Building initial module for 6.1.99ERROR (dkms apport): kernel package linux-headers-6.1.99 is not supportedError! Bad return status for module build on kernel: 6.1.99 (aarch64)Consult /var/lib/dkms/metis/0.07.16/build/make.log for more information.dpkg: error pro
Hi, I’m trying to deploy and inference with a yolov8 segmentation model. I tried to follow the tutorial, but I just get:INFO: deploying model yolov8sseg-coco for 4 cores. This may take a while… How long should this run? I left it on for 3.5 hours and nothing happened (except the loading bar). This seems to be a model included in your library: yolov8sseg-coco.Any idea if I can turn on more extensive logging to see what is happening? Thank you in advance!
Dear all, I am trying to set up the metis PCIe accelerator on a intel workstation. Later we will move to a full risc-v embedded setup.In out lab workstation we are forced to use singularity instead of docker, so i am trying to translate the docker container into a singularity one.I think i am very close to make everything work. However, i am now stuck with a problem i can’t figure how to solve. in particular, i get a weird error when i try to run a simple ./inference.py with one of the pre-trained models. Below the full description on how i create, run the container, and how i run the example from the voyager sdk my `voyager-sdk-1.2.5.def` fileBootstrap: dockerFrom: ubuntu:22.04%labels Author You Version voyager-sdk-1.2.5%environment export DISPLAY=$DISPLAY%post apt-get update && apt-get install -y \ sudo \ git \ pciutils \ lsb-release \ x11-utils curl \ wget \ gnupg2 \ lsb-release \ software-prop
Hello, I try to use a YOLO model on 8 camera streams in parallel on the Metis PCIe card. 20-30 fps per camera stream is all I need. With the 548 fps end-to-end stated for YOLOv8s in https://axelera.ai/metis-aipu-benchmarks, it should be possible to reach ~68 fps per camera. As a small test I wrote a python script, which creates a inference stream with 8 videos as input:from axelera.app import config, display, inf_tracersfrom axelera.app.stream import create_inference_streamdef run(window, stream): for frame_result in stream: window.show(frame_result.image, frame_result.meta, frame_result.stream_id) fps = stream.get_all_metrics()['end_to_end_fps'] print(fps.value)def main(): tracers = inf_tracers.create_tracers('core_temp', 'end_to_end_fps', 'cpu_usage') stream = create_inference_stream( network="yolov5s-v7-coco", sources=[ str(config.env.framework / "media/traffic1_1080p.mp4"), str(config.env.framework / "media/traffic1_
Hi all,I am trying to enable the the usage of the Metis PCIE rev02 on RISC-V Hosts, in particular the Sifive P550. Currently I could successfully (at least I think so), install the drivers and the device gets correctly identified and mapped in the system. I attach the loggings just to be sure: dmesg (after rescan):[ 858.500785] pci 0000:01:00.0: [1f9d:1100] type 00 class 0x120000[ 858.500843] pci 0000:01:00.0: reg 0x10: [mem 0x04380000-0x04380fff 64bit][ 858.500865] pci 0000:01:00.0: reg 0x18: [mem 0x08000000-0x09ffffff][ 858.500922] pci 0000:01:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref][ 858.500943] pci 0000:01:00.0: Max Payload Size set to 512 (was 128, max 512)[ 858.501115] pci 0000:01:00.0: supports D1[ 858.501121] pci 0000:01:00.0: PME# supported from D0 D1 D3hot[ 858.516166] pci_bus 0000:01: busn_res: [bus 01] end is updated to 01[ 858.516197] pcieport 0000:00:00.0: BAR 14: assigned [mem 0x42000000-0x44ffffff][ 858.516209] pci 0000:01:00.0: BAR 2: assigned [mem 0x
Hi folks,after I resolved some of the issues I had with my Metis PCIe device, I stopped working with it since it was so loud that I got headache after 30min of runtime.Now I realized that there is a new FW which silences the fan and I upgraded the SDK and the FW:(venv) root@holodeck7:/voyager-sdk# axdevice -vINFO: Found PCI device: 01:00.0 Processing accelerators: Axelera AI Metis AIPU (rev 02)INFO: Found AIPU driver: metis 90112 0INFO: Firmware version matches: v1.3.1INFO: Using device metis-0:1:0Device 0: metis-0:1:0 4GiB pcie flver=1.2.0-rc2 bcver=1.0 clock=800MHz(0-3:800MHz) mvm=0-3:100% device_runtime_firmware=v1.3.1 board_controller_board_type=matterhorn sw_throttling: 200°C, hysteresis 5°C, throttle rate:12% hw_throttling: 105°C, hysteresis 10°C pvt_warning_threshold: 95°CBUT the fan keeps on spinning at max although the Metis is not being used at all.Tried to find means to read-out the actual temperature, which does not seem to work:(venv) root@holodeck7:
I’m interested in hearing from everyone here about which models you’re successfully using with your Metis device and Voyager SDK, and which ones you’d like to see being officially supported.This could be really useful info in terms of prioritisation when it comes to adding more support in the model zoo.
Have you seen our fruit demo? If not, @David explains it amazingly here.Basically, it is running 3 different yolov8 models on a single chip to process 4k camera input to segment fruit, while only using 640x640 models!I love this demo and use it to evaluate any system that comes across my desk! So, obviously, when @Victor Labian posted the setup for the Orange Pi here, I just had to try it out! And it works perfectly straight after following the instructions!I shaded the fruit to look different from the original to show the accuracy of the segmentation.If you want to run this demo for yourself, you can! It’s available in the examples folder of your SDK, called fruit_demo.py. Let me know what you think! Or, as @Spanner suggestedA Fruitful and A-peel-ing Orange Pi Experiment that Didn't Go Pear-ShapedHere's what's current on the Axelera grapevine - a core functionality of the Orange Pi with Metis is detecting fruit in real time! Super smoothie AI pipe-lime with bananas and apples being
Hi! I'm testing the Metis M2 on a Rapsberry Pi 5 using the Ubuntu 22.04 container. I can successfully get results using the inference test suggested by the article:./inference.py yolov8s-coco-onnx ./media/traffic1_480p.mp4 Unfortunately, I get around 4FPS. Is this expected? If I profile the device and the host separately (using --show-host-fps and --show-device-fps) I see that the device is running at ~800FPS while the host is the one bottlenecking at ~4FPS.I tried enabling GLES processing using export AXELERA_OPENGL_BACKEND=gles,3,1but, unfortunately, it makes not much of a difference. I also tried setting the PCI to gen3 by setting the following in the /boot/firmware/config.txt, without any luck:dtparam=pciex1_gen=3As a reference: these are the instructions I'm following:- https://support.axelera.ai/hc/en-us/articles/26362016484114-Bring-up-Voyager-SDK-in-Raspberry-Pi-5- https://support.axelera.ai/hc/en-us/articles/25953148201362-Install-Voyager-SDK-in-a-Docker-ContainerIs this perfo
Hi Axelera Community,Has anyone here successfully run the Axelera Voyager SDK with ROS 2? I’m currently working on integrating the video_infer node with ROS 2 and would really appreciate hearing from anyone who has tried this setup.My Setup: Hardware: Aetina board SDK Version: voyager-sdk v1.2.5 ROS 2: Humble (on Ubuntu 22.04) Python: 3.10 (using a virtual environment) Model Used: yolov5s-v7-coco Inference Server:python3 inference_server.py yolov5s-v7-coco none --port 50051 --no-display ROS 2 Node:ros2 run axelera_infer video_infer What’s Working: The inference server starts successfully and shows: Server started, listening on 50051 INFO : Using device metis-0:1:0 The ROS2 node launches and shows: [INFO] [video_infer_node]: Starting video inference node... [INFO] [video_infer_node]: Sending inference request... [INFO] [video_infer_node]: StreamInfer started successfully No errors are thrown on either side. Problem: The stream only seems to run a few frames: Stream Playing
Hi again @Spanner ! 😊I need to infer an image obtained from byte data stored in a memory space (https://docs.python.org/3/library/mmap.html) and after converting the bytes I get an image.I've seen the sources available for create_inference_stream and I know that I can make inferences to images (https://github.com/axelera-ai-hub/voyager-sdk/blob/release/v1.2.5/docs/tutorials/application.md), but I have to specify the/path/to/file. But in this case I won't have that information.In the code I'll have this line where I read the bytes in the memory space: frame_size=image_width*image_height*3 //RGB frame_bytes = mm.read(frame_size)And then I get the new image: frame = np.frombuffer(frame_bytes,dtype=np.uint8).reshape((image_height,image_width, 3))Is there any way to infer “frame”?Thank you!
There's everything to play for.
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.