Johson Posted December 4, 2025 Posted December 4, 2025 (edited) I've already benchmarked a computer-interactivbe task (like image processing) on the Orangepi 5 plus's processor——and it wrapped up in just 40 minutes, which is impressively fast. Therefore, I' like to try the NPU. Anyone know how to tap into its NPU for AI workloads? Edited December 4, 2025 by Johson 0 Quote
robertoj Posted December 4, 2025 Posted December 4, 2025 (edited) I would like to know too. The orange pi zero 3 has a GPU that's available for SIMD acceleration through the latest OpenGLES library, but I haven't had time for that: https://ai.google.dev/edge/mediapipe/framework/getting_started/gpu_support Try it in your opi5+, then run the mediapipe python examples. Then look for other neural network tasks that use the same NN engine: tensor flow light (tensor flow, pytorch will need a different method) Other examples: https://forum.armbian.com/topic/28895-efforts-to-develop-firmware-for-h96-max-v56-rk3566-8g64g/#comment-167001 https://opencv.org/blog/working-with-neural-processing-units-npus-using-opencv/ Edited December 4, 2025 by robertoj 0 Quote
usual user Posted December 5, 2025 Posted December 5, 2025 FWIW, on my rk3588 devices the NPUs are working with recent mainline releases: [ 5.967316] [drm] Initialized rocket 0.0.0 for rknn on minor 0 [ 5.975499] rocket fdab0000.npu: Rockchip NPU core 0 version: 1179210309 [ 5.978652] rocket fdac0000.npu: Rockchip NPU core 1 version: 1179210309 [ 5.985602] rocket fdad0000.npu: Rockchip NPU core 2 version: 1179210309 This script runs the Mesa example with the latest available working versions: Spoiler #!/bin/bash IMAGE="grace_hopper.bmp" WORKBENCH="." ENVIRONMENT="${WORKBENCH}/python/3.11" [ "${1}" == "setup" ] || [ ! -f ${ENVIRONMENT}/bin/activate ] && BOOTSTRAP="true" [ -v BOOTSTRAP ] && python3.11 -m venv ${ENVIRONMENT} source ${ENVIRONMENT}/bin/activate [ -v BOOTSTRAP ] && pip install numpy==1.26.4 [ -v BOOTSTRAP ] && pip install pillow==12.0.0 [ -v BOOTSTRAP ] && pip install tflite-runtime==2.14.0 TEFLON_DEBUG=verbose ETNA_MESA_DEBUG=ml_dbgs python ${WORKBENCH}/classification-tflite.py \ -i ${WORKBENCH}/${IMAGE} \ -m ${WORKBENCH}/mobilenet_v1_1_224_quant.tflite \ -l ${WORKBENCH}/labels_mobilenet_quant_v1_224.txt \ -e /usr/lib64/libteflon.so deactivate And with this script, the Mesa example runs, with a small adjustment, also with the TFLite successor LiteRT: Spoiler #!/bin/bash IMAGE="grace_hopper.bmp" WORKBENCH="." ENVIRONMENT="${WORKBENCH}/python/3.13" [ "${1}" == "setup" ] || [ ! -f ${ENVIRONMENT}/bin/activate ] && BOOTSTRAP="true" [ -v BOOTSTRAP ] && python3.13 -m venv ${ENVIRONMENT} source ${ENVIRONMENT}/bin/activate [ -v BOOTSTRAP ] && pip install pillow [ -v BOOTSTRAP ] && pip install ai-edge-litert-nightly TEFLON_DEBUG=verbose ETNA_MESA_DEBUG=ml_dbgs python ${WORKBENCH}/classification-litert.py \ -i ${WORKBENCH}/${IMAGE} \ -m ${WORKBENCH}/mobilenet_v1_1_224_quant.tflite \ -l ${WORKBENCH}/labels_mobilenet_quant_v1_224.txt \ -e /usr/lib64/libteflon.so deactivate A MediaPipe sample can also be set up easily: Spoiler #!/bin/bash WORKBENCH="." ENVIRONMENT="${WORKBENCH}/python/3.12" [ "${1}" == "setup" ] || [ ! -f ${ENVIRONMENT}/bin/activate ] && BOOTSTRAP="true" [ -v BOOTSTRAP ] && python3.12 -m venv ${ENVIRONMENT} source ${ENVIRONMENT}/bin/activate [ -v BOOTSTRAP ] && pip install mediapipe [ -v BOOTSTRAP ] && pip install pillow [ -v BOOTSTRAP ] && pip install ai-edge-litert-nightly python ${WORKBENCH}/detect.py --model efficientdet_lite0.tflite But unfortunately, the MediaPipe framework does not support the extended delegate functionality of LiteRT (TFLite). And therefore no NPU support. classification-3.11-tflite.logclassification-3.13-litert.logobject_detection-3.12-litert.log 1 Quote
flappyjet Posted 2 hours ago Posted 2 hours ago Do you have rocket kernel with mesa that enable NPU working now? Or with rknn kernel? I'm a hobbyist working on a script to detect the NPU environment for ARM64 edge devices. Since I only have limited hardware, I’m looking for volunteers to run a quick test and see what is the environment on different NPU-equipped boards. This is a personal experimental project with no guarantees, just a survey to see what works and what doesn't. GitHub Repository: [npu-toolbox](https://github.com/flappyjet/npu-toolbox/) Test Script: `curl -sSL https://github.com/flappyjet/npu-toolbox/raw/refs/heads/main/scripts/npu_probe.sh | bash` Report Results: Please drop a comment in this survey issue: [Call for Testing: NPU detection Script](https://github.com/flappyjet/npu-toolbox/issues/1) Here's my A311d device info Quote # npu-toolbox probe [platform] comatible="onethingcloud,oes,amlogic,a311d,amlogic,g12b," model="OneThing Cloud OES," family="Amlogic Meson" soc_id="G12B (A311D)" arch="aarch64" kernel="6.12.59-ophub" [modules] galcore="" etanviv="/sys/module/etnaviv" [devices] npu_device="/dev/dri/renderD128" # Driver: etnaviv devices_not_support=[ "/dev/dri/renderD129", # Driver: panfrost ] [libraries] delegate="/usr/local/lib/aarch64-linux-gnu/libteflon.so" # npu-toolbox benchmark Quote INFO: Created TensorFlow Lite XNNPACK delegate for CPU. ================================================================================ Model: bird Image | Backend | Time (ms) | Result -------------------------------------------------------------------------------- bird.jpg | CPU | 47.40 | ID:671|Passer domesticus (House Sparrow) -------------------------------------------------------------------------------- bird.jpg | NPU | 13.04 | ID:671|Passer domesticus (House Sparrow) -------------------------------------------------------------------------------- people.jpg | CPU | 47.32 | ID:964|background -------------------------------------------------------------------------------- people.jpg | NPU | 7.79 | ID:964|background -------------------------------------------------------------------------------- street.jpg | CPU | 47.28 | ID:964|background -------------------------------------------------------------------------------- street.jpg | NPU | 7.82 | ID:964|background -------------------------------------------------------------------------------- ================================================================================ Model: mobilenet Image | Backend | Time (ms) | Result -------------------------------------------------------------------------------- bird.jpg | CPU | 47.39 | ID:11|brambling, Fringilla montifringilla -------------------------------------------------------------------------------- bird.jpg | NPU | 13.53 | ID:11|brambling, Fringilla montifringilla -------------------------------------------------------------------------------- people.jpg | CPU | 47.38 | ID:646|maypole -------------------------------------------------------------------------------- people.jpg | NPU | 8.00 | ID:646|maypole -------------------------------------------------------------------------------- street.jpg | CPU | 47.56 | ID:921|traffic light, traffic signal, stoplight -------------------------------------------------------------------------------- street.jpg | NPU | 8.02 | ID:921|traffic light, traffic signal, stoplight -------------------------------------------------------------------------------- ================================================================================ Model: ssdlite Image | Backend | Time (ms) | Result -------------------------------------------------------------------------------- bird.jpg | CPU | 208.39 | Detected 1 -------------------------------------------------------------------------------- bird.jpg | NPU | 31.77 | Detected 1 -------------------------------------------------------------------------------- people.jpg | CPU | 208.49 | Detected 10 -------------------------------------------------------------------------------- people.jpg | NPU | 21.96 | Detected 10 -------------------------------------------------------------------------------- street.jpg | CPU | 208.35 | Detected 10 -------------------------------------------------------------------------------- street.jpg | NPU | 21.87 | Detected 10 -------------------------------------------------------------------------------- 0 Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.