Jump to content

Recommended Posts

Posted (edited)

I've already benchmarked a computer-interactivbe task (like image processing) on the Orangepi 5 plus's processor——and it wrapped up in just 40 minutes, which is impressively fast. Therefore, I' like to try the NPU. Anyone know how to tap into its NPU for AI workloads?

Edited by Johson
  • Johson changed the title to How to use OrangePi 5 Plus's NPU for Image Generation?
Posted (edited)

I would like to know too.

 

The orange pi zero 3 has a GPU that's available for SIMD acceleration through the latest OpenGLES library, but I haven't had time for that:

 

https://ai.google.dev/edge/mediapipe/framework/getting_started/gpu_support

 

Try it in your opi5+, then run the mediapipe python examples.

 

Then look for other neural network tasks that use the same NN engine: tensor flow light (tensor flow, pytorch will need a different method)

 

Other examples:

https://forum.armbian.com/topic/28895-efforts-to-develop-firmware-for-h96-max-v56-rk3566-8g64g/#comment-167001

https://opencv.org/blog/working-with-neural-processing-units-npus-using-opencv/

Edited by robertoj
Posted

FWIW, on my rk3588 devices the NPUs are working with recent mainline releases:

[    5.967316] [drm] Initialized rocket 0.0.0 for rknn on minor 0
[    5.975499] rocket fdab0000.npu: Rockchip NPU core 0 version: 1179210309
[    5.978652] rocket fdac0000.npu: Rockchip NPU core 1 version: 1179210309
[    5.985602] rocket fdad0000.npu: Rockchip NPU core 2 version: 1179210309

This script runs the Mesa example with the latest available working versions:

Spoiler
#!/bin/bash
IMAGE="grace_hopper.bmp"
WORKBENCH="."
ENVIRONMENT="${WORKBENCH}/python/3.11"
[ "${1}" == "setup" ] || [ ! -f ${ENVIRONMENT}/bin/activate ] && BOOTSTRAP="true"
[ -v BOOTSTRAP ] && python3.11 -m venv ${ENVIRONMENT}
source ${ENVIRONMENT}/bin/activate
[ -v BOOTSTRAP ] && pip install numpy==1.26.4
[ -v BOOTSTRAP ] && pip install pillow==12.0.0
[ -v BOOTSTRAP ] && pip install tflite-runtime==2.14.0
TEFLON_DEBUG=verbose ETNA_MESA_DEBUG=ml_dbgs python ${WORKBENCH}/classification-tflite.py \
          -i ${WORKBENCH}/${IMAGE} \
          -m ${WORKBENCH}/mobilenet_v1_1_224_quant.tflite \
          -l ${WORKBENCH}/labels_mobilenet_quant_v1_224.txt \
          -e /usr/lib64/libteflon.so
deactivate

 

And with this script, the Mesa example runs, with a small adjustment, also with the TFLite successor LiteRT:

Spoiler
#!/bin/bash
IMAGE="grace_hopper.bmp"
WORKBENCH="."
ENVIRONMENT="${WORKBENCH}/python/3.13"
[ "${1}" == "setup" ] || [ ! -f ${ENVIRONMENT}/bin/activate ] && BOOTSTRAP="true"
[ -v BOOTSTRAP ] && python3.13 -m venv ${ENVIRONMENT}
source ${ENVIRONMENT}/bin/activate
[ -v BOOTSTRAP ] && pip install pillow
[ -v BOOTSTRAP ] && pip install ai-edge-litert-nightly
TEFLON_DEBUG=verbose ETNA_MESA_DEBUG=ml_dbgs python ${WORKBENCH}/classification-litert.py \
          -i ${WORKBENCH}/${IMAGE} \
          -m ${WORKBENCH}/mobilenet_v1_1_224_quant.tflite \
          -l ${WORKBENCH}/labels_mobilenet_quant_v1_224.txt \
          -e /usr/lib64/libteflon.so
deactivate

 

A MediaPipe sample can also be set up easily:

Spoiler
#!/bin/bash
WORKBENCH="."
ENVIRONMENT="${WORKBENCH}/python/3.12"
[ "${1}" == "setup" ] || [ ! -f ${ENVIRONMENT}/bin/activate ] && BOOTSTRAP="true"
[ -v BOOTSTRAP ] && python3.12 -m venv ${ENVIRONMENT}
source ${ENVIRONMENT}/bin/activate
[ -v BOOTSTRAP ] && pip install mediapipe
[ -v BOOTSTRAP ] && pip install pillow
[ -v BOOTSTRAP ] && pip install ai-edge-litert-nightly
python ${WORKBENCH}/detect.py  --model efficientdet_lite0.tflite

 

But unfortunately, the MediaPipe framework does not support the extended delegate functionality of LiteRT (TFLite).
And therefore no NPU support.

classification-3.11-tflite.logclassification-3.13-litert.logobject_detection-3.12-litert.log

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use - Privacy Policy - Guidelines