Jump to content

Keko

Members
  • Posts

    2
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Yes I've checked some examples from RKNN Toolkit 2 and they are working as expected. Its a two steps process: 1. Use RKNN Toolkit2 to convert models (onnx, tensorflow, tflite, caffe, others) to RK3576 NPU 2. Use RNKK Toolkit Lite 2 to perform the inference on the board. I can provide some instructions on how to do it, if anyone is interested. Everything is covered here. At this moment I'm trying to find a fast style transfer model able to do inference (ideally something around 10fps, for 1920x1080 frames) on the H96 max M9s.
  2. Hello, Just compiled and installed Armbian 24.11 (6.1.75 vendor kernel for rk35xx) on this H96 MAx TV Box (m9S version) following instructions in this post. Thanks to @Hqnicolas, @cmuki, @hzdm et al. Everything running smoothly (except BT, as usual with the tv boxes). I had to recompile the vendor kernel with this patch applied to solve an issue reported here (applications crashing, system not responding) related to the maximum number of threads allowed by the kernel. Will try to run some RKNN stuff, to check the NPU and see it in action. My use case requires running some computer vision and style transfer models with a reasonable frame rate.
×
×
  • Create New...

Important Information

Terms of Use - Privacy Policy - Guidelines