Jump to content

Does the Orange Pi 6 Plus NPU work natively with Ollama, or do models need conversion? Also, experiences running larger LLMs with its high RAM?


Recommended Posts

Posted

Hey everyone, I'm eyeing the Orange Pi 6 Plus for some edge AI projects, given its 12-core CIX SoC, up to 28.8 TOPS NPU, and massive RAM options (16/32/64GB LPDDR5). Has anyone gotten Ollama running on it with NPU acceleration? Does it support it out of the box, or do you need to convert models (e.g., to INT4/INT8 formats) using custom tools like rkllama or similar?

Additionally, with the higher RAM, can it handle bigger LLMs (like 13B+ models) more smoothly than lower-spec SBCs? Any benchmarks or tips on setups (e.g., Ubuntu/Debian installs, frameworks like MLC-LLM)? I'd love to hear real-world experiences—thanks!

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use - Privacy Policy - Guidelines