Jump to content

The VPU driver


Myy

Recommended Posts

Done, and loaded.  Now to see what the fallout is.  ;-)

 

tony@tinkerboard:~/Desktop$ mpv --hwdec=auto *.mp4
Playing: Avengers_ Infinity War.mp4
[ffmpeg/demuxer] mov,mp4,m4a,3gp,3g2,mj2: stream 0, timescale not set
 (+) Video --vid=1 (*) (h264 1280x720 29.970fps)
     Video --vid=2 [P] (png)
 (+) Audio --aid=1 (*) (aac 2ch 44100Hz)
File tags:
 Comment: Invader. Annihilator. So-called savior. As Thanos moves ever closer to omnipotence, the fate of the universe rests with the Avengers.
 Date: 2018
 Title: Avengers: Infinity War
libGL error: unable to load driver: rockchip_dri.so
libGL error: driver pointer missing
libGL error: failed to load driver: rockchip
libGL error: unable to load driver: rockchip_dri.so
libGL error: driver pointer missing
libGL error: failed to load driver: rockchip
[vo/gpu/opengl] Suspected software renderer or indirect context.
Failed to open VDPAU backend libvdpau_rockchip.so: cannot open shared object file: No such file or directory
mpi: mpp version: 598cae3 author: Jacob Chen DEBIAN: update rules for release_20171218-2
hal_h264d_api: hal_h264d_init mpp_buffer_group_get_internal used ion In
mpp_rt: NOT found ion allocator
mpp_rt: found drm allocator
mpp_drm: os_allocator_drm_alloc handle_to_fd failed ret -1
mpp_buffer: mpp_buffer_create failed to create buffer with size 4040
mpp_hal: mpp_hal_init hal h264d_rkdec init failed ret -1
mpp_hal: mpp_hal_init could not found coding type 7
mpp_dec: mpp_dec_init could not init hal
mpp: error found on mpp initialization
mpp: WARNING: setup buffer group before decoder init
mpp: command 310002 param 0x2b89ea0 ret -1
[ffmpeg/video] h264_rkmpp: Failed to assign buffer group (code = -1)
[ffmpeg/video] h264_rkmpp: Failed to initialize RKMPP decoder.
Could not open codec.
VO: [gpu] 1280x720 yuv420p
AO: [pulse] 44100Hz stereo 2ch float
AV: 00:00:56 / 02:29:49 (0%) A-V:  0.000 Dropped: 13

Audio/Video desynchronisation detected! Possible reasons include too slow
hardware, temporary CPU spikes, broken drivers, and broken files. Audio
position will not match to the video (see A-V status field).

@JMCC  I'll stop clogging your media scrip thread with development noise, we can put it all here.  This is after doing your recommended HWdecode.  It isn't v4l2, it is rkmpp, I had some wires crossed.

Link to comment
Share on other sites

I guess this might need either a newer version of RKMPP, which can use the MPP-Service thing, or some setup that I have no idea about...

 

Ugh... Couldn't they use V4L2 from the beginning... I'll give a recent compiled version of MPP (the library) tomorrow.

Link to comment
Share on other sites

On 1/26/2019 at 10:04 PM, Myy said:

I guess this might need either a newer version of RKMPP, which can use the MPP-Service thing, or some setup that I have no idea about...

 

Ugh... Couldn't they use V4L2 from the beginning... I'll give a recent compiled version of MPP (the library) tomorrow.

 

About that, phh said:

 

Quote

 rockchip considers v4l2 api can't do what they need to.  rockchip's issue that there is no proper v4l API for what they want/need is a real problem, and there is no short term hope to fix this. First try was two or three years ago now by/with chromium guys, and was very partial (only h264 and vp8 decoding support, nothing else) I know the v4l2 request API is supposed to address part of it, but it doesn't seem to be enough anyway 

 

So rather than improving v4l2 to fill their requirements, rockhip made it's own piece of crap. Thanks to that choice we now don't have acceleration in chromium which only supports v4l2 (partially as i got told).

Link to comment
Share on other sites

Wasted roughly 1 hour recompiling ffmpeg, to get "relocation R_ARM_THM_MOVW_ABS_NC against 'ff_vector_clip_int32_neon' can not be used when making a shared object; recompile wiht -fPIC"....

 

Great...

I'll recompile with "-fPIC" tomorrow...

Link to comment
Share on other sites

After fighting to get FFMPEG compiled, MPV compiled (with Debian putting ffmpeg includes inside /usr/include/arm-what-ever-arch-hf/ and /usr/include/arm-what-ever-arch-hf/ having priority over /usr/include !) and then patching MPV to get the OpenGL display working correctly (stop using eglGetDisplay when you can use eglGetPlatformDisplayEXT !), I got the same error as you had...

 

And then I realized that the error states that the buffer allocation failed...

 

So I tried as root and it led to RKMPP failing to get the right frames in time. And then it crashed... badly.

 

I'll retry with Gstreamer, since Rockchip seems to love gstreamer. And if that doesn't work, then the VPU is still in a shit state... But at least, they can communicate with it so I guess it's something...

Link to comment
Share on other sites

The crash happened at mpp_iommu_detach, so there was some driver communication... Maybe I could use that do dump the registers, when playing I-Frames only H264 file, and see how it enables and feed the hardware...

Link to comment
Share on other sites

MPP/RKMPP is the RocKchip Media Process Platform.

A set of libraries, made by Rockchip, to communicate with their VPU driver. The thing is done in such a way that the "driver" basically only handle a few things like memory management.

The actual registers of the hardware are known by MPP and are setup by this library, then sent to the driver which almost blindly write the registers values into the hardware, or read them back and send them back to MPP.

Which mean that, even if you have the sources of the Rockchip VPU driver, you need the sources of MPP to understand how the hardware is actually programmed, based on the format you want to decode/encode.

This is the kind of setup which make you wonder, who's the real "driver" ?

http://opensource.rock-chips.com/wiki_Mpp

 

FFMPEG is one the most famous multimedia processing library and tool. This thing can combine audio/video from different sources and combine/convert them into a LOT of formats.

It comes as a library AND as a binary, which is one of the swiss-army knife for Audio-Video processing.

https://ffmpeg.org/

 

MPV is a Media Player, fork of Mplayer2, which use FFMPEG as a backend. It currently have a RKMPP backend to decode video frames using the RKMPP libraries.

https://mpv.io/

 

H264 is a video format.

https://en.wikipedia.org/wiki/H.264/MPEG-4_AVC

 

The I-frames in H264 are reference (key) frames, from which other kind of frames (B/P frames) will be generated. The I-frame is basically the full frame, while the B/P frames are basically "patches" applied to I-frames to get the new picture.

The "patches" being generally smaller than the I frame, you get one way to "compress" the video (upon various others used simultaneously).

https://en.wikipedia.org/wiki/Inter_frame

Link to comment
Share on other sites

13 hours ago, Myy said:

MPP/RKMPP is the RocKchip Media Process Platform.

A set of libraries, made by Rockchip, to communicate with their VPU driver. The thing is done in such a way that the "driver" basically only handle a few things like memory management.

The actual registers of the hardware are known by MPP and are setup by this library, then sent to the driver which almost blindly write the registers values into the hardware, or read them back and send them back to MPP.

Which mean that, even if you have the sources of the Rockchip VPU driver, you need the sources of MPP to understand how the hardware is actually programmed, based on the format you want to decode/encode.

This is the kind of setup which make you wonder, who's the real "driver" ?

http://opensource.rock-chips.com/wiki_Mpp

 

FFMPEG is one the most famous multimedia processing library and tool. This thing can combine audio/video from different sources and combine/convert them into a LOT of formats.

It comes as a library AND as a binary, which is one of the swiss-army knife for Audio-Video processing.

https://ffmpeg.org/

 

MPV is a Media Player, fork of Mplayer2, which use FFMPEG as a backend. It currently have a RKMPP backend to decode video frames using the RKMPP libraries.

https://mpv.io/

 

H264 is a video format.

https://en.wikipedia.org/wiki/H.264/MPEG-4_AVC

 

The I-frames in H264 are reference (key) frames, from which other kind of frames (B/P frames) will be generated. The I-frame is basically the full frame, while the B/P frames are basically "patches" applied to I-frames to get the new picture.

The "patches" being generally smaller than the I frame, you get one way to "compress" the video (upon various others used simultaneously).

https://en.wikipedia.org/wiki/Inter_frame

 

That confirms that i will never be able to join this sector, it's clearly too complex to me.

 

WOW, BIG NEWS (i think?), it seems that Randi Li has made an adapter for v4l2<->mpp . Will this mean that any app that follows the standards will now have video acceleration??? https://patchwork.kernel.org/patch/10789627/

Link to comment
Share on other sites

Just to clarify, I never got MPV+RKMPP to work with EGL display, only with GBM (which the script does through the "mpv-gbm" wrapper, using the command line I posted above in this thread).

@Myy If you want to see the mpv build options I used, just uncompress the script and look in "packages/mpv/sources.txt"

Link to comment
Share on other sites

Well, I was able to use the EGL output after patching the display initialization method. I guess I should format the patch and send it to the MPV devs for review.

 

https://gist.githubusercontent.com/Miouyouyou/b9273ee3d949db3e1eb12f6bf99c1101/raw/cbb7d31b5ed131b3e53086c97bf0870bc3e6b3e7/0001-Use-eglGetPlatformDisplay-when-possible.patch

 

From 93a400edcabee9de0d6b464e081aa9c562085559 Mon Sep 17 00:00:00 2001
From: Myy Miouyouyou <myy@miouyouyou.fr>
Date: Fri, 1 Feb 2019 17:13:57 +0000
Subject: [PATCH] Use eglGetPlatformDisplay when possible

And then fallback on eglGetDisplay if the initialization fails...
That said, currently the code only handle eglGetPlatformDisplay
with GBMm in order to initialize displays with DRM/KMS backends.

Signed-off-by: Myy Miouyouyou <myy@miouyouyou.fr>
---
 video/out/opengl/context_drm_egl.c | 11 ++++++++++-
 1 file changed, 10 insertions(+), 1 deletion(-)

diff --git a/video/out/opengl/context_drm_egl.c b/video/out/opengl/context_drm_egl.c
index 6aa3d95..de118a5 100644
--- a/video/out/opengl/context_drm_egl.c
+++ b/video/out/opengl/context_drm_egl.c
@@ -158,9 +158,18 @@ static bool init_egl(struct ra_ctx *ctx)
 {
     struct priv *p = ctx->priv;
     MP_VERBOSE(ctx, "Initializing EGL\n");
-    p->egl.display = eglGetDisplay(p->gbm.device);
+    PFNEGLGETPLATFORMDISPLAYEXTPROC get_platform_display = NULL;
+    get_platform_display = (void *) eglGetProcAddress("eglGetPlatformDisplayEXT");
+    if (get_platform_display)
+        p->egl.display = get_platform_display(EGL_PLATFORM_GBM_KHR, p->gbm.device, NULL);
+    else {
+       MP_ERR(ctx, "WHAT !?");
+        p->egl.display = eglGetDisplay(p->gbm.device);
+    }
+
     if (p->egl.display == EGL_NO_DISPLAY) {
         MP_ERR(ctx, "Failed to get EGL display.\n");
+        MP_ERR(ctx, "Error : %d\n", eglGetError());
         return false;
     }
     if (!eglInitialize(p->egl.display, NULL, NULL)) {
-- 
2.7.4

 

Link to comment
Share on other sites

On 2/1/2019 at 6:22 PM, Myy said:

Well, I was able to use the EGL output after patching the display initialization method. I guess I should format the patch and send it to the MPV devs for review.

 

https://gist.githubusercontent.com/Miouyouyou/b9273ee3d949db3e1eb12f6bf99c1101/raw/cbb7d31b5ed131b3e53086c97bf0870bc3e6b3e7/0001-Use-eglGetPlatformDisplay-when-possible.patch

 


From 93a400edcabee9de0d6b464e081aa9c562085559 Mon Sep 17 00:00:00 2001
From: Myy Miouyouyou <myy@miouyouyou.fr>
Date: Fri, 1 Feb 2019 17:13:57 +0000
Subject: [PATCH] Use eglGetPlatformDisplay when possible

And then fallback on eglGetDisplay if the initialization fails...
That said, currently the code only handle eglGetPlatformDisplay
with GBMm in order to initialize displays with DRM/KMS backends.

Signed-off-by: Myy Miouyouyou <myy@miouyouyou.fr>
---
 video/out/opengl/context_drm_egl.c | 11 ++++++++++-
 1 file changed, 10 insertions(+), 1 deletion(-)

diff --git a/video/out/opengl/context_drm_egl.c b/video/out/opengl/context_drm_egl.c
index 6aa3d95..de118a5 100644
--- a/video/out/opengl/context_drm_egl.c
+++ b/video/out/opengl/context_drm_egl.c
@@ -158,9 +158,18 @@ static bool init_egl(struct ra_ctx *ctx)
 {
     struct priv *p = ctx->priv;
     MP_VERBOSE(ctx, "Initializing EGL\n");
-    p->egl.display = eglGetDisplay(p->gbm.device);
+    PFNEGLGETPLATFORMDISPLAYEXTPROC get_platform_display = NULL;
+    get_platform_display = (void *) eglGetProcAddress("eglGetPlatformDisplayEXT");
+    if (get_platform_display)
+        p->egl.display = get_platform_display(EGL_PLATFORM_GBM_KHR, p->gbm.device, NULL);
+    else {
+       MP_ERR(ctx, "WHAT !?");
+        p->egl.display = eglGetDisplay(p->gbm.device);
+    }
+
     if (p->egl.display == EGL_NO_DISPLAY) {
         MP_ERR(ctx, "Failed to get EGL display.\n");
+        MP_ERR(ctx, "Error : %d\n", eglGetError());
         return false;
     }
     if (!eglInitialize(p->egl.display, NULL, NULL)) {
-- 
2.7.4

 

I like the "MP_ERR(ctx, "WHAT !?");" part xD

Link to comment
Share on other sites

You should like my Wayland example, then :3

 

Anyway, I see that Ezecquiel Garcia is currently pushing patches to adapt the V4L2 patches from Ayaka, into something that works with V4L2 (and a few modifications) without the MPP layer in-between.

He pushed support for MPEG-2 decoding support... I'll see if he pushes support for H264 this week.

If not, I'll try to adapt Ayaka's patches.

Link to comment
Share on other sites

18 hours ago, Myy said:

You should like my Wayland example, then :3

 

Anyway, I see that Ezecquiel Garcia is currently pushing patches to adapt the V4L2 patches from Ayaka, into something that works with V4L2 (and a few modifications) without the MPP layer in-between.

He pushed support for MPEG-2 decoding support... I'll see if he pushes support for H264 this week.

If not, I'll try to adapt Ayaka's patches.

That would be perfect, skipping shitty vendor middleware and following the standards.

 

Quote

Could not make the current window current

xD

 

Another question: I thought codecs were "agnostic" and that the problem was more the v4l2 part but you say he is making codecs available one by one. What part must be fixed/made for each codec?

Link to comment
Share on other sites

Well, while this is called a "Video Processing Unit", the thing is : there's a LOT of video file formats out there. Which mean, a lot of different parameters and decoding/decompressing methods, based on the format used. (I mean, there are different formats and there are "different" for a reason...)

All the VPU I know are specialized in decoding a few formats, at most : H264, H265, VP8, VP9, ...

 

For each format, the VPU must be configured to access external data like : The current frame, the configuration of the current stream (Width, Height, Bytes per pixel, Color format, ...), the different decoding tables if any (e.g. CABAC tables for H26x), ... .

The amount of external data and configuration vary from format to format, knowing that some formats can also have "sub-formats" (H264 is a good example of this madness) which require more or less parameters.

 

So, yeah, VPU are dedicated to a few formats, and for each format, the setup can be completely different. That can be due to configuration registers being mapped at different addresses depending on the decoded format, or the same registers having completely different meaning depending on the format decoded.

 

Note that, in this case, the VPU decode one frame per one frame.

You cannot just "Send the MKV to the VPU, get a video stream on the other end". It *clearly* doesn't have enough memory for that.

Very roughly, the procedure goes as is :

First, the user application must :

  • Get the first frame of the video stream
  • Send it to the VPU driver

Then VPU driver must :

  • Setup the VPU to decode the frame
  • Launch the VPU decoding process
  • Wait for the decoded result
  • Send back the result to the user application.

Then user application :

  • Retrieves and shows the result,
  • Rinces and repeat for every frame of the video.

So, yeah, VPU are not CODEC agnostics. They are CODEC specialized. So the driver is setup slowly, but surely, to decode each format correctly.

Link to comment
Share on other sites

Spoiler

 

On 2/7/2019 at 3:45 PM, Myy said:

Well, while this is called a "Video Processing Unit", the thing is : there's a LOT of video file formats out there. Which mean, a lot of different parameters and decoding/decompressing methods, based on the format used. (I mean, there are different formats and there are "different" for a reason...)

All the VPU I know are specialized in decoding a few formats, at most : H264, H265, VP8, VP9, ...

 

For each format, the VPU must be configured to access external data like : The current frame, the configuration of the current stream (Width, Height, Bytes per pixel, Color format, ...), the different decoding tables if any (e.g. CABAC tables for H26x), ... .

The amount of external data and configuration vary from format to format, knowing that some formats can also have "sub-formats" (H264 is a good example of this madness) which require more or less parameters.

 

So, yeah, VPU are dedicated to a few formats, and for each format, the setup can be completely different. That can be due to configuration registers being mapped at different addresses depending on the decoded format, or the same registers having completely different meaning depending on the format decoded.

 

Note that, in this case, the VPU decode one frame per one frame.

You cannot just "Send the MKV to the VPU, get a video stream on the other end". It *clearly* doesn't have enough memory for that.

Very roughly, the procedure goes as is :

First, the user application must :

  • Get the first frame of the video stream
  • Send it to the VPU driver

Then VPU driver must :

  • Setup the VPU to decode the frame
  • Launch the VPU decoding process
  • Wait for the decoded result
  • Send back the result to the user application.

Then user application :

  • Retrieves and shows the result,
  • Rinces and repeat for every frame of the video.

So, yeah, VPU are not CODEC agnostics. They are CODEC specialized. So the driver is setup slowly, but surely, to decode each format correctly.

I will add another question on the top of the last one: When stream services talk about "hardware secure codecs". What are exactly they talking about? Are those a completely different codecs?

Edited by Tido
added spoiler.. shorted quote - please add a spoiler nexttime yourself, thx
Link to comment
Share on other sites

For what I understand, it seems to be related to HTML5 EME (Encrypted Media Extensions), and it's mainly used to provide DRM support.

Basically, the media (movie) is encrypted, some decryption keys are provided by a DRM server (requiring authentication, and so...), and the decryption is being done by the hardware itself.
The only implementation I've seen from this is included in the "chromium" engine.

It seems to be mainly used to deter the "copy the movie by intercepting the stream".

Anyway, I took a look today at the two VPU implementations sent on the linux-media and linux-rockchip mailing lists and...

Notably patches providing v4l2_m2m_buf_copy_data . It has been suggested multiple times, with patches like this : https://patchwork.kernel.org/patch/10680123/
Maybe it has been mainlined and renamed since then. I'll have to take a look around, see if I can just add the functions from the provided patches, or if adding these helpers is slightly more complex... or see how I can modify Ayaka patches to avoid these extensions.

I could also try to add MPEG-2 support for RK3288 by mimicking how it's done for the RK3399 in Ezequiel patches but... this might take way more time than needed.

Link to comment
Share on other sites

Without checking, IIRC, registers places are different. This could be resolved through a few macros... But that generates another question : How to test the whole thing ? I'll have to look at bootlin repositories to understand how to use the whole thing.

Link to comment
Share on other sites

Right. Not a clue from over here, just thinking in type...

I'm patching up a USB3 Ethernet adapter at the moment, and a discussion with some Arch guys trying to mainline wifi on Tinker, they have uhs_ddr50 enabled, so now I'm curious. And I need to try out the Bluetooth serdev setup. And figure out mipi dsi, etc ad infinitum.

Sent from my Pixel using Tapatalk

Link to comment
Share on other sites

For mpeg-2 on rk3288/rk3328 there are some other patches needed, see my rockchip-5.x-vpu branch for working rk3288/rk3328 mpeg-2 decoding on v5.0-rc6. clk, drm and dts patches will be sent upstream any day now.

Also check out https://github.com/mpv-player/mpv/pull/6461 and the linked ffmpeg hwaccel if you want to use mpv or kodi-gbm for testing, I recently pushed dynamic selection of media/video device to hwaccel so should work without forcing decoder to /dev/video0.

 

I pushed two libreelec test images to http://kwiboo.libreelec.tv/test/ for tinker board and rock64 if you want to test mpeg-2 decoder, it includes patches from my rockchip-5.x-rebase and rockchip-5.x-vpu linux branch.

Link to comment
Share on other sites

Interesting ! I'll try to include these patches and use them with the provided tools, this week.

 

Did you port the MPEG-2 code using the RK3399 as a template or using the old chromium code provided by Tomasz Figa ? Or maybe another way ?

Link to comment
Share on other sites

I was familiar with the vpu2 hw regs that needed to be set from creating an experimental mpeg-2 hwaccel for the rk vcodec kernel driver some time ago.

After collaboration with @jernej to create a v4l2 request api hwaccel and getting it working with the Allwinner cedrus driver I learned enough v4l2 to get the rockchip vpu MPEG-2 decoder to work on my Rock64.

 

Ezecquiel Garcia was then very helpful and got my initial work (decoder boilerplate was copied from encoder and chromium os) ready for upstream and submitted the MPEG-2 decoder for rk3399 on top of his decoder boilerplate work.

RK3288 and RK3328 was left out as they requires clk and drm changes to work properly, patches are being prepared to be submitted.

 

The rockchip-vpu-regtool was created to help set correct hw regs for both vpu1 and vpu2, mpeg2.txt was created based on mpp hal code, some imx-vpu-hantro code along with some docs was also useful to get more insights into the rockchip vpu.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use - Privacy Policy - Guidelines