Getting video stream from RAW Bayer image sensor

Following suggestions on this forum, I could successfully set up video streaming from an image sensor with RAW Bayer output (AR1337 from OnSemi) with the DragonBoard 410c on Debian 18.01.
However the frame rate is very slow because decoding is made by GStreamer (using “bad” plugins) from the rdi interface, since the pix interface only supports sensors with YUV formats such as the OV5645.
I tried to overcome this problem doing some hacking on the camss_vfe driver and I think I correctly set up the DEMUX for the SGBRG8_1X8 format. I tried to test this using the test pattern generator in csid and configuring the csid source and the following components up to vfe sink for this Bayer format. Then I could see the test pattern streaming through /dev/video3. Is this approach correct?
When I connect the camera instead I am getting “VFE0 pix0 overflow” errors from ispif, while no error happens when I select rdi.
Any suggestions?

Thank you,

Marco

I don’t understand this point, could you please elaborate ? Where debayering is done ?

I assume this problem happens when a new frame comes to PIX interface before the previous one has been processed.

I can successfully do debayering in software by GStreamer (from /dev/video0), but at a very low rate.
When I try to do it in hardware (from PIX interface /dev/video3) I get no frame and overflow errors.
If I use the csid test pattern generator through the same path (PIX interface), setting bayer formats in the vfe input chain, I can see frames correctly.

Yes, this is likely, but why do test frames get through and camera frames don’t?
Resolution is 1080p in both cases, so the quantity of data should be the same.

Thanks,
Marco

Like you said PIX interface does not support RAW BAYER images, only YUV. So you need to debayer before PIX interface (in CSID?), not sure it’s possible, but maybe @todortomov can confirm this.

You can potentially use GPU for debayering (e.g. https://github.com/rasmus25/debayer-rpi).

Interesting link. It could be an alternative if I find no way to debayering in vfe.
Only it will take a while to port this code into a GStreamer plugin…

Thanks,
Marco

Hello,

The hardware which is capable to do the demosaic is the VFE. However the configuration for that is more complex than configuring the Demux module. This configuration is not public and is not supported in the CAMSS driver.

Hi Todor,

thanks for your answer, I understand Qualcomm’s need to protect their technology, but such a secretive attitude only makes things harder for ordinary developers like me, while it doesn’t completely prevent reverse engineering by really motivated people…

Apart from this, I have a few questions about the test pattern generator. When I enable it and configure the csid source pad as, for example, SGBRG8_1X8/1920x1080, does it stream really in that format to connected entities? What is different from streaming from a real image sensor? Is the frame rate determined by the client connected to the final video device, or what else?
(actually there must be something different since I could see the test pattern streaming through /dev/video3)

Thanks,
Marco

No, it streams the pattern that you have set with the V4L2_CID_TEST_PATTERN control. The format and size are used to calculate bytes per line and lines per frame - the frame size.

It is determined by the test generator. This is the same as with camera sensor. If the client cannot handle (dequeue/queue) the buffers fast enough then the write interface will skip (discard) frames. The test generator has horizontal and vertical blanking values, you can try to play with them but I think they are set on maximum and you can probably only increase the frame rate.

The problem is that the VFE is not configured for Bayer input but for UYVY. Even if you get something on the output this does not mean that if you send Bayer on the input it will be demosaic-ed. As I said - this configuration is proprietary. I’m sorry to discourage you but this is it.

Best regards,
Todor Tomov

To be clearer, these are the commands I used:

yavta --no-query -w '0x009f0903 1' /dev/v4l-subdev2
media-ctl -d /dev/media1 -l '"msm_csid0":1->"msm_ispif0":0[1],"msm_ispif0":1->"msm_vfe0_pix":0[1]'
media-ctl -d /dev/media1 -V '"msm_csid0":1[fmt:SGBRG8_1X8/1920x1080 field:none],"msm_ispif0":0[fmt:SGBRG8_1X8/1920x1080 field:none],"msm_vfe0_pix":0[fmt:SGBRG8_1X8/1920x1080 field:none]'
media-ctl -d /dev/media1 -V '"msm_vfe0_pix":1[fmt:UYVY8_1_5X8/1920x1080 field:none]'

I know I can choose which test pattern to generate via the argument passed to yavta, which issues the command to set V4L2_CID_TEST_PATTERN. Do you mean that that the CSID outputs the frame size set in the commands (1920x1080) but the frame encoding is not SGBRG8_1X8?
About the clock rate, when there is no sensor at the beginning of the chain, how can the CSID figure out the frame rate at which to output data?
I have seen a comment in the code saying “if sensor pixel clock is not available set highest possible CSID clock rate”. Does this apply?

Thanks,
Marco

Hi,
I think can dump it into it into memory via rdi and validate it using tools like 7YUV or similar for knowing CSID TG frame encoding format.

Best Regards,
Jox