I’m using a design comparable to the DragonBoard410c that features two cameras, a OV13855 on CSI0 and a OV5640 on CSI1. I’m based on the 4.9 release, with a Yocto-built userspace image.
I’ve backported the OV5640 driver that landed in mainline and modified another driver to support the OV13855. After setting up the media links, the OV5640 works with a GStreamer pipeline, but the OV13855 however does not yet work, especially because I lack a working pipeline for testing. The driver sets
MEDIA_BUS_FMT_SGRBG10_1X10 as format code, which I think should match the output format described in the datasheet (“10-bit RGB RAW”).
I’d like to know how to the hardware formats (
MEDIA_BUS_FMT_SGRBG10_1X10 for OV13855,
MEDIA_BUS_FMT_UYVY8_2X8 for OV5640) into something the application stack is able to use. Eventually, I need to stream the video content into a web application which runs in a QtWebEngine/Chromium context, and if I read the code correctly, this is where the supported formats are set up:
So my question is: how to I set up a the media pad links for bringing the native hardware formats into anything that Chromium code is willing to accept?
Thanks for any pointer!