Broken pipe error when trying to develop a driver for OV4689 MIPI sensor for APQ8016 using CAMSS & CCI

Hi Loic,

Yet in the camss driver code, it supports 8-bit/10-bit/12-bit RAW RGB formats. (see camss-vfe.c) It also supports rdi sinks so I don’t think it would be a problem of pixel format.

yes, you should at least be able to retrieve a stream via the raw interface. Maybe @todortomov has some info about this kind of error.

I assume you configured correctly the CSI port in your device tree (with the four CSI lanes).

Indeed, the clock-lane is configured at index 1 while data-lanes are at indexes 0, 2, 3 and 4.

enable and reset GPIOs are set correctly and the sensor register is also set. Globally, the device tree seems to be ok as I can retrieve the values in my code.

I’m looking forward to grabbing more suggestions from 96boards users ! I appreciate your help.

EDIT: debugging the code, it seems that the s_stream callback function isn’t called. Also, I have set a MCLK at 24MHz - not 23.88MHz - but I don’t really know if this changes a lot or not.

Hello,

If this is the first time you are trying to run your driver (or system), I’d recommend to try with a simple v4l2 application and get that running first. It removes some of the complexity and can help you understand whether the problem is in any ot the the drivers, in GStreamer configuration, or something else.
You can try yavta: http://git.ideasonboard.org/yavta.git
You again configure your media pipeline and formats and then try streaming with something like (check and adjust parameters for your exact case):
yavta -B capture-mplane -c10 -I -n 5 --requeue-last -f SBGGR10P -s 1280x960 /dev/video0

Hi Todor,

Thank you for your answer (and also, thanks for making sensors drivers !)

I found out what was going on, I was getting a “Broken pipe” because the pixelformat specified through GStreamer and the pixelformat I was setting to my pipeline were different : my version of GStreamer only handled 8-bit RAW bayer format while I only can stream through 10-bit RAW bayer format.

Thus, I modified some code in GStreamer and rebuilt it thanks to this patch in order to handle 10-bit RAW instead of 8-bit RAW : http://developer.ridgerun.com/wiki/index.php?title=Compile_gstreamer_on_tegra_X1_and_X2 (category Steps to patch gstreamer to support RAW10)

Right now, the s_stream callback is called correctly and the camera seems to be in streaming mode, but when I’m trying to write a frame into a file, the file is 0 byte. I will continue my debug to find it out.

Regards,
Mathieu

Hi Mathieu/Todor,
I am also facing same kind of issues while trying to dump from IMX230.

RAW Bayer dump IMX230

Regards,
Jox

Debugging more and more, it seems that the MIPI clock doesn’t pulse at all. The master clock MCLK from the SoC to the camera pulses correctly (at 23.88MHz, not 24MHz however, like your driver from ov5645), but the MIPI clock doesn’t. Yet, I’ve configured my sensor through both PLL with correct multipliers and dividers.

Would it come from the pixel clock stated in the driver, or something else ?

If you have any idea, I would give it a try !

EDIT: which data-lanes and clock-lanes indexes should I put into the device tree ? Is it based on the ROHS schematics from 1 to X pins ? For the ov5645, it seems that it is based on it since the clock-lanes is 1 and data-lanes is 0 2.

Hi everyone,

I am still stuck in my development, and I still don’t understand what is going on. Some help would be very appreciated ! :slight_smile:

Here is the current state :

  • I am developing a driver for the OV4689 sensor for Debian, compliant with Qualcomm CCI & CAMSS drivers. Globally, I find inspiration from the existing OV5645 driver into the kernel.
  • The sensor is configured in the device tree with clock-frequency = <23880000>, clock-lanes = <1> and data-lanes = <0 2 3 4>
  • The I2C address used to communicate with the sensor is - as specified in the datasheet - the address 0x42
  • I manage to communicate with the sensor (power up, power down, read/write registers) but once the camera is launched in streaming mode, nothing happens.
  • I have checked the master clock through an oscilloscope and it pulses at the correct frequency (23.88MHz), but the MIPI clock doesn’t pulse and MIPI data lanes don’t send any data.
  • The softwares I used to launch the video and grab some frames are GStreamer and yavta, both trigger the sensor in streaming mode and stay in a loop (since some data isn’t sent it seems)
  • I am using the Raw Data Input (RDI) pipeline to get the data since OV4689 sends 10-bit RAW RGB data on 4 lanes MIPI. The PLL of the sensor is set according to the datasheet for a 24MHz master clock
  • There is no specific warn/error messages so it is really hard to debug, all I see is that the camera is in streaming mode and the software seems to wait for data but since the MIPI clock doesn’t pulse and MIPI data lanes don’t send data, it just waits.

I know this development is kind of specific, but since I try to make similar development from the OV5645, I don’t know why it does not work, this is why I am looking for help here. :smiley:

Thank you in advance for your help,
Regards,
Mathieu

I would suggest to add some debug in camss driver (mainly in camss-csiphy.c and camss-csiphy-2ph-1-0.c), start by checking csiphy_lanes_enable function is called when you open your stream.

Hi Loic,

I don’t find this function into camss-csiphy.c, I am using a version of the driver for kernel 4.9.56.

Where could I find a recent version of the driver ?

The current DB410C kernel is 4.14, you can find kernel build instruction in last release note: https://releases.linaro.org/96boards/dragonboard410c/linaro/debian/latest/ or in 96B documentation:https://www.96boards.org/documentation/consumer/dragonboard410c/build/kernel.md.html.

Even in the latest kernel I can’t find this function.

Today I have checked the MIPI clock and data lanes through the oscilloscope and it seems that the sensor sends data and the clock pulses correctly. But I still can’t figure out why it does not work.

Maybe it comes from the camss driver ? I already put some debug messages and the video_start_streaming function from the video node is called. Is the venus driver from Qualcomm implicated ?

Also, I am using the V4L2_PIX_FMT_SBGGR10P pixel format with a resolution of 2688x1520 (sensor highest res), I am not sure if it is the source of the problem (could you tell us more @todortomov ?)

Regards,
Mathieu

Hi Mathieu,
As you are trying on 410c branch, “lane_enable” function is not present there. It is present in 820c 4.14 branch. For 410c you can check “static int csiphy_stream_on(struct csiphy_device *csiphy)” function which contain all the functions in “lane_enable”.

Give a try by changing the “settle count”.

Best Regards,
Jox

https://git.linaro.org/landing-teams/working/qualcomm/kernel.git/tree/drivers/media/platform/qcom/camss/camss-csiphy-2ph-1-0.c?h=release/qcomlt-4.14#n93

No venus is a different component (video encoder/decoder).

Relatively good news, is it because you changed kernel version ?

Now I think the next thing to check is the ISPIF/VFE interrupt, which is triggered for several reasons, including start of frame: camss-vfe-4-1.c « camss « qcom « platform « media « drivers - working/qualcomm/kernel.git - Qualcomm Landing Team kernel

Could you please give command you are running? why using csid1?, this should be something like this:

media-ctl -d /dev/media0 -l \
'"msm_csiphy0":1->"msm_csid0":0[1],"msm_csid0":1->"msm_ispif0":0[1],"msm_ispif0":1->"msm_vfe0_pix":0[1]';

Hi Mathieu,

Please check the points by Loic above and share info on them.

I didn’t change the kernel version for now (I tried to flash on the board but it didn’t work, trying again later this day).

sudo media-ctl -d /dev/media0 -l '"msm_csiphy0":1->"msm_csid0":0[1],"msm_csid0":1->"msm_ispif0":0[1],"msm_ispif0":1->"msm_vfe0_rdi0":0[1]'

sudo media-ctl -d /dev/media0 -V '"ov4689 4-0042":0[fmt:SBGGR10_1X10/2688x1520 field:none],"msm_csiphy0":0[fmt:SBGGR10_1X10/2688x1520 field:none],"msm_csid0":0[fmt:SBGGR10_1X10/2688x1520 field:none],"msm_ispif0":0[fmt:SBGGR10_1X10/2688x1520 field:none],"msm_vfe0_rdi0":0[fmt:SBGGR10_1X10/2688x1520 field:none]'

As you can see, I’m using rdi0 since the sensor sends 10-bit RAW data (I’m using V4L2_PIX_FMT_SBGGR10P pixel format)

Also, here is there output using yavta :

yavta -B capture-mplane -c1 -I -n 1 -f SBGGR10P -s 2688x1520 /dev/video0 

[  342.583887] ov4689 4-0042: powering up OV4689 sensor
[  342.633095] qcom-camss 1b0ac00.camss: timer_clk_rate value: 100000000
[  342.641714] qcom-camss 1b0ac00.camss: vfe_isr interrupt handler is called
Device /dev/video0 opened.
Device `Qualcomm Camera Subsystem' on `platform:1b0ac00.camss' (driver 'qcom-camss') supports video, capture, with mplanes.
Video format set: S[  342.650834] qcom-camss 1b0ac00.camss: video_buf_init
BGGR10P (41414270) 2688x1520 field none, 1 planes: 
 * Stride 3360, buffer size 5107200
Video format: SBGGR10P (41414270) 2688x1520 field none, 1 planes: 
 * Stride 3360, buffer size 5107200
1 buffers requested.
length: 1 offset: 4227546768 timestamp type/source: mono/EoF
Buffer 0/0 mapped at address 0xffff9fa9b000.
[  342.674166] qcom-camss 1b0ac00.camss: video_buf_prepare
[  342.702218] qcom-camss 1b0ac00.camss: csiphy_settle_cnt_calc: ui value: 3190
[  342.702241] qcom-camss 1b0ac00.camss: csiphy_settle_cnt_calc: timer_clk_rate value: 100000000
[  342.708366] qcom-camss 1b0ac00.camss: csiphy_settle_cnt_calc: settle_cnt value: 14
[  342.716762] qcom-camss 1b0ac00.camss: csiphy_stream_on: cfg->csid_id value: 0
[  342.724252] ov4689 4-0042: [ov4689] ov4689_s_stream with status: 1
[  342.733820] ov4689 4-0042: sensor streaming mode (reg 0x0100) : 0x01

As you can see in the log, the vfe_isr interruption does not seem to be called when streaming mode starts.

EDIT:

I receive CSID interruption only right after the sensor is powered up and right before the video node is opened, and only at this moment. Would it mean that there would be a conflict between the settle calculated by the driver and the one set in the registers ?

EDIT2: does the sensor driver need to be launched before qcom-camss driver ? Because I insmod my driver after boot manually

It doesn’t matter that you insert it after boot.
Your commands seem correct.

The first CSID interrupt happens when CSID reset is done.

The settle count value could be your problem. The settle time can be calculated or measured with an oscilloscope. Then the settle count could be calculated. Or you can start with the value calculated by the driver and try different values (lower or higher) and see what happens. Although we have adjusted the settle count calculation several times, there are some inconsistencies when using different sensors. Try to play with this value and see if you start to receive the video data

May I ask what is this value used for ? (not in general, I mean in the driver) It seems that this value is used to write something… What is the difference between the effective settle time of the CSI-2 transmitter and the “settle count” ?

Regards,
Mathieu

I’m not sure that I understand the question. In general, the settle time is time that the CSI2 receiver will wait for HS-0 mode and after that will start looking for the Leader-Sequence. So the driver has to configure the CSI2 receiver for that - it writes the settle count value to the registers. The settle count is the parameter, which hardware is able to understand - it is calculated from the settle time. You can check the calculation in the latest version: camss-csiphy-2ph-1-0.c « camss « qcom « platform « media « drivers - working/qualcomm/kernel.git - Qualcomm Landing Team kernel

I don’t understand what is a settle count. I know that the settle time is the time the data lanes need to prepare for sending data (which consist of hs_prepare time with some hs_zero time I guess), and that this value is expressed in nanoseconds. But I can’t figure out what is the settle count. I can’t find the definition.