MIPI CSI interface support for interfacing Image Sensor

Hello liuhui,

settle-cnt sets the Ths-settle parameter as per MIPI D-PHY spec (the time that the receiver waits after LP-00 state and before it starts looking for the Leader-Sequence).
settle-cnt is calculated by the formula:
settle-cnt = Ths-settle [ns] / csiphy0_timer_clk [GHz]

The csiphy0_timer_clk is set to 200MHz (0.2GHz).
Ths-settle you have to determine based on your sensor settings.
Then calculate the settle-cnt.

u8 vc = 0; - this is the Virtual Channel Identifier as per MIPI CSI-2 spec that the sensor uses for the frame data. Usually sensors use virtual channel 0 by default and/or the sensor can be configured which virtual channel to use. Set this value the same as the virtual channel which your sensor uses.

1 Like

Hello:
In D-phy ,the range of THS-SETTLE is (85ns+6UI~145ns+10UI).According to your calculation method, the min is 85/0.2=299,
so how to get the “qcom,settle-cnt = <0xe>;” about OV5645?

I test my sensor show time out all the time.I find in ispif only have happend 1 interrupt and 2 at vfe.At csiphy have happend 100000 interrupt probably,1 interrupt at csid . My sensor’s format is raw10,60f/s,1280*960,4lane,date rate is 297Mbps.Could you give me some suggestion about debug.

Hello liuhui,

Yes, there is a mistake in the above formula about settle-cnt. The correct one is as follows:

settle-cnt = Ths-settle [ns] * csiphy0_timer_clk [GHz]

The csiphy0_timer_clk is set to 200MHz (0.2GHz).
Ths-settle you have to determine based on your sensor settings.
Then calculate the settle-cnt.

So for ov5645 Ths-settle = 70ns and settle-cnt = 0xe. This can be specific to the settings of the transmitter, this is why it is lower than 85ns.

If you don’t receive interrupts on CSID it it possible that the CSIPHY does not detect the Leader-Sequence so the settle-cnt value can really be your issue.

2 Likes

Hello,
Recently i tried to connect a different camera sensor to my board and discovered that the I2C address was hardcoded into the CCI code.
The code is: drivers/media/platform/msm/cci/msm_cci.c
Lines: 1176,1209:
cci_ctrl.cci_info->sid = 0x78 >> 1;

Is there any problem to use the dts registered address?
Thank you, Leonid.

Hello Leonid,

This is a temporary version of the cci driver only to enable usage of it. This will be fixed. You can change the address according to your needs.

Hi Todor,

    I am just wondering which registers suggest Ths-settle is 70ns? Thanks,

Regards,
Xiang

Hi ,

Anyone has the idea to set raw image sensor format,i used the following one,but got "unable to setup format:Invalid argument (22)",it seems like we can set format for msm_ispif and msm_vfe.

media-ctl -d /dev/media1 -V ‘“ov7251 1-0060”:0[fmt:SRGGB10/640x480],“msm_csiphy0”:0[fmt:SRGGB10/640x480],“msm_csid0”:0[fmt:SRGGB10/640x480],”msm_ispif”:0[fmt:SRGGB10/640x480],”msm_vfe”:0[fmt:SRGGB10/640x480]’

Regards,
Xiang

I am sorry,some typo in the last command,now it works,it should be like this:

media-ctl -d /dev/media1 -V ‘”ov7251 1-0060″:0[fmt:SRGGB10/640×480],“msm_csiphy0”:0[fmt:SRGGB10/640×480],“msm_csid0”:0[fmt:SRGGB10/640×480],”msm_ispif”:0[fmt:SRGGB10/640×480],”msm_vfe”:0[fmt:SRGGB10/640×480]’

Now the media entities:

Media device information

driver qcom-camss
model QC MSM CAMSS
serial
bus info
hw revision 0x0
driver version 0.1.0

Device topology

  • entity 1: msm_csiphy0 (2 pads, 3 links)
    type V4L2 subdev subtype Unknown flags 0
    device node name /dev/v4l-subdev0
    pad0: Sink
    [fmt:SRGGB10/640x480 field:none]
    <- “ov7251 1-0060”:0 [ENABLED,IMMUTABLE]
    pad1: Source
    [fmt:SRGGB10/640x480 field:none]
    -> “msm_csid0”:0 [ENABLED]
    -> “msm_csid1”:0 []

  • entity 2: msm_csiphy1 (2 pads, 2 links)
    type V4L2 subdev subtype Unknown flags 0
    device node name /dev/v4l-subdev1
    pad0: Sink
    [fmt:UYVY2X8/1920x1080 field:none]
    pad1: Source
    [fmt:UYVY2X8/1920x1080 field:none]
    -> “msm_csid0”:0 []
    -> “msm_csid1”:0 []

  • entity 3: msm_csid0 (2 pads, 3 links)
    type V4L2 subdev subtype Unknown flags 0
    device node name /dev/v4l-subdev2
    pad0: Sink
    [fmt:SRGGB10/640x480 field:none]
    <- “msm_csiphy0”:1 [ENABLED]
    <- “msm_csiphy1”:1 []
    pad1: Source
    [fmt:SRGGB10/640x480 field:none]
    -> “msm_ispif”:0 [ENABLED]

  • entity 4: msm_csid1 (2 pads, 3 links)
    type V4L2 subdev subtype Unknown flags 0
    device node name /dev/v4l-subdev3
    pad0: Sink
    [fmt:UYVY2X8/1920x1080 field:none]
    <- “msm_csiphy0”:1 []
    <- “msm_csiphy1”:1 []
    pad1: Source
    [fmt:UYVY2X8/1920x1080 field:none]
    -> “msm_ispif”:0 []

  • entity 5: msm_ispif (2 pads, 3 links)
    type V4L2 subdev subtype Unknown flags 0
    device node name /dev/v4l-subdev4
    pad0: Sink
    [fmt:SRGGB10/640x480 field:none]
    <- “msm_csid0”:1 [ENABLED]
    <- “msm_csid1”:1 []
    pad1: Source
    [fmt:SRGGB10/640x480 field:none]
    -> “msm_vfe”:0 [ENABLED,IMMUTABLE]

  • entity 6: msm_vfe (2 pads, 2 links)
    type V4L2 subdev subtype Unknown flags 0
    device node name /dev/v4l-subdev5
    pad0: Sink
    [fmt:SRGGB10/640x480 field:none]
    <- “msm_ispif”:1 [ENABLED,IMMUTABLE]
    pad1: Source
    [fmt:SRGGB10/640x480 field:none]
    -> “msm_vfe_video”:0 [ENABLED,IMMUTABLE]

  • entity 7: msm_vfe_video (1 pad, 1 link)
    type Node subtype V4L flags 0
    device node name /dev/video0
    pad0: Sink
    <- “msm_vfe”:1 [ENABLED,IMMUTABLE]

  • entity 8: ov7251 1-0060 (1 pad, 1 link)
    type V4L2 subdev subtype Unknown flags 0
    device node name /dev/v4l-subdev6
    pad0: Source
    [fmt:UYVY2X8/640x480 field:none
    crop:(0,0)/640x480]
    -> “msm_csiphy0”:0 [ENABLED,IMMUTABLE]

But i still got some errors:

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Got context from element ‘glimagesink0’: gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"(GstGLDisplayX11)\ gldisplayx11-0";
Setting pipeline to PLAYING …
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Device ‘/dev/video0’ cannot capture in the specified format
Additional debug info:
gstv4l2object.c(2858): gst_v4l2_object_set_format (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Tried to capture in YU12, but device returned format pRAA

Any idea?

Regards,
Xiang

Hello Xiang,

About Ths-settle - I’m not sure that I understand the question.
The Ths-settle value is calculated based on the camera sensor that you have and the sensor settings that are applied to it. This is sensor specific - depends on the sensor that you use. Then the Ths-settle can be converted to the settle-cnt value and program that in the CSIPHY (register is named CAMSS_CSI_PHY_LNn_CFG3 in the code).

If you set SRGGB10 media bus format on subdevs, you have to set V4L2_PIX_FMT_SRGGB10P on the video device node (you have to check the exact naming in gstreamer).

Thanks Todor,now i understand how is settle-cnt calculated.I saw your post:
settle-cnt = Ths-settle [ns] * csiphy0_timer_clk [GHz]
Ths-settle is from sensor setting,but i didn’t find where csiphy0_timer_clk is set(200MHz)

Regarding raw image sensor format,so i can use media-ctl to set media bus format,how should set the format at application side(ie.,gstreamer,sorry,new to gstreamer…)

Regards,
Xiang

Can someone please summarize the state of MIPI CSI camera support?

I’ve read through several threads and it seems support was targeted for summer 2016. There has been at least some test code circulating? Is support limited to specific sensors? Is it possible for the community to write drivers for others now? Can both interfaces be used? Simultaneously? Video capture works? I also see there is an app note for design of the electrical interface, and Intrinsyc and Inforce both seem to have some level of support.

I’m looking at the 410E for a new design because the Dragonboard and this community appear to be maturing, while historically I would never consider QCM (they would not consider my small customers either), because they are so closed with documentation. Unfortunately, it seems difficult to know if a commercial design can be successful yet.

Thanks in advance.

Matt

Hello Matt,

I suppose your questions concern the camera support in Linux (Debian).

The first place to start about information is the release notes of the latest release:
http://builds.96boards.org/releases/dragonboard410c/linaro/debian/latest/

I’ll add here some more information about your questions.
The camera support is in active development. Some features are already supported and available in the release, some are planned to be added.

What is currently supported from the Camera Subsystem driver (the MIPI CSI receiver driver) is direct dump to memory (no processing) for UYVY or RAW frame format. Two camera sensors can be used concurrently and independently. The Linaro release also includes a driver for OV5645 camera sensor which is used to develop and test this whole functionality. Using other MIPI CSI2 camera sensors is also possible. Some difficulties may arise since there is no public camera documentation from Qualcomm and debugging can be tricky however users can always ask in forum and we try to give relevant advice.

Video capture is currently not supported, work on this is ongoing.

Best regards,
Todor

Thanks for the clarification Todor.

I did mean Debian, but did you mean then that cameras have better (or worse) support in another environment? Android? I assumed the most function was in Debian.

The last bit is unfortunate, with no video capture. What generally is gating video capture development? Info missing from Qualcomm? Developer bandwidth?

We have a small team that could possibly contribute to the vid capture effort, but I need to understand what the obstacles are.

Also, release notes mention dual cam support, but then also not having support for both interfaces on the high speed connector; are the ios not pulled out to it? What’s the issue there?

Regards,
Matt

hi,

I am assuming you are referring to the “fully optimized video capture” use case, using CSI camera and the hw encoder available on SoC.

There is no obstacle for video capture.

the video capture use case requires several pieces:

  1. support for CSI camera
  2. support for color conversion in the camera subsystem (most camera would provide YUV data, which is different from the video format expected by the hw encoder , e.g. NV12)
  3. potentially scaling/cropping of camera picture
  4. hw video encoder, using the dedicated IP
  5. ability to share video buffers between various sub systems without any CPU copy.

as of today, #1, #2 and #4 are available, the video encoder driver is avaiable along with the video decoder , using the v4l2 m2m in kernel ‘api’. #5 should also be supported (not 100% sure).

What the previous sentence meant is that we do not have the full integration with Gstreamer. But we have a standalone test app that does video encode, and you should be able to use the camera driver and the video encoder simultaneously, we just haven’t finished the full integration.

For #3 , the work is starting, but this is likely not essential right now.

Regarding dual camera, the limitation has been fixed , and you can now use both sensors simultaneously. It was a limitation in the 16.06 release, we forgot to update this item in the 16.09 release notes.

Okay, I just got my hands on a board.

Assuming I do not need #3 for now, if I wanted to setup a simple test to try #1,#2,#4, can I test all together with gstreamer, or will I need to write a test application of some sort?

Hi Todor,

  We are porting some image sensor drivers for dragonboard410c,and we know sensor is streaming 1080p@30fps with raw12,but we didn't see any interruption happened for /pro/interruptions/camss,do you have any idea?

  We tried ov5645 module,and we can see the interruptions for camss,ispif and vfe increasing as video is streaming.

Regards,
Xiang

Hi Xiang,

Have you verified that the sensor is streaming using oscilloscope?
You might want to check that the external sensor clock which you use is supported by the sensor driver.
Another thing is the settle-cnt value which we have discussed above previous time.

Thanks Todor,we have verified sensor MIPI signals with oscilloscope,it’s 2 data lanes with data rate 800Mbps

And we assume csipy0_timer_clk is still 200Mhz for settle-cnt(we are based on release 16.09).What else could be wrong?

Regards,
Xiang

Hi Todor,

    I am wondering what is the condition to trigger camss, ispif and vfe interruption.Do you have any idea?

Regards,
Xiang