I2S0 Bugs for Hikey960

Hi,

I’m using the I2S driver from 4.9 kernel version ported to 4.19.
But this driver has several issues:

1 - Only supports 2 Channels, 48000 Sample Rate, 16 Bit.
2 - Can only be master on Bit Clock and Frame Clock. Slave mode creates a lot of corruption in the data in I2S.
3 - Wrong bit clock. The bit clock should be 2x16x48000=1536000, but it is 3072000.

I can see this driver was sent to mainline: https://lkml.org/lkml/2019/2/28/541
But Pengcheng Li didn’t answer the reviews for this driver.

Does anyone knoes whats happeing with the support for this board?

Thanks
Lucas

Its starting to have that abandoned feel. I haven’t seen much from hisilicon about this in quite a while.

For (1), there is some discussion here about mono, but I haven’t had a chance to actually try it out since its priority has fallen significantly for me – I’ve implemented software workarounds;
https://bugs.96boards.org/show_bug.cgi?id=674

Other sample rates are possible with a driver update. I’ve done so here;

Your (3) is invalid. Your calculation only shows the MINIMUM PERMISSIBLE bit clock. It is allowed to be much higher. In fact, 3 MHz is what you will normally see in use for hardware that supports 32 bit sample sizes. The transmitter may only fill in the 16 most significant bits, and the receiver will only pay attention to those 16 bits, the rest will be 0’s and discarded. But more significantly, if the two sides are operating at different sample sizes (which often happens if one or both sides is non-programmable), then the data still makes sense.

The actual key to the transmission is the WORD CLOCK, which defines where valid data begins. The receiver takes the WC transition as the trigger to begin reading bits from the data line, counted by the bit clock. When it finishes reading the sample, it waits any number of bit clocks until the next WC transition.

3 - Well, the problem is the codec as slave needs to know whats the frequency for Bit Clock and the driver answers 1536000, but it outputs 3072000. So what to do in the driver to really find the correct bit clock ?

I think you’re going to have to start off by explaining what codec you are trying to use, what driver you are trying to use, and how you are trying to configure it, since that doesn’t make any sense. You should NOT have to do ANYTHING to configure the slave codec to the proper bit clock.