I am using an X5-210M in conjunction with the 8-ch DDC core. You said previously that as stated in the DDC datasheet, I and Q are 16-bit encoded. I have made the following measurement:
1- sinus at 14.50 MHZ on ADC0. Clock 50 MHz. decimation rate 128 (no modification of the test.ini file); DDC tuning freq is 14.48MHz. The sinus on ADC0 has a power of 10 dBm, which means 1Vpp (verified on an oscilloscope 50Ohm input), so that 1Vpp covers the full range of the ADC.
When I look at the data recorded with Matlab, the I (or Q) channel at the output of the DDC covers only the range 1072 to -1074. That makes around 11 bits for encoding, and not 16 as stated in the datasheet. It’s even less than the input of the ADC which is 14 bits. Consequently the minimum input level on ADC0 is around -50 dBm. That is normal as the max is encoded on 11 bits wich makes 11bits*6dB =66dB; so +10dBm-66dB = -50dBm.
Could you explain why the I (or Q) are not encoded on 16 bits?
Could you confirm that the powermeter does not influence the scaling of the data, that powermeter is just for information?
And what is the range of the decimation? When I look at the test.ini file, I can see the CIC decimation rate (which 128 by now). When it is 128 for the CIC, what is the total decimation rate (I think I have seen somewhere in the documentation that the 2 FIR filters also decimate). And what is the range of the CIC decimation, and the step between 2 values?
X5-210M has 14 bits A/Ds.
The ini file shows the basic setup for the configuration of the DDC.
The output of the DDC is 16 bits max.
This however is dependent on the selection of gain at each stage in the DDC.
DDC has CIC, CFIR and PFIR stage.
The gain at CIC can be up to 2 bits (12 dB).
The gain at CFIR and PFIR can be upto 6 bits (72 dB)
The reason for this user defined gain setting is the flexibility of
adjusting the sensitivity of receiver depending on the application.
You can select optimum values for CIC, CFIR and PFIR bits to get a full
scale 16 bit output.
This can be done by changing the test.ini file.
e.g. if you select Bstart = 1 instead of 2 for [CIC], you can get 6 dB
gain in the output. Care should be taken while choosing these values to
avoid overflow in the DDC output.