Hi all,

While I keep working on some code that makes use of VADC module, I ran across the following problem. Reference manual provides the following formula to calculate conversion timing.

Click image for larger version

Name:	conv_time_formula.JPG
Views:	1
Size:	17.4 KB
ID:	2383

Suppose I have DIVA set to 3 in my code, which results in analog internal clock equal to f_ADC / DIVA + 1 = 120 / 4 = 30 MHz.

Now suppose we have 12-bit conversions with STC set to 0. Doing some math shows that the expected conversion time would be approximately 483 ns. In reality, however, the conversion time is close to 675 ns. To test it I configured a GPIO pin to be set at the same time the trigger is sent to ADC to start a conversion and to reset when conversion is finished.

Another thing to note would be that the difference between the expected time vs. true time is consistent with regards to the value of STC for all conversions with the same amount of channels. It increases as I add more channels to the queue source. For instance, for one channel, the discrepancy is about 200 ns, whereas for two channels, the difference increases to 440-470 ns.

Could someone please suggest what could be the cause of such behaviour? Thank you for taking time to read my question.

Best regards,