VADC Sample and Conversion time longer than defined?

Tip / Sign in to post questions, reply, level up, and achieve exciting badges. Know more

cross mob
User13818
Level 1
Level 1
Hello,

I am trying to use the ADC of XMC4800 as fast as possible.
So I took the DAVE APP "ADC_MEASUREMENT" and defined sample time as 55.6nsec which results in a total conversion time of 458.333nsec.
I set "Start conversion after initialization" so that I get a first conversion. I set up a "End of measurement interrup" in which I call "ADC_MEASUREMENT_StartConversion" at the end so that it starts the next conversion.

What I now see with my oscilloscope is that I get a conversion time of around 2microsec instead of 458

void ADC_INT()
{
DEBUG_ON; // set Digital IO

DEBUG_OFF; // reset Digital IO
ADC_MEASUREMENT_StartConversion(&ADC_MEASUREMENT_0);

}


The time in the interrupt routine takes about 1microsec (measured on oscilloscope with the digital IO).

Does anyone know why it takes so long? Did I miss some configurations (that is my assumption)?
Should I use other APPs?
Should I use a DMA transfer so that I can save time with not going into the interrupt routine? Nevertheless, somehow I have to read and process the ADC data. Actually I wanted to use "ADC_MEASUREMENT_GetResult()" in the interrupt to read the values, but this consume even more time for the interrupt handling and conversion time.....

Hopefully there is someone who had similar problems.

Appreciate your help!
0 Likes
1 Reply
User13818
Level 1
Level 1
No ideas at all?

We planned to replace some old analog parts with the XMC4800, but therefore I need to get a solution for this problem to understand the behaviour completeley.

Hopefully anyone can help me, thanks!
0 Likes