Dec 04, 2020
01:11 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dec 04, 2020
01:11 AM
Hello,
I would like to build an application that outputs DSD results through the internal DAC.
But the DAC input stages only take 12-Bit inputs while the DSD only produces 16 bit results -> the 4 MSBs would be cut off.
Is there any way around this that does not involve manually bit-shifting each result?
The DAC has a SCALE and MULDIV input, but this only takes 12-bit inputs ...
Any Ideas?
Regards,
Funky Luke
I would like to build an application that outputs DSD results through the internal DAC.
But the DAC input stages only take 12-Bit inputs while the DSD only produces 16 bit results -> the 4 MSBs would be cut off.
Is there any way around this that does not involve manually bit-shifting each result?
The DAC has a SCALE and MULDIV input, but this only takes 12-bit inputs ...
Any Ideas?
Regards,
Funky Luke
0 Replies