FatFs returning error when writing large files.

Tip / Sign in to post questions, reply, level up, and achieve exciting badges. Know more

cross mob
User13086
Level 1
Level 1
Hi, I am working on a Bootloader(XMC4700/DAVE432) that flashes the internal flash from SD-Card.

With small .hex images all works fine, but if the image gets bigger (>600k)
then the FatFs Function f_write() returns error code 1 (= FR_DISK_ERR).

It seems that the app's "FATFS" or "SDMMC_BLOCK" cannot handle large files.

Do you have any idea what might be causing this problem? Do you know of workarounds?

best regards
Hans
0 Likes
9 Replies
User13086
Level 1
Level 1
With a "commercial" micro SD-Cards (tested with SanDisk Ultra 16 GB) there are no problems,
also large files can be read and written.


But with "industrial" micro SD-Cards (tested with Cactus 1 GB) the problem is not solved.

Best regards
Hans
0 Likes
lock attach
Attachments are accessible only for community members.
User13086
Level 1
Level 1
I have testet the two SD-Cards with a program that writes
40 000 * 100 Byte (4MB) to a file.


With a 16 GB SD-Card (type=SDMMC_BLOCK_CARD_TYPE_HIGH_CAPACITY) there are no problems.

But with the 1 GB SD-Card (type=SDMMC_BLOCK_CARD_TYPE_STANDARD_CAPACITY_V2) an
error "SDMMC_BLOCK_MODE_STATUS_SECTOR_OUT_OF_BOUND" occours.


I seems that the "SDMMC_BLOCK" app has problems with low capacity SD-Cards.

Is it possible to fix this issue ? (..industrial grade 16GB SD-Cards are very expensive)




Below are the output of the tesprogram for both SD-Cards.



-----------------------------------------------------------------------------------------------------
Test with 1GB SD-Card:


----- SD_CARD_TEST start ! -----
check_SD_CARD()
SDMMC_BLOCK_CARD_TYPE_STANDARD_CAPACITY_V2
Ok mount SD-card
open ok
SDMMC_BLOCK_SD_lCheckSectorBound()
SDMMC_BLOCK_MODE_STATUS_SECTOR_OUT_OF_BOUND, sector_num=3015, local_sector_count=2999
i=5
Error write data(100 byte)
ErrCode=117 is_size=12
close
SDMMC_BLOCK_SD_lCheckSectorBound()
SDMMC_BLOCK_MODE_STATUS_SECTOR_OUT_OF_BOUND, sector_num=3015, local_sector_count=2999
--- unmount ---


----- SD_CARD_TEST end ! -----


-----------------------------------------------------------------------------------------

Test with 16GB SD-Card:

----- SD_CARD_TEST start ! -----
check_SD_CARD()
SDMMC_BLOCK_CARD_TYPE_HIGH_CAPACITY
Ok mount SD-card
open ok
write file successful finished
close
--- unmount ---


----- SD_CARD_TEST end ! -----

Best regards,
Hans
0 Likes
Not applicable
Hi Hans,

This *might* fall into the category of ruling something out. But your second post made me think it could be worth a pre-test, of your various SD cards, 'commercial' and 'industrial', using the PC-based 'h2testw' tool, available for example, at https://www.portablefreeware.com/?id=1519.

I know from the web and experience, that some cheaper SD cards have less capacity than they report, and are sold as. And this includes SD cards that are badged as from reputable manufacturers, but are infact clones. And this is not apparent until an attempt to use the card in excess of its actual capacity. For example, on writing a large number of files, in which case some go AWOL. Or with a single large file, in which case it reads corrupt, or not at all.

The tool does take a while to run. But it writes 1MB files with various values to all reported available locations, and when done, reads back and checks the file values were as written. It's non-destructive, unless the card has issues, in which case data can be lost, so backup first. You'll need of course an SD port on your PC or eg USB-to-SD-port adaptor..

Best regards,

David King
0 Likes
User13086
Level 1
Level 1
Hi David,

Thanks for your tip.
I have tested my 1GB SD-Cards with 'h2testw'.

All was ok, so I think the quality of the SD-Cards are not the problem.

Best regards,

Hans
0 Likes
User13086
Level 1
Level 1
In the file "sdmmc_block_private_sd.c" the function SDMMC_BLOCK_SD_GetSectorCount()
computes the wrong sectorCount for "Standard SD (and MMC) cards" if csd_v1.read_blk_len is zero.

For all 1GB SD-Cards I have tested, the SDMMC functions gets a "..csd_v1.read_blk_len" value of zero.
(This may be another problem.)

mult was computed as:

mult = (uint32_t)(((uint32_t)temp_csd_v1.dev_size_mult + (uint32_t)temp_csd_v1.read_blk_len) -
(uint32_t)7U);

I changed the computation to the formula that is given in the /*original comment*/ (read_blk_len not needed) :

/* Left shift evaluates 1 * 2 ^ (TmpMmcCsd.DeviceSizeMult + 2) */
mult = (uint32_t)temp_csd_v1.dev_size_mult + (uint32_t)2UL;

With this changing everything works fine, I can write large files.


Best regards,

Hans
0 Likes
jferreira
Employee
Employee
10 sign-ins 5 sign-ins First like received
Hi,

Thanks for sharing the fix. The APP will be updated in the next release expected by the end of the month.

Regards,
Jesus
0 Likes
Not applicable
could you please explain the technical reason why mult is so calculated? What and where is the reference for this calculation ?
0 Likes
User13086
Level 1
Level 1
shenj wrote:
could you please explain the technical reason why mult is so calculated? What and where is the reference for this calculation ?


Hi shenj,

mult is computed in the original code as:

mult = dev_size_mult + read_blk_len - 7

The main problem is, that "read_blk_len" is read as 0 !

if read_blk_len is not available(=0), a good default value is 9 (2^9 = 512 byte)
( Standard SD cards until 1GB use a block_len of 512 byte. )

This solution is not perfect, but better than the old one (..read_blk_len == 0)
some information i found here: http://www.hjreggel.net/cardspeed/index.html#special-sd.html

so we have:

mult = dev_size_mult + 9 - 7
mult = dev_size_mult + 2

Sector_Count = (Device_Size + 1) << mult


Best regards
Hans
0 Likes
Not applicable
Hans_ many thanks for the wonderful explanation.
0 Likes