Hi, can any one plz help me to solve fatfs read problem?
I am using Cortex-M3 luminary lm3s6965 controller. My application is such that I am writing 100-150k data(1024 bytes at a time; i.e. using for loop), in single .csv file, in 4GB micro SD card using fat16. and it is working perferctly.
Bt when I am trying to read data from sd card(1024 bytes at a time), then it will read data upto 65535 bytes and then it gets hang.
So can i read complete 100-150k data from sd card.
Could any body help? Thanks in advance...
Taking a wild *** guess on what happening it feels a little like the implementation of the absolute sector value is only 16 bit eg 0 - 65535
On the SD card is the data correct e.g. has the 16 bit value rolled over and restarted writing at start of the file.
I am aware on certain examples given by ST the SD card part didnt do multi sector read/writes well because they used a 16 bit value for the Absolute sector counter rather than a 32 bit.
Never worked with a luminary device so haven't seen the code
oops didnt read closely enough your reading.
I use fatfs and can read 300K plus no issues
Check your code that deals with working out the absolute sector to read from on the SD card to and make sure its not adding a 16 bit offset to a 32 bit value.
Would check/instrument the DISKIO.C abstraction code, make sure there weren't any issues there with memory regions, DMA transfer limits, etc. And evaluate if larger transfers passed down needed to be decomposed further, or where the issue is coming from.
My guess would be it's not FatFs, so focus on validating your SD card code, perhaps externally from FatFs.
ST used 32-bit BYTE OFFSET addressing, which has obvious limits when dealing with 4GB+ BLOCK addressed devices.
I have tried reading sd card in different ways, but its not working... After reading 64K data using "f_read" function cause processor to hang.
I am using "FatFs - FAT file system module R0.04b (C)ChaN, 2007".
Here I have already saved ".csv" file into sd card, and I am just trying to read this .csv file. My sample code is given below:
f_mount(0, &l_stFatFs); if(f_open(&lstRead_File, (const uint8_t *)&gbRD_SDFile_Name[0], FA_READ)==FR_OK) { ldLen = f_size(&lstRead_File); f_close(&lstRead_File); ldLen /= 1023; } for(gdIndex = 0; gdIndex < ldLen; gdIndex++) { stSDC.FileName = (uint8_t *)&gbRD_SDFile_Name[0]; stSDC.Buffer = (char *)&gbFTP_Buffer[0]; stSDC.Length = 1023; stSDC.Offset = gdData_Index; __disable_irq(); if(!(MT_SDC_Read(&stSDC))) gdData_Index += 1023; __enable_irq(); }
This one is not working. SO culd any one give me sample code to read and write to sd card using fatfs...
You haven't shown what data type your "gdData_Index" has.
And this is probably the first time I have seen anyone walking through a large file with 1023 steps at a time. Why not use a block size of 2^n - 256, 512, 1024 etc? It's quite common that the driver layer has buffer schemes that works on 2^n-sized blocks. And memory cards, hard drives etc normally have sectors that are 2^n bytes large.
Maybe you get an issue somewhere in the driver layer when your last read spans the 64kB boundary. If you use 1024-byte strides, then none of your reads will span the 64kB boundary - one read will fit before and the next read will start at offset 65536 = 2^16.
Well where's the code doing the actual reading? And why would you keep opening/closing the file? Interrupts disabled? Are you polling in you SD read code?
My problem is still not resolved to read sd card having file size more than 64K.
I have tried one more experiment: If my file size is >64K, then i read only 64k data continuosly and that code works fine.
So what is real problem? Plz help.......... Give me some sample code...
View all questions in Keil forum