Hi, can any one plz help me to solve fatfs read problem?
I am using Cortex-M3 luminary lm3s6965 controller. My application is such that I am writing 100-150k data(1024 bytes at a time; i.e. using for loop), in single .csv file, in 4GB micro SD card using fat16. and it is working perferctly.
Bt when I am trying to read data from sd card(1024 bytes at a time), then it will read data upto 65535 bytes and then it gets hang.
So can i read complete 100-150k data from sd card.
Could any body help? Thanks in advance...
Taking a wild *** guess on what happening it feels a little like the implementation of the absolute sector value is only 16 bit eg 0 - 65535
On the SD card is the data correct e.g. has the 16 bit value rolled over and restarted writing at start of the file.
I am aware on certain examples given by ST the SD card part didnt do multi sector read/writes well because they used a 16 bit value for the Absolute sector counter rather than a 32 bit.
Never worked with a luminary device so haven't seen the code
oops didnt read closely enough your reading.
I use fatfs and can read 300K plus no issues
Check your code that deals with working out the absolute sector to read from on the SD card to and make sure its not adding a 16 bit offset to a 32 bit value.
Would check/instrument the DISKIO.C abstraction code, make sure there weren't any issues there with memory regions, DMA transfer limits, etc. And evaluate if larger transfers passed down needed to be decomposed further, or where the issue is coming from.
My guess would be it's not FatFs, so focus on validating your SD card code, perhaps externally from FatFs.
ST used 32-bit BYTE OFFSET addressing, which has obvious limits when dealing with 4GB+ BLOCK addressed devices.
I have tried reading sd card in different ways, but its not working... After reading 64K data using "f_read" function cause processor to hang.
I am using "FatFs - FAT file system module R0.04b (C)ChaN, 2007".
Here I have already saved ".csv" file into sd card, and I am just trying to read this .csv file. My sample code is given below:
f_mount(0, &l_stFatFs); if(f_open(&lstRead_File, (const uint8_t *)&gbRD_SDFile_Name[0], FA_READ)==FR_OK) { ldLen = f_size(&lstRead_File); f_close(&lstRead_File); ldLen /= 1023; } for(gdIndex = 0; gdIndex < ldLen; gdIndex++) { stSDC.FileName = (uint8_t *)&gbRD_SDFile_Name[0]; stSDC.Buffer = (char *)&gbFTP_Buffer[0]; stSDC.Length = 1023; stSDC.Offset = gdData_Index; __disable_irq(); if(!(MT_SDC_Read(&stSDC))) gdData_Index += 1023; __enable_irq(); }
This one is not working. SO culd any one give me sample code to read and write to sd card using fatfs...
You haven't shown what data type your "gdData_Index" has.
And this is probably the first time I have seen anyone walking through a large file with 1023 steps at a time. Why not use a block size of 2^n - 256, 512, 1024 etc? It's quite common that the driver layer has buffer schemes that works on 2^n-sized blocks. And memory cards, hard drives etc normally have sectors that are 2^n bytes large.
Maybe you get an issue somewhere in the driver layer when your last read spans the 64kB boundary. If you use 1024-byte strides, then none of your reads will span the 64kB boundary - one read will fit before and the next read will start at offset 65536 = 2^16.
Well where's the code doing the actual reading? And why would you keep opening/closing the file? Interrupts disabled? Are you polling in you SD read code?
My application is to log Analog/Digital IO data to local as well as remote storage for data monitoring. I am writing to SD card after every 100mSec. Then after every 1hour i will read data from SD card, & upload it to remote server.
At a time of reading sd card, i'll have to find the size of file; so for that i open the file then get the file size using "f_size" and then close the file.
* "gdData_Index" data type is uint32_t.
* If i try to read data in terms of 2^n bytes i.e. 1024 bytes then "f_read" function will return me 0 byte of data.
* As SD card works on SPI, i'll have to disable all interrupts before accessing SPI to avoid system crash. once the SPI work gets over then again enable all interrupts.
My problem is still not resolved to read sd card having file size more than 64K.
I have tried one more experiment: If my file size is >64K, then i read only 64k data continuosly and that code works fine.
So what is real problem? Plz help.......... Give me some sample code...
"So what is real problem?"
That you don't seem to like to perform any debugging.
Haven't you taken a closer look at the low-level SPI code and seen if it does what it should?
I'm not too impressed with an implementation that requires the interrupts to be disabled while you read 1023 bytes - that's a significant amount of time, which means serial ports, timers etc will not be properly serviced during this time. Maybe you don't use other time-critical peripherial hardware, but real-world applications normally do. Correct code should only need to turn off interrupts around a few processor instructions, when something needs to be performed atomically.
By the way - if 1024 is too large block size then I would have tried 512 instead of selecting 1023. It's so very often faster to use a block size that is 2^n bytes large when there is a lower layer that may contain buffering, or when there is hardware that is block-based.