This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Data from serial line (or virutal com port) in simulation.

Hi,

I am using serial port in my project.
I want to use simulator on Windows PC with uVision IDE simulator running my application.
The input to application should be from UART.
I know that I can
1. Create simulation scripts.
2. Use the following:

MODE COM2 9600, 0, 8, 1
ASSIGN COM2 <S0IN >S0OUT
S0TIME = 1


to get the data from COM2.

COM2 is a virtual com port paired with other virtual com port (COM5).
Application (actually python), is supplying data to COM5.

This "almost" works. The data goes through. Unfortunatelly it is not reliable.
Some bytes are lost.

I have checked thoroughly that data is ok on COM5.

I am simulating LPC2378. The UART is handled in polling mode.
LPC application does not do anything else at this stage.

I know that interrupt based application would be better, next thing on my todo list,
but this should also work.

Doing some excersizes in simulator I discovered that it does work well only if there are programmed delays between character.
I wonder if this is not a solution. Intercept the data from COM in simulation and supply it char by char with programmed delays. Is it possible to get the data from serial in script function?

Regards
Hubert

Parents
  • Hallo!

    I did several applications running on f/w side and exchanging through their UARTs with s/w monitors on PC side. The PC uses built-in UART (RS-232) chip. The only TX/RX lines are in use (no additional control lines at all), this is specific of my system :-(
    I am using Windows XP to run the s/w counterparts. Yes, it is easy to lose bytes!

    One solution I found that works pretty good for me, is to send information by portions of multiple of 16 bytes (typical size of FIFO of PC's UART) AND do delays between not separate bytes but between these 16-byte wide packets. Then even Windows-based application catches packets satisfactorily at 115200 kbit/s

    Also, I found the Hyperterminal is pretty sluggish since able to lose bytes even you sending them at 9600.

    Interrupts itself will not solve the problem since it is just a convenient mechanism to catch the packets. The only h/w flow control can help drastically. However, if it is OK with you to loose performance you may try use s/w control per each byte or group of bytes (like an acknowledge) to provide feedback with sender. It leads to significant overheads.

    Regards,
    Nikolay

Reply
  • Hallo!

    I did several applications running on f/w side and exchanging through their UARTs with s/w monitors on PC side. The PC uses built-in UART (RS-232) chip. The only TX/RX lines are in use (no additional control lines at all), this is specific of my system :-(
    I am using Windows XP to run the s/w counterparts. Yes, it is easy to lose bytes!

    One solution I found that works pretty good for me, is to send information by portions of multiple of 16 bytes (typical size of FIFO of PC's UART) AND do delays between not separate bytes but between these 16-byte wide packets. Then even Windows-based application catches packets satisfactorily at 115200 kbit/s

    Also, I found the Hyperterminal is pretty sluggish since able to lose bytes even you sending them at 9600.

    Interrupts itself will not solve the problem since it is just a convenient mechanism to catch the packets. The only h/w flow control can help drastically. However, if it is OK with you to loose performance you may try use s/w control per each byte or group of bytes (like an acknowledge) to provide feedback with sender. It leads to significant overheads.

    Regards,
    Nikolay

Children
  • Nikolay,

    I do not experience this problem using hyperterminal.
    I am pretty sure at this stage that the issue is between simulation thread, which runs in the background of uVission simulation. I was able to reproduce this case using only simulation function(signal) to supply data to uart. If there are no dealys it works the same way. If I don't introduce dealys in simulation the last byte (out of 16 sent) is received. There is a thread in this forum:
    µVISION DEBUGGER: AUTOMATED SERIAL INPUT SCRIPT
    which shows that there must be a delay between bytes.
    This is missing when mapping is done by ASSIGN function (IMHO).
    I agree with you that interrupt driven approach will not help in this case.
    I think I need to change approach and create dll which will receive data (e.g. over TCP/IP socket) and supply it to UART with the specified rate, i.e. no faster than at specified rate.

    Regards
    Hubert