This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

realtime timer simulation

What's the trick to make the C51 simulator increment timers in realtime?

Ive tried setting the xtal value appropriately, on the target panel.

(I just need the simulation's blink led freq to have the same period as the real target)

Peter

Parents
  • I originally chose the blink LED example

    That wouldn't have been so bad hadn't you said that you "just" wanted to get simulated Blinky to blink in 1-second real-time intervals. Using the word "just" implied that the LED was the only thing you're interested in, which is non-sensical. Simulating Blinky any slower than the PC can manage is generally a waste of time.

    I assume Windows timers implement swatch() in the debug process

    No, they don't. What part of "the simulator doesn't even try to establish any relation between simulated time and real-world time" didn't you understand? swatch() operates in simulated time.

Reply
  • I originally chose the blink LED example

    That wouldn't have been so bad hadn't you said that you "just" wanted to get simulated Blinky to blink in 1-second real-time intervals. Using the word "just" implied that the LED was the only thing you're interested in, which is non-sensical. Simulating Blinky any slower than the PC can manage is generally a waste of time.

    I assume Windows timers implement swatch() in the debug process

    No, they don't. What part of "the simulator doesn't even try to establish any relation between simulated time and real-world time" didn't you understand? swatch() operates in simulated time.

Children
  • Come on :-) there HAS to be a way to do this.

    We are not simulating nuclear explosions here in clocktime: we are blinking a led, and running a timeout on a 30 year old communciations protocol with a serial polling routine, at 9600 bps, on a $2, 20 year old commodity part running at 8Mhz

    There has to be some trick. As I suspected, swatch() does not do it (give its FAR too easy to find on the Keil website, and could so easily have been referenced by the discussion group!)

    Perhaps we can choose a device with 2 serial ports, knowing the simulator prgramming allows us to link the 8051 serial port to a real COM port acting in realtime/clocktime/baudtime.

    i.e. at 9600, the binding of PC COM2 to the second virtual serial port can introduce a timebase with 102uS base period into the simulator, by interrupting the chip for each char delivered to the window COM driver. So each serial interrupt is essentially 11 (bits) * 102uS (per-bit).

    Rather than have swatch() cause the irq on ext0, can we let a serial port mapping onto a std baud rate raise the serial interrupt, instead?

    Then I just have a tiny windows program pump out chars are a regular rate, based on Win32 timer() support, acting as a simple clock source.

    The 24hz signal may be badly attenuated due to the windows scheduler, etc. But, its only 24hz Im looking for, on a 2Mhz pentium core, with 400Mhz front side bus! 24/12 allows me to even blink a led in the extension module (that simulating my front panel), naturally.

    Keep the tone fun. Its programmable electronics, not religion!

  • there HAS to be a way to do this.

    Says who?

    allows us to link the 8051 serial port to a real COM port acting in realtime/clocktime/baudtime.

    Whether this link is anything like "real time" remains to be seen. The documentation of uV2 is scarily thin in this area --- if there were some reliable timing involved, I would guess Keil would have proudly boasted that in the documentation.

    Actually, trying to do pretty much anything in hard real-time under any modern Windows is a lost cause. It's simply not what these platforms are meant for. 24 Hz is a harder requierement than keeping the usual Ms-DOS timer (18 Hz), and Windows is already somewhat flaky in its support of that.

    at 9600, the binding of PC COM2 to the second virtual serial port can introduce a timebase with 102uS base period into the simulator

    No, it can't. 104 us is the bit time of a 9600 baud line. That's only relevant inside the PC's UART. The time between events leaving that UART is the byte transmission time, which is about 1 ms. And that's without considering the fact that they all have FIFOs these days, which would cut the event rate by another order of magnitude.

    I told you several posts upstream: you'll almost certainly have to consider writing a problem-specific debugger/simulator extension DLL go get this done, or forget about doing this in the simulator. The simulator is unfit by design for this kind of work.

  • One problem here is that there are two distinct time domains:

    1. Simulation Time
    2. Real Time
    And...the two time domains are not related in a simple way. For instance, Simulation Time is not proportional to Real Time. In some situations, for some devices, on some machines, simulation time is faster than real time. But, in others, simulation time is slower than real time. In fact, the more windows you open in uVision, the slower simulation becomes.

    In these situations, it is not possible to synchronize simulation speed with real time (unless we could slow down real time--if we could do that we would probably stop working on the compiler and development tools).

    We have an option for "run no faster than real time" on our list of features to add. However, few customers want us to slow the tools down so it is a low priority at this time.

    Jon

  • I think it comes down to "product concept", not academics.

    If you think of the simulator as a showoff for the microcode emulation in the core, then one simulates "performance" of the core, and show that one can nicely emulate the 8051 instruction set... Presumably, professionals in this space compete on the accuracy of the core timing and the instruction decoding, which is no doubt full of theoretical issues to war over, for fun.

    If the tool is for software types to build demos, as a rapid developement tool, pre board design/manufacturing, it needs a prototyping emphasis - and concept work for THAT audience.

    The fun part of the simulator was always the extension modules: seeing the I2C device, visually, was fun. Seeing the LED visualization was fun. Getting blinky() to work, naturally, could have been fun... acting as a sanity check.

    I was almost tempted to write my own extension for fun... now Im not. The product concept is evidently flawed - for my purposes. I have no interest whatsoever in the accuracy of simulation, re core timing. Its irrelevant to making a mockup of a product, that will use a commoidity uP on commodity boards.

    So there you have the result:

    I cannot show my customer a blinking led on the simualted front panel (or, rather, it blinks at about 20ms rate, clocktime, on my CPU)

    I cannot send packets through the redirected COM port to the simulator process, and have the IP stack behave normally re protocol exceptions, based on timers.

    This seems a shame, as both features are almost there, and the tool is almost really valuable for rapid application development.

    While I could probably now introduce a 50mS period external signal into the simulator as a "trick", I dont think Ill bother. I'd be fighting the product concept, and thats rarely a good idea.

    Peter.

  • May I add a comment just to establish some perspective? I remember writing in asm51 (no C compiler yet), using my own DOS batch files as my make utility, and debugging the code with port pins and an oscilloscope. The first time I used PK51 I was so happy and I couldn't believe how much this tool allowed me to do. It was like being a caveman who discovers fire, the wheel, and refrigeration all at once. When a slick new 51 variant comes out sure I'd like to have the new features supported in simulation right away but that never holds me back, the Keil tools are great.

  • I had too requirements to run simulator 'nearly' realtime before. Even when I debug my communication stack's and profile my code. I did some work on add-on-own dll to regulate simulation run time. But now I haven't functional dll. I'm ready to cooperate.
    Roman

  • a bit of dictionary definition makes me wonder

    how can "simulate" be "real"

    I just wonder

    Erik

  • I think the implication/desire is that if simulation speed >= real-time then simulation speed could be constrained or synchronized with real-time.

    Unfortunately, simulation speed may be <,=,> real-time. It's still useful, but it isn't real-time.

    Presonally, I want the simulator to run as fast as possible (and hopefully, much faster than real-time).

    Jon