What's the trick to make the C51 simulator increment timers in realtime? Ive tried setting the xtal value appropriately, on the target panel. (I just need the simulation's blink led freq to have the same period as the real target) Peter
What's the trick to make the C51 simulator increment timers in realtime? There is none, and there shouldn't be. The simulator has its own simulated time --- synchronizing that to wall-clock time would be impossible on slow machines, and wasteful on fast ones. (I just need the simulation's blink led freq to have the same period as the real target) And what good is that supposed to do you?
I'm (politely) amazed at the two responses. If I paraphase the responses: the original requirement was non sensical. Use a real board, and jtag debugger, Peter. Let me be more precise about my working scenario, which I simplified and recast in the original post to use only a few words. I have (borrowed) an IP stack on an 8051. It has a 24hz signal derived from a timer, which acts as a timeout for a polling loop on the serial port. The same signal is a timebase for other exception handlers in the TCP stack. In the simulator, I have succesfully redirected the IP listeners's abstract serial port to the PC COM1. Another instance of the simulator is acting as IP client, initiating a TCP connection, connected to COM2. A null modem anbd suitable handshake is emulated between the two COM ports, using some fancy windows driver software. At some point, later on, a real client board will talk unto a simulated server Things work fine, except that the 24Hz timebase is not in realtime. I cannot test so far (in the simulator) the reality, which seemed a really obvious thing to want to do! Now to my incredulity. If the simulator handles core timing so that the peripherals are accurate in realtime (e.g. the baud rates, the smartcard barrel shifter in ETU, etc), why not the core 8051 timers!? If its expense and waste issue, Dont worry! My machine is idle 12 hours a day as it is. I can happily waste a pentimum's ghz of core bandwidth emulating an 8Mhz/12 8051 timer.
I can happily waste a pentimum's ghz of core bandwidth emulating an 8Mhz/12 8051 timer a couplew of comments: 1) it may be possible to real time simulate ann 8MHz/12 '51 chip, but would you not want tghe same for a 100MHz/1 2) the simulator simulate it does not emulate if you need to emulate, buy an ICE or switch to "chips with built in ICE" (SILabs, some late derivatives from Philips and ST, maybe more). Erik
Peter, At least some of those peripherals aren't really handled by Keil's simulator at all. For instance, the PC is responsible for handling the timing on the serial ports, etc. Keil doesn't have to provide a timer to do this. I'm not familiar with a barrel shifter, but that may be a similar scenario. Also, trying to get good timing resolution within Windows generally means straying into the realm of API calls that will work differently (or sometimes not at all) on different machines, so I'm not surprised they don't support it. That being said, the comments about getting an ICE might not be a bad idea. There are some EXTREMELY inexpensive ICEs out there that will do this sort of thing. Of course, if your target board doesn't even exist yet, that could be a bit of a wrinkle. :)
That being said, the comments about getting an ICE might not be a bad idea. There are some EXTREMELY inexpensive ICEs out there that will do this sort of thing. Of course, if your target board doesn't even exist yet, that could be a bit of a wrinkle. Not necessarily, My Ceibo ICEs for the XA, the standard '51 and the x2 '51 (but not for the LPC) work beautifully without a "target board". Erik
erik, True enough... I think most ICEs will work in a targetless mode of some sort. Visualizing a blinking LED with them as Peter had originally asked, however, might be somewhat difficult. :)
the original requirement was non sensical. As posted, yes, it was. You didn't mention any connection between the simulator and the real world other than your visual comparison of the simulated LED blink speed an a wall clock. And requiring those to be in synch is nonsense. Use a real board, and jtag debugger Nobody mentioned JTAG up to that point in the debugger --- only emulators. In the 8051 world, even though some JTAG-enabled variants now exist, "emulator" usually still refers to the old-fashioned "unwieldy drop-in replacement for the real CPU" kind of device. If the simulator handles core timing so that the peripherals are accurate in realtime [...]why not the core 8051 timers!? Because it doesn't do the former, there's no reason it should attempt the latter. You're mixing two possible meanings of "realtime" here. The simulator internally is realtime-correct, i.e. everything that happens within the simulated CPU has exactly correct timing behaviour relative to everything else in that category. What the simulator does not even try to do, while you insist it should, is establish any kind of relation between the simulated time (as displayed in the "Regs" pane) and real-world time (as displayed on your wristwatch), or any other time base you might think of. If you want to lock the simulated times of two instances of uV2 running in parallel, you'll at least have to write your own simulator extension DLLs to do it. I wouldn't bet on it being possible at all. The builting simulator is a very powerful tool for some jobs --- but it's not a panacea.
not really, the Ceibo's have a connector for each port and a row of led's with an identivcal connector, you plug in a little cable and a port has LEDs on all pins Erik
Nobody mentioned JTAG up to that point in the debugger --- only emulators. In the 8051 world, even though some JTAG-enabled variants now exist, "emulator" usually still refers to the old-fashioned "unwieldy drop-in replacement for the real CPU" kind of device. While you are 100% correct in this statement, it has been common to refer to the JUAG etc debuggers as "built in ICE" since the PC screen basically is the same Erik
I originally chose the blink LED example - rather than my actual need for a 24Hz timebase - as it clearly linked the notion of realtime to clocktime, not F.Osc clock pulses, or internal core states, or F.CPU. It seemed clearer. Anyways, Ive given up on the (non-obvious!) literal simulation. I think I can just create a #ifdef for the SIMUALTOR build, and use an external source as the timebase. I assume Windows timers implement swatch() in the debug process, and thus I can introduce clocktime into the simulation using swatch(1/24). signal void int0_signal (void) { while (1) { PORT3 |= 0x04; /* pull INT0(P3.2) high */ PORT3 &= ~0x04; /* pull INT0(P3.2) low and generate interrupt */ PORT3 |= 0x04; /* pull INT0(P3.2) high again */ swatch (0.5); /* wait for 1 second */ } }
I originally chose the blink LED example That wouldn't have been so bad hadn't you said that you "just" wanted to get simulated Blinky to blink in 1-second real-time intervals. Using the word "just" implied that the LED was the only thing you're interested in, which is non-sensical. Simulating Blinky any slower than the PC can manage is generally a waste of time. I assume Windows timers implement swatch() in the debug process No, they don't. What part of "the simulator doesn't even try to establish any relation between simulated time and real-world time" didn't you understand? swatch() operates in simulated time.
Come on :-) there HAS to be a way to do this. We are not simulating nuclear explosions here in clocktime: we are blinking a led, and running a timeout on a 30 year old communciations protocol with a serial polling routine, at 9600 bps, on a $2, 20 year old commodity part running at 8Mhz There has to be some trick. As I suspected, swatch() does not do it (give its FAR too easy to find on the Keil website, and could so easily have been referenced by the discussion group!) Perhaps we can choose a device with 2 serial ports, knowing the simulator prgramming allows us to link the 8051 serial port to a real COM port acting in realtime/clocktime/baudtime. i.e. at 9600, the binding of PC COM2 to the second virtual serial port can introduce a timebase with 102uS base period into the simulator, by interrupting the chip for each char delivered to the window COM driver. So each serial interrupt is essentially 11 (bits) * 102uS (per-bit). Rather than have swatch() cause the irq on ext0, can we let a serial port mapping onto a std baud rate raise the serial interrupt, instead? Then I just have a tiny windows program pump out chars are a regular rate, based on Win32 timer() support, acting as a simple clock source. The 24hz signal may be badly attenuated due to the windows scheduler, etc. But, its only 24hz Im looking for, on a 2Mhz pentium core, with 400Mhz front side bus! 24/12 allows me to even blink a led in the extension module (that simulating my front panel), naturally. Keep the tone fun. Its programmable electronics, not religion!
there HAS to be a way to do this. Says who? allows us to link the 8051 serial port to a real COM port acting in realtime/clocktime/baudtime. Whether this link is anything like "real time" remains to be seen. The documentation of uV2 is scarily thin in this area --- if there were some reliable timing involved, I would guess Keil would have proudly boasted that in the documentation. Actually, trying to do pretty much anything in hard real-time under any modern Windows is a lost cause. It's simply not what these platforms are meant for. 24 Hz is a harder requierement than keeping the usual Ms-DOS timer (18 Hz), and Windows is already somewhat flaky in its support of that. at 9600, the binding of PC COM2 to the second virtual serial port can introduce a timebase with 102uS base period into the simulator No, it can't. 104 us is the bit time of a 9600 baud line. That's only relevant inside the PC's UART. The time between events leaving that UART is the byte transmission time, which is about 1 ms. And that's without considering the fact that they all have FIFOs these days, which would cut the event rate by another order of magnitude. I told you several posts upstream: you'll almost certainly have to consider writing a problem-specific debugger/simulator extension DLL go get this done, or forget about doing this in the simulator. The simulator is unfit by design for this kind of work.
One problem here is that there are two distinct time domains:
I think it comes down to "product concept", not academics. If you think of the simulator as a showoff for the microcode emulation in the core, then one simulates "performance" of the core, and show that one can nicely emulate the 8051 instruction set... Presumably, professionals in this space compete on the accuracy of the core timing and the instruction decoding, which is no doubt full of theoretical issues to war over, for fun. If the tool is for software types to build demos, as a rapid developement tool, pre board design/manufacturing, it needs a prototyping emphasis - and concept work for THAT audience. The fun part of the simulator was always the extension modules: seeing the I2C device, visually, was fun. Seeing the LED visualization was fun. Getting blinky() to work, naturally, could have been fun... acting as a sanity check. I was almost tempted to write my own extension for fun... now Im not. The product concept is evidently flawed - for my purposes. I have no interest whatsoever in the accuracy of simulation, re core timing. Its irrelevant to making a mockup of a product, that will use a commoidity uP on commodity boards. So there you have the result: I cannot show my customer a blinking led on the simualted front panel (or, rather, it blinks at about 20ms rate, clocktime, on my CPU) I cannot send packets through the redirected COM port to the simulator process, and have the IP stack behave normally re protocol exceptions, based on timers. This seems a shame, as both features are almost there, and the tool is almost really valuable for rapid application development. While I could probably now introduce a 50mS period external signal into the simulator as a "trick", I dont think Ill bother. I'd be fighting the product concept, and thats rarely a good idea. Peter.
May I add a comment just to establish some perspective? I remember writing in asm51 (no C compiler yet), using my own DOS batch files as my make utility, and debugging the code with port pins and an oscilloscope. The first time I used PK51 I was so happy and I couldn't believe how much this tool allowed me to do. It was like being a caveman who discovers fire, the wheel, and refrigeration all at once. When a slick new 51 variant comes out sure I'd like to have the new features supported in simulation right away but that never holds me back, the Keil tools are great.