Hi there,
I run the codes below in uVision3 V3.30 in order to see the effect of AUXR1 register in switching DPTR0 and DPTR1. I choose the device as AT89S51 which provides the dual DPTR function. But when I run the codes step by step in uVision, the Address Window appears the pointer address 4000h and 8000h have been written into the same address saying 82h-83h - the DPS in AUXR1 didn't switch the DPTR though the instruction 'XRL DPTRSW, #DPS' definitely alters the DPS in step run. Why this happened?
DPTRSW DATA 0A2H ; DPS EQU 00000001B ; ORG 00H
MOV R7, #4 ; MOV DPTR, #4000h ; XRL DPTRSW, #DPS ; MOV DPTR, #8000h ;
LOOP: XRL DPTRSW, #DPS ; CLR A MOVC A, @A+DPTR ; INC DPTR ; XRL DPTRSW, #DPS MOVX @DPTR, A ; INC DPTR ; DJNZ R7, LOOP ;
END
Erik Malund said: I know no "skilled and experienced engineer" that use a simulator. All "skilled and experienced engineers" I know use an emulator.
An emulator is a nice tool. We have a few emulators stuck in obscure corners of the engineering dept. steel storage cabinets. One for each family of CPU used sometime in the past. Actually, an emulator was a required tool back then, because you really could not test firmware, place breakpoints and analyze trace buffers without one. The software debuggers sucked big time, especially for cross-development. The good and extremelly expensive emulators were usually coupled with a sizable logic analyser (a real one), that helped you to deglitch your board. The good stuff was based on unix workstations, of course, not those feeble PCs.
Then, some five moore's cycles later, there came available massive computing power at every desk. I have in my develop PC today VHDL simulators that run in a few minutes RTL files that would take 5 hours to run in a HP workstation a few years ago. Linear Tech's SPICE engine runs a dozen times faster than the first UCB SPICE I used in a SUN workstation, and the core code is very similar. That allows me to fully characterize complex ADC and SMPS circuits in one week, before the hardware is even laid-out. The result is that usually I make just one PCB iteration for these designs now, and the prototype actually works pretty much exactly the same as the SPICE models.
The same applies to firmware simulation. I have in my machine simulators that run full-blown instrumented firmware simulations faster than realtime. A well-planned firmware deployment MUST account for a well-designed simulation environment. There are many situations in which an emulator or on-chip debugger cannot substitute for a good simulator. When hunting for hard bugs, a simulator is more productive than an emulator, because you have a more controlled environment. Especially in hard-realtime systems, where the on-chip debug resources may interfere with the system computing load.
In other words, simulation is ESSENTIAL to a professional design flow. So much so that virtually every EDA tool vendor throws real money in development of good simulation tools. For example, for some high-reliability contracts, you must provide proof of verification for your entire software, by means of testbench simulation vectors and results for every function in your software, something really expensive to do if you use a hardware emulator.
I am not dismissing the importance of on-chip debugging and integrated trace hardware on the current CPUs. The embedded trace macrocell in the ARM cores do really facilitate system-level verification. But even then, the simulator is essential on the workflow.
I see it as a much neglected part of the design flow, almost as neglected as comprehensive firmware testing methods and verification.
Per Westermark said: "The availability of a simulator is often a deciding factor when choosing which compiler suite to buy for a project, since it can allow the majority of time after reception of a hw prototype to be spent validating the hardware, instead of first having to decide what is hardware and what is software errors.
A simulator can also alow testing of some worst-case scenarios that may be _very_ hard to generate on a real hardware platform. "
To which I fully agree. The Keil simulator, for example, has some really powerful and nice features. You simply cannot duplicate the functionality of the signal functions provided by the simulator environment to generate signals and complex hardware responses. To test failsafe interface code, you often must simulate behaviour that is very difficult to obtain from real hardware, but is easily achievable from a signal function, or a testbench.
What I don't agree in Erik statements is that one should be 'grateful' for the toolset functionality, and simply ignore the bugs.
TOTALLY MISQUOTED
I stated something like "bugs should always be fixed, features are up to the provider".
I can appreciate your point, but I think you are missing one thing here that I want to make clearer: Nobody is demanding full simulation for every of the many 8051 derivatives on the market. Then why are you "demanding full simulation for" this particular derivative?
re the posts about "the importance of simulation"; It may be that some work well by such a method and good luck with that. I never would, simply because of the issues I have seen that no simulation could ever 'predict'. I can see simulation as a possible useful tool for computing dominant projects, but for an I/O dominant project, I stand by my position that simulation is worthless or, at best, not very useful.
Erik