We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
Hi,
Does anybody know if it is possible to run a simulation, with all its functionality, but without the IDE being launched? (No GUI)
The aim here is to debug the windows application that communicates with this simulated device, and not the device code itself.
If possible, can it be done using the demo version, or do I need to purchase a full software license for it?
Many thanks in advance! Shahar.
Since I'm not interested in time-accurate simulation of the peripherals (At the end I'm interested in QAing my Win application only...), this I think should not be an issue.
Generally speaking, since all peripheral mechanisms are interrupt-driven in the MCU implementation, I don't see where there should be a problem. (The dll implements the relevant callbacks - as the AGSI API suggests.)
SPI communication for instance, will be carried out exactly as fast as the simulation goes.
The only issue being the mapped UART - as I understood from you.
Can you please explain to me how (and why) the inclusion/exclusion of a mapped UART impacts the simulation timing?
Thanks.
In that case, might it not be simpler to just compile your embedded code into some kind of PC format for the purpose of testing the Win App?
It doesn't impact the simulation timing.
But it represents a scaling error in relation to the Windows application that is counting timeouts and transfer times based on a different time scale.
If your protocol specifies that there should be five character pauses between message and response, then the Windows side will compute 5x1ms = 5ms wall time. A simulator that runs at 20% of real time will also compute 5ms delay. But this delay will be scaled and look like 25ms for the Windows application. And if the Windows application makes a 5ms delay, it will be scaled and look like 1ms on the simulated machine.
Seen another way. A 9600 baud UART can transfer 1000 characters/second. If the simulated machine runs at 20% of real time, then the UART may either run with clock-cycle-based transfer times, in which case the UART will only manage 200 characters/second, to scale the virtual baudrate. The other alternative, is that the simulated UART doesn't compute time, but allows a byte to be received instantly, i.e. as soon as the Windows application sends out a byte, the receive flag gets set in the simulated machine.
When using the UART for sending out debug information, it is good to have the UART take zero time, basically getting an infinite baudrate.
I have actually considered what you proposed.
The thing is, I have ~150K of Embedded C code that naturally contains many target-specific elements.
Migrating this code base so it can be run as some "Intex-Windows process" sounds to me like a lot of work.
If you are aware of any tools / technic that can help me in this direction, please let me know.
Thanks!
Actually, building an AGSI DLL and all the simulation scripts sounds like a lot of work to me...!