<?xml version="1.0" encoding="UTF-8" ?>
<?xml-stylesheet type="text/xsl" href="https://community.arm.com/utility/feedstylesheets/rss.xsl" media="screen"?><rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:slash="http://purl.org/rss/1.0/modules/slash/" xmlns:wfw="http://wellformedweb.org/CommentAPI/"><channel><title>Running a simulation without the debugger and IDE</title><link>https://community.arm.com/developer/tools-software/tools/f/keil-forum/26208/running-a-simulation-without-the-debugger-and-ide</link><description> 
Hi, 

 
Does anybody know if it is possible to run a simulation, with all
its functionality, but without the IDE being launched? (No
GUI) 

 
The aim here is to debug the windows application that communicates
with this simulated device, and not the</description><dc:language>en-US</dc:language><generator>Telligent Community 10</generator><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/151345?ContentTypeID=1</link><pubDate>Thu, 21 Jan 2010 09:57:42 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:801d9067-8e44-4af9-8bc7-d552a1c69b06</guid><dc:creator>Andy Neil</dc:creator><description>&lt;p&gt;&lt;p&gt;
Actually, building an AGSI DLL and all the simulation scripts
sounds like a lot of work to me...!&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/146490?ContentTypeID=1</link><pubDate>Thu, 21 Jan 2010 07:52:56 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:f3faa55e-5dda-494a-a8a2-16397dd5f280</guid><dc:creator>Shahar Behagen</dc:creator><description>&lt;p&gt;&lt;p&gt;
I have actually considered what you proposed.&lt;/p&gt;

&lt;p&gt;
The thing is, I have ~150K of Embedded C code that naturally
contains many target-specific elements.&lt;/p&gt;

&lt;p&gt;
Migrating this code base so it can be run as some &amp;quot;Intex-Windows
process&amp;quot; sounds to me like &lt;b&gt;a lot of work.&lt;/b&gt;&lt;/p&gt;

&lt;p&gt;
If you are aware of any tools / technic that can help me in this
direction, please let me know.&lt;/p&gt;

&lt;p&gt;
Thanks!&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/144806?ContentTypeID=1</link><pubDate>Wed, 20 Jan 2010 09:30:41 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:25b0db84-5770-4837-a7ad-be8bfef4f940</guid><dc:creator>ImPer Westermark</dc:creator><description>&lt;p&gt;&lt;p&gt;
It doesn&amp;#39;t impact the simulation timing.&lt;/p&gt;

&lt;p&gt;
But it represents a scaling error in relation to the Windows
application that is counting timeouts and transfer times based on a
different time scale.&lt;/p&gt;

&lt;p&gt;
If your protocol specifies that there should be five character
pauses between message and response, then the Windows side will
compute 5x1ms = 5ms wall time. A simulator that runs at 20% of real
time will also compute 5ms delay. But this delay will be scaled and
look like 25ms for the Windows application. And if the Windows
application makes a 5ms delay, it will be scaled and look like 1ms on
the simulated machine.&lt;/p&gt;

&lt;p&gt;
Seen another way. A 9600 baud UART can transfer 1000
characters/second. If the simulated machine runs at 20% of real time,
then the UART may either run with clock-cycle-based transfer times,
in which case the UART will only manage 200 characters/second, to
scale the virtual baudrate. The other alternative, is that the
simulated UART doesn&amp;#39;t compute time, but allows a byte to be received
instantly, i.e. as soon as the Windows application sends out a byte,
the receive flag gets set in the simulated machine.&lt;/p&gt;

&lt;p&gt;
When using the UART for sending out debug information, it is good
to have the UART take zero time, basically getting an infinite
baudrate.&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/144805?ContentTypeID=1</link><pubDate>Wed, 20 Jan 2010 09:15:16 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:f4d6d99f-7ae2-4542-8669-7d45d2e07362</guid><dc:creator>Andy Neil</dc:creator><description>&lt;p&gt;&lt;p&gt;
In that case, might it not be simpler to just compile your
embedded code into some kind of PC format for the purpose of testing
the Win App?&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/142544?ContentTypeID=1</link><pubDate>Wed, 20 Jan 2010 09:06:53 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:741e03af-d81f-4216-8530-72f698c4239d</guid><dc:creator>Shahar Behagen</dc:creator><description>&lt;p&gt;&lt;p&gt;
Since I&amp;#39;m not interested in time-accurate simulation of the
peripherals (At the end I&amp;#39;m interested in QAing my Win application
only...), this I think should not be an issue.&lt;/p&gt;

&lt;p&gt;
Generally speaking, since all peripheral mechanisms are
interrupt-driven in the MCU implementation, I don&amp;#39;t see where there
should be a problem. (The dll implements the relevant callbacks - as
the AGSI API suggests.)&lt;/p&gt;

&lt;p&gt;
SPI communication for instance, will be carried out exactly as
fast as the simulation goes.&lt;/p&gt;

&lt;p&gt;
The only issue being the mapped UART - as I understood from
you.&lt;/p&gt;

&lt;p&gt;
Can you please explain to me how (and why) the inclusion/exclusion
of a mapped UART impacts the simulation timing?&lt;/p&gt;

&lt;p&gt;
Thanks.&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/139467?ContentTypeID=1</link><pubDate>Wed, 20 Jan 2010 07:54:27 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:77b1fc01-10f0-463b-91d5-12c5342a16b2</guid><dc:creator>ImPer Westermark</dc:creator><description>&lt;p&gt;&lt;p&gt;
The scripts that are run by the debugger can make delays based on
either a fixed number of clock cycles, or a fixed amount of time. But
since the scripts are run by the simulator, delays for a specific
amount of time will not count wall-time in your room, but will count
time based on the simulation.&lt;/p&gt;

&lt;p&gt;
So if the processor has a 48MHz clock cycle time, a delay of 1
second in the script will not take one second of your time, but will
wait for the simulator to step the simulated instruction clock
counter 48 million times. So if you simulate an external SPI device,
a delay of 480 clock cycles in the script will then correspond to a
10us delay if you had used real hardware.&lt;/p&gt;

&lt;p&gt;
So all scripts scales the delay times with how fast your PC is
able to simulate the processor. But programs using a mapped serial
port will not know how fast the simulation is, so they will not be
able to scale their timing and adjust timeouts correspondingly. If
you run a serial port at 9600 baud and fills the FIFO, you expect the
UART to be able to send out the characters at about 1ms/character
based on your time. But a simulator running at 20% of real-time will
have their UART running correspondingly slower. So it will take 5ms
of PC time for each character sent into - or received out from - the
virtual machine.&lt;/p&gt;

&lt;p&gt;
How is your AGSI DLL taking care of the scaling of time from a
simulation that isn&amp;#39;t running 1:1 with the wall clock?&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/130590?ContentTypeID=1</link><pubDate>Wed, 20 Jan 2010 06:36:32 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:1fe19c8c-693a-4078-be8f-4273036b05cc</guid><dc:creator>Shahar Behagen</dc:creator><description>&lt;p&gt;&lt;p&gt;
I am aware of the time accuracy issue you have mentioned.&lt;/p&gt;

&lt;p&gt;
Regarding the serial port mapping, I didn&amp;#39;t experience actual
communication faults, but only slightly delayed replies from the
virtual system, compared to the hardware target.&lt;/p&gt;

&lt;p&gt;
&lt;i&gt;&amp;quot;When the simulation is only tested against debugger test
scripts, then they will have their actions scaled
accordingly&amp;quot;&lt;/i&gt;&lt;/p&gt;

&lt;p&gt;
Can you please elaborate on that?&lt;br /&gt;
(If it matters, please note that I don&amp;#39;t use those C-scripts that are
run from the uVision command line, or from anywhere else.)&lt;/p&gt;

&lt;p&gt;
Thanks.&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/130591?ContentTypeID=1</link><pubDate>Wed, 20 Jan 2010 06:22:58 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:0cc8207e-175b-415d-99d1-f41ea5224f44</guid><dc:creator>Shahar Behagen</dc:creator><description>&lt;p&gt;&lt;p&gt;
The reason for not using the real target Hardware is that I have
also written a small dll for uVision (using their &amp;quot;AGSI&amp;quot; API) that
controls the (simulated) behavior of my on-board peripherals. (e.g:
external memory, SPI based devices...)&lt;/p&gt;

&lt;p&gt;
This approach should (hopefully) allow me to run - automatically -
all kinds of scenarios - which is great mainly for my QA-ing
purposes...&lt;/p&gt;

&lt;p&gt;
I hope that my I made that point clear...&lt;/p&gt;

&lt;p&gt;
Got your last answer. Thanks for your effort and insights! I
appreciate it.&lt;/p&gt;

&lt;p&gt;
Shahar.&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/125953?ContentTypeID=1</link><pubDate>Wed, 20 Jan 2010 05:20:22 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:0c20b99c-ca8f-454b-bbe4-7b2d1eed1507</guid><dc:creator>Andy Neil</dc:creator><description>&lt;p&gt;&lt;p&gt;
IF the PC can&amp;#39;t hack it, why not just use the real target
hardware?&lt;/p&gt;

&lt;p&gt;
AFAIK, the simulator and debugger are integral parts of uVision -
they are not available separately.&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/125952?ContentTypeID=1</link><pubDate>Wed, 20 Jan 2010 05:15:59 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:8fe3d1bd-898a-4004-8f36-1c4e5dc3feed</guid><dc:creator>ImPer Westermark</dc:creator><description>&lt;p&gt;&lt;p&gt;
The simulator will process cycle-for-cycle. But that isn&amp;#39;t the
same as running in real time. If your PC is only fast enough to run
the simulation at half real time, then that is what will happen. The
simulator output will then count 1 second of real-time execution
every 2 seconds wall-time for you.&lt;/p&gt;

&lt;p&gt;
When the simulation is only tested against debugger test scripts,
then they will have their actions scaled accordingly. When mapping a
PC serial port into your simulated processor, then you may get into
troubles because of processing speed of the virtual system in
relation to the expectations of the Windows application that
communicates with the virtual machine.&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/115496?ContentTypeID=1</link><pubDate>Wed, 20 Jan 2010 04:16:53 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:efa2ed5c-c233-4120-a712-126693a8772a</guid><dc:creator>Shahar Behagen</dc:creator><description>&lt;p&gt;&lt;p&gt;
Regarding your comment about my PC - I must agree, the situation
is indeed serious......&lt;/p&gt;

&lt;p&gt;
But the thing is I want to run &lt;b&gt;several instances&lt;/b&gt; of uVision
simulation on a single machine. (My Win application talks to several
devices).&lt;/p&gt;

&lt;p&gt;
Just out of curiosity: The core simulation simulates in my case a
C8051F133 (~100Mhz clock) CPU, and is (claimed) to be doing a
cycle-accurate simulation. There are no Idle CPU cycles.&lt;br /&gt;
How can it NOT induce a serious CPU penalty??&lt;/p&gt;

&lt;p&gt;
Anyhow, I don&amp;#39;t know what are the CPU overheads caused directly by
IDE/Debugger itself, which is what I wanted to try and find out in
the first place.&lt;/p&gt;

&lt;p&gt;
Regarding the other issue I mentioned before, do you have any idea
regarding Keil&amp;#39;s licensing fees when it comes to &amp;quot;just running a
simulation&amp;quot;. (i.e: compiling / debugging is not needed)&lt;/p&gt;

&lt;p&gt;
Many thanks!&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/103728?ContentTypeID=1</link><pubDate>Wed, 20 Jan 2010 00:41:35 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:fb435ec6-3928-46dc-bdfb-aa54e88cfa9b</guid><dc:creator>Andy Neil</dc:creator><description>&lt;p&gt;&lt;p&gt;
If that&amp;#39;s an issue, then you must have a seriously under-specified
PC!&lt;/p&gt;

&lt;p&gt;
I&amp;#39;ve never had problems running other apps due to the resources
used by uVision!&lt;/p&gt;

&lt;p&gt;
You can, of course, run uVision on a different PC...&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/78048?ContentTypeID=1</link><pubDate>Wed, 20 Jan 2010 00:05:23 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:4bf3428c-e8de-4e55-9873-58a1791c837d</guid><dc:creator>Shahar Behagen</dc:creator><description>&lt;p&gt;&lt;p&gt;
You are absolutely right.&lt;/p&gt;

&lt;p&gt;
I am already launching the simulation from the command line - and
it is indeed easy to configure it to automatically start
executing.&lt;/p&gt;

&lt;p&gt;
As you suspected, the windows application communicates with the
simulated device via UART, and I&amp;#39;m using a NULL Modem cable.&lt;/p&gt;

&lt;p&gt;
The only issue is that I want to reduce the CPU and memory
overheads induced by the IDE, which I don&amp;#39;t need.&lt;/p&gt;

&lt;p&gt;
Thanks!&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Running a simulation without the debugger and IDE</title><link>https://community.arm.com/thread/57603?ContentTypeID=1</link><pubDate>Tue, 19 Jan 2010 02:58:18 GMT</pubDate><guid isPermaLink="false">dd9e70c8-6d3c-4c71-b136-2456382a7b5c:1a76bd0b-01f1-46d7-8404-34a15f2f93aa</guid><dc:creator>Andy Neil</dc:creator><description>&lt;p&gt;&lt;p&gt;
I think not.&lt;/p&gt;

&lt;p&gt;
&lt;i&gt;&amp;quot;The aim here is to debug the windows application that
communicates with this simulated device, and not the device code
itself.&amp;quot;&lt;/i&gt;&lt;/p&gt;

&lt;p&gt;
But why does that preclude use of the GUI?&lt;/p&gt;

&lt;p&gt;
The uVision &lt;b&gt;Manual&lt;/b&gt; tells you how you can start a debugging
session from the command line - so you don&amp;#39;t have to do it all
manually.&lt;/p&gt;

&lt;p&gt;
How does the device communicate with windows app?&lt;br /&gt;
If it&amp;#39;s a UART link, then just direct the simulated UART to a COM
port, and use a Null-modem cable to link that to the Windows App&amp;#39;s
COM port...&lt;/p&gt;
&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item></channel></rss>