Hi, When I execute the first putchar statement in the code below, nothing is printed to the serial window, when I execute the second putchar statement, the value I thought would print out firstly (0xFF) prints out now. In the for loop I have printed the value i, but likewise the value for i prints out after executing the second putchar statement in that loop. Any ideas why this is happening? Also I have reset the value of counter1 to 2, but as I am stepping through the putchar statements, counter1 increases. I cant see how this can happen as the only way the counter can increment is in the ADC interrupt routine, but the program doesn't jump to this routine in the meantime? If anyone can help, it would be great! Lisa
if(finished) { rec1[buf].cnt = counter1; // store counter1 value in 1st element of array buf ^= 1; // set alternative buffer up counter1 = 2; // reset counter1 to original value // print out whole structure putchar(rec1[!buf].cnt); putchar((char) (rec1[!buf].base_value >> 8)); putchar((char) (rec1[!buf].base_value)); for(i = 0; i < MAX_COUNT-2; i++) { putchar(i); putchar(rec1[!buf].x[i]); } finished = 0; }
"Any ideas why this is happening?" Yes, but I don't know whether they are correct... I've noticed this in the simulator. I wonder whether it tries to do serial output in 'real time', eg one character per millisecond at 9600 baud. Maybe the time the simulator takes to get from the first putchar to the next breakpoint is less than 1ms, hence the character doesn't appear. I also notice that the last character transmitted often doesn't appear in the serial window if the program terminates shortly after stuffing it into SBUF. In general I find the simulator a waste of time as it introduces so many 'variables' that don't exist on the target hardware. I find myself spending too much time wondering why the program doesn't work as expected in the simulator when it does work on the target.
The Keil simulator accurately simulates the 8051. I wonder whether it tries to do serial output in 'real time', eg one character per millisecond at 9600 baud. Maybe the time the simulator takes to get from the first putchar to the next breakpoint is less than 1ms, hence the character doesn't appear. That's pretty much it. Note, however, that the effect described in this thread is NOT a bug in the simulator. It is a common misunderstanding about how the 8051 serial port works. When a program writes a characer into SBUF for transmission, that character is not instantly sent out the serial port (either on real hardware or in the simulator). It takes 10 baud for the character to be shifted out the serial port. At 9600 baud, this is approximately 1,000 instruction cycles. If you look carefully at the source for putchar (I've included the short version):
char putchar (char c) { while (!TI); TI = 0; return (SBUF = c); }
Thanks Stefan, I'm blaming the simulator too!
Hi Jon, That makes sense. Thanks. However, why would the variable, counter1, in the snippet of code in the first message, keep incrementing while these putchar statements are executing? The program has not jumped to my ADC interrupt routine where this value is used. When it does jump to the ADC ISR during putchar(), it is also incremented (as expected). Regards, Lisa
why would the variable, counter1, in the snippet of code in the first message, keep incrementing while these putchar statements are executing? I don't know. I haven't seen the interrupt routine or the A/D initialization code. And, I prefer not to guess. I could come up with 100 things that could be wrong but aren't. Jon
Here's a thought. Have you enabled trace in the simulator and set a breakpoint on accesses to counter1. Then, you can review the trace history to see what happened. Jon
"Note, however, that the effect described in this thread is NOT a bug in the simulator." Sure, I didn't mean to imply that it was a bug, just a feature. Once a character has been placed in SBUF it is clocked out of the TX pin irrespective of program flow and TI is set whether or not putchar() or anything else is waiting for it. Simulating the simulator: SBUF='A'; while(1); //Breakpoint My 'A' will still appear on the real hardware. I would expect it to also appear in the simulator serial window. You'll argue that if I run my simulator simulator in the simulator I'll get what I expect, but that's not the point - I see a breakpoint as a software device that halts program flow, not something that stops the UART peripheral clock. "In general I find the simulator a waste of time as it introduces so many 'variables' that don't exist on the target hardware. Really? Have you reported any of these to us? Currently, I'm unaware of very many things in the simulator that don't exist on real target hardware." I'm talking about the differences in expected behaviour between debugging on the simulator and debugging on the target. The SBUF issue above was a good example, here's another: void Fn(void) { int a=4; int b; b=a; printf("%d",b); } If I break on the printf() line and try and watch a and b I can sometimes see the correct value of a and sometimes not. I assume this is down to 'a' no longer existing due to optimisation. However, if I were debugging this on the target hardware I'd stick in a printf("a=%d",a); and get the expected result.
There are the following points to note when working with the simulator. 1. Unlike real hardware, the simulator lets you single step even in places where you watch at the timing of peripherals. 2. For infinite printf speed, you may set the SxTIME variable to 0. In this case the simulator does not longer 'simulate' the timing of a serial interface and presents the character on the serial window. This works also for the printf style debugging, that you are trying to perform.
Simulating the simulator: SBUF='A'; while(1); //Breakpoint My 'A' will still appear on the real hardware. I would expect it to also appear in the simulator serial window. Why? In the simulator, when you stop program execution, you effectively stop time. All timers, oscillators, interrupts -- all of it -- actually stops. You typically can't do that with real hardware. Or, if you do, you may create a dangerous situation that damages or destroys hardware. I'm talking about the differences in expected behaviour between debugging on the simulator and debugging on the target. The SBUF issue above was a good example, here's another: void Fn(void) { int a=4; int b; b=a; printf("%d",b); } If I break on the printf() line and try and watch a and b I can sometimes see the correct value of a and sometimes not. This may have to do with name overloading. You see, the object named A can be the accumulator, your variable A, or another global variable A. So, which A are you looking at. The same applies to B (since there is a B register on the 8051). But, this issue applies to both simulators and to emulators. There is no magical hardware solution where the emulator "knows" where to find the value of A. This information comes from the object file and, since A can be optimized out, it may very well be. So, I'm not sure how this makes simulators a waste of time. Using this as a justification, you could further generalize that debuggers are a waste of time. Jon
"Why? In the simulator, when you stop program execution, you effectively stop time. All timers, oscillators, interrupts -- all of it -- actually stops. You typically can't do that with real hardware." Yes, I realise this. The problem is that I want a simulator which gives me the same result I would get using my usual 'printf() and oscilloscope' debugging technique. I don't want a simulator which simulates an unrealistic situation. Out of interest, what happens if you break half way through the transmission of a character? Do you get one corrupt character in the serial window on interruption then possibly another character when execution resumes, or does the 'correct' character magically appear on resumption? "This may have to do with name overloading. You see, the object named A can be the accumulator, your variable A, or another global variable A. So, which A are you looking at. The same applies to B (since there is a B register on the 8051)." And again, the simulator has left me confused. I don't know what I'm looking at. "But, this issue applies to both simulators and to emulators." It may not surprise you to hear that I avoid emulators wherever possible. "There is no magical hardware solution where the emulator "knows" where to find the value of A. This information comes from the object file and, since A can be optimized out, it may very well be." Quite. "So, I'm not sure how this makes simulators a waste of time." Because I am never sure whether what I see happening in the simulator is what I will see happening on the target. "Using this as a justification, you could further generalize that debuggers are a waste of time." In large (usually not embedded) 'C' programs debuggers can be quite useful, especially when trying to track down stack corruption. On the 8051, however, I don't really see the need.
Out of interest, what happens if you break half way through the transmission of a character? Do you get one corrupt character in the serial window on interruption then possibly another character when execution resumes, or does the 'correct' character magically appear on resumption? The character is not transmitted until it's transmitted (rather profound, huh). The correct character is transmitted when the transmission time has elapsed. So, the correct character appears in the serial window. Jon