In computer programming, the word Timer is used to describe a hardware or software component that raises an event (“ticks”) at a specified interval. So, for example, if you want a notification once per second, you would instantiate a timer and set its period to one second. The timer API often has you specify the period in milliseconds, so you’d say 1,000 milliseconds.
In a C# program, for example, you can create a timer that ticks every second, like this:
private System.Threading.Timer _timer;
private int _tickCount;
public void Test()
{
_timer = new System.Threading.Timer(MyTimerHandler, null, 1000, 1000);
Console.ReadLine();
}
private void MyTimerHandler(object state)
{
++_tickCount;
Console.WriteLine("{0:N0} tick", _tickCount);
}
It’s important to note that ticks don’t come at exact 1-second intervals. The timer is reasonably accurate, but it’s not perfect. How imperfect is it? Let’s find out. Below I’ve modified the little program to start a Stopwatch and output the actual elapsed time with each tick.
private System.Threading.Timer _timer;
private Stopwatch _stopwatch;
private int _tickCount;
public void Test()
{
_stopwatch = Stopwatch.StartNew();
_timer = new System.Threading.Timer(MyTimerHandler, null, 1000, 1000);
Console.ReadLine();
}
private void MyTimerHandler(object state)
{
++_tickCount;
Console.WriteLine("{0:N0} - {1:N0}", _tickCount, _stopwatch.ElapsedMilliseconds);
}
On my system, that program shows the timer to be off by about 200 milliseconds every three minutes. That’s on a system that isn’t doing anything else of note: no video playing, no downloads, video streaming, or heavy duty background jobs. 200 milliseconds every three minutes doesn’t sound like much, but that’s one second every fifteen minutes, or 96 seconds every day. My granddad’s wind-up Timex kept better time than that.
Granted, the problem could be with the Stopwatch
class. But it isn’t. I created a simple program that starts a Stopwatch
and then outputs the elapsed time whenever I press the Enter key. I let that program run for days, and every time I hit Enter, it agreed very closely with what my wristwatch said. There was some variation, of course, because no two clocks ever agree perfectly, but the difference between Stopwatch
and my wristwatch were a lot smaller than the differences between Stopwatch
and the Timer
. I have since run that experiment on multiple computers and multiple clocks, and every time the Stopwatch
has been quite reliable.
Things get worse. On a heavily loaded system with lots of processing going on, timer ticks can be delayed by an indeterminate time. I’ve seen ticks delayed for five seconds or more, and then I get a flood of ticks. I’ll see, for example, a tick at 5,000 milliseconds, then one at 9,000 milliseconds followed very quickly by ticks at 9,010, 9015, and 9,030 milliseconds. What happened is that the system was busy and couldn’t schedule the timer thread at 6,000 milliseconds, so the system buffered the ticks. When it finally got around to dispatching, three other ticks had come in and it fired all four of them as quickly as it could.
This can be a huge problem on heavily loaded systems because it’s possible that multiple timer ticks can be processing concurrently.
The only thing a timer is good for is to give you notification on a periodic basis–approximately at the frequency you requested. A timer pointedly is not for keeping time.
Given that, you can probably explain what’s wrong with this code, which is a very simplified example of something I saw recently. The original code did some processing and periodically checked the _elapsedSeconds
variable to see how much time had elapsed. I simplified it here to illustrate the folly of that approach.
private Timer _timer;
private int _elapsedSeconds;
private ManualResetEvent _event;
private void Test()
{
_timer = new Timer(MyTimerHandler, null, 1000, 1000);
_event = new ManualResetEvent(false);
_event.WaitOne();
Console.WriteLine("One minute!");
_timer.Dispose();
}
private void MyTimerHandler(object state)
{
++_elapsedSeconds;
if (_elapsedSeconds == 60)
{
_event.Set();
}
}
Here, the programmer is using the timer like the second hand of a clock: each tick represents one second. There are at least two problems in that code. First, we’ve already seen that the timer won’t tick at exactly one-second intervals. Second and more importantly, it’s possible on a busy system for multiple timer events to occur concurrently. Imagine what happens in this case:
_elapsedSeconds
is equal to 59.- A tick occurs and Thread A is dispatched to handle it.
- Thread A increments the
_elapsedSeconds
value to 60. - Another tick occurs and Thread B is dispatched to handle it.
- At the same time, Thread A is swapped out in favor of some higher priority thread.
- Thread B increments the
_elapsedSeconds
value to 61.
At this point, the _elapsedSeconds
value has been incremented beyond 60, so the conditional will never be true and the event will never be set. (Well, it might: 4 billion seconds from now, when _elapsedSeconds
rolls over. That’s only about 125 years.)
You could change the conditional to _elapsedSeconds >= 60
, but that’s missing the point. We’ve already shown that the timer isn’t going to be particularly accurate. If you were trying to time a whole day, you could be off by a minute and a half.
The problem of concurrent execution of timer handlers is another topic entirely that I probably should cover in a separate post. It’s important you understand that it can happen, though, and you have to keep that in mind.
In programming, timers don’t keep time. Don’t try to make them. They can’t count up reliably, and they can’t count down reliably. If you want to keep track of elapsed time, start a Stopwatch
and then check its Elapsed
or ElapsedMilliseconds
properties when necessary.
On a related note, do not use the clock to measure time.