Thread: RTC accuracy
View Single Post
  #8  
Old June 2nd 09, 09:11 PM posted to alt.comp.hardware
pawihte[_2_]
external usenet poster
 
Posts: 16
Default RTC accuracy


"Paul" wrote in message
...
MCheu wrote:
pawihte wrote:
My computer RTC is quite accurate as computer clocks go,
being off by no more than a few seconds over a period of
several days without syncing, whereas I've noticed some
computers to be off by minutes in 24 hours even with a
healthy battery.

I've heard that the accuracy of a computer clock not only
depends on the oscillator in the RTC hardware, but is also
influenced by interrupt calls (or something like that) in OS
and app environment. If this is true, do I just happen to
have a good RTC hardware or does my OS and software
installation also have something to do with it?


In my experience, the RTC varies a bit depending on the
quality and frequency response of the oscillator they choose
to use. It also sometimes varies depending on the quality of
power supplied to the system. I had thought that since it
draws from a battery it wouldn't be an issue, but the only
system I've ever had with a timing issue seemed to be fixed
with a PSU swap.


To see how it is powered, have a look at 25281202.pdf reference
schematic
here. It contains a relatively modern motherboard schematic
(some bits are
not realistic enough for my tastes - they should have used a
real commercial
device for the Super I/O).

http://www.intel.com/design/chipsets...ics/252812.htm

PDF page 82 has the diode-OR of CMOS battery, and onboard
regulator output
PDF page 85 has +5VSB to V_3P3_STBY onboard regulator (three
terminal)
PDF page 79, shows V_5PO_STBY is the +5VSB coming from the
power supply.


The diode-OR switchover circuit is very basic and indicates that
they don't consider stringent regulation necessary at that point.

The voltage regulator on page 85, should reduce the impact a
PSU would
have on timekeeping. But if +5VSB drops low enough, I supposed
there
could be a less stable voltage feeding the RTC (which lives
inside
the Southbridge). The voltage really shouldn't step radically
out
of bounds.


And of course, if the +5V rail fell below 4.3V which would
provide less than the 1V dropout limit of the onboard regulator
IC, that would probably affect other circuits more vital than the
RTC.


On page 85, the circuit is a dual footprint, allowing two
different
devices to be used. If you look at Figure 12 here, you can see
the equation for the adjustable regulator. They're trying to
set
the thing to around 3.32 volts. The CMOS battery is probably
closer to ~3.0 or so.


I wonder why they don't just use the fixed 3.3V version of the
MC33269. It would provide a tighter (1%) tolerance on the output
voltage than would be guaranteed by using two 1% resistors
(possible 2% error) plus the tolerance of the bandgap on-chip
reference.


http://www.onsemi.com/pub_link/Collateral/MC33269-D.PDF

It the power supply is ON, then one leg of the diode OR gets
3.32 volts. If the power supply is OFF at the back, the CMOS
battery delivers around 3.0 volts. The operating frequency
of the RTC 32768Hz crystal might not be exactly the same,
for those two cases. If the power supply is weak, the voltage
could be anywhere between 3.32 and 3.0 volts. If the regulator
output drops below 3.0 volts, then the CMOS battery takes
over. Based on that example, I don't see a mechanism for the
timekeeping to be grossly affected.


The in-circuit voltage of my CMOS battery is currently about 3.0V
vs. about 3.3V when new. I normally turn off power at the plug
for 8-10 hours a day, sometimes up to 20 hrs, but my clock is
still accurate to within a few seconds in between weekly online
synchronisations.


*******

The timekeeping while the OS is running, is AFAIK, done with
clock tick interrupt counting. That is traceable to a different
crystal than the one used by the RTC. The RTC is there mainly
for timekeeping, when the OS is not running. The
characteristics
of the OS maintained time, will be different than the RTC, both
due to the different crystal used (the one on the clockgen),
and due to the possibility of problems with the clock
tick interrupt servicing.


This is where my limited knowledge of computer technology lets me
down. If OS timekeeping depends on the base CPU clock, won't
factors like spectrum spreading also cause inaccuracies in the
RTC?


http://www.maxim-ic.com/appnotes.cfm/an_pk/632

"Each PC contains two clocks. Although they are known by
several
different names, we will call them the "hardware clock" and
the
"software clock." The software clock runs when the PC is
turned
on and stops when the PC is turned off. The hardware clock
uses
a backup battery and continues to run even when the PC is
turned
off."

The CK409 clockgen on the Intel reference schematic, is on PDF
page 21.
The quartz crystal used is 14.318MHz (4 x color burst
3.579545MHz).
The OS maintained clock will be traceable to the properties of
that 14.318MHz quartz crystal. The RTC time, on the other hand,
is traceable to its 32768Hz quartz crystal.

I don't think the computer designers care too much, about
refining either of these. My digital watch, for example,
has a trimmer capacitor inside, to adjust + or - on
the frequency. Computers don't have that.


That's probably the root of it all.