Is Using a Shorter Timer Period Less Accurate Than Using a Longer One?

Submitted by: Submitted by

Views: 11

Words: 298

Pages: 2

Category: Science and Technology

Date Submitted: 09/06/2016 06:08 PM

Report This Essay

Just as the title says , you may sometimes think about the similar question. someone have tested it, and got some results that you may be curious about.

He is using a dsPIC 33FJ128GP804 and trying to record data at 200Hz as accurately as possible. The device also has a GPS and He is noticing that the timer is drifting compared to the GPS by about 333ms/hour. He think it should be possible to do better than that. His clock source is a 40 MHz crystal with a tolerance of 30ppm and stability of 50ppm and aging of 1ppm/year.

There is the Oscillator of the PIC running at 72 MHz. He can not change this, it is required to run the UARTS at 3Mbit.

During his testing to figure out why there is so much drift, He did an experiment. He is running two timers, one at 200Hz and one at 5Hz. In the interrupt handlers, the 200 Hz timer increments a long int by 1 and the 5Hz timer increments a long int by 40. The two long ints should remain the same. However, after 45inutes, the values differ by 6.(30ms)

Why are you seeing a difference between the two values? And is there anything we can do to increase the timer accuracy other than changing the crystal? Does having a longer period increase accuracy? If so, could we use a slower timer periodically reset the faster timer. Would changing the source of the timer to the external oscillator from the instruction clock make a difference?

He do use the GPS time to correct the timing in post processing, but GPS isn't always available so he don't want to have to rely on it.

If you have more questions, welcome email wu@songjicn.com .