Q&A - Ask Doubts and Get Answers
Q

It is claimed that two cesium clocks, if allowed to run for 100 years, free from any disturbance, may differ by only about 0.02 s. What does this imply for the accuracy of the standard cesium clock in measuring a time-interval of 1 s ?

Q 2.26: It is claimed that two caesium clocks if allowed to run for 100 years, free from any disturbance, may differ by only about 0.02 s. What does this imply for the accuracy of the standard caesium clock in measuring a time-interval of 1 s?

Answers (1)
Views

In terms of seconds, 100 years = 100 \times 365\times 24\times 60\times60 = 3.154 \times10^{9}\ s

Given, Difference between the two clocks after 100 years = 0.02 s

\therefore In 1 s,  the time difference  = \frac{0.02}{3.15\times10^{9}} = 6.35 \times 10^{-12} s  

\therefore Accuracy in measuring a time interval of 1 s = 

\frac{1}{6.35 \times 10^{-12}} = 1.57 \times 10^{11} \approx 10^{11}

\therefore Accuracy of 1 part in  10^{11}\ to\ 10^{12} 

Exams
Articles
Questions