I have no idea what has to be done for this problem. Any help would be greatly appreciated.
A stone is dropped into a deep well and is heard to strike the bottom 10.2 s after release. The speed of
sound in air is 343 m/s.
a) How deep is the well?
b) What would be the percentage error in the depth if the time required for sound to reach the rim
of the well were ignored?
The time for the stone to fall to the bottom of the well t1 plus the time for the sound to the top of the well t2 must add up to 10.2 sec:
The distance for the stone to fall to the bottom of the well is given by:
While the distance do the sound to return from the bottom of the well:
Since these distance are the same:
Therefore, replacing t2 with 10.2-t1=t2
Solving for t1:
t1=9.035s Time for the stone to reach the bottom of the well!
Solving for t2:
t2=10.2-t1=1.165s Time for the sound to return to the top of the well!
If you ignore the time for the sound to return your error would be: