I am really confused about this question. I don't understand what needs to be done. Thanks for any help.
A light source contains two wavelength of light, λ1=450 nm and λ2=650 nm. It is incident on a diffraction grating with 1000 slits/mm and the interference pattern is being observed on a screen that is 1.5 m away from the slits. What is the separation between the 1st order bright fringes of the two wavelengths of light?
For a diffraction grating the location of the antipodes can be predicted from n*lambda=d*sin(theta) where n is the order of the antinode, lambda is the wavelength of the light, d is the distance between the adjacent slits in the diffraction grating and theta is the angle between the central antinode and the nth order antinode.
For the 450nm light these values will be:
n=1 the first order antinode - given
Solving for the angle theta:
And the angle theta will be:
Since the distance to the screen is given to be L=1.5m, the distance to the first antinode on the projection screen X1 will be:
Solving for X1:
Repeating all of this for the second wavelength lambda2:
Therefore, the distance between these two antinode s on the screen will be:
dX=1.28-0.75=0.53m or 53cm