Advertisement

I am really confused about this question. I don't understand what needs to be done. Thanks for any help.

A light source contains two wavelength of light, λ1=450 nm and λ2=650 nm. It is incident on a diffraction grating with 1000 slits/mm and the interference pattern is being observed on a screen that is 1.5 m away from the slits. What is the separation between the 1st order bright fringes of the two wavelengths of light?

For a diffraction grating the location of the antipodes can be predicted from n*lambda=d*sin(theta) where n is the order of the antinode, lambda is the wavelength of the light, d is the distance between the adjacent slits in the diffraction grating and theta is the angle between the central antinode and the nth order antinode.

For the 450nm light these values will be:

n=1 the first order antinode - given

lambda=450nm=4.5x10^-7 m

d=1/1000mm=1/10^3*10^-3=1x10^-6m

Solving for the angle theta:

sin(theta)=n*lambda/d=1*4.5x10^-7/1x10^-6=0.45

And the angle theta will be:

theta=sin^-1(0.45)=26.7deg

Since the distance to the screen is given to be L=1.5m, the distance to the first antinode on the projection screen X1 will be:

tan(theta)=X1/L

Solving for X1:

X1=L*tan(theta)=1.5*tan(26.7)=0.754m

Repeating all of this for the second wavelength lambda2:

sin(theta2)=n*lambda2/d=1*6.5x10^-7/1x10^-6=0.65

theta2=sin^-1(0.65)=40.5deg

X2=L*tan(theta)=1.5*tan(240.5)=1.28m

Therefore, the distance between these two antinode s on the screen will be:

dX=1.28-0.75=0.53m or 53cm

Physics

Answers by Expert:

I am teaching or have taught AP physics B and C [calculus based mechanics & electricity and magnetism] as well as Lab Physics for college bound students. I have a BS in Physics from the University of Pittsburgh and a Master of Arts in Teaching from same. I have been teaching physics for 34 years. I am constantly updating my skills and have a particular interest in modern physics topics.