Astronomy/measurement of cosmic microwave background radiation
When cosmic microwave background radiation is measured I think the radiometer should also simultaneously detect radiation from stars. Does radiation from stars cause an error in the CMB radiation measurement? Is it necessary to "subtract out" the radiation from stars and if so , how do you do it?
ANSWER: Hi Robert,
This is a great question (also sorry for the delay in replying, I've been travelling). Experiments which measure the CMB using microwaves (Planck, WMAP, COBE, etc.) have also indeed detected emission from the stars in the Milky Way. The majority of the emission from the stars in the Milky Way is shown on the images/maps as a band across the middle.
An example of this can be seen here:
The stellar emission from the Milky Way can be seen as the dark red band across the middle of the 2nd (middle) image.
Since these probes observe in multiple frequencies, they are able subtract out the contamination of these stars since they have very specific frequencies due to the contaminating physical sources. It is similar to doing forensics, and you want to identify the DNA or fingerprints of one person, but you have two sets. If you know one set of fingerprints or DNA belongs to a specific person, you can remove those of the person you know and look for the one set that you don't know.
You can also make measurements of the CMB off the plane of the Milky Way (the band in the middle) where you expect a drastically less-dense field of stars, and therefore have less contamination.
I hope this helps, but in short, yes they do contaminate but we can subtract them out.
---------- FOLLOW-UP ----------
QUESTION: Brad, Thanks,
I am a retired microwave engineer and I am interested in a few details.
Presumably the stars emit as blackbodies but also emit a few spectral lines?
The radiation and lines get redshifted too?. (Someone told me once that the night time sky would be bright from starlight but for redshift due to expansion)
You determine temperature from radiation intensity?(watt/Hz/steradian?)
How do you determine just how much signal to subtract out at a given frequency? Just until the (redshifted?)lines disappear? I guess the radiation from the stars would have a different spectrum because they are so much hotter.
Thanks for your patience.
No worries at all.
Stars do emit both as a black body and through individual spectra lines. In the microwave regime though, the experiments doing CMB measurements can usually not resolve the individual spectral lines, and therefore are only measuring the black-body spectrum. We understand the various sources of stellar black-body contamination from measuring the temperature of stars in the optical. This is measured in flux which is ergs/cm^2/s/Hz. Yes, I know we use weird units in Astronomy, haha. As you say, it really does have to do with the temperature differences between stars and the CMB. For instance, the cooler stars are still 1000's of Kelvin, while the CMB is ~4K. This means stars have a very distinguishable black-body when compared to the CMB black-body.
Since the stellar black-body is well understood, we can then subtract it out. If you want to look at an example, you can look at the design paper of the WMAP satellite:
specifically, Page 8, Figure 5, top image.
While light gets red-shifted, we are looking at Milky Way contamination, so there is no redshift correction we need to make.