You are here:

Wireless Communications/microwave power transmission

Advertisement


Question
My question is a little wired but please answer it anyways. I'm doing the math by myself but i would like to know if I am on the right path, thank you. So transmitting 75 kilowatts of microwave energy from 20 miles above the earth surface or 105600 feet above the earth surface with a transmitting parabolic antenna with a diameter of 3 feet. with no other variables like energy loss, but if you can add that in that would be fine with me. What area in square feet would you need to receive 25 kilowatts on the ground. If you can also you may change the transmitting array area to a smaller size or bigger. The idea of the question is so i can get an idea of how to solve this on my own. Please provide an example. 

Answer
Hello Emmanuel,

This is an interesting question, one that has been posed for decades.  For example, in the 1970s NASA did a study of launching a large array of solar panels into synchronous orbit and beaming down the energy collected via microwave transmission.  The transmitter was to be a large array of magnetron tubes, each emitting about 1000 watts.  With tens of thousands of elements in the array the radio beam would be well focused. The receiving array would capture most of the transmitted energy with an antenna covering tens of square kilometers.  That system would have provided megawatts at the ground array; it was never launched.

Basically, to capture 1/3 of the transmitted energy you'd need to intercept 1/3 of the beam.   You need to calculate the antenna beam width.  To do that you need to know the operating frequency and the antenna area and efficiency.   For your example of a 3 foot parabolic you need to estimate the illumination efficiency (typically about 60-70%) and then you will have the beamwidth and effective transmitted power.

The beam 'spreads' with distance, so you calculate the size of the "spot" from 20 miles, and intercept at least 1/3 of that area with your receiver antenna.  The "at least" comes from receiver antenna efficiency which includes losses in the array as well as in converting the microwave energy into AC or DC power.

Here's a quick worked example:  Assume the operating frequency is 2.4 Ghz.  Antenna diameter is 1 meter (about 3 feet).  The antenna gain on boresight is about 26 dB and the half power full beamwidth is about 8 degrees if we assume illumination efficiency of 65%.  At 20 miles the half power beamwidth covers a circle with a radius of about 1.4 miles.  (20 x sine of 4 degrees equals about 1.4)  That means half of the transmitted power will pass through a circle about 2.8 miles across at a distance of 20 miles.  If your receiver is a parabolic dish 2.8  miles in diameter and has an illumination efficiency of 65% about 1/3 of the transmitted power will be focused at the feed point.

Clearly that is a large antenna.   If you increase the size of the transmitter antenna (NASA's proposed orbiting solar transmitter array was miles across) you can reduce the size of the receiver antenna.   Likewise, you can increase the operating frequency.   

Regards,
Jerry  

Wireless Communications

All Answers


Answers by Expert:


Ask Experts

Volunteer


Jerry Hinshaw

Expertise

RF and microwave engineering, hardware and systems applications. Avionics, satellite, and terrestrial communications. Space communications and navigation, geo-location services via satellite and ground-based wireless systems. Systems integration of RF and wireless services. Electric Utility RF Smart Meter (SmartMeter) and Smart Grid wireless devices and networks. Utility Distribution Automation.

Experience

30 years experience in microwave and RF systems engineering, radio hardware design and development.

Publications
about 20 articles, 2 patents

Education/Credentials
BSEE MSEE Communications Systems

©2016 About.com. All rights reserved.