Laser communications are more efficient than microwave links
1. We make the assumption that bits received is proportional to photons received (wavelength independent). This means we're assuming that whatever wavelength is used, system sensitivity
can be optimised to reach the quantum shot noise limit.
2. Photon energy is given by
3. At the receiver, power distribution from a diffraction-limited
transmitter is described by Fraunhofer diffraction.
The first minimum in the diffraction pattern occurs at a radius of
so for diffraction-limited transmission, the energy density at receiver is proportional to
4. Making the following assumptions:
...then the above considerations can be combined to give a cost per bit proportional
- Efficiency is wavelength-independent (i.e. we're considering the long-term future,
where either technology is developed to the theoretical limit)
- Cost per bit relates only to the power required for transmission (i.e. cost of
system construction is neglected - clearly an important issue, but various technolgies
could narrow the present gap between the costs of microwave and optical telescopes)
Things to note about this relation:
Switching from microwave transmission (wavelength near 1cm) to laser transmission
(wavelength near 1 micron) offers a potential cost saving of 4 orders of magnitude.
- Cost goes up as the square of distance
- Cost goes down in proportion to the area of the receiver
- Cost goes down in proportion to the area of the transmitter
- Cost goes up in proportion to the wavelength