Laser communications are more efficient than microwave links
1. We make the assumption that bits received is proportional to photons received (wavelength independent). This means we're assuming that whatever wavelength is used, system sensitivity
can be optimised to reach the quantum shot noise limit.
2. Photon energy is given by
3. At the receiver, power distribution from a diffractionlimited
transmitter is described by Fraunhofer diffraction.
The first minimum in the diffraction pattern occurs at a radius of
,
so for diffractionlimited transmission, the energy density at receiver is proportional to
4. Making the following assumptions:
 Efficiency is wavelengthindependent (i.e. we're considering the longterm future,
where either technology is developed to the theoretical limit)
 Cost per bit relates only to the power required for transmission (i.e. cost of
system construction is neglected  clearly an important issue, but various technolgies
could narrow the present gap between the costs of microwave and optical telescopes)
...then the above considerations can be combined to give a cost per bit proportional
to
Things to note about this relation:
 Cost goes up as the square of distance
 Cost goes down in proportion to the area of the receiver
 Cost goes down in proportion to the area of the transmitter
 Cost goes up in proportion to the wavelength
Switching from microwave transmission (wavelength near 1cm) to laser transmission
(wavelength near 1 micron) offers a potential cost saving of 4 orders of magnitude.
