Antenna factor

In electromagnetics, the antenna factor is defined as the ratio of the electric field strength to the voltage V (units: V or µV) induced across the terminals of an antenna. The voltage measured at the output terminals of an antenna is not the actual field intensity due to actual antenna gain, aperture characteristics, and loading effects.[1]

For an electric field antenna, the field strength is in units of V/m or µV/m and the resulting antenna factor AF is in units of 1/m:

If all quantities are expressed logarithmically in decibels instead of SI units, the above equation becomes

For a magnetic field antenna, the field strength is in units of A/m and the resulting antenna factor is in units of A/(Vm). For the relationship between the electric and magnetic fields, see the impedance of free space.

For a 50 Ω load, knowing that PD Ae = Pr = V2/R and E2= PD ~ 377PD (E and V noted here are the RMS values averaged over time), the antenna factor is developed as:

Where

  • Ae = (λ2G)/4π : the antenna effective aperture
  • PD is the power density in watts per unit area
  • Pr is the power delivered into the load resistance presented by the receiver (normally 50 ohms)
  • G: the antenna gain
  • is the magnetic constant
  • is the electric constant

For antennas which are not defined by a physical area, such as monopoles and dipoles consisting of thin rod conductors, the effective length is used to measure the ratio between E and V.

Notes

  1. Electronic Warfare and Radar Systems - Engineering Handbook (4th ed.). US Naval Air Warfare Center Weapons Division. 2013. p. 192.

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.