Magnitude

Magnitude is an inverse logarithmic scale denoting the brightness of a celestial object. There are several types of magnitude scale in astronomy; the most commonly used are apparent magnitude, absolute magnitude and absolute bolometric magnitude.

Apparent Magnitude

Apparent magnitude is a measure of the brightness of an object in the night sky, as it would appear to an observer on Earth, under ideal viewing conditions, i.e. corrected for any attenuation by the Earth’s atmosphere.

Apparent magnitude is an inverse scale, i.e. bright objects have lower magnitudes than dim objects, with the brightest stars having negative magnitudes. For example, Sirius (the brightest star in the night sky) has an apparent magnitude of around -1.46, and the unaided eye can detect stars down to a magnitude of about +6, under near-perfect viewing conditions.

List of the 20 brightest stars

The list below shows the 20 brightest stars in the night sky (i.e. not including the Sun), with their constellations shown in brackets, followed by their visual magnitudes.

  1. Sirius (in Canis Major): −1.46 
  2. Canopus (in Carina): −0.74
  3. Rigil Kentaurus & Toliman (in Centaurus): −0.27
  4. Arcturus (in Boötes): −0.05
  5. Vega (in Lyra): 0.03 (variable)
  6. Capella (in Auriga): 0.08 (variable)
  7. Rigel (in Orion): 0.13 (variable)
  8. Procyon (in Canis Minor): 0.34
  9. Achernar (in Eridanus): 0.46 (variable)
  10. Betelgeuse (in Orion): 0.50 (variable)
  11. Hadar (in Centaurus): 0.61
  12. Altair (in Aquila): 0.76
  13. Acrux (in Crux): 0.76
  14. Aldebaran (in Taurus): 0.86 (variable)
  15. Antares (in Scorpius): 0.96 (variable)
  16. Spica (in Virgo): 0.97 (variable)
  17. Pollux (in Gemini): 1.14
  18. Fomalhaut (in Piscis Austrinus): 1.16
  19. Deneb (in Cygnus): 1.25 (variable)
  20. Mimosa (in Crux): 1.25 (variable)

Historically, the brightness of stars was measured by eye, and hence any formal mathematical definition had to fit roughly will the historically accepted values for the magnitudes of stars, which is why the scale might seem a little strange. The commonly used system, popularized by Ptolemy’s Almagest in the second century, and believed to have originated with Hipparchus, divided the stars into six “magnitudes”. The brightest stars in the sky being of the first magnitude and the dimmest visible stars of the sixth magnitude.

This corresponds to a logarithmic scale, since the human eye has a roughly logarithmic response to differences is brightness, i.e. each increase of one degree of magnitude very roughly corresponded to a doubling of the apparent brightness.

The definition of magnitude was formalised by Norman Robert Pogson in 1856, with first-magnitude stars defined as stars that are 100 times as bright as sixth-magnitude stars. The brightness of the star Vega, in the constellation of Lyra, marks the zero point of the scale, i.e. by definition, Vega has a magnitude of 0. (Although note that Vega is a variable star, so its brightness changes slightly over time). The definition can also be used to compare the brightness of objects other than stars, such as planets or galaxies.

This definition means that, if the brightness of two stars differ by one degree of magnitude, the intensity of light reaching the observer on Earth from the brightest of the two stars is 1001/5 (approximately 2.512) times that of the second star – i.e. the light needs to increase in intensity by around 2.512 times for each increase of one degree in magnitude.

Mathematically, the difference in apparent magnitude between two stars is, therefore, defined as:

Mag star 1 – Mag star 2 = – 2.5 log10 (Brightness star 1 / Brightness star 2)

It should be noted well that apparent magnitude depends on the distance from the source to the observer and is, therefore, not a measure of the intrinsic brightness of an object. For example, the intensity of the light reaching the observer on Earth from a star that is 100 light-years (See Astronomical Distance Measurements) away will only be a quarter of the intensity of light from a similar star, with the same intrinsic brightness, that is only 50 light-years away. This is because light waves spread out in space so that their intensity reduces with the square of the distance from the source.

Measurement

The magnitude of a celestial object is usually measured by an instrument called a bolometer. This measures the power of incident light, or other wavelengths of electromagnetic radiation, usually using a temperature dependent electrical resistor.

For the measurement to be meaningful, it is therefore necessary to specify over what wavelength range of the electromagnetic spectrum the bolometer was measuring.

In order for the measurement to correspond to visual magnitude, the light must first be passed through a filter allowing only light that can be perceived by the human eye to pass through.

The standard widely-used filters are those of the UBV system. The U-band filter has a wavelength centred at around 350 nanometres (nm), in the near ultraviolet; the B-band filter has a wavelength centred at around 435 nm, in the blue part of the visible spectrum and the V-band filter has a wavelength centred at around 555 nm, in the middle of the human visual range in daylight.

It is usually the V-band magnitude that is quoted, since this corresponds most closely to the visual magnitudes perceived by the human eye.

Absolute Magnitude

Absolute Magnitude is defined as the apparent magnitude that a star, or other self-luminous object, would have if it were a distance of 10 parsecs (See Astronomical Distance Measurements) from the Earth. This corresponds to a distance of about 32.6 light-years or 190 trillion miles.

This definition can, therefore, be used to compare the intrinsic brightness of stars or galaxies, since it does not depend on the object’s distance from Earth, unlike apparent magnitude.

For planets, or other objects in our Solar System that glow by reflecting light from the Sun, the apparent magnitude is defined as the apparent magnitude that the object would have if it were 1 astronomical unit (See Astronomical Distance Measurements) from both the Sun and the Earth.

Absolute Bolometric Magnitude

Many of the brightest stars output much of their radiation in the ultraviolet part of the spectrum. If only the output of the star in the visual wavelength range of the electromagnetic spectrum is taken into account, this might not give a fair indication of the star’s true luminosity.

The absolute bolometic magnitude, is a measure of absolute magnitude over all wavelengths of electromagnetic radiation, and therefore gives a more accurate representation of the total energy output of the star.

The most luminous stars have an absolute bolometric magnitude of less than -12. For example, one of the most luminous known stars, R136a1 in the Large Magellanic Cloud, has an absolute bolometric magnitude of −12.58, corresponding to a luminosity of around 8.7 million times that of the Sun. However, at a distance of around 163,000 light-years from Earth, R136a1 has an apparent magnitude of only +12.23.

Astronomy, Cosmology, Space and Astrophysics