The Astrogators' Guide to

Magnitude and Luminosity

Back to Contents

    Anyone who has seen the unobscured night sky knows that the dark of the cosmic abyss sparkles with lights of differing brightness. The Greek astronomer Hipparchos (ca. BC 190 B ca. BC 120) began the process of classifying the brightnesses of the stars, ranking the stars on a six-point scale, from the brightest (first magnitude) to the almost invisible (sixth magnitude). Claudius Ptolomaeus refined the system in his masterwork, The Almagest. Of necessity, Hipparchos gave us a purely subjective system, but not an inaccurate one, as the following indicates:

    In 1665 Christiaan Huygens (1629 Apr 14 B 1695 Jul 08) made the first measurement indicating the scale of interstellar distances. He measured the distance from Earth to Sirius, the brightest star in the European sky, and did so by comparing the star's brightness to the brightness of the sun. Viewing the sun through a small hole, he moved away from the hole until its brightness matched his memory of how bright Sirius appeared in the night sky. At that point, he knew, the ratio of the size of the hole to the distance between the hole and his eye equaled the ratio of the size of the sun on the sky to the distance it would have to lie from Earth to appear as bright as the hole, which distance he calculated as 22,664 AU. If Huygens had known that Sirius has 26.1 times the luminosity of the sun, instead of assuming that Sirius glows just as brightly as the sun does, he would have calculated the distance as 591,530 AU. That compares very well with the modern figure of 543,900 AU obtained from parallax measurement.

    By the middle of the Nineteenth Century astronomers were measuring the amounts of light that they were receiving from stars and combining those data with the distances determined by parallax to calculate the stars' luminosities, the rates at which the stars emit energy. In 1856 the English astronomer Norman Robert Pogson (1829 Mar 23 B 1891 Jun 23) noticed that the range of stellar magnitudes from one to six roughly corresponds to a luminosity ratio of 100:1. Thus each increase by one unit of magnitude corresponds to a decrease in luminosity by a factor equal to the fifth root of one hundred, 2.511886, which astronomers call Pogson's ratio. In Pogson's system a star's absolute magnitude acts as the logarithm to base 2.511886 of the star's absolute luminosity, so we have

(Eq'n 1)

the minus sign encoding the fact that as magnitude increases so luminosity increases.

    We want to calculate the luminosity of a star as a multiple of the sun's luminosity and not in actual kilowatts glown into space, so we need only refer Equation 1 to the sun's absolute magnitude of +4.71 and its luminosity, Lsol. For a star of absolute magnitude M we have, then,

(Eq'n 2)

Alternatively we have

(Eq'n 3)

    If we know the luminosity of a star and the star's distance from Earth, we use the Stefan-Boltzmann law to calculate the star's radius. We can assume into our premises the statement that stars radiate as ideal blackbodies to a very accurate approximation of the truth. Thus, the rate of energy emission per square kilometer of photospheric surface stands in proportion to the fourth power of that photosphere's absolute temperature and we calculate the star's luminosity by multiplying that number by the star's surface area, which stands in square proportion to the star's radius. If we compare that calculation with the same calculation for the luminosity of the sun, the proportionality constants cancel each other and we get

(Eq'n 4)

In that equation we measure the star's temperature in Kelvin degrees. Given a star's luminosity and temperature, we can then solve that equation for the star's radius and get

(Eq'n 5)

In this way we can gain information about stars without actually measuring the relevant quantities directly.

eeeeffff

Back to Contents