Skip to Navigation ↓
When Hipparchus first invented his magnitude scale, he intended each grade of magnitude to be about twice the brightness of the following grade. In other words, a first magnitude star was twice as bright as a second magnitude star. A star with apparent magnitude +3 was 8 (2x2x2) times brighter than a star with apparent magnitude +6.
In 1856, an astronomer named Sir Norman Robert Pogson formalized the system by defining a typical first magnitude star as a star that is 100 times as bright as a typical sixth magnitude star. In other words, it would take 100 stars of magnitude +6 to provide as much light energy as we receive from a single star of magnitude +1. So in the modern system, a magnitude difference of 1 corresponds to a factor of 2.512 in brightness, because
2.512 x 2.512 x 2.512 x 2.512 x 2.512 = (2.512)5 = 100
A fourth magnitude star is 2.512 times as bright as a fifth magnitude star, and a second magnitude star is (2.512)4 = 39.82 times brighter than a sixth magnitude star.
The following table shows how the difference in apparent magnitude between two stars (m2 - m1) corresponds to the ratio of their apparent brightnesses (b1/b2)
This relationship can also be shown by the equation:
(m2 - m1) = 2.5log10(b1/b2)
1. Put these galaxies in order of magnitude from brightest to faintest:
NGC 4085: m = 12.94
M101: m = 8.30
M87: m = 9.60
IC1410: m = 15.94
NGC 5248: m = 10.97
2. How much brighter is a magnitude +2 star than a magnitude +4 star?
3. A variable star periodically triples its light output. By how much does the apparent magnitude change?
1. M101, M87, NGC 5248, NGC 4085, IC1410
2. 6.31 times brighter
3. (m2 - m1) = 2.5log10(3) ; (m2 - m1) = 1.19, so the star's brightness varies by 1.19 magnitudes