3.9.2.1
Brightness & Apparent Magnitude
Astrophysics | AQA A-Level Physics
Key Definition
Apparent magnitude ($m$): the perceived brightness of a star as seen from Earth. It is a dimensionless number with no unit.
The Hipparcos scale
- The Hipparcos scaleThe original brightness classification system devised by the ancient Greek astronomer Hipparchus, ranking stars from 1 (brightest) to 6 (faintest visible to the naked eye). classified visible stars on a scale from 1.0 (brightest) to 6.0 (faintest visible to the naked eye).
- As astronomy progressed, this was refined into a precise logarithmic scale, such that a magnitude 1 star is exactly 100 times brighter than a magnitude 6 star.
- This means that each step of 1 in magnitude corresponds to a brightness change by a factor of $100^{1/5} \approx 2.51$.
Comparing brightness using apparent magnitude
- The key part is that the scale runs backwards: the more negative the apparent magnitude, the brighter the object. For example, the Sun has $m = -26$ while Pluto has $m = +15$.
- To compare the brightness of two objects A and B, use the intensity ratio:
- Where $I_A$ and $I_B$ are the intensities (W m$^{-2}$) and $m_A$, $m_B$ are the apparent magnitudes of objects A and B respectively.
- The naked eye can detect objects as faint as magnitude $+6$. The Hubble Space Telescope can detect objects as faint as magnitude $+31$, which is $2.51^{(31-6)} \approx 10^{10}$ times dimmer than the faintest naked-eye object.
Common Mistake
The intensity ratio equation is not given on your data sheet, so you must remember it. Students also often get confused by the direction of the scale: a "bigger" magnitude means a dimmer star, not a brighter one. Always say "brighter" or "dimmer" rather than "larger" or "smaller" magnitude.