4 views

1 Answers

In astronomy, magnitude is a unitless measure of the brightness of an object in a defined passband, often in the visible or infrared spectrum, but sometimes across all wavelengths. An imprecise but systematic determination of the magnitude of objects was introduced in ancient times by Hipparchus.

The scale is logarithmic and defined such that a magnitude 1 star is exactly 100 times brighter than a magnitude 6 star. Thus each step of one magnitude is 100 5 ≈ 2.512 {\displaystyle {\sqrt{100}}\approx 2.512} times brighter than the magnitude 1 higher. The brighter an object appears, the lower the value of its magnitude, with the brightest objects reaching negative values.

Astronomers use two different definitions of magnitude: apparent magnitude and absolute magnitude. The apparent magnitude is the brightness of an object as it appears in the night sky from Earth. Apparent magnitude depends on an object's intrinsic luminosity, its distance, and the extinction reducing its brightness. The absolute magnitude describes the intrinsic luminosity emitted by an object and is defined to be equal to the apparent magnitude that the object would have if it were placed at a certain distance from Earth, 10 parsecs for stars. A more complex definition of absolute magnitude is used for planets and small Solar System bodies, based on its brightness at one astronomical unit from the observer and the Sun.

The Sun has an apparent magnitude of −27 and Sirius, the brightest visible star in the night sky, −1.46. Venus at its brightest is -5. The International Space Station sometimes reaches a magnitude of −6.

4 views