Apparent magnitude
Apparent magnitude is a measure of the brightness of a celestial object as seen from Earth, without regard to its distance from Earth. The scale of apparent magnitudes is logarithmic and inversely proportional, meaning that a decrease in magnitude by 1 unit represents a 2.512 times increase in brightness. This system has its roots in the work of the ancient Greek astronomer Hipparchus, who classified stars into six magnitudes, with the first magnitude being the brightest and the sixth the faintest visible to the naked eye.
History[edit | edit source]
The concept of magnitude dates back to at least the 2nd century BC, with Hipparchus's introduction of a scale for classifying the brightness of stars. The modern system of apparent magnitude was developed in the 19th century, with the advent of photometric measurements. The scale was made more precise and extended beyond the limits of human vision, incorporating both brighter and fainter objects.
Scale and Measurement[edit | edit source]
The apparent magnitude scale is logarithmic; a difference of 5 magnitudes corresponds to a brightness factor of exactly 100. Therefore, a difference of 1 magnitude corresponds to a brightness factor of the fifth root of 100, approximately 2.512 (known as Pogson's Ratio, after the 19th-century astronomer Norman Pogson who defined it). An object of magnitude 1 is about 2.512 times brighter than an object of magnitude 2, and about 100 times brighter than an object of magnitude 7.
Mathematical Definition[edit | edit source]
The apparent magnitude \(m\) of an object can be defined in terms of its flux \(F\) as follows:
\[m = -2.5 \log_{10}(F) + C\]
where \(C\) is a constant that sets the zero point of the scale. For stars, this constant is defined in such a way that the star Vega has an apparent magnitude of about 0 in all wavelengths.
Zero Point[edit | edit source]
The zero point of the apparent magnitude scale is somewhat arbitrary and has changed as the scale has been extended and refined. In the modern system, it is based on a defined flux for the star Vega, making Vega's magnitude approximately 0 across all wavelengths of light.
Applications and Limitations[edit | edit source]
Apparent magnitude is used extensively in astronomy to compare the brightness of celestial objects. However, it does not account for the intrinsic brightness of objects, which is measured instead by absolute magnitude. Absolute magnitude measures how bright an object would appear if it were placed at a standard distance from Earth, allowing for a direct comparison of the intrinsic brightness of objects.
See Also[edit | edit source]
Search WikiMD
Ad.Tired of being Overweight? Try W8MD's physician weight loss program.
Semaglutide (Ozempic / Wegovy and Tirzepatide (Mounjaro / Zepbound) available.
Advertise on WikiMD
WikiMD's Wellness Encyclopedia |
Let Food Be Thy Medicine Medicine Thy Food - Hippocrates |
Translate this page: - East Asian
中文,
日本,
한국어,
South Asian
हिन्दी,
தமிழ்,
తెలుగు,
Urdu,
ಕನ್ನಡ,
Southeast Asian
Indonesian,
Vietnamese,
Thai,
မြန်မာဘာသာ,
বাংলা
European
español,
Deutsch,
français,
Greek,
português do Brasil,
polski,
română,
русский,
Nederlands,
norsk,
svenska,
suomi,
Italian
Middle Eastern & African
عربى,
Turkish,
Persian,
Hebrew,
Afrikaans,
isiZulu,
Kiswahili,
Other
Bulgarian,
Hungarian,
Czech,
Swedish,
മലയാളം,
मराठी,
ਪੰਜਾਬੀ,
ગુજરાતી,
Portuguese,
Ukrainian
WikiMD is not a substitute for professional medical advice. See full disclaimer.
Credits:Most images are courtesy of Wikimedia commons, and templates Wikipedia, licensed under CC BY SA or similar.
Contributors: Prab R. Tumpati, MD