magnitude in astronomy, measure of the brightness of a star or other celestial object. The stars cataloged by Ptolemy (2d cent. AD), all visible with the unaided eye, were ranked on a brightness scale such that the brightest stars were of 1st magnitude and the dimmest stars were of 6th magnitude. The modern magnitude scale was placed on a precise basis by N. R. Pogson (1856). It was found by photometric measurements that stars of the 1st magnitude were about 100 times as bright as stars of the 6th magnitude, i.e., 5 magnitudes lower. Pogson defined a difference of 5 magnitudes to be exactly equal to a hundredfold change in brightness, so that stars differing by 1 magnitude differ in brightness by a factor of 2.512 (the 5th root of 100). The modern magnitude scale permits a precise expression of a star's relative brightness and extends to both extremely bright and very dim objects. Thus, an object 2.512 times as bright as a 1st-magnitude star is of 0 magnitude; brighter objects have negative magnitudes. The sun's magnitude, for example, is -26.8. On the other hand, a faint star of 16th magnitude is only 1/10,000 as bright as a 6th-magnitude star, the dimmest that can be seen with the naked eye. Magnitudes determined on the basis of an object's relative brightness as seen from the earth are known as apparent magnitudes. Astronomers also assign a star an absolute magnitude, which is the magnitude that a star would have if it were located at a standard distance of 10 parsecs ...
magnitude in astronomy, measure of the brightness of a star or other celestial object. The stars cataloged by Ptolemy (2d cent. AD), all visible with the unaided eye, were ranked on a brightness scale such that the brightest stars were of 1st magnitude and the dimmest stars were of 6th magnitude. The modern magnitude scale was placed on a precise basis by N. R. Pogson (1856). It was found by photometric measurements that stars of the 1st magnitude were about 100 times as bright as stars of the 6th magnitude, i.e., 5 magnitudes lower. Pogson defined a difference of 5 magnitudes to be exactly equal to a hundredfold change in brightness, so that stars differing by 1 magnitude differ in brightness by a factor of 2.512 (the 5th root of 100). The modern magnitude scale permits a precise expression of a star's relative brightness and extends to both extremely bright and very dim objects. Thus, an object 2.512 times as bright as a 1st-magnitude star is of 0 magnitude; brighter objects have negative magnitudes. The sun's magnitude, for example, is -26.8. On the other hand, a faint star of 16th magnitude is only 1/10,000 as bright as a 6th-magnitude star, the dimmest that can be seen with the naked eye. Magnitudes determined on the basis of an object's relative brightness as seen from the earth are known as apparent magnitudes. Astronomers also assign a star an absolute magnitude, which is the magnitude that a star would have if it were located at a standard distance of 10 parsecs ...