Magnitude is a term astronomers use to describe the brightness of an object. The magnitude scale for stars was invented by the ancient Greeks, possibly by Hipparchus around 150 B.C. The Greeks grouped the stars they could see into six brightness categories. The brightest stars were called magnitude 1 stars, while the dimmest were put in the magnitude 6 group. So, in the magnitude scale, lower numbers are associated with brighter stars.
Modern astronomers, using instruments to measure stellar brightnesses, have refined the system initially devised by the Greeks. They decided that five steps in magnitude should correspond to a brightness difference by a factor of 100. A magnitude 1 star is thus 100 times brighter than a magnitude 6 star. This means that a difference of 1 in magnitude means a factor of about 2.5 times brighter. A magnitude 3 star is 2.5 times brighter than a magnitude 4 star, while a magnitude 4 star is 2.5 times brighter than a magnitude 5 star.
Exceptionally bright objects, like the Sun, and very dim objects, such as faint stars that can only be seen with telescopes, have driven astronomers to extend the magnitude scale beyond the values of 1 through 6 used by the Greeks. We now recognize magnitudes less than one (including negative numbers) and greater than six. Sirius, the brightest star in our nighttime sky, has a magnitude of -1.4. The largest modern ground-based telescopes can spot stars as dim as magnitude 25 or higher. The human eye, without the aid of a telescope, can see stars as faint as 6th or 7th magnitude. See the Light Pollution Interactive for a demonstration of stars you can see at different magnitudes.
Source: Windows to the Universe © Copyright 2007 University Corporation for Atmospheric Research
Read more about magnitude of stars at Windows to the Universe.