• Question: how bright are the stars

    Asked by codytyler002 to Jen on 15 Jun 2011.
    • Photo: Jen Gupta

      Jen Gupta answered on 15 Jun 2011:


      Great question! Astronomers measure the brightness of stars in terms of magnitude. The apparent magnitude is how bright the stars looks to us in the night sky. Then we define the absolute magnitude as how bright the star would be at a fixed distance from us (we use a distance of 10 parsecs which is about 32.6 lightyears). Because stars are all at different distances to us, their apparent magnitude isn’t actually that useful because it doesn’t tell us about how bright the star really is. Imagine a situation where you have a torch really close to you and a floodlight far away. If the floodlight is really far away, the torch would look brighter than the floodlight, even though if you put them side by side the floodlight would be the brightest.

      The numbers we use for magnitudes might seem a bit silly to you (they do to me!) because the magnitude scale was invented a long time ago, based on the stars people could see. At the time of invention, I think the scale went from 1 to 6, where the faintest stars we can see with our eyes are apparent magnitude 6 and apparent magnitude 1 stars are 100 times brighter than this. So large numbers mean that the star is faint and low numbers (going to negative) mean that the star is bright. These days the magnitude system is defined that the bright star Vega is apparent magnitude 0.

      The brightest star in the night sky is Sirius which has an apparent magnitude -1.4. The faintest stars that we can see with our eyes are apparent magnitude 6 but the Hubble Space Telescope has seen stars that are about apparent magnitude 26!

Comments