Astrophysicists like to refer to rough, simple one-line approximations to get the order of magnitude of an astronomical event as back-of-the-envelope calculations, since they are often performed on a piece of scrap paper.
That's a hubble image of galaxy NGC 2403, 11 million light-years (ly) distant. The arrow is pointing to a supernova in that galaxy, SN 2004dj.
The bright star in the top center in is our own galaxy, and so is ~1000 ly distant.
You'll notice that the apparent brightness of the supernova is comparable (within an order of magnitude, at least) to the foreground star.
Apparent brightness is proportional to intrinsic brightness divided by the distance squared.
So, the supernova is brighter than the foreground star by a factor of approximately (11,000,000/1,000)^2 or about 121 million.
A supernova is brighter than a normal star by a factor in the hundreds of millions!
According to the
associated article, the factor is, in fact 200 million.
Now, I'm actually working backwards here... the reason we know that galaxy is 11 million ly distant is because we observed supernova in it before. We've observed enough supernova to have developed a theory that all supernova of a given class have the same (approximately) intrinsic brightness, and we've seen some supernova up close enough to know their distance from other means, so we have that brightness calibrated. So, when we see a supernova in a distant galaxy, we have a standard candle that allows us to calculate the distance to it!
Pretty cool, huh?