Monday, July 23, 2007

Guide To LEDs

Until now, I've really only understood 3 things about LEDs:
  1. They don't use much power
  2. You need to plug them in the right direction
  3. You can blow them if you apply too much voltage (or was it current?)
I think I've got it figured out now. I haven't seen it explained this way anywhere else, which could mean I'm just wrong or it could be I'm about to be nominated for the Nobel Prize in Explaining LEDs on a Blog.

Here's the deal: An LED requires a certain minimum voltage. If it gets that voltage, it turns on, otherwise it doesn't (and it doesn't drop any more than that voltage). The exact number varies by color (and other parameters?), but is in the region of 1.5-3.5V. So if you had a 1.5V LED and a 1.5V battery, you could just hook them up, right? No, because you'd get too much current. The LED drops 1.5V but has practically no resistance. (Ohm's Law is really more of a guideline.) A reasonable number to use as a first guess for the amount of current an LED can take is 20mA.

So how do we put this all together? Perhaps the clearest way is to do what I did: Figure out what voltage all the LEDs in my drawer took and label them. First, I built the very simple circuit on the right.

What value to use for R? We don't want more than about 20 mA running through the LED and R = V/I, so 9V/.02A = 450Ω. A standard 470Ω resistor should be fine. So far, so obvious. But how does this figure out the voltage the LED drops?

I tricked you. You actually build that circuit but put an ammeter in there too. Measure the source voltage carefully (well, semi-carefully--I just mean don't assume a "9V" battery is really 9V). Measure the actual resistance of R. You are going to get some measured current Im. Im will not be equal to V/R! That's because the Ohm's Law is only applying to the resistor and the resistor isn't dropping the full 9V. It is dropping only the remainder of the voltage that the LED didn't use. The current is then that remainder voltage divided by the resistor value. In equation form: Im = (V - Voltage dropped by LED)/R. Rearranging, the voltage your LED is dropping = V - Im * R.

So for instance a typical red LED in my drawer is 2.1V. So when I put it into the circuit, the voltage across the resistor is 9 - 2.1 = 6.9V. 6.9V/470Ω ~= 15mA.

Is this just arcane nonsense? Yes, for source voltages that are much higher than the LED voltage and when you are only putting one LED in the circuit. But I went through and did that with all my LEDs. Now I know how to put a bunch of LEDs together and can save power (and resistors, and circuit space) by putting them in series with the right size of resistor. I could fit 4 2.1V LEDs in a 9V circuit because 4 * 2.1 = 8.4. That leaves .6V. In order to limit to 20mA exactly, I'd want a 30&Omega (.6V/.02A); resistor in there.

One caveat: If the voltage varies you might blow something in such a tight margin. I was running my circuit off a weak 9V that was down to about 8.5V. If I really did build the circuit with 4 LEDs I'd have .1V left over, which would require a mere 5Ω resistor. But then say I put a fresh battery in and it's 9.3V now. Now suddenly there's .8V across the same 5Ω resistor which gives me 160mA. Goodbye LEDs! So it would be good to either work with a regulated voltage OR leave yourself enough extra voltage that normal variation isn't a huge change percentage-wise.

No comments:

Post a Comment