I thought diodes were "one way" switches and you put resistors in the circuit?
Isn't it even?
Banjo,
I think what he's saying is the diodes are there to reduce the
voltage a little (resistors reduce
current.). A bog-standard silicon diode, in series with the battery, will drop the voltage by around 0.6 to 0.7 volts. In other words, in order for the diode to let current pass, it needs at least 0.6 volts before it starts to conduct. So his two series diodes are dropping around 1.2 - 1.4 volts of battery juice before it reaches the LED. And don't forget... Just as LEDs are diodes, they also need a certain amount of juice in order to conduct. A standard red, yellow or green LED will need at least 2 volts. The white and blue jobs need at least double that.
But if he's using two series-connected 6-volt batteries, with two series diodes, it means his 10.6 volts (12v minus 1.4v) is still waaaay too much juice to shove up any LED - especially without connecting its appropriate current limiting resistor in series first.
Rich Def,
In case you need a bit of help, I'd advice you scrap the series diodes then use a single resistor instead. All you need to ascertain the required value is........
1) First of all, deduct your LED voltage from your battery voltage. For instance, if you're using a standard 5mm red LED connected to your 12v supply, you simply deduct 2 from 12.
2) Now you need to work out how much current you want to pass through the LED. More current means the LED glows brighter. But the pay-off is that more current also means more of a drain on your batteries. Most LEDs will need approx' 10mA to 50mA in order to glow (that's 0.01 Amp to 0.05 Amp). You'll probably find that 20mA is a good compromise between brightness versus current drain. So let's assume 20mA is to be your required current draw.
3) Now that you're armed with two variables makes it easy to calculate the third. The first var' being the voltage (10 volts), and the second var' being the current draw (20mA). So in this case, the third variable - the unknown variable - is the required resistance value. You simply divided 10 by 0.02. The answer = 500. That means a resistance of 500 Ohms. The nearest practical value in terms of real-life resistor values would be 470 Ohms or 560 Ohms. Common practice is usually to use the next highest value.
After that comes another variable.... Watts.
Watts is calculated by multiplying your battery voltage by the amount of current that all your LEDs are pulling. Shan't bog you down with anymore techn' blurb, other than to say that, as a rule o' thumb, you can use one-quarter Watt (or even the smaller one-eighth Watt) resistors for parallel-connected LEDs. But series-connected LEDs means you'll need a beefy'er-wattage. Generally, a half-watt job would cover lost needs. If it gets warm, use a 1-Watt.
You read it here first.