Using 5.5V voltage with 3V 20mA LED bulb resistance value Should I use a resistor with U / I = R (5.5 / 0.02 = 275), or a resistor with (input voltage output voltage) / current = (5.5-3) / 0.020 = 125 ohm, or some other resistor? Give me a value and the solution to this problem in the future,

Using 5.5V voltage with 3V 20mA LED bulb resistance value Should I use a resistor with U / I = R (5.5 / 0.02 = 275), or a resistor with (input voltage output voltage) / current = (5.5-3) / 0.020 = 125 ohm, or some other resistor? Give me a value and the solution to this problem in the future,


Your latter algorithm is right, 125 Ω. It will be calculated in this way in the future



I now have 10 LED white bulbs, each bulb is 3V, 20mA. I want to connect them in series to 220V. How much resistance do I need to add
How is the resistance calculated
I'm asking. If there are five 3V and five 2V. Ma are 3V, 20mA and 2v10ma respectively, how to calculate the resistance


According to u = IR
R=(220-10*3)/0.02=9.5KΩ



I want to connect a 1-Watt LED bulb with a voltage of 220 V and a voltage of 12 V respectively. What is the resistance of each?
I want to use 220 V voltage to connect 1 W LED lamp for lighting. What model and resistance value should I use? What model should I use for 12 V voltage


If the voltage after conversion is still 220 V, assuming the working voltage drop of LED is 1.5 V, then the current flowing through LED is I = 1 W / 1.5 V = 0.67a, and the required series resistance should be r = (220-1.5) /