Led guys chime in

PJ Diaz

Well-Known Member
No you don't need more AC power you're just having less loss at 2:30 V we're using the same power just you have less loss so you think you have more power but you don't have more power you just have less loss.
Your basing your math on everything running at 100%. In that case, you'd be right that there would be no cost difference. The difference would be that you would have increased light output for the same amount of power consumption. Either way it's the same horse, whether you look at it from the head or the tail. You have a better power consumption value running at 230v vs 115v, if that's how you want to look at it. My math was based on equal wattage at the LED board level, which is really what we should be considering. If you want to look at the horse from a different perspective, go for it. Either way, it's a better value to run a 230v if you can. It's not a massive difference, but with a lot of lights over time, it's definitely worth it. In a tiny 4x4 home grow it's not going to be noticeable however. A 600sq ft grow op will save enough by running 230v to put a new roof on the building every 10 years. It's worth it at that level.
 

rootforme

Well-Known Member
Your basing your math on everything running at 100%. In that case, you'd be right that there would be no cost difference. The difference would be that you would have increased light output for the same amount of power consumption. Either way it's the same horse, whether you look at it from the head or the tail. You have a better power consumption value running at 230v vs 115v, if that's how you want to look at it. My math was based on equal wattage at the LED board level, which is really what we should be considering. If you want to look at the horse from a different perspective, go for it. Either way, it's a better value to run a 230v if you can. It's not a massive difference, but with a lot of lights over time, it's definitely worth it. In a tiny 4x4 home grow it's not going to be noticeable however. A 600sq ft grow op will save enough by running 230v to put a new roof on the building every 10 years. It's worth it at that level.
I agree you have slightly better efficiency now if you want to save on your utility bill you have to be able to calculate what that efficiency is and then you can turn off your light at an earlier time after matching DLI and then you would be able to have a cost savings but as long as you're staying on 1212 and still pulling the same 800 W from the wall whether it be 230 or 115v your bill won't change.
 

PJ Diaz

Well-Known Member
I agree you have slightly better efficiency now if you want to save on your utility bill you have to be able to calculate what that efficiency is and then you can turn off your light at an earlier time after matching DLI and then you would be able to have a cost savings but as long as you're staying on 1212 and still pulling the same 800 W from the wall whether it be 230 or 115v your bill won't change.
You don't get it.
 

rootforme

Well-Known Member
What if it does though
I guess you guys haven't noticed that most equipment at 110 V will be twice the amperage of similar equipment at 220 V? that's because it's the same power draw either way. Volts times amps equals W now go convert that between your 220 and 110 and you tell me how the wattage is any different?
 

PJ Diaz

Well-Known Member
I guess you guys haven't noticed that most equipment at 110 V will be twice the amperage of similar equipment at 220 V? that's because it's the same power draw either way. Volts times amps equals W now go convert that between your 220 and 110 and you tell me how the wattage is any different?
Because you can dim your LED fixture down 4% when running on 230v vs 115v and get the same light output. Boom, savings!
 

rootforme

Well-Known Member
Because you can dim your LED fixture down 4% when running on 230v vs 115v and get the same light output. Boom, savings!
Just like I said if you want to turn off your light to match the efficiency then you will get a cost savings Which is what dimming your light does.. Dimming your light actually just cycles a power off power on that's quicker than the human eye can see and this makes it look dimmer.
 
Last edited:

PJ Diaz

Well-Known Member
Just like I said if you want to turn off your light to match the efficiency then you will get a cost savings Which is what dimming your light does..
Glad to hear that you are finally understanding the reality of cost savings with 230v vs 115v.
 

Samwell Seed Well

Well-Known Member
I guess you guys haven't noticed that most equipment at 110 V will be twice the amperage of similar equipment at 220 V? that's because it's the same power draw either way. Volts times amps equals W now go convert that between your 220 and 110 and you tell me how the wattage is any different?

In your own glass tube scenrio... less amps needed at higher volts means less killowatts and add the transfromer phase into that...get out...im a ex welder... we turn watts/amps to $$ signs all day..hell ive been a ground once or twice

The power people charge you at multiple rates based on the entiry that is being charged, time of day and load over time and its phase from thw tranformer

Forest for the tress Mr. Objective truth, that exist in a bubble... and this aint the wonder dome
 

rootforme

Well-Known Member
In your own glass tube scenrio... less amps needed at higher volts means less killowatts
Huh? Let's say you're using 150 800w LED lights. Let's say you use those lights 12/12 for 30 days.

110v 8 amps
220v 4 amps

110v x 8 amps = 880 watts x 12h x 30d = 316,800 x 150 lights = 47,520,000 watts
220v x 4 amps = 880 watts x 12h x 30d = 316,800 x 150 lights = 47,520,000 watts
 

Samwell Seed Well

Well-Known Member
Huh? Let's say you're using 150 800w LED lights. Let's say you use those lights 12/12 for 30 days.

110v 8 amps
220v 4 amps

110v x 8 amps = 880 watts x 12h x 30d = 316,800kw
220v x 4 amps = 880 watts x 12h x 30d = 316,800kw

Yes this is basic.


Power companies..dont use 110 and 220... though do they

Cost is KWH is an added variable based on location per hour. Thats why i run gas lantern at home to trick the power company /s

We both know they sell use power on load not consumption..ffs

California fucked, washington fine.. power is cheap here we believe in the atom..the gravity water drop...the bong
 

rootforme

Well-Known Member
Yes this is basic.


Power companies..dont use 110 and 220... though do they

Cost is KWH is an added variable based on location per hour. Thats why i run gas lantern at home to trick the power company /s

We both know they sell use power on load not consumption..ffs

California fucked, washington fine.. power is cheap here we believe in the atom..the gravity water drop...the bong
This is true my only point was regardless of 110V or 220V you're still gonna pay for the same W.
 

PJ Diaz

Well-Known Member
You w
Huh? Let's say you're using 150 800w LED lights. Let's say you use those lights 12/12 for 30 days.

110v 8 amps
220v 4 amps

110v x 8 amps = 880 watts x 12h x 30d = 316,800 x 150 lights = 47,520,000 watts
220v x 4 amps = 880 watts x 12h x 30d = 316,800 x 150 lights = 47,520,000 watts
You would only need 144 fixtures run at 230v vs the 150 fixtures you'd need at 115v to get the same amount of light, hence a savings in both power consumption and also the number of fixtures purchased to cover the space. Your math doesn't consider all variables.
 

Samwell Seed Well

Well-Known Member
This is true my only point was regardless of 110V or 220V you're still gonna pay for the same W.

Facts yes. An electron is an electron. Half these minbsys are running voltage readers to masturbate to their effiecnies in light production.

It was never about cost...or reality only side glances in the mirror satisfaction...

But seriosuly... you will spend less to produce more...its more of a metric of local power rates, house effiency and their cost...ok its about cost

But def lower the amp is aways better and more effeicient for a circut any curuit..less HEAT...always leave the real answer obscured in the end

Better for you your ballast your diodes...
 
Last edited:

Shaded420

Well-Known Member
The funniest part about all of this is that he has already agreed with what @bk78 and @PJ Diaz and @DoubleAtotheRON are saying but he's so hell bent on proving a point that he's out here arguing against himself.

But you're not using less current because you're using twice the volts. You guys really don't understand how electricity works. The only reason your light gets more efficient is because with lower amperage you get lower heat which makes it more efficient. 220v @ 5 amps is the same draw as 110v @ 10 amps.
Here. Right here. Zoom in. Focus.
 

MidnightSun72

Well-Known Member
No you don't get it you get billed for watts. I don't care what light you have or how efficient it is. What power is the light receiving from the plug? That's what you get billed for not DC power converted..
Read your first sentence in this quote.
".....you get billed for your watts"

the LED drivers uses less watts to produce DC power at 250V vs 120V.

a 600W LED driver that is 90% efficient on 110V = 600w divided by 0.9 = 666W so it takes an extra 66 watts to give the LEDs 600W of DC power

now let's say we switch that same driver over 250V so now it's running 94% efficient.
so same 600W driver at 250V would be 600W divided by 0.94 = 638W

so on 110V the light uses 666W

and on 250V the light uses 638W

so the driver uses less watts ( 28 watts less) on 250V and subsequently your bill will be less.
 

rootforme

Well-Known Member
Facts yes. An electron is an electron. Half these minbsys are running voltage readers to masturbate to their effiecnies in light production.

It was never about cost...or reality only side glances in the mirror satisfaction...

But seriosuly... you will spend less to produce more...its more of a metric of local power rates, house effiency and their cost...ok its about cost

But def lower the amp is aways better and more
Read your first sentence in this quote.
".....you get billed for your watts"

the LED drivers uses less watts to produce DC power at 250V vs 120V.

a 600W LED driver that is 90% efficient on 110V = 600w divided by 0.9 = 666W so it takes an extra 66 watts to give the LEDs 600W of DC power

now let's say we switch that same driver over 250V so now it's running 94% efficient.
so same 600W driver at 250V would be 600W divided by 0.94 = 638W

so on 110V the light uses 666W

and on 250V the light uses 638W

so the driver uses less watts ( 28 watts less) on 250V and subsequently your bill will be less.
That's an intelligent theory although that's not the way it actually works. If you have a 600 W light you are not going to get more than 600 W out of that light because you're more efficient you're going to utilize more of the original 600 W than another light at 110 V but you're still using the same W either way.

You're 250 V lamp might actually use 596 W of the 600 W Whereas the 110 V lamp which is less efficient might only get to utilize 570 of the original 600 W. But at the end you're both using 600 W and your bills will be identical.
 
Top