LED Help - Please don't hate me ;-)

PurpleBuz

Well-Known Member
:confused:

That is a great tip...lol I never would've thought of that! Thanks!



I clearly don't know what I'm talking about so...

How many wpsf would the 700a would with 6 cob's?
How many wpsf would the 1050a would with 6 cob's?

What is the calculation for this? Thanks so much for you patience with me...I'm humbled yet again...
its better to fill up the hlgs within 90% of its capacity, for driver efficiency. The calc is to sum the Vf for the cobs, and multiply by current. below the drivers max voltage.
 

giantsfan24

Well-Known Member
its better to fill up the hlgs within 90% of its capacity, for driver efficiency. The calc is to sum the Vf for the cobs, and multiply by current. below the drivers max voltage.
That's what I meant...it's better to pair up the cob's to the drivers and not "lag" them so to speak? "Vf" is volts forward, correct?
 

giantsfan24

Well-Known Member
6 cxb3070 @ .7A gets you ~650ppfd in a 2.5'x2.5' 141watts

6 cxb3070 @ 1.05A gets ~930ppfd in a 2.5'x2.5' 218watts

.
Ok so the 1.05A is the 1050a driver? One driver would run 6 or 7 cob's? Dude, thanks so much for walking me through this. You've been extremely helpful. :-)
 

giantsfan24

Well-Known Member
6

@bassman999 with cxb3590 vs cxb3070, you have to think of missed oportunity costs as well. For Op's 6 cobs, it's $102 more for 3590 over 3070. For that amount of money you could buy an extra 3070 which more than makes up the light output difference, and you're still left with an extra $70 saved. Put that towards electricity and you can run that extra cob @ 1.4A for free for about 18 months @ 18/6. By then we're all probably upgrading to cxb6599s with thier 30,000lumens or whatever crazy shit they have out by then.
This is exactly what I was thinking but could not quantify. Thank you again!!
 

Atulip

Well-Known Member
The mean well hlg 240h-c1050 will run 6 cxb3070s and have some voltage leftover to run fans too if needed.
 

Rahz

Well-Known Member
Wow! Thanks for that calculation. I just had a sense but you've clearly spelled it out.

As this is a personal grow and my usage is so light, it just doesn't make sense to go deep and get the highest end stuff but get what's adequate and will do the job.

Where did you see the 3070's at $32? I've seen $44.
Comparing the same drive current doesn't tell the whole story. The 3590 CD runs about the same efficiency at 2.1 amps as the 3070 BB does at 1.4 amps. 3590 at the higher drive current still provides 50% efficiency but provides 50% more light. It's no coincidence that it costs about 50% more. Also this holds true as they are under driven respective percentages of nominal, slightly favoring the 3590.
 

Atulip

Well-Known Member
Comparing the same drive current doesn't tell the whole story. The 3590 CD runs about the same efficiency at 2.1 amps as the 3070 BB does at 1.4 amps. 3590 at the higher drive current still provides 50% efficiency but provides 50% more light. It's no coincidence that it costs about 50% more. Also this holds true as they are under driven respective percentages of nominal, slightly favoring the 3590.

We're comparing same drive currents because we're comparing watts for watts. Yeah you can run 3590s harder and get more light at a higher current. Same could be said for 3070s

Cxb3070 BB 36v @ 2.1A 11,185 lumens
Cxb3590 CD 36v @ 2.1A 12,166 lumens

Watt for watt the 3590 is 8.8% more light output at 2.1A. Doesn't quite justify a 53% increase in price. @ cutters prices.

You could apply the savings to the additional "waste" wattage costs. Even at 2.1A, cxb3070 BB has 4.0w additional waste. Then add 10% increase in overall wattage to surpass the light output of 3590, 7.8w more+4w waste, 11.8w.

$17 saved @ $.12/kWh is 141kwh. 11,948 hours, 1.8years @ 18/6
 
Last edited:

Rahz

Well-Known Member
Yeah you can run 3590s harder and get more light at a higher current. Same could be said for 3070s
You're missing the point I am making. The cost per par watt and efficiency at each chips nominal current is virtually identical. There is no secret sauce that makes the 3590 or the 3070 stand out from one another. They are based on the same technology. The 3070 has a cost advantage at 1.4 amps because it is driven at a higher percentage of it's nominal current than the 3590 at 1.4 amps. 72% and 58% respectively. You can't underdrive the 3590 more and then say it's more expensive.
 

Atulip

Well-Known Member
But we're all underdriving both 3590s and 3070s. OP is talking about running at 1050mA. It makes zero financial sense for him to choose the 3590 over the 3070.

Yes, because the 3590 is larger with more individual leds so the maximum drive current is higher and by running it at equal watts we get more efficiency. The same thing would be achieved by running more of the 3070s to match the efficiency.

@ max drive current. 1.9A for 3070, 2.4A for 3590.

Cxb3070 BB 9500min lumens @ 85℃ - 68.4w - 138.8 lm/w
Cxb3590 CD 12000min lumens @ 85℃ - 86.4w - 138.8 lm/w

26.3% increase in capacity for a 53% increase in cost.
 

Rahz

Well-Known Member
But we're not running them at 85C, and we are under driving in the case of 1.4 amps for 3070 and 2.1 amps for 3590. For both solutions the efficiency is right at 50% and the 3590 is putting out just over 12000 lumens while the 3070 is putting out just over 8000 lumens. About 50% more light for 50% more cost. The solution is similar if we drive the 3070 at 1.05 amps (6400 lumens at 54% efficiency) and the 3590 at 1.4 amps (8,900 lumens at 56% efficiency), again close to 50% more light for 50% more cost. At .7 amps for the 3590 (4750 lumens at 64% efficiency) and .5 amps for the 3070 (3200 lumens at 60% efficiency), the 3590 provides 50% more lumens for 50% more cost.

A more fair comparison would be 3070 BB at 1.05 amps (6400 lumens, 54% efficiency) and 3590 CD at 1.75 amps, (10500 lumens, 53% efficiency).
 
Last edited:

PurpleBuz

Well-Known Member
[QUOTE="Rahz, post: 12282327, member: 444012", and we are under driving in the case of 1.4 amps for 3070 and 2.1 amps for 3590. For both solutions the efficiency is right at 50% and the 3590 is putting out just over 12000 lumens while the 3070 is putting out just over 8000 lumens. About 50% more light for 50% more cost. [/QUOTE]

Your both right :)

Rahz the only thing your missing in your analysis is that depending on exactly what pricing is available at the time of purchase it may be cheaper to use more 3070s at a lower current than the 3590s. The 3590s carry a significant pricing premium. Thanks to cuutter, jerry and PLC, the premium is much lower but its still there.
 
Last edited:

Atulip

Well-Known Member
Look man I'm trying to explain as best I can.

You're trying to compare the light output of a 3070 at 73.6% of its maximum and a 3590 at 87.5% of its maximum. Of course the 75.6w 3590 puts out 50% more light than the 50.4w 3070. It also happens to be 50% more wall watts too...

That still doesn't justify the $49 price vs $32.

Lets say I make a 4x CD bin 3590 light and run it at max.(2.4A) 13,237 min lumens each @ 25℃. $198 for the cobs. 52,948 total lumens, 345.6w 153.2lm/w

Now let's get 5x BB bin 3070s at their max(1.9A) 10,520 min lumens each @ 25℃. $158 for the cobs. 52,600 total lumens, 342w 153.8lm/w

I just saved $40 and got the same light output and actually a better spread also. (Yes technically 348lumens less, but also 3.6w less)


Numbers are from Cree datasheets and prices are from cutter. Now if you can get 3590s for ~$40, that's about where they start becoming a better value over 3070s
 

Fastslappy

Well-Known Member
or with 3590 get u more light in a smaller package & that smaller package is a saving that not reflected by atulip
as more cobs mean more hardware to support them
 

Atulip

Well-Known Member
A smaller package also means a more concentrated area of heat, less efficiency, and a more concentrated area of light, poorer spread.

Well to match light output at equal wattages you need to use more 3070s at lower current each than 3590s. 26% more cobs at 26% less current gets you the exact same light output and wall watts for 3070s vs 3590s

Basically 5 3070s = 4 3590s at the same overall watts.

So I take my $40 saved per 5 cobs bought and I buy my exta holder needed, optic, an individual arctic cpu cooler, maybe a fuse to protect my extra cob. Every extra expense can be covered in that $40 savings by choosing 5 3070s over 4 3590s.

If you can get them for $40 per 3590, then 5 3070s at $32 equals out.
 

Rahz

Well-Known Member
Look man I'm trying to explain as best I can.
I understand what you're saying but the situation changes when running them at lower temperatures and the percentage of nominal current is a secondary consideration to efficiency and par watts. At 1.75 amps and 50C the 3590 provides 33 par watts. At 1.05 amps and 50C the 3070 provides 20 par watts. The 3070 is under driven further, but the important part is that at those currents they're both operating at the same efficiency. Thus for any particular wattage the amount of light will be similar. At nominal current the 3070 has a slight edge. They become almost even at around 1.75 -vs- 1.05. At lower currents the 3590 has a slight edge.

I don't think package size is a defacto increase in heat, but I will agree that being able to use more cobs can be advantageous. Still, when we're talking about two versions of the same technology it's not reasonable to expect one to provide a notable performance increase over the other.

4 3590s at 250 watts will provide slightly more light than 5 3070s at 250 watts. That $40 saved is going to show up as less yield due to 1500 lumens less light.
 
Last edited:

Fastslappy

Well-Known Member
on a small scale I agree with you Atulip
but once you have to start buying more hardware over multiple fixtures that 3590 gets more attractive to me
 

Atulip

Well-Known Member
Even at 1500lumens less overall lowering your yield, that's easily made up by a more even spread or the ability to bring your lights closer.

Just going by flat numbers. I think a light needs to be 50% better to justify 50% more cost. At 1.4A like most are running, it has 5.6% more efficency and pumps out like 10% more lumens.

If you're willing to pay 50% more for 10% better cobs, why not just spend that on more 3070s and run them at higher overall watts and use the money saved towards electricity.
 
Top