Solar Discussion

Open topic for questions/conversations/etc about solar!

My project goes, though is taking far longer than hoped…

Well I’ll kick this one with a question, though it’s one that might be difficult to get hard data on. Putting aside a moment the cost and mechanical complexity of a tracking system for a solar array, what is the rough gain in daily efficiency of an array mounted to something like an equatorial mount, that can track the sun throughout the day, as opposed to panels in a completely fixed location, like roof mount?

The best answer to this is playing around on PVWatts.

You can select fixed mounts, one axis trackers, or two axis trackers and see what you gain.

Down in Melba, a 1kW system, south facing, 45 degrees, 10% losses, works out to 1.57MWh/yr.

Do that with two axis tracking, you get 2.2MWh/yr - so 40% more production, at some significant increase in cost.

Interesting, 40% is a lot more increase than I thought it would be. Could be an interesting cost analysis for a given system size, when is it more economical to build/buy a tracking setup vs. buying 40% more solar panels (and subsequent increase in cost for larger wire, chargers, etc. able to deal with the increased peak current).

I can offer my thoughts, based on the system I’m building, which is not tracker-based.

As the cost of panels has dropped (I got mine last year for $0.50/W, new-old-stock), the balance of system costs have increased in percentage. In the 1980s, yeah, trackers made sense because the panels were $10/W or more - it was nuts, so they dominated system cost and the balance of system was a rounding error compared to panels.

My system design is for mainly east-west facing panels run to separate MPPT strings on inverters. It adds to the panel cost by some amount, but the wiring costs don’t change that much - I’m able to run fairly cheap wire from the panels, because they’re 10A peak strings (so 12 gauge is perfectly fine), and even though my DC/AC ratio is fairly high at about 1.18, the inverters shouldn’t ever really be clipping. The east and west panels won’t be getting full illumination at the same time, and if I wanted to go with smaller inverters, I could. Reality is that I picked my inverters under the plan that I will probably change panels out in 15 years without bothering to tell anyone… so might have a bit more DC side.

My system costs are coming in around $25k (give or take a bit), and less than half of that is panels. I looked at doing trackers, and they’re quite expensive, complicated, and failure prone compared to lazy lizard panels, just laying in the sun and not moving.

Play with the numbers, but I don’t think you’re going to make trackers make a huge amount of sense anymore unless you’re building them yourself, from scratch - which I consider an entirely reasonable thing for you to do, FWIW.

Late to the party, I know… but my input on the tracker thing.

A tracker can make sense, but not often economical sense anymore. I’ll explain…

A tracker is expensive (unless you build it yourself, which, hey, if you have time and effort to spare, go for it! They’re cool!) and the efficiency gain (although it looks massive on paper, like 40%) fails to eclipse just adding more panels as far as cost. As Syonyk said, you can always throw another array facing east and another facing west, along with separate charge controllers / inverters, and you break even with the cost of a commercial solar tracker system. And as an additional bonus, you now have some more redundancy built in.

The situation though, where trackers make sense, though they might not be ‘economical’ as in ‘more power per $’:
You can’t fit more panels. Say you had a limited area to put panels. Few or no obstructions, but say you need a lot of energy per day out of only a little bit of land area. In this case, a tracker can make sense. I say it doesn’t make economical sense, because in 99% of areas this situation would occur, the cheaper option would be to plug into the grid and just suck as hard and fast as you want. :stuck_out_tongue:

Depending on how your system is set up, you may not even need additional charge controllers/inverters.

For my house grid tied system, the Sunny Boy inverters already support multiple MPPT strings, so while I’ve got a hefty DC/AC ratio, I don’t actually need extra inverters. I could even have gone smaller with the inverters, but if you’re doing creative math to cap out your panel, may as well go for the limit (there was no significant cost delta by dropping an inverter size).

On a DC system, I’d be awfully inclined to put a good ideal diode at the end of each string and just merge the east/west strings together into a single MPPT controller.

The other benefit is cloudy days - the increased panel area means that on cloudy days, you get more production out of the system, because where panels are aimed doesn’t matter - only surface area does.

On the same note… thoughts of amorphous v monocrystalline? I’m running a few emergency amorphous panels and they still charge the emergency packs even on the cloudy days. Not sure I’d deploy that to somewhere with good insolation, of course.

I ran Kyocera KD135GX-LFBS “multi-crystalline” panels for 5 years as my main power source for a totally off-grid install which was my primary residence for the majority of that time. I felt that they were excellent as primary panels. @Syonyk probably has some anecdotal evidence on his panels too.

I really haven’t paid much attention to the various panel types. I know the amorphous stuff is supposedly better in low light conditions, but I think that’s partly that it’s just not as good in full sun - so it tracks ambient illumination better and doesn’t require direct solar photons for peak output.

I’m pretty sure everything I own is monocrystalline, though.

Yeah they don’t need the direct light so it’s great to put into a vehicle for passive trickle charging, etc. I really just keep enough solar for basic comms and lighting, and that’s about it.

I would assume the amorphous stuff would be fine for that. It generates power, that’s all that really matters!

I’ve considered some stick-on solar for my truck, but it mainly lives under a carport these days (on a battery tender, because I’m cheap and don’t feel like replacing the batteries yet).

Yeah, especially with the low use these days, all I need is a watt or two and amorphous works even in shade or clouds. Heck, the ‘charge’ light will light up even inside my garage, and I have it piped in through the ODB2 12v line in the truck since the 12v lighter socket disconnects when engine is off.

I like the OBD-II line trick - I’ve had live cigarette lighter ports on my various vehicles I’ve wanted to charge.

My truck just lives on a plug in battery tender, though… I’m not driving it enough to keep it charged, and I don’t want to replace the batteries just yet.

Anybody here notice any less power output with the terrific amount of smoke around? I figure it’s more dense horizontal than vertical but I’m curious if there’s any production loss.

Oh yeah. It massively impacts my production. I get no solar in the mornings, significantly reduced solar mid-day, and no solar in the evenings. This is actually the hardest time of year for me, energy-wise. I need quite a bit of energy (4-5kWh/day) for air conditioning on the hottest days, and I can’t always get it. At least the batteries are warm.

Your eyes don’t really do the dynamic range of the sky justice. They adapt very well, and you don’t notice huge changes in brightness over time. Solar… notices.

Sounds like an argument for at least some amorphous panels in your mix.

It can’t pull energy that isn’t there.

My understanding of amorphous is that while it doesn’t degrade in lower light conditions as much, that’s because it’s not generating as much in full brightness conditions anyway. What’s the watts per unit area peak production on your panels vs a good monosilicon panel?

You’ve had your panels long enough that you may be able to see some degradation - are you above or below the 1% they’re talking about here?

It may be hard to baseline if you don’t have “new” panels to compare to.

Report seems to be here: https://static1.squarespace.com/static/5b4e34d1f2e6b166c33dc4f1/t/60bac2d447c5540fbe510992/1622852311371/Solar+Risk+Assessment+2020.pdf

and the group seems to be a panel insurance company so they do have reasons to find the “true” amount to price correctly - but they may also be inclined to overstate it. It’ll be interesting in 15-20 years to see if people just leave underperforming solar around or if they begin to get removed/replaced.

I don’t know and I’ve no way to tell how to know. The solar days are still getting longer, and I have no real basis of comparison.

My east-frame, east-array seems to be missing 30-40W in the morning compared to the west array, even after cleaning, which implies I might have a bad diode, but it’s not present by noon, and it doesn’t represent any large losses in terms of total system production. That’s a change from when the panels were brand new.

I’m mostly planning to let them run until they get really bad, and then perhaps replace panels and test, but… until then, I won’t notice equal degradation across the array. All I do is compare stuff to each other.

I could put Tigo optimizers on and monitor each panel (while optimizing production), but at $3000 for the array, I’d literally never make those worth it, and that’s a good chunk of change for idle curiosity. Plus I’d have to figure out battery interfaces for the secure power supply outlets. It’s just adding a ton of complexity for no real benefits.