Starlink: Satellite Based Internet

I could, I just don’t care to bother. It’s fine as-is.

It’s getting marginally less-bad, though still prone to connection breaking. I should figure out how it’s doing it and see if I can work around without something like the Wireguard tunnels.

I did the math, though. Our house typically draws around 30kWh/day lately without much heating/cooling, if we don’t take the car anywhere. Of that, 2.4kWh is Starlink.

Dishy is about an 8% increase in our spring/fall base load power consumption. o.O

Yikes that’s a lot of power for an internet connection! I hope they manage to drive that down substantially with improvements to either the hardware and or software! The kind of crazy thing is though that 20-30 year that was the typical amount of power a single decent incandescent light bulb would use if run constantly…

1 Like

Weird - 2.4kWh/day sounds huge to me, but 100 watt bulb sounds small.

The real story is that Syonyk uses about 1/2-1/3 the power I do, but that’s comparing to coming off of a cold April. Time to break out the kill-a-watt and check some things.

1 Like

Yeah. It really, really is. My WISPs use a fraction of it. It’s substantially greater than the normal idle draw of my office, which is about 30W of inverter idle and 40-50W of various loads, to include one of the WISP CPEs, the property area network omni access point, a router, sleeping computers, running Pis, a lot of monitors sipping sleep power.

And if you left a 100W light bulb running constantly, you were likely to be read the riot act by your parents when they figured it out. :stuck_out_tongue: Or it was doing something important like keeping the engine warm so it would start. A 100W trouble lamp under the hood overnight made a big difference!

Probably not. I deliberately picked some low days for comparison. We use a lot of power. :confused: There’s a reason the solar arrays are huge. The house is pure electric, including our water (the well is… oh, should check the water level but pumping from 150-200’), and we use a lot of power for transportation - if the Volt moves, that’s 10kWh, and some days are now twice if there are no conflicts.

We were stable around 18-19MWh/yr for everything until about 2020, then it gradually dropped down to around 15.5MWh for 2020. Then solar came online, which is pushing our average down a lot and I don’t have as fine grained data on our usage because it’s a behind-the-meter interconnect.

Also, the house doesn’t count office energy use. My office consumes 2MWh/yr.

1 Like

Hehe maybe I should CHARGE you to take that PC off your hands so you can save on energy :smiley:

I really need to work on doing some rewiring. I have a house that was at some point converted into a duplex and then converted back, so I have 2x 100 amp service, and since moving my office to the basement the “upper” meter has nearly no power draw, and everything’s on the lower. The way we’re billed it wouldn’t change a ton but it’d be nice to have it balanced.

Once I get this gas dryer hooked up all our “heat” will be gas which will be nice.

This 300 watt LED I have in my office may have something to do with my power draw (but it REALLY does help; I should think about moving the office to the garage and install some natural light windows there).

It’s in my office. It’s opportunistic use only, and with the other stuff I’ve added, it gets used less. I can run all the heat generating computers in here at once, I just can’t also run the air conditioner with that. I’ve capped out my inverter.

Your 300W LED… o.O Are you trying to grow things in there?

It’s a corncob light (I think it’s this one: Feit Electric® 1000-Watt Equivalent COB Daylight LED Light Bulb at Menards® - only 175 watts, good!) and it’s my sole office light (which has no large windows) and it seems to help keep me “daylighed”. I have the equipment to make it auto turn on at work-time-started and turn off (and the normal yellowish bulbs turn on) when it’s time to get off work, but I’ve not set that up yet.

I suppose I could try side-mounting some form of a skylight for the summer months, but long term it’s going to be repositioning the office (maybe convert a current small basement window to a larger egress window for more light).

So many projects, so little time.

That’s a crazy amount of light! 20,000 lumens, about the same has having 25 normal LED bulbs!

Yeah, plenty of “turn the lights out!” comments through the 90’s. When you think of a typical household, say family of 5, lights on for a few of hours each night after dark and an hour in the mornings. Then assume use of 3-4 rooms, with say an average of 2 bulbs per room you quickly hit an equivalent or higher energy load just to keep the lights going for a few hours… The Florescent and LED lights really have made a drastic change to energy use over the last 20-30 years!

It’s interesting to see what different what other pure electric setups use in hasher conditions. We are also pure electric, but have the advantage of a far milder climate (northern NZ) and being in the city limits (no water pumps etc). For us that gives a use of around 11-12MWh a year with Heatpump heating/cooling. A fair bit of that is unnecessary through, 1-2MWh probably goes to power the spa pool.

While the estimates I’ve seen range all over the map due to conflicting and non-standard metrics of counting, incorrect usage of ‘energy’ versus ‘electricity’, lumping appliances and other residential loads into the lighting category, etc., current data suggests lighting is significantly less than 5% of global energy use, and if the US is a good example, it’s about 1.4% of total energy use in fact (I assume post-LED, since they are quoting current values), according to Bright Green Lies, which also notes:

“With the introduction of solid-state lighting—such as LEDs … —it was thought (and hoped) that the transition to it from conventional lighting—like electrical filaments, gas, and plasma—would result in big energy savings. According to the latest research, however, the use of LEDs has resulted in a ‘rebound’ effect whereby many jurisdictions have opted to use even more light owing to the associated energy savings.”

So, if we made the world’s lighting completely inconsequential, we’d still not have a noticeable impact on energy use, and in fact by making it more efficient we run the risk of making the problem worse (Jevon’s paradox, again - we’re going to see this a lot due to human nature and looking at the wrong type of solutions).

Here’s a good example of this in action, using Canada’s energy study from 1990 to 2013: Energy Efficiency Trends in Canada 1990 to 2013

Efficiency increased, in many cases strongly, during this timeframe. Yet across the board, total energy use went up, not down. All we’re doing with the increased efficiency is using more.

To keep this on the topic of Starlink, though, if I read what @Syonyk is saying correctly, the heater is a major consumer of power and that could be corrected by not running it unless it was necessary, and then only for as long as needed to get the snow to slough off.

I don’t believe there’s an actual heater in there. Or if there is, it’s not used regularly.

The heat is a byproduct of the phased array system. :confused: It’s inherent to the dish, and I’m not sure it can still be “online” and use less.

1 Like

Oof, I misunderstood then. But a phased array system shouldn’t “inherently” consume loads of power, if I understand how they work they’re basically an xy grid (attenuated to form a circle or octagonal approximation in many cases) of antennae which are combined with various delays to reinforce the beam from certain 2d angles of incidence. Unless each antenna needs it’s own full amplifier (and those amplifiers are somehow horribly inefficient, to boot) I can’t see how they’re pulling more than a few tens of watts (a mW or two per amplifier x a 100x100 array, for instance) for the array. Add a very powerful onboard processor and you’re still not adding more than single-digit watts. It’s baffling to me.

¯\(ツ)

I didn’t design the thing. I just plugged it in and monitored power.

Hearing about the overheating problems that have started to crop up…I understand it’s tricky, but how did the engineers not get the word that people in the middle of the desert, likely with ambient air temperatures ABOVE the operating temperature would use this? Hell, as people are finding out, simply having it where sunlight/heat can reflect from the surface up onto the dish can cause it to overheat regularly.

With all the “this is a beta”, you’d think they’d also have warned people and given them better guidance as to placement to try and avoid overheating. Guess that’s just what comes from being a beta user. Find the problem/pain points. Still, at least some people are getting meaningful internet, even if imperfect, where they had little to none before.

It seems to me the “uses 100w continuously” and the “overheats in the desert” parts are probably related.

I wonder how much of the computational part could actually be separated from the dish itself to reduce heat issues.

Probably not a lot, although I imagine at least some of it probably could be on a secondary PCB, but I bet it has to remain very close to the antennas to provide for speed-of-light type calibration and keeping SNR down substantially from stuff run over wires/cables. There are ways to mitigate that of course, but I bet it’d have added a non-trivial cost to it.

As a beta/v0.9, probably solid. As a “rock solid hard to screw up”, needs improvement.

Yeah, that’s making the tech news circles, apparently.

“Doesn’t get that warm in Silicon Valley, so I don’t see why it should be a problem.”

That standard SV arrogance is on display all sorts of places. Remember, Tesla’s giant touch screens were failing because they learned, the hard way, why “automotive grade” stuff is tested to high temperatures. The “Cabin Overheat Protection” feature seems to be a “Oh, hey, let’s not overheat the touchscreen” feature in practice.

Guidance? Communication? There’s none of that, but I also don’t see how you can place the dish anywhere but “in full sun.” It needs a huge swath of visible sky to work, and even small obstructions like a tree branch in that cone will lead to interruptions. The beam is laser tight in practice.

If there’s anything that casts useful shade on the dish, it will also be blocking the beam. Getting the dish up off the ground would be useful, but given that it ships with a ground tripod, obviously wasn’t intended to be required.

I checked the debug data on mine and all I see are booleans about thermal throttling and shutdown, no actual temperature readouts I can use to see how hot it is.

But there’s a very, very good reason I’ve not cancelled both my existing ISPs - I dropped the backup one, because Starlink is a good bit quicker than it (and because it’s on the house, where I have the power to run this power hog). But I certainly wouldn’t rely on Starlink only for connectivity in my situation where it’s not really optional for work.

My impression is that most of the heat is from the phased array controllers, so “practically none.” You can’t move that away from the radiating surfaces. Plus the frequency is high enough you can’t really do much with remote links.

I’ll see how it is this summer.

Yup, indeed. I wonder if it’s really possible to adequately cool it sealed the way it is without some kind of water/radiator loop (yeah…not gonna happen) or some active airflow vents, which apparently they don’t want since they don’t want to go through the trouble of weather sealing that part of it as well. Which I sorta get, to a degree. But clearly on bright sunny hot summer days, I’m sure they’ll start to get a lot more reports of overheat shutdowns. And how annoyed customers are. And the elaborate DIY solutions that some of them might end up trying, lol.

Surely there are some tricks that could be utilized to help reduce the heating or increase the cooling?
For example adding sloped/conacal white surfaces around the device at ground level to reduce the amount of reflected radiant heat from below, colour the lower part of what ever side points away from the sun (these are north facing?) to improve the radiant heat transfer? Or simply change the stand to add a fan that blow air up from below to assist cooling? Or look into Peltier cooling, if your already throwing that much power at the thing why not add a bit more…

Seems like it’s simply something they didn’t think about…

Peltiers will make the problem worse at these power levels - they generate a lot of waste heat for how much cooling they do. I’ve fought them for a while and they’re just a pain in the rear.

Supposedly the newer gen hardware uses half the power and might run cooler as a result, but it does strike me as absurd that nobody tested the thing in high temperatures.

1 Like