From a comment on justbeast's journal:
Jun. 2nd, 2014 07:09 pmConsider a box of circuitry that can be attached to any solar panel. The box draws a miniscule amount of power directly from the panel itself to run the logic inside. It's got three ports with standard interconnects - two with sockets that happen to match the shape of a USB port, so if the user has USB cables sitting around, they can use those to construct the network, or if they have a USB device that needs charging, they can plug that in to any node on the network and the box will charge it.
By stringing the boxes together and attaching them to solar panels of varying size, shape, make, model, etc, an end user can create a solar array of their own design. For outdoor use they'd use waterproof cables that match the USB socket standard.
They could set something up that works in their back garden, on their roof, on their porch. Hang it out the window of an apartment. Something that folds up. Something that hangs off a bicycle. Something they can deploy on a camping trip, to charge their crap and their car battery in an emergency, then roll up. Spend a hundred bucks on a panel, then next month or next year spend a few hundred more.
The key is, the boxes can be daisy-chained from one panel to the next in as long a sequence as needed, or in a tree pattern using the one upstream socket and the two downstream sockets. Attach a battery anywhere on a node and it will charge the battery when it can, then draw from it if it needs to. It's an ad-hoc power grid that any reasonably intelligent ten-year-old could assemble.
The trunk node snakes inside the house and goes to a box not unlike a UPS that spits out normal household 110. The big power spikes get handled by the normal power line, but everything is assisted by the battery-backed input of your trunk node. Give it an 802.11 chip and some clever firmware and you could monitor the whole tree from your phone.
So basically, a little at a time, I could casually construct my own household solar power solution, tailored to my budget and my terrain, and scale it from "let's charge my phone" all the way up to "let's dry my clothes and charge my electric car" without interruption.
Who is working on this?
By stringing the boxes together and attaching them to solar panels of varying size, shape, make, model, etc, an end user can create a solar array of their own design. For outdoor use they'd use waterproof cables that match the USB socket standard.
They could set something up that works in their back garden, on their roof, on their porch. Hang it out the window of an apartment. Something that folds up. Something that hangs off a bicycle. Something they can deploy on a camping trip, to charge their crap and their car battery in an emergency, then roll up. Spend a hundred bucks on a panel, then next month or next year spend a few hundred more.
The key is, the boxes can be daisy-chained from one panel to the next in as long a sequence as needed, or in a tree pattern using the one upstream socket and the two downstream sockets. Attach a battery anywhere on a node and it will charge the battery when it can, then draw from it if it needs to. It's an ad-hoc power grid that any reasonably intelligent ten-year-old could assemble.
The trunk node snakes inside the house and goes to a box not unlike a UPS that spits out normal household 110. The big power spikes get handled by the normal power line, but everything is assisted by the battery-backed input of your trunk node. Give it an 802.11 chip and some clever firmware and you could monitor the whole tree from your phone.
So basically, a little at a time, I could casually construct my own household solar power solution, tailored to my budget and my terrain, and scale it from "let's charge my phone" all the way up to "let's dry my clothes and charge my electric car" without interruption.
Who is working on this?
no subject
Date: 2014-06-03 03:02 am (UTC)no subject
Date: 2014-06-03 11:22 am (UTC)First, let's talk about scale.
1W = not quite enough to charge a typical old-school flip phone, perhaps coming from a weak tiny wall wart, or a USB port being limited to 200 mA
10W = almost the max of a USB cable's capability, typically to charge big gadget like an iPad, also could be enough power for ~75W equivalent LED bulb
100W = a little more than the power for a big high-end laptop charger brick, typically delivered at higher voltage like 15V to 20V (would need 20A at USB's 5V)
1000W = big beefy old school PowerMac/Mac Pro at full throttle, or current-gen Mac Pro plus a laptop, three displays, and a disk array, all running full throttle, all power delivered via 120V AC (would need 200A at USB's 5V)
Why so much current?
P = I * V
power equals current times voltage
So once you start wanting to pool together all of these USB devices, assuming the voltage isn't changing, all of the current in parallel adds up. Except each device would be generating 5V on its own, and each would be slightly off, so each would have it's output voltage slightly above or below the other, causing their power supplies to fight each other. But that's another issue entirely (more on that later). Put simply, adding up multiple power supplies is a bit trickier than simply shorting all of their outputs together and plugging in some new load.
Also, USB cabling and interconnect, like any non-superconductor, is lossy. A typical 10W USB charger brick outputs up to 2.1A at 5V (10.5W) to deliver 10W (if you're lucky) to a device needing a charge. The 12W iPad charger brick, running at the slightly goosed 5.2V and delivering up to 2.4A (5.2V * 2.4A = 12.48W) is about the limit of what should be put down a USB cable, and especially through USB interconnect.
Why so much loss?
P = I2 * R
power equals current squared times resistance
Double the voltage and halve the current, and the power remains the same. Less current means less I2R drop, less transmission loss. Note also that resistance is cumulative over the length of the wire, whether you're stringing cables across your yard or power lines across miles and miles. High voltages enable long distance power transmission. So really what you want is to keep the current as low as possible. That means getting the voltage as high as possible to deliver the same power and avoid the transmission losses from resistance.
no subject
Date: 2014-06-03 11:23 am (UTC)Histrionics and nerd rage aside, the classic story of Edison vs. Tesla, DC vs. AC, the grand War of the Currents is as much about voltage as it is about direct vs alternating current. Edison's safe low voltage DC power could get about a mile from the power station before the losses killed it and the light bulbs wouldn't glow bright enough to be useful. Increasing the transmission distance would necessitate higher voltage, which didn't work out for a variety of reasons. Tesla's dangerous animal-shocking high voltage AC power could span the state. And the trick wasn't just generating AC power, though the AC dynamo is a brilliant fever dream of an invention, rather it was getting the generated power up to super high voltages. But where did that high voltage power come from?
Back then, DC electricity was typically generated with batteries or motors. Want more volts? Add more cells, make bigger motors, or spin your motors faster. Once generated, stepping voltage up required using one motor to spin another at a higher speed, almost like gears on a bike. Early non-mechanical experiments in stepping up DC voltages in the 1910s and 1920s used highly inefficient voltage doubler rectifier circuits with nasty ripple on the outputs. The first semi-modern DC to DC converter was invented in the 1950s and used a simple relaxation oscillator to drive a transformer, step the voltage up, then rectify it back to DC. Again, not ver efficient. The modern DC-DC boost converter wasn't invented until the 1970s, and even then was highly inefficient. Modern high-efficiency DC-DC boost converters are in the neighborhood of 90% efficient at optimal load, and only the really intelligent power supplies approach that efficiency at light loads.
High-voltage AC on the other hand was nearly trivial (via hindsight) in the late 19th century using technologies and manufacturing capabilities of the day. Once you've devised an efficient way to generate AC power at a reasonably constant frequency, and know how to wrap wires around iron, you're most of the way there. An AC generator connected to a transformer can output roughly the same power at nearly any voltage, depending on how you wind the wires in the generator and transformer. These days, a typical power station transformer is 95-99% efficient, and it's just a bunch of wire wrapped around a dumb hunk of metal. If well-designed and not overloaded, it can sit there stepping power up or down, efficiently, for a very long time. So if power is generated as AC, and stepped up to high voltage to reduce current and loss, driven down the line, stepped down to something safer, and delivered to it's final destination still as AC, you only have the losses of some transformers and the skin of the wire.
Today, the historical advantage of transformer-based step-up conversion isn't such a big deal any more. Generating high voltage of either AC or DC is well understood and there are many ways to do it, though generating high-power and high-voltage DC from AC is still more costly and complex. High voltage AC is common and there's lots of infrastructure, tools, knowledge, and history supporting it. Power transmission using high voltage DC, though not yet widely used, is now feasible in a way that it was not in the past. Once you have DC at a high voltage, transmitting it long distances has the advantage of not fighting the inductance of the wire (building up and tearing down a magnetic field with every cycle) and especially the capacitance between the cable and its surroundings in the case of underground and even underwater power lines.
no subject
Date: 2014-06-03 11:23 am (UTC)So you want to create a tiny replica of the power grid in your back yard using lots of little individual solar cells and some sort of wire. Great.
DC, AC, who cares? You just want to move around a lot of power, so just keep the current as low as possible and hence the voltage as high as possible. Great.
Each solar panel thingy on its own may be able to power a small device, but now you want to put them together. Can you just plug them into some sort of USB hub, or hypothetical USB power grid hub? Not exactly. If only a power grid were so simple.
The key is, the boxes can be daisy-chained from one panel to the next in as long a sequence as needed, or in a tree pattern using the one upstream socket and the two downstream sockets.
From a very very high level you describe how the power grid works, but the devil is in the details.
First, let's isolate just the photovoltaic cells. As the sun streaks across the sky, each PV cell outputs a different voltage which varies over time. And they won't necessarily track each other perfectly since each will have its own efficiency, reflectivity, angle, construction quality, etc, etc. So at the very least you need to convert the PV cells' output (all DC) each to a constant and compatible voltage. High voltage transmission might be good for efficiency, but may not be safe around your yard. 12V is common in solar, 5V would match the USB idea you mentioned. Let's just pick one, 5V for example, and not worry so much about the losses for a moment.
So now you've got a variety of solar cells, each connected to its own power management circuit, each designed to regulate its output to 5V. Great. But that 5V is relative to what? Each may be internally self-consistant, and you could make fancy power supplies with precision trimmed regulators and/or reference voltage generators so get as close to a theoretical 5V as possible, but they all will vary slightly. Each power supply, being designed to output a fixed 5V, might not take kindly to being tied directly to another supply which is higher or lower. Their regulation mechanisms would try to push their output, and hence the outputs of all other supplies as well, higher or lower. This could oscillate out of control, or even blow up a supply. But let's assume for a moment that they don't fight for voltage dominance and blow each other up.
Take some of these supplies and put them at opposite corners of your yard and connect them with wire. Let's say there connected to something in the middle of your yard which will act like your power grid's substation. Well, the wire will be a variety of lengths and the wire and interconnect will be of a variety of resistances. So you can tie all of those 5V wires together and each one will want to be someplace different, all below 5V. The voltage drop from each panel, across the wire, to your busbar in the center will be different. You could even tie all of the ground wires together and not guarantee that ground is the same voltage at each point in the system. Currents through the ground wires would generate voltages across them which would in turn change the ground voltage at each point in the system. And that's just the example of a star pattern. Start daisy-chaining and the series resistance, cumulatively, means each hop could get lower and lower in voltage.
no subject
Date: 2014-06-03 11:24 am (UTC)Maybe this comparison to a grid and substation system designed for super high power and based on old methods and technologies does not apply to this idea. Let's throw away the baggage of historical power grid design for a moment and take a lesson from computer design.
The 12V from a PC power supply goes into a local CPU power supply chip to generate the various voltages that the CPU will need. These all tend to be far below 12V, so a buck (DC step down) converter is employed to generate the lower voltage and higher current power. Let's say the CPU core voltage is 1 volt. So every Watt of CPU power needs 1 amp of current. A big beefy 100W cpu would then need 100A. You'd better put that power supply very very close to your CPU! (remember I2R drop?) That much current running through a buck is unrealistic, so power supply designers employ a neat trick: multi-phase buck converter power supply. Rather than periodically pulsing 12V through one buck circuit into an output capacitor to generate that 1V, the 12V is pulsed sequentially through several different phases in parallel. This allows you to keep the current of each phase down to something reasonable.
The same idea could also allow you to combine multiple sources into one output, though it would be tricky. Let's treat each solar panel like a phase of a CPU power supply, and pulse the load on each one, time-dividing between them to sequentially sip power into some big bulk capacitor to feed the centralized power supply. But you can't just slam a load into each one blindly. When you have one central PC power supply feeding all of the phases of the CPU buck, you know that each phase will be roughly the same load and width. If each phase is now a different supply, you need to know its output capabilities as well. Let's say one is at the optimal angle under full sun, and another is in the shade and can only deliver a tiny pulse before it droops or overloads. So now you need a precise way of monitoring the power coming in from each place, or possibly even a communication protocol between every generator and every load. You need to sort all of the power output capabilities to adjust the load pulse width applied to each one to optimize the whole-system efficiency and not break anything. And speaking of whole-system efficiency, all of these intermediate stages take a bite out of that.
This is getting complicated and we still don't have a way to store any of this power.
no subject
Date: 2014-06-03 11:24 am (UTC)That sort of works. But batteries also need dedicated power management circuits which have their own set of constraints. Batteries need to be charged with a higher voltage than they will ever output. So you max cell voltage should be lower than you system voltage so you can charge directly off of that. Then you need to boost the battery's output back up to the system voltage for when you are draining rather than charging. Battery management is certainly a solved problem already, but it is yet another step with added cost and complexity, and the dreaded efficiency hit. It's one of those things that is significantly more efficient with one battery or large bank of batteries and with one controller than with a distributed array of varied batteries.
So now you've got a custom and complicated power supply on each and every solar panel and battery, some kind of network of distribution hubs to combine, regulate, split, monitor, and balance all of this power, and some sort of central processor to coordinate this complex dance. Each step, each conversion, each interconnect and cable, all have efficiency hits, transmission losses, and interconnect issues. This seems like a Herculean design task and my guesstimation for the whole-system efficiency is not so high.
So where does this leave you? Maybe with a small hypothetical collection of many and varied USB-targeted consumer gadget solar chargers. Each would have it's own associated design and packaging costs, internal inefficiencies and losses, and low maximum output. Turning all that into a power grid would add an unholy mess of cost and complexity. All together, they are unlikely to add up to something significantly greater than the sum of their parts. You'd probably be far better off saving up for one larger panel (or set) with one dedicated controller.
That is, unless the portability and modularity is important enough to you to justify the cost, complexity, and inefficiency that comes from the many possible grid designs.
no subject
Date: 2014-06-03 04:57 pm (UTC)Everyone is hoping to cover a roof with thin solar strips and suddenly beat out the power company. However we don't have the battery technology for load balancing what we already get from solar, let alone what we want to get.
no subject
Date: 2014-06-03 09:21 pm (UTC)no subject
Date: 2014-06-03 09:29 pm (UTC)This is assuming I toss my Radeon HD 5870 video card and use the new Mac Pro's internal dual FirePro, then buy a four-drive external array to contain the previously internal storage...
With all the rest of the hardware the same, I'm guessing I'd save something like 50 bucks a month, right?
Hmmm. This Knowledge Base article gives some benchmarks:
http://support.apple.com/kb/HT2836?viewlocale=en_US&locale=en_US
no subject
Date: 2014-06-03 09:37 pm (UTC)Yes, I was picturing a system where the boxes communicated over some digital protocol that allowed them to report to each other what sort of power they were expecting from their panels, what they were seeing downstream, etc.
And yeah, I'm picturing mostly using those little consumer-grade panels, like those folding ones people use to charge laptops or the panels people use for car batteries, as a kind of stepping stone to a larger, more "official" solar installation. But it sounds like the efficiency loss would just about make it pointless ... and, taking the wider view, might move people's solar efforts in a less useful direction.
This is all coming to mind because I've got to re-do my roof in a few years, and every now and then I get obsessive about the idea of adding a bunch of solar panels up there.
no subject
Date: 2014-06-04 02:40 am (UTC)no subject
Date: 2014-06-05 05:06 pm (UTC)no subject
Date: 2014-06-05 05:12 pm (UTC)