Designing home DC wiring

Loss at voltage conversion is just a function of how good your converter is, 5% to 20% losses could be expected at each conversion.

Losses in distribution are the real killer. With 120/240 volt distribution, we don't really think about distribution losses until we are running many 10s of metres of cable down to an outbuilding. If the cable is thick enough for the current, the volt drop is negligible, or at least a negligible fraction of the total supply voltage.

However, with low voltage distribution, you get hit by a double whammy. The currents are higher, to transmit the same power. Any voltage drop is a larger fraction of the voltage you start with, so your power losses are proportionately larger.

There are three practical regimes to consider for DC distribution, 12v, 48v, and 'high' voltage.

12v is worth considering as there is so much stuff around that uses that voltage. However, you need stupidly thick cables to supply an area much more than a single room.

48v is around the highest voltage that's deemed to be 'touch safe'. As it's four times higher voltage than 12v, you only need 1/16th of the cable cross section to supply the same power to the same area with the same losses. It's used extensively in the comms industry, and so there is a significant amount of equipment available for dealing with it, inverters, converters etc.

'High' voltage, so 250 and up, is often used to connect solar panels, and in electric cars. Compared to AC, it's more difficult to handle, arcs don't go out as easily, so you need special fuses and switches. I wouldn't really recommend it from a safety and ease of use point of view for distribution in a home.


FWIW, I played with a 12V distribution around a small flat (apartment). I had a big ATX power supply serving up 12V and setup a sort of ring of thick low voltage cable. I then "tapped" off that ring to provide drops to the places that needed it.

Pros:

  • You can do what you like - it's entirely unregulated so no "code" to follow, and it's touch-safe so it's pretty easy to work on (even when running)
  • Most stuff is actually pretty low current, so volt drops along cables wasn't much of a concern (and a lot of stuff has a power supply inside it to drop from 12V to even lower, so even if you had quite a bit of volt drop, it all still works)
  • You can avoid having any sort of power supply nearby to whatever you're powering

Cons:

  • It's a lot of work to put in, and quite expensive (although I did it with a lot of recycled cable)
  • It's non-standard, so if you move out, you may as well pull it all out because it's unlikely anyone else would take it on
  • Actually, power supplies for small devices are cheap and plentiful and are mostly pretty small, so not hard to put somewhere nearby to whatever you're trying to power

I've concluded it's really not worth it - at least not on a building scale (even a small one). Where it does makes sense is in (say) a media cabinet (or perhaps behind your TV). There, having a single power supply can supply a load of USB devices, stuff that uses a 12V "wall wart" or brick power supply, and if you have the outputs, 18V stuff too. Even there though, you're not doing much that you couldn't do relatively easily by more traditional means, but it's quite fun.

Elsewhere, I'm told the datacentre industry is looking at low-voltage cabinets (along the same lines). The idea being that pretty much all computer and network gear uses 5 and 12V, so having a couple of big power supplies in the cabinet, you can actually supply the whole lot - thus saving dozens of mains power supplies in the cabinet. In a datacentre that helps because that probably less cost and heat, at the possible expense of ease of deployment and choice of vendor.

Some Maths

Back to your actual question... You need to decide what voltage you're going to use and what you'll accept at the socket. Let's say you pick on 12V. You also need to decide what voltage range you'll accept at the socket.

To throw some maths onto this... this 42A cable (the biggest they sell) has a conductor cross section of 4.5 mm2. This calculator suggests that it will have a resistance of 3.733 milli-ohms per metre, so over 30 m, that's 112 milli-ohms (0.112 ohms). I=P/V, so 100W at 12V is 8.33A. V=IR, so the volts drop at that current over 30m of that cable will be 8.33 x 0.112 = 0.932V.

That means if you decide that the open-circuit socket voltage has to be 12V and no more, then at 100W, you're only actually going to be getting about 11V (even less if you use cheaper, thinner cable). You may then want to compensate, perhaps by making the socket voltage 12.5V, and thus only drop to 11.5V at full load. That also means you're going to have to make sure every device you might ever plug into any socket can handle 12.5V rather than 12V. You should also consider the heat generated by the cable - don't stuff it into tight, insulated spaces or else the voltage drop will be even higher.

Oh, and by the way, that one run of 30m to the socket will cost you £85 in cable. For that money, you could (almost) have an electrician come to your house and add in a new mains socket in that location and plug in a 12V wall wart for you ;-)


I've thought about this for years. I ended up at why not just use Power over Ethernet? Whether or not you use the network. Cat 5e cable is pretty cheap.