Neophyte question about AC vs. DC (especially for powering a home)

It's not impossible, it's just more complicated and expensive. Everything in your house is designed to run from AC. Many smaller products do take DC in but they come with an AC adapter because that's the only available source of continuous, inexpensive power nearly everywhere. The voltage required can be different for each device. The closest thing to a standard for DC power is probably USB 5.0V, which only offers enough current for small gadgets and not anything larger.

The way a solar powered house works is roughly: solar panel to battery charger to battery, to DC-AC inverter to wall outlets, plus another power regulator & meter if feeding extra energy back to the grid, which isn't a requirement. One could power a house directly with unregulated DC from the battery if the appliances were designed to run from it, but most aren't. If the battery voltage had to be regulated before distribution to the house, all you'd really be doing is swapping the DC-AC inverter for a DC-DC regulator, basically a different box with similar cost.

Due to the small size of the market for DC appliances (at the moment), they'd be harder to find and possibly more expensive than AC units. If a time comes when nearly every house has solar on the roof, they might be just as easy to purchase and maintain.

As to reusing wiring, a wire is just a strip of copper and doesn't care whether you put AC or DC on it, IF you stay within its capabilities. If you had to put a lot more current through the wire due to lower voltage, you might need thicker wires, different safety features in the wiring boxes, higher rated fuses and so on. You'd want different plugs on the outlets so nobody made a mistake of plugging an AC device into an outlet providing DC.

Overall, it's cheaper and simpler to put a DC-AC inverter at the battery than it is to gut the entire electrical system of the house and rebuild it, plus buying all new appliances, plus still needing small DC-AC inverter in each room for the devices which can't be repurchased to run from DC - which at the moment is nearly every gadget. You might think of the AC inverter as providing "backward compatibility" with the previous hundred years of electrical devices.


If a neophyte has read this far, it might help to define "SMPS" — that's a Switch Mode Power Supply. Just about every (99.99...%) desktop computer contains one, as does an UPS, Uinterruptible Power Supply.

[P.S.: This being my first post to S.E., I must admit to getting carried away with history and peripherally-related topics. Guilty as charged?]

Inside, the SMPS uses a rectifier to convert the incoming AC to DC, which powers a high-frequency* inverter. (An inverter converts DC to AC.) That inverter's AC feeds a little transformer that's markedly smaller than a 60 Hz transformer of the same rating, maybe 10% as big, if that. The transformer provides the needed DC voltages from multiple secondary windings via rectifiers. In a sense, it's not wildly different from a drive belt in a car's engine that provides different speeds for the alternator, fan, and other accessories. *At least 25 kHz, likely many times that.

A safety note: The DC which feeds the inverter is roughly 300 V or so, and is made smooth by large capacitors that store energy for milliseconds while the incoming AC's instantaneous voltage is not at, or near, its peak. They might hold their charge after disconnecting the power cord, and they're a dangerous, possibly lethal shock hazard.

The inverter uses semiconductors, traditionally power transistors, to quickly switch the DC either fully on, or fully off at the high frequency. When on, those semiconductors are very efficient, losing just a bit of power as heat, and when off, even better. Transitions while switching are quick, but need good engineering. That's the "switch mode" part. (Yes, there's an oscillator to provide timing for the switches.)

Inverters which are part of solar-power installations provide AC at the region's frequency, 50 Hz in much of the world and part of Japan, and 60 Hz for North America, the other part of Japan, and iirc most (if not all) Central and South American countries.

Some time back, there was a suggestion that future domestic and small-office power would be at two voltages, 320 V (quite likely DC, iirc) and something like 24 or 32 V, as I recall, also DC. High voltage would be for devices needing lots of power.

Before the Rural Electrification Administration, 32 volt DC was commonplace, along with small wind turbines. Try Wincharger™ for a trade mark.

Long high-voltage AC power transmission lines do have significant losses, perhaps because of capacitance as well as resistance. High-voltage DC lines, however, have much lower losses. Although France had one pioneering HVDC link with insulated generators and motors in series, it took a while, likely decades, to develop inverters, in particular. Reliably converting a megavolt DC at hundreds of megawatts to AC is not for amateurs!

Power Supply and related history

This is really a misnomer. They're really power converters. Power is supplied from the utility grid's generators, which are rotated by turbines. Back in the early 1920s, all radio receivers were powered by batteries, A batteries (typically car batteries, all 6 V), and B batteries, non-rechargeable, 22½ V and multiples thereof, up to 135 V. C bavteries did exist, but lasted half of forever, apparently. Those car batteries long predated sealed/valve-regulated types, and dilute sulphuric acid was unkind to living-room floors and rugs. Recharging was a nuisance. B batteries comprised many 1.5 V zinc-carbon cells, and their cost was not trivial.

Back then, household utility power was becoming quite common, and there was a real need to run radios from household power. At first, devices to replace batteries did the job, and afaik those were called "power supplies", also "battery eliminators". The term caught the fancy of radio engineers, and from then on, has remained in use for AC line/mains to DC converters.

Related notes:

Before 110 (120?) volts became standard for DC utility service in the USA, early utility DC ranged from 50 to 500 V.* The first widespread application for electric motors was rotary fans, typically tabletop. Drive belts were used for a few. Antique fan collectors preserve early electric-motor history. *An ad, reproduced online, by an early fan maker offered that range of voltages.

Utility DC power didn't quickly disappear. New York City had 110 V DC supplied to at least one hotel ballroom after 1960. (DC elevator drives might still exist, even today.) The Audio Engineering Society held its annual convention's exhibition in the early 1960s in The New Yorker Hotel. When exhibits were first being set up, soon after devices were plugged in and switched on, they seemed dead, but the power transformers and motors in them overheated; some might have been badly damaged. Feeding DC to an AC-only device apparently does'nt trip breakers or blow fuses.

You guessed it! Wall outlets were not marked as DC, and had the standard paired slots we all had before the 3rd-wire safety ground.

Many decades ago, it was common to use testers to check power for whether it was AC or DC. Among such testers were polarity-test paper, which had been treated with some ionic salt. DC created a color at only one wire. The little neon-bulb types with attached leads were, and still are another. Only the negative electrode glows.

Along with this, devices were advertised as OK to use on AC or DC. Notable were the noisy, high-speed motors in vacuum cleaners and corded electric drills, among many others. Those motors have carbon "brushes", commutators, and rotors wound with magnet wire. Basically, they're DC motors with laminated field cores and a slightly-wider air gap around the rotor. As well, pre-WW II radios, notably the ubiquitous five-tube, operated fine on DC — reverse the power plug, if they were apparently "dead" on DC.

The earliest motors for trolley cars, all DC, used copper (alloy?) wire brushes to contact their commutators. Those just didn't work, so carbon blocks took their places. The original name stuck.

Apparently, many light switches were rotary. As you turned the knob, you'd wind a spring, and after a quarter turn, the mechanism would unlatch the contacts suddenly, to break the arc. (No blowout magnets?) Try "Ark-Les"™ for a trade mark. Perhaps this is why we say "turn on/turn off" a light, although desk and table lights with switch sockets sometimes have rotating knobs.

Older wall switches for room lights, the ubiquitous up/down lever type, made a distinctive snap when operated. That simply must have been to break DC arcs. My apt. has both kinds

Massachusetts used to require that bathroom light switches be outside the room door. (My apt. does, built in 1957.) Apparently, people got electrocuted, perhaps because removable covers for rotary switches weren't faithfully always replaced.

Indeed, the history of electrical shock protection has been continuing improvement. One quite-early electric fan had exposed connections, and what looked like big, long fusible links on top, without covers.

Even today, arc-fault interrupters for home and small-office circuits are rare (and rather expensive). In industry and the utilities, where lots of power is handled, arc flash is a serious hazard, being taken seriously.

Some time back, I came across an explanation for the holes at the prong ends of our ordinary Western Hemisphere power-cord plugs. Early wall outlets did not have ferrous-alloy springs, little doubt because of eventual corrrosion. Non-ferrous spring alloys of the time apparently could and did lose their temper, and plugs were falling out! Dimples in the outlet contacts engaged the holes, at least coping with fallouts, if not maintaining good contact.

Really-early electric appliances had power cords ending in male screw threads, the same as those for our light bulbs.

If these diversions are bad manners, I apologize!


You can feed your house DC, however the issue remains that while most devices rectify AC to DC, they are designed for an AC input. This is why you need an inverter, even if it's at some loss, you feed your electronics what they were designed for. Even then, grid tie-in solar systems that you speak of only help boost the grid's power. You need quite a bit of solar panels and buffering (batteries) to make yourself completely separate from the grid, and even then, your capacity is limited to your setup, as opposed to being able to dynamically pull from the grid when needed. Getting more opinion based, I wouldn't think it would be worth the trouble and you miss a lot of the benefits. For example, say 50% of the population gets solar panels, not enough to fulfill their power needs individually. However, together with a grid tie-in based setup and inverters, they can reduce the load on the power generation company itself. Though, I also wonder about the safety of DC with current wiring standards. Maybe someone more experienced could chime in, however since AC isn't at peak voltage all the time (returns to 0V), it gives a bit of a cooling headroom. DC on the other hand is constant.