Looks like I managed to inadvertently offend quite a few EEs with previous posting. My apologies... as that was not the intention. The main culprits in the "big picture" of the national electric "grid" then as now, is/was lack of adequate long-term planning, the jealous guarding of antiquated commercial ties (such as is epitomized in the Edison/Westinghouse feud), and most of all: lack of coordinated R&D across many competing corporations. These are problems of Capitalism, more so than any particular engineering failure. I am not anti-capitalist by any means, but we all must realize that it is the "best" system now only because the other choices are even more inefficient and it would be a mistake to keep it the same forever, just because it occasionally works well. Capitalism has a great deal of room for improvement, especially in the area of long-term coordinated planning across related industries - which of necessity temporarily is require to ally normal competitors. The one industry to do a decent job of this has been semiconductors - and that is probably why we now have the small efficient PS.
Now if our government had gotten into the picture early-on, before Edison electrocuted all of those elephants, who knows what would have happened. Edison was not necessarily backing the wrong horse, as it turns out. Its just that the right horse had the wrong saddle, so to speak. BTW, I doubt if there are any college engineering students on this forum, but if there are any - this advice: if undecided go "electrical" first - that major will likely be the hardest in your school, but if you are going geek, they are all going to be difficult, and any of them will force you to give up valuable beer-drinking time, but electrical provides the best foundation for every other engineering discipline to build on, should you change your mind IMHO.... And you can catch up on the beer-drinking after one of the many Cons which dominate the industry (like Con-Ed) hires you to do some petty job. Mike Carrell wites: Horace wrote: > > One of the reasons 60 Hz was chosen over higher frequencies is the > > prevention of transmission line losses. One main problem with using high > > freqency transformers in power supplies until fairly recently was > > rectification. Diodes drop in efficiency with frequency. These days the > > availability of high current low voltage FETs (with switching logic to > > achieve the rectification) permits efficient rectification, but even FETs > > still have frequency limitations, just much higher AFAIK. > > Horace has a point, for over long distances even widely spaced high voltage > transmission line have shunt capacitance. The frequency standards were set > early on when a lot of the AC machines had cast iron magnetic circuits and > eddy current losses were greater the higher the frequency. Laminated > construction came later. This is all true historically, but in hindsight we can ask - were there any missed opportunities along the way? When it comes to moving lots of power hundreds of miles through Transmission Lines, and given that historically "other" considerations have dominated over what is theoretically ideal, one major point needs to be made. In theory, Tesla notwithstanding - at any given voltage and everything else being equal (which it seldom is) - one is always better off with DC than AC. That is a fact that is often swept under the carpet with the down-conversion-loss broom. Electric energy is transported across the countryside with high-voltage lines because the line losses are much smaller than with low-voltage lines. The choice between AC and DC is unrelated to this one fact. All wires in commercial use have resistance, but the development of high-temperature superconductors will probably change this soon and when it does, AC will be a goner, mas o menos, even if it takes several centuries. Let's call the total resistance of the transmission line leading from, say a turbine-mounted generator in a dam or steam station to your local substation R. Let's also say the local community demands a power P=IV from that substation. This means the current drawn by the substation is I=P/V and the higher the transmission line voltage, the smaller the current. The line loss is given by P(loss)=I^2 R, or, substituting for I, P(loss) = P�R/V� (notice there is no place of AC/DC in this formula) Since P is fixed by community demand, and R is as small as Aluminum permits(using big fat copper cable would help were it not for the cost), line loss decreases strongly in power law fashion with increasing voltage, whether it be AC or DC but for the same line, there are *additional* albeit small AC heating loss at any given voltage. Kind of like friction. The reason is simply that you want the smallest amount of current that you can use to deliver the most power P; and when you calculate the "power factor" AC will always require a higher peak current than average. Another important note: the loss fraction from downshifting DC has been high historically, actually unacceptable high, but that does not mean that it "had to be" that way, then or now. We might well have overlooked a few things along the way (such as, is there really such a thing as a super-efficient cavity diode?). Would a national R&D effort a century ago have given us a nearly lossless HV DC down-conversion technology? Perhaps not, given that semiconductors were still 50 years off ... but don't be too sure - if you believe what some experts in the field of advanced cavity-type diodes will tell you (yeah, I know, where's the beef?) Again, this deficiency in AC is inherent in the system and should not be glossed over (by the long-term planners) by DC down-conversion issues - because power is proportional to current but line loss is proportional to current squared, and if your current fluctuates, its peak will always exceed its average. Line loss can be quite large over long distances, up to 30% or so. By the way, line loss power goes into heating the transmission line cable which, per meter length, isn't very much heat until you multiply by the total meters. Its a long way from Hoover Dam to LA. Given that we want to reduce line loss by using high voltage, the choice between AC and DC becomes historically straightforward. It has been lossy and difficult to reduce a DC high voltage to low voltage AC or DC without additional significant losses - whereas with AC, it is easy to reduce AC high voltage to low voltage using a step-down transformer. You see lots of these when you walk by a substation. An ideal transformer reduces V and increases I so that the power IV is constant within a percent or two of loss due to eddy current heating of the iron core. To compete with this, one would need HV DC downshifting with comparable losses - 1-2%. You can get pretty close now with semiconductors in a small computer PS. In the big picture, however, is it possible to even imagine downconverting HV DC "from the grid" to street voltage AC? Of course it wasn't possible back in Edison's day, but he was still correct in theory, and therefore one is free to wonder the result, had we institutionalized a coordinated R&D effort back then... ? Doubtful if things would be different. Ergo, we didn't really make an unforgivable error; and our EEs actually did all that their government allowed them to do. Side note: A neighborhood substation typically reduces 3 phase incoming HV voltage, 5000 volts and up, to a reasonable value for street lines, say 330-500 V, and then a small transformer outside your house will center-tap this and reduce it to 110 V. Between your computer and the dam, there may be 4-6 parasitic transformers, up-down-and sideways, each taking its own small toll, but it is cumulative. This is an area that is subject to improvement, even without a switch to HV DC for the transmission line itself. There is some reason to suspect that the transformer itself, using optimized cores and frequencies, could become the first major ZPE product to hit the market. Quien sabe? BTW we call it two-phase by the time it gets to the toaster - but to be precise this is not exactly the case. Your house wiring is technically NOT even AC at all from one perspective, but instead is pulsed DC positive at the 60 cycle frequency. The AC-like component is supplied by the capacitance of all the gadgets in your house and the house wiring itself. If you don't believe this, look in your breaker box and you will see that only one line is actually "hot" and both the other two are essentially grounded (but at different places). The end result looks like AC on a scope, but there can be some distortion, depending on the capacitance quirks of "this old house". Here is the main point which I meant to make before. Although the low loss technology for HV DC was not in place during the Westinghouse-Edison-Tesla days, one wonders if some kind of imposed national R&D effort would have changed the situation then. And we should give Edison, not Tesla, the credit for being correct in principle. It probably is too late now For HV DC, that is, until low cost superconductivity comes along. BUT... are we now making adequate plans for that day in a national coordinated effort? Perhaps EPRI is doing so, and perhaps they can fit in the OU transformer as well... ;-) Jones BTW You Might Be an EE if: Dilbert is your hero, You have saved the power cord from a discarded old appliance, You have purchased an appliance "as-is" at a yard sale just to see if you can fix it (after which you saved the power cord) Your spouse sends you an e-mail instead of calling you to dinner You look forward to Christmas as a chance to put together the latest Hi-tek toys You can quote from Monty Python o Firesign theatre Your idea of interpersonal communication means getting the decimal point in the right place At Christmas, it goes without saying that you will be the one to find the burnt-out bulb in the string. You window shop at Radio Shack or Sharper Image instead of Macys You are convinced you can build a phazer out of a camera's flash attachment.... You will find an egregious geek error in the spiel above, and embarrass the author once again....

