A recent ice storm tha took out power for almost two days, during which I rented a 1000 watt generator (the only one I could find). It only generated 7.5 amps of power so I alternated between sump pump and freezer.
Now, checking generator specs, it seems that they are mostly advertised in wattage, rather than amperage capacity. Are there any good rules to follow that would help figure out what wattage/amperage ratings I should look for in a generator based on which appliances I want to power at the same time? I’m thinking that it is not as simple as just adding up the amperage ratings of the devices I want to run, how does the wattage of these devices figure into this?
Thanks, wsf
Replies
very little help, but
watts = volts x amps x cos theta
cosine theta is the angle between volts and amps, in a purley resistive load cos theta is 1, so watts would be volts x amps.
bring in a motor and then theta comes into play, as well as a surge needed to start some motors.
bobl Volo Non Voleo Joe's cheat sheet
Bob,
So then, in a household situation, can we consider volts a constant 120? A 10 amp device at 120 volts would require 1200 watts?
Except for motors, and everything I want to run has a motor. How does a motor change this formula in a practical sense? Since I don't have my trig tables handy and I have no idea what the value of theta is, is there any usable rule that can be applied to motors vis a vis this formula?
very simplistically, cos theta = power factor (PF)
PF typically 0.8 (or even less for light loads) for most motors.
volts X amps = volt amps, which is what most generators are really rated for.
Generators limited by how hot they can get before the wire insulation is damaged.
Volts are what produce the core losses(magnetic iron hysterysis losses, some eddy current losses.
Amps produce the losses in the wire.
theta is by how much they are out of phase, which effects how much power you can get to the motor shaft, but does not reduce the heating of the generator.
Bottom line (again simplistic) is that if you read your generator wattage as Volt-amps, then your use of 120V times 10 amps is going to be on the safe side. -- this totally discounts the demands of STARTING the motor, a whole 'nother subject with small generators.
PS: you probably don't care and it does not effect your question, but due to increasingly greater use of electronic motor drives, etc., the term 'power factor' is gradually going out of technical usage, and being replaced by 'displacement factor' and other terms used to better describe harmonics and other aspects of power systems more complex than the old simple sine waves.
Here's a good link - http://www.mayberrys.com/honda/generator/html/requirements.htm
And, as Junkhound said, starting a motor is a whole different story.