Since around school age I've always thought of Voltage as how "fast" the electrons are flowing and Current as how much energy is in the "bucket" of each electron. I asked my teacher this in Physics and he said that's absolutely incorrect, but didn't really give much by way of an explanation. My reasoning is that an electronic device will always only take the current that it needs but usually must have a specific voltage. Too high voltage and the device will break too low and it wont work, or the bulb will be dim etc.
Can anyone comment on the above? I can give more examples where that analogy fits and so it makes sense to me.
:awesome:
http://en.wikipedia.org/wiki/Hydraulic_analogy
This is where I make my 16 years as a teacher pay off...
Voltage is like the difference in pressure, so it's measure of how hard the electrons will hit. They all move at wire speed, so it's more the force of the flow than the speed of the flow that concerns voltage.
Current is like flow rate, so it's how much over time. Volts will shock you (big force in short time will knock you across the floor, but you can live), but the amps are what kills you (large amounts over sustained time will cook your goose).
I'll leave discussions of where the analogy breaks down to the article referenced above.
Nice one thanks!
:awesome:
Yep. That is why on a small gauge cable you can run higher volts, but not higher amps.
yeah I always had it in my head voltage = acceleration, amps = mass (more tortured analogies)