Power loss (1 Viewer)

Joined
Aug 22, 2013
Messages
50
Gender
Female
HSC
2014
I just sat my hsc today, so this is hardly a big issue. it's just been bugging me for a while. We learn that power loss is given by P=I^2*R, so the electricity is sent at high voltages in order to minimise heat.

BUT, using Ohm's Law, P=V^2/R. So isn't increasing the voltage screwing you over anyway?
 

Fizzy_Cyst

Well-Known Member
Joined
Jan 14, 2011
Messages
1,189
Location
Parramatta, NSW
Gender
Male
HSC
2001
Uni Grad
2005
That V in Ohms law is POTENTIAL DIFFERENCE, not POTENTIAL.

When you get that version of power loss, the V is also referring to potential difference (voltage drop), when voltage is increased, the voltage drop across the wire/cable decreases, hence power loss decreases.
 

anomalousdecay

Premium Member
Joined
Jan 26, 2013
Messages
5,769
Gender
Male
HSC
2013
That V in Ohms law is POTENTIAL DIFFERENCE, not POTENTIAL.

When you get that version of power loss, the V is also referring to potential difference (voltage drop), when voltage is increased, the voltage drop across the wire/cable decreases, hence power loss decreases.
Let me clarify that when the voltage (potential) increases, as a result, the current through the component will decrease (since the power output is constant by E = Pt, where the energy input is constant), meaning that by ohm's law the voltage drop (potential difference) will decrease (since the resistance is constant but the current has decreased), thus making it so that the power loss decreases.

Remember that we are looking at the conservation of energy here after all. So you have to start off as so:



And remember: Twinkle twinkle little star, power loss equals I squared R.
 

Users Who Are Viewing This Thread (Users: 0, Guests: 1)

Top