120V AC motor question

Status
Not open for further replies.
Member
Joined 2004
Paid Member
Reading through the archives, I gather that reducing the voltage to the motor will reduce the torque and motor generated noise. I generally run my TT through an isolation transformer with multiple taps so adjusting the voltage is easy enough but I worry about damaging the motor. Is there a general rule of thumb for a safe reduction in voltage?

I’m not sure if any of this matters but just in case, the motor is a reversible 600 RPM PB 5.5W 60 Hz Hurst as supplied by VPI. It’s going to be spinning a HW-19 Mk 3 platter (older lead filled version) with a stock VPI bearing.

Thanks in advance for any advice you can offer.
Marty
 
Hi,

You can reduce the voltage safely until it simply does not work.

Synchronous motors draw the current they need and torque is not
simply based on drive voltage, maximum torque perhaps, but other
than at start up maximum torque is irrelevant, stall torque is not
maximum torque, some slip needs arranging for fast start up.

🙂/sreten.
 
The torque available from a synchronous motor is indeed dependent on the supply voltage. For any voltage above the back EMF the maximal torque is given by torque constant x (Esupply-Eback)/winding resistance.

A synchronous motor locks onto the supply frequency but if loaded it always runs behind the phase. The actual phase lag is proportional to the torque load - it is 90 degrees behind at the drop out torque and comes forward as the load is reduced.

Combining these two, we see that the degree of rotational variation for a given torque variation must increase as the drive voltage is reduced.
 
Mark Kelly said:

The torque available from a synchronous motor is indeed dependent on the supply voltage. For any voltage above the back EMF the maximal torque is given by torque constant x (Esupply-Eback)/winding resistance.

Hi, This sounds far too much like a DC motor, 2 phase AC is more complicated, 🙂/sreten.
 
Mark Kelly said:
.A synchronous motor locks onto the supply frequency but if loaded it always runs behind the phase. The actual phase lag is proportional to the torque load - it is 90 degrees behind at the drop out torque and comes forward as the load is reduced.

Doesn't the motor get hot if it lags the voltage frequency by too much?? (ie if it goes too slow)
 
The moment of inertia of the rotor and the effective compliance of the rotor in the stator field create a resonant tank. I call this the "torque spring effect" and have done some calculations on its magnitude and frequency.

As far as I can see there is nothing to damp the motion except the bearing friction in the motor so the Q of the system is very high. The calculations are contained in the series of posts I did on Audio Asylum modelling turntable mechanics.
 
Mark Kelly said:
The phase lags, not the speed.

Phase lag is the angle between the rotor and the stator field.
The speed stays synchronous until dropout.


oops - i had a blond moment. you're right.

just thinking now - a drill speed controller uses a light dimming type of circuit - the firing angle of a triac is adjusted to change the speed of the motor.

so if the motor locks onto the supply frequency, how does its speed change??

does it (at low speed) "skip" an AC wave and lock onto a bit of the next one?? -(if that makes ANY sense)
 
Dan2 said:

oops - i had a blond moment. you're right.

just thinking now - a drill speed controller uses a light dimming type of circuit - the firing angle of a triac is adjusted to change the speed of the motor.

so if the motor locks onto the supply frequency, how does its speed change??

does it (at low speed) "skip" an AC wave and lock onto a bit of the next one?? -(if that makes ANY sense)

Hi,

Your having another one. A power drill motor is AC, but not synchronous.

🙂/sreten.
 
Status
Not open for further replies.