Reading through the archives, I gather that reducing the voltage to the motor will reduce the torque and motor generated noise. I generally run my TT through an isolation transformer with multiple taps so adjusting the voltage is easy enough but I worry about damaging the motor. Is there a general rule of thumb for a safe reduction in voltage?
I’m not sure if any of this matters but just in case, the motor is a reversible 600 RPM PB 5.5W 60 Hz Hurst as supplied by VPI. It’s going to be spinning a HW-19 Mk 3 platter (older lead filled version) with a stock VPI bearing.
Thanks in advance for any advice you can offer.
Marty
I’m not sure if any of this matters but just in case, the motor is a reversible 600 RPM PB 5.5W 60 Hz Hurst as supplied by VPI. It’s going to be spinning a HW-19 Mk 3 platter (older lead filled version) with a stock VPI bearing.
Thanks in advance for any advice you can offer.
Marty
Hi,
You can reduce the voltage safely until it simply does not work.
Synchronous motors draw the current they need and torque is not
simply based on drive voltage, maximum torque perhaps, but other
than at start up maximum torque is irrelevant, stall torque is not
maximum torque, some slip needs arranging for fast start up.
🙂/sreten.
You can reduce the voltage safely until it simply does not work.
Synchronous motors draw the current they need and torque is not
simply based on drive voltage, maximum torque perhaps, but other
than at start up maximum torque is irrelevant, stall torque is not
maximum torque, some slip needs arranging for fast start up.
🙂/sreten.
Hi,
a synchronous motor "locks" onto the supply frequency.
But it can run both behind and ahead of the phase.
I suspect this hunting brought on by varying load will get worse as the supply voltage is dropped.
Any comment?
a synchronous motor "locks" onto the supply frequency.
But it can run both behind and ahead of the phase.
I suspect this hunting brought on by varying load will get worse as the supply voltage is dropped.
Any comment?
The torque available from a synchronous motor is indeed dependent on the supply voltage. For any voltage above the back EMF the maximal torque is given by torque constant x (Esupply-Eback)/winding resistance.
A synchronous motor locks onto the supply frequency but if loaded it always runs behind the phase. The actual phase lag is proportional to the torque load - it is 90 degrees behind at the drop out torque and comes forward as the load is reduced.
Combining these two, we see that the degree of rotational variation for a given torque variation must increase as the drive voltage is reduced.
A synchronous motor locks onto the supply frequency but if loaded it always runs behind the phase. The actual phase lag is proportional to the torque load - it is 90 degrees behind at the drop out torque and comes forward as the load is reduced.
Combining these two, we see that the degree of rotational variation for a given torque variation must increase as the drive voltage is reduced.
Mark Kelly said:
The torque available from a synchronous motor is indeed dependent on the supply voltage. For any voltage above the back EMF the maximal torque is given by torque constant x (Esupply-Eback)/winding resistance.
Hi, This sounds far too much like a DC motor, 2 phase AC is more complicated, 🙂/sreten.
Nope.
Here's professor Holtz on the subject:
http://www.ema.uni-wuppertal.de/paper/torque.pdf
combining equations 1 and 2, given that the stator flux linkage is dependent on the angle between them and that this angle is at its maximum (90 degrees) at the maximal torque gives the equation I posted above.
I explicitly stated that my equation referred to maximal torque.
Here's professor Holtz on the subject:
http://www.ema.uni-wuppertal.de/paper/torque.pdf
combining equations 1 and 2, given that the stator flux linkage is dependent on the angle between them and that this angle is at its maximum (90 degrees) at the maximal torque gives the equation I posted above.
I explicitly stated that my equation referred to maximal torque.
Mark Kelly said:Nope.
Hi, I prefer KISS and actually relevant to your approach, 🙂/sreten.
Mark Kelly said:.A synchronous motor locks onto the supply frequency but if loaded it always runs behind the phase. The actual phase lag is proportional to the torque load - it is 90 degrees behind at the drop out torque and comes forward as the load is reduced.
Doesn't the motor get hot if it lags the voltage frequency by too much?? (ie if it goes too slow)
The phase lags, not the speed.
Phase lag is the angle between the rotor and the stator field.
The speed stays synchronous until dropout.
Phase lag is the angle between the rotor and the stator field.
The speed stays synchronous until dropout.
Mark,
how does oscillation of the phase angle get damped?
Is there a damping mechanism (electrical) inherent in the construction of the motor?
Or is it damped externally with a lossy belt drive?
how does oscillation of the phase angle get damped?
Is there a damping mechanism (electrical) inherent in the construction of the motor?
Or is it damped externally with a lossy belt drive?
The moment of inertia of the rotor and the effective compliance of the rotor in the stator field create a resonant tank. I call this the "torque spring effect" and have done some calculations on its magnitude and frequency.
As far as I can see there is nothing to damp the motion except the bearing friction in the motor so the Q of the system is very high. The calculations are contained in the series of posts I did on Audio Asylum modelling turntable mechanics.
As far as I can see there is nothing to damp the motion except the bearing friction in the motor so the Q of the system is very high. The calculations are contained in the series of posts I did on Audio Asylum modelling turntable mechanics.
Mark Kelly said:The phase lags, not the speed.
Phase lag is the angle between the rotor and the stator field.
The speed stays synchronous until dropout.
oops - i had a blond moment. you're right.
just thinking now - a drill speed controller uses a light dimming type of circuit - the firing angle of a triac is adjusted to change the speed of the motor.
so if the motor locks onto the supply frequency, how does its speed change??
does it (at low speed) "skip" an AC wave and lock onto a bit of the next one?? -(if that makes ANY sense)
Dan2 said:
oops - i had a blond moment. you're right.
just thinking now - a drill speed controller uses a light dimming type of circuit - the firing angle of a triac is adjusted to change the speed of the motor.
so if the motor locks onto the supply frequency, how does its speed change??
does it (at low speed) "skip" an AC wave and lock onto a bit of the next one?? -(if that makes ANY sense)
Hi,
Your having another one. A power drill motor is AC, but not synchronous.
🙂/sreten.
Really? i thought all AC motors were synchronous!
I guess i will stick to studying light current
Thanks for correcting me.
I guess i will stick to studying light current

Thanks for correcting me.
- Status
- Not open for further replies.
- Home
- Source & Line
- Analogue Source
- 120V AC motor question