Originally Posted by Fourdan
I am not a great specialist in motor drivers.
See Wikipedia VFD
Maybe another guy could answer
My idea is that when the rpm are steady .. torque produced by the motor
is equal to the loading torque.
When there is some acceleration or deceleration, the problem is more complicated.
When starting from stop if you have position sensors or not (sensorless) the algorithms are different.
I am aware that motor consumes current depending on the loading torque. But I also notice that when throttle is small, it is very easy to hold the rotor still; while in higher throttle, it can be very difficult to stop the rotor. However, for servo drive, even when the motor is running at low speed, the torque needed to stop the rotor can be very big; in other words, constant torque.
This may explain why high speed inrunner + gear to reduce the output speed is favoured. For example, the maximum torque of a inrunner (15000 rpm)+ gear (5:1) at 3000rpm, may be better than a motor directly running at 3000rpm, at 50% throttle.
I am also not sure, just curious. Thanks for the input.