However, when the motor inertia is larger than the strain inertia, the electric motor will require more power than is otherwise necessary for the particular application. This raises costs because it requires spending more for a motor that’s bigger than necessary, and because the increased power usage requires higher operating costs. The solution is by using a gearhead to match the inertia of the motor to the inertia of the load.
Recall that inertia is a measure of an object’s level of resistance to improve in its movement and is a function of the object’s mass and form. The greater an object’s inertia, the more torque is required to accelerate or decelerate the object. This implies that when the strain inertia is much larger than the electric motor inertia, sometimes it could cause excessive overshoot or enhance settling times. Both circumstances can decrease production line throughput.
Inertia Matching: Today’s servo precision gearbox motors are producing more torque relative to frame size. That’s because of dense copper windings, light-weight materials, and high-energy magnets. This creates greater inertial mismatches between servo motors and the loads they are trying to move. Utilizing a gearhead to better match the inertia of the motor to the inertia of the load allows for utilizing a smaller electric motor and results in a far more responsive system that’s easier to tune. Again, that is achieved through the gearhead’s ratio, where in fact the reflected inertia of the load to the engine is decreased by 1/ratio^2.
As servo technology has evolved, with manufacturers producing smaller, yet better motors, gearheads are becoming increasingly essential companions in motion control. Finding the optimal pairing must take into account many engineering considerations.
So how does a gearhead start providing the energy required by today’s more demanding applications? Well, that all goes back to the basics of gears and their capability to modify the magnitude or direction of an applied pressure.
The gears and number of teeth on each gear create a ratio. If a electric motor can generate 20 in-lbs. of torque, and a 10:1 ratio gearhead is attached to its output, the resulting torque will be near to 200 in-lbs. With the ongoing emphasis on developing smaller footprints for motors and the equipment that they drive, the capability to pair a smaller electric motor with a gearhead to attain the desired torque output is invaluable.
A motor could be rated at 2,000 rpm, however your application may just require 50 rpm. Trying to run the motor at 50 rpm might not be optimal based on the following;
If you are working at an extremely low velocity, such as for example 50 rpm, as well as your motor feedback resolution isn’t high enough, the update rate of the electronic drive may cause a velocity ripple in the application. For example, with a motor feedback resolution of just one 1,000 counts/rev you possess a measurable count at every 0.357 degree of shaft rotation. If the electronic drive you are employing to control the motor has a velocity loop of 0.125 milliseconds, it will search for that measurable count at every 0.0375 amount of shaft rotation at 50 rpm (300 deg/sec). When it does not find that count it will speed up the electric motor rotation to find it. At the speed that it finds the next measurable count the rpm will be too fast for the application and then the drive will sluggish the engine rpm back off to 50 rpm and then the whole process starts all over again. This continuous increase and decrease in rpm is exactly what will trigger velocity ripple within an application.
A servo motor running at low rpm operates inefficiently. Eddy currents are loops of electrical current that are induced within the engine during procedure. The eddy currents in fact produce a drag pressure within the motor and will have a greater negative impact on motor performance at lower rpms.
An off-the-shelf motor’s parameters might not be ideally suited to run at a low rpm. When a credit card applicatoin runs the aforementioned motor at 50 rpm, essentially it is not using all of its available rpm. Because the voltage constant (V/Krpm) of the engine is set for a higher rpm, the torque continuous (Nm/amp), which can be directly related to it-is lower than it needs to be. As a result the application needs more current to drive it than if the application form had a motor specifically designed for 50 rpm.
A gearheads ratio reduces the electric motor rpm, which explains why gearheads are sometimes called gear reducers. Utilizing a gearhead with a 40:1 ratio, the electric motor rpm at the input of the gearhead will end up being 2,000 rpm and the rpm at the output of the gearhead will end up being 50 rpm. Operating the engine at the higher rpm will allow you to avoid the issues mentioned in bullets 1 and 2. For bullet 3, it allows the look to use less torque and current from the electric motor predicated on the mechanical benefit of the gearhead.