Relationship of torque and shaft size

A rudimentary understanding of how shaft sizes are determined can be helpful to anyone who works with pumps, fans, elevators or any other motor-driven equipment. Engineers often design using an ample safety factor, but consider modifying a shaft only with good engineering support. The greater the consequence of failure, the more generous the safety factor should be.


Figure 1: A fire pump such as this does not run often, but it had better work when you need it! Courtesy: EASAHave you ever wondered why various types of electric motors with the same horsepower/kilowatt ratings have different shaft diameters, or why some pump shafts are so much smaller than the shafts of the motors that drive them? And what about those hollow-shaft motors? A rudimentary understanding of how shaft sizes are determined can be helpful to anyone who works with pumps, fans, elevators or any other motor-driven equipment.

Bigger is better—or at least used to be

Owing partly to tradition, the shafts of electric motors are often larger than those of the equipment they drive. Engineers were very conservative a century ago when electric motors first came into widespread industrial use, so they typically designed in a sizable margin of error. Today’s engineers haven’t changed much in this respect. For example, standard NEMA frame dimensions, which have been revised only once since 1950, still specify much larger shaft sizes than commonly accepted principles of mechanical engineering would require.

Shaft design basics

Shaft size is dictated by torque, not horsepower. But changes in horsepower and speed (rpm) affect torque, as the following equation shows:

Torque (lb-ft) = hp x 5,252/rpm

Accordingly, an increase in horsepower would require more torque, as would a decrease in rpm. For example, a 100 hp motor designed for 900 rpm would require twice as much torque as a 100 hp motor designed for 1,800 rpm. Each shaft must be sized for the torsional load it is expected to carry.

Two basic approaches are used to determine the required minimum shaft size for motors, both of which yield conservative results. One method calls for making the shaft large enough (and therefore strong enough) to drive the specified load without breaking. Mechanical engineers define this as the ability to transmit the required torque without exceeding the maximum allowable torsional shearing stress of the shaft material. In practice, this usually means that the minimum shaft diameter can withstand at least two times the rated torque of the motor.

Figure 2: Effluent pumps application. Courtesy: EASAAnother way to design a shaft is to calculate the minimum diameter needed to control torsional deflection (twisting) during service. To engineers, this means the allowable twisting moment, or torque, is a function of the allowable torsional shearing stress (in psi or kPa) and the polar section modulus (a function of the cross-sectional area of the shaft).

Machinery’s Handbook provides the following equations for determining minimum shaft sizes using both design approaches: resistance of torsional deflection and transmission of torque. Both sets of equations are based on standard values for steel, with allowable stresses of 4,000 psi (2.86 kg/mm2) for power-transmitting shafts, and 6,000 psi (4.29 kg/mm2) for line-shafts with sheaves (sometimes called pulleys). Some of the equations also are specific to keyed or non-keyed shafts, which is handy for pump users who need to know how to calculate both keyed and unkeyed shafts.

Transmission of torque approach

Most motor shafts are keyed, which increases the shear stress exerted on the shaft. Considering this, motor shaft designs typically use no more than 75% of the maximum recommended stress for a non-keyed shaft. This is another reason why the shafts of electric motors are often larger than the pump shafts they drive.

> > See the following pages for Equations 3-5 and Examples 1-4.

<< First < Previous Page 1 Page 2 Page 3 Next > Last >>

click me