The following is an extract from an article kindly provided by Acromag Inc, written by one of their engineers, Bruce Cyburt. The article introduces the concept of strain and strain measurement. The full article is available in PDF format.
Please visit Acromag’s website at www.acromag.com to obtain the latest information about their strain gauge related products.
Strain gauges are widely employed in sensors that detect and measure force and force-related parameters, such as torque, acceleration, pressure, and vibration. The strain gauge is the building block for strain sensors that often employ multiple strain gauges in their construction. A strain gauge will undergo a small mechanical deformation with an applied force that results in a small change in gauge resistance proportional to the applied force.
Because this change in resistance with applied force is so small, strain gauges are commonly wired using a Wheatstone Bridge. The resultant output voltage of the bridge is directly related to any imbalance between resistances in each leg of the bridge and the bridge excitation voltage. The output of the bridge is normally specified in terms of millivolts of output voltage per volt of applied excitation (mV/V), and this is usually referred to as its rated output or sensitivity.
The actual maximum or full-scale output of a strain gauge bridge at its full-rated load is the product of a bridge’s sensitivity (mV/V) and the applied excitation voltage. This is referred to as the output span under full rated load. Strain is a measure of the deformation of a body when subject to an applied force. Specifically, strain (e) is the fractional change in dimension (length, width, or height) of a body when subject to a force along that dimension. That is: e = dL / L. Note that strain can be either positive (tensile), or negative (compressive).
Further, the magnitude of a strain measurement is typically very small and is often expressed as a whole number multiple of 10- 6, or microstrain (µe). In most cases, strain measurements are rarely encountered larger than a few millistrain (e * 10-3), or about 3000µe, except for high-elongation applications. When a body of material is subject to a force in one direction, a phenomenon referred to as Poisson’s Strain causes the material to contract slightly in the transverse or perpendicular dimension. The magnitude of this contraction is a property of the material indicated by its Poisson’s Ratio. The Poisson’s Ratio is the negative ratio of the coincident compressive strain that occurs in the transverse direction (perpendicular to the applied force), to the strain in the axial direction (parallel to the applied force).
Strain gauges are devices that change resistance slightly in response to an applied strain. These devices typically consist of a very fine foil grid (or wire grid) that is bonded to a surface in the direction of the applied force. The cross-sectional area of this device is minimized to reduce the negative effect of the shear or PoissonÕs Strain. These devices are commonly referred to as bonded-metallic or bonded-resistance strain gauges. The foil grid is bonded to a thin backing material or carrier which is directly attached to the test body. As a result, the strain experienced by the test body is transferred directly to the foil grid of the strain gauge, which responds with a linear change (or nearly linear change) in electrical resistance. As you can surmise, the correct mounting of strain gauges is critical to their performance in ensuring that the applied strain of a material is accurately transferred through the adhesive and backing material, to the foil itself.
Most strain gauges have nominal resistance values that vary from 30 to 3000 ohm, with 120, 350, and 1000 ohm being the most common. The relationship between the resultant fractional change of gauge resistance to the applied strain (fractional change of length) is called the Gauge Factor (GF), or sensitivity to strain. Specifically, the Gauge Factor is the ratio of the fractional change in resistance to the strain: GF = (dR / R) / (dL / L) = (dR / R) /e
The Gauge Factor for metallic strain gauges is typically around 2.0. However, it is important to note that this ratio will vary slightly in most applications and a method of accounting for the effective Gauge Factor of a strain measurement system must be provided (see Instrument Gauge Factor). In the ideal sense, the resistance of a strain gauge should change only in response to the applied strain. Unfortunately, the strain gauge material, as well as the test material it is applied to, will expand or contract in response to changes in temperature. Strain gauge manufacturers attempt to minimize gauge sensitivity to temperature by design, selecting specific strain gauge materials for specific application materials.
Though minimized, the equivalent strain error due to the temperature coefficient of a material is still considerable and additional temperature compensation is usually required. Because strain measurement typically requires the detection of very small mechanical deformations, and small resistance changes, the resultant magnitude of most strain measurements in stress analysis applications is commonly between 2000 and 10000µe, and rarely larger than about 3000µe.