Skip to main content

Posts

Showing posts from December, 2020

Instrumentation Engineer’s Calibration Mistakes

 Instrumentation Engineer’s Calibration Mistakes Error When Attempting to Reverse the Range In some applications, particularly DP level, it is necessary to reverse the range such that the LRV > URV. For instance, the range may have been set -87 to -1 mbar initially, but the application may actually require the range to be -1 to -87 mbar. That is the range has to be reversed: Before : LRV = -87 mbar  &  URV = -1 mbar After Change : LRV = -1 mbar  &  URV = -87 mbar To make this range change the technician would typically make the mistake of directly enter -1 mbar for LRV which would be rejected since both URV and LRV would then be -1 mbar leaving a span of 0 mbar which would violate the minimum span. Or, the technician would attempt to directly enter -87 mbar for URV which would also be rejected since both URV and LRV would then be – 87 mbar also violating the minimum span. The solution is to first enter an intermediate value. For instance, setting the ...

Smart Transmitter Calibration Tutorial Part 3

 Transmitter Output Current Trim (Analog Trim) It is rare for the analog output current circuitry of a 4-20 mA transmitter to drift. However, should the analog output current be incorrect, use current trim to correct the analog output signal. For instance, if the analog output current is 4.13 mA when it should be 4.00 mA, then current trim is used to adjust it to 4 mA. Current trim is used to match the transmitter analog output current to the current input of the analog input (AI) card channel on the DCS. For instance, the transmitter may be reading 0.00% but the DCS may show 0.13% because of differences in current calibration. The DCS may not support current trim of channels in the AI and AO cards. If there is drift in the DCS input circuitry A/D conversion or D/A conversion and output circuitry, current trim has to be done in each device instead. Current trim is only applicable to transmitter with 4-20 mA analog output. That is, for 4-20 mA/HART transmitters, not for FOUNDATION f...

Smart Transmitter Calibration Tutorial Part 2

 Range Setting (Re-range)  Range setting (re-ranging) refers to setting the scale for the 4 mA and 20 mA points. This scale is usually referred to as “calibrated range” or “calibration range”. That is, at what input shall the transmitter analog output be 4 mA; Lower Range Value (LRV) often referred to as “zero” meaning 0%, and at what input shall it be 20 mA; Upper Range Value (URV), sometimes called “full scale” meaning 100%. Note that the term “span” is not the same as URV. Span is the magnitude of difference between URV and LRV. For instance, if LRV is 20 and URV is 100, the span is 80. Since Fieldbus, PROFIBUS, and WirelessHART do not use 4-20 mA, range setting is not required for such devices in most applications. Note that calculating what the output current value should be is done in firmware in the transmitter microprocessor. It is a mathematical function. Internally the 4-20 mA/HART transmitter computes: Percentage = (PRIMARY_VARIABLE – LRV) / (URV – LRV) * 100  ...

Smart Transmitter Calibration Tutorial Part 1

 Calibration can be carried out using a handheld communicator in the field, a laptop on the bench in the workshop, or from intelligent device management (IDM) software as part of an asset management system. Electronic Device Description Language (EDDL) is the technology used by device manufacturers to define how the system shall display the device information and functions to the technician. EDDL makes the calibration of smart transmitters and other intelligent devices easier. Figure: Smart Transmitter This tutorial explains the common principles of calibration, re-ranging, and trim as they apply to various kinds of transmitters. The detail procedure varies slightly depending on the measurement done, sensing principle, and each manufacturer. Calibration  By definition, the term “calibrate” means several different things: Set the range (scale) Trim (correct) the sensor (transducer) reading or current output against a standard Simply compare the sensor (transducer) reading or cu...

Differential Pressure Transmitter Calibration Procedure

 Differential Pressure Transmitter Calibration Procedure Calibration Procedure : Set up the differential pressure transmitter, HART communicator, power supply, hand pump, and the multimeter as below (see below calibration setup Diagram). Make sure the equalizing valve manifold is closed. Apply a pressure to the transmitter equal to a lower range pressure (usually it correspond to 4 mA in the transmitter output). For example we have 0 to 100 mBar calibrated range, then the lower range pressure is 0, or let’s say we have -2 psig to 5 psig then we have lower range pressure equal to -2 psig. Read the pressure in the transmitter LCD (or in the HART communicator). Adjust (if any) through the HART communicator so that the output of the transmitter (on LCD) is the same with the applied pressure. Read the mA output of the transmitter by using a multimeter. Adjust (if any) through the HART communicator so that the output of the transmitter (on multimeter) is 4 mA. Apply a pressure to the tra...

What is a Dead Weight Tester?

 A dead weight tester is an instrument that calibrates pressure by determining the weight of force divided by the area the force is applied. The formula for dead weight testers is pressure equals force divided by area of where force is applied. Dead Weight Tester Dead weights are usually used for pressure gauge calibration as they come with high accuracy, So they can be used as primary standard (as mentioned before).there are many types of them depending on the application and they are operated with oil (hydraulic) or with air (pneumatic). Dead weight testers are the basic primary standard for accurate measurement of pressure. Dead weight testers are used to measure the pressure exerted by gas or liquid and can also generate a test pressure for the calibration of numerous pressure instruments. Why dead weight tester called dead weight tester? In dead weight tester, we put the weight on the weight stand of dead weight tester putting weight is reference weight which is to be calibrat...

How to Test a Pressure Switch ?

 Accurate calibration of pressure switches is a critical step in ensuring process quality and the safe operation of equipment. The setup is similar to pressure gauge calibration except now a voltage or continuity across a set of switch contacts needs to be read either by a (Digital Multimeter) DMM or the calibrator. The purpose of the calibration is to detect and correct errors in the set point and deadband of the pressure switch. Calibrators can save you time by reducing steps and reducing the amount of equipment you have to bring to the job. With the right calibrator the entire process can be automated. To perform the test: Setup 1. Safely disconnect the device from the process it controls. 2. Connect the calibrator or DMM to the common and NO (normally open) output terminals of the switch. The DMM or calibrator will measure an “open circuit”. if measuring continuity. If measuring V ac be sure the tool is properly rated for the voltage being measured. 3. Connect the pressure swit...

Pressure Transmitter Calibration at the bench

 Technicians calibrate at the bench to ensure calibrations are effective and don’t result in degradation of performance. They ensure that all components are in good working order prior to installation, and can evaluate them when component failure is suspected. The bench provides a stable ambient environment for calibration, an opportunity to use the most accurate equipment, and protection from factory conditions during the commissioning testing, and calibration of pressure transmitters. To perform the test: Connect the transmitter test hose from the calibrator to the transmitter Connect the mA measurement jacks of the calibrator to the transmitter Set the pressure/vacuum selection knob to the necessary function Close the vent knob and supply metering valve Apply pressure or vacuum from the pump by holding down the pump button and release when the necessary pressure is reached Correct the pressure with the fine pressure adjustment Read the reference pressure and the current output o...

How to Test a Pressure Gauge

 Both analog and digital process gauges need to be verified to detect errors related to drift, environment, electrical supply, addition of components to the output loop, and other process changes. Pressure gauges may be verified in the field or at the bench. Field calibration may save time, and allows for troubleshooting in the process environment. Multifunction calibrators make it easier to do this with one tool, and documenting calibrators make it easier to follow procedures, capture data and document results. Bench calibration provides an environment where the gauge can be cleaned, inspected, tested, and recertified under reference conditions for the best possible accuracy. To perform the test:  Isolate the pressure gauge from the process using valves, or by removing the gauge from the process. Connect the gauge to the calibrator or reference gauge. For hydraulic pressure gauges it’s important to remove any gas that might be trapped in the fluid in the gauge, calibrator, an...

Difference between Calibration and Ranging

The concepts of calibration (trimming) and ranging are often difficult for new engineers of instrumentation to immediately grasp. A simple analogy useful for understanding these topics is that of setting a digital alarm clock. Suppose you purchase a digital alarm clock to wake you up at 7:00 AM in the morning so that you can get to school on time. It would be foolish to simply unpack your new clock from its box, power it up, and set the wake-up time to 7:00 AM expecting it will wake you at the correct time. Before trusting this alarm time of 7:00 AM, you would first have to synchronize your new clock to some standard time source (like as per your country standard time zone) so that it accurately registers time for the zone in which you live. Otherwise, the wake-up setting of 7:00 AM will be hopelessly uncertain. Once your clock is synchronized against a trusted time source, however, the wake-up (alarm) time may be set at will. If your class schedule changed, allowing one more hour of s...

Shunt Calibration of a Strain Gauge Load Cell

 Shunt calibration is the known electrical unbalancing of a strain gauge bridge by means of a fixed resistor that is placed, or “shunted,” across one leg of the bridge. The Wheatstone Bridge used in strain gauge load cells are typically field-calibrated using the shunt calibration technique. Shunt calibration is a method of periodically checking the gain or span of a signal conditioner, which is used in conjunction with a strain gauge based transducer, without exposing the transducer to known, traceable, physical input values. If required, adjustments can then be made to the signal conditioner to insure accurate measurement results A strain gauge bridge is “in balance” when the host mechanical structure is unloaded and unstressed. As the host structure (diaphragm, bending beam, shear beam, column, etc.) is loaded or stressed, the Wheatstone Bridge becomes unbalanced, resulting in an output signal that is proportional to the applied load. Fig : A Wheatstone Bridge circuit showing th...

Portable Gas Detectors Calibration Procedure

  Portable Gas Detectors Calibration Procedure : According to AS/NZS 60079.29.2:2008 (Gas Detectors – Selection, installation, use and maintenance of detectors for flammable gases and oxygen) a portable gas monitor should be regularly calibrated by a competent person according to the manufacturer’s instructions. A gas monitor/detector looses sensitivity over time due to aging, exposure to high concentrations of gases, sensor poisoning etc and so the accuracy of the gas monitor must be verified on a regular basis. This can be done by performing a calibration that will ensure the gas monitor is operating correctly Calibration Frequency : The frequency of span calibration is best determined by local standards, company policies and industry best practices. The calibration frequency should be determined by the level of risk. A high risk situation will require a higher frequency of calibration in comparison to a low risk situation. Best practice is to test the gas monitor with a known co...

What is Bump Testing ?

 The latest version of the International Safety Equipment Association (ISEA) statement applies to all types of direct reading portable gas detectors, not just confined space instruments. The ISEA protocol has been widely adopted by the gas detection equipment manufacturing community, even by manufacturers who are not members of the Association. The ISEA protocol begins by clarifying the differences between a “bump test”, a “calibration check” and a “full calibration”: A “bump test” (function check) is defined as a qualitative check in which the sensors are exposed to challenge gas for a time and at a concentration to activate all of the alarms to at least the lower alarm settings. It is important to understand what a qualitative test of this kind does not do. The test confirms that the gas is capable of reaching the sensors, that when they are exposed to gas the sensors respond, the response time (time to alarm) after gas is applied is within normal limits, and that the alarms are ...