Skip to main content

Transmitter Calibration Procedure

 In today’s environment of shrinking cost margins and increased regulations, maintaining properly calibrated equipment becomes ever important. Even the smallest inaccuracy in process measurements can cause a significant loss in revenue, especially if the process handles large volumes.

Electronic equipment can lose accuracy over a period of time, leading to incorrect readings; this is where calibration comes in. Calibrating electronic equipment, in this case transmitters, brings the measured values of the device in line with a known value from an applied standard.

Instrument Loops

In order to understand transmitter calibration, it may be beneficial to first review instrument loops. An instrument loop refers to a configuration of equipment, connected in such a way to deliver power to all devices and obtain a reading from these devices.  An instrument loop can be either electrical or pneumatic.

The figure below shows a simple instrument loop with a power supply, transmitter, and a meter to measure current in the loop.

Smart Transmitters
Smart transmitters are microprocessor-based transmitters that are capable of being reprogrammed with various parameters without the need for board replacement.

Smart transmitters are more accurate than traditional analog transmitters, have a much greater turndown ratio, and are much easier to calibrate.

Calibration Guidelines
All calibrations should follow some general guidelines, which are true of any type of transmitter, smart or traditional. Some of the guidelines are discussed here.

Primary vs. Secondary standards
Primary standards produce a certified output that can be used to calibrate a piece of equipment. Deadweight testers are examples of primary standards. These devices use a precision mass to measure pressure.
Secondary standards typically measure a variable with a transducer or electronics, then display it electronically.
Gravitational Constants and Corrections
Since the gravitational constant throughout the world changes slightly from place to place, this must be taken into account when using a primary or mass-to-pressure device like a dead weight tester.

When using a primary tester to calibrate an instrument, make sure that the weights have the proper gravitational constant stamped on them for the area the tester is being used in.

  • Some commonly used constants are below:
  • “Standard/International” gravity is 980.665 cm/sec2
  • US mean gravity is 980.000 cm/sec2
  • All instrument calibrations should be made to .1% of calibrated span.

General Calibration Routine
All calibrations have common elements.  The following is a general procedure for performing any calibration:

  1. Isolate the transmitter from the source. This means taking the transmitter off service.
  2. Connect a primary device, such as a dead weight tester or a decade box, to the input of the transmitter.  Make sure the primary device has been certified.  The setup should be similar to the figure below.  In this figure, a HART communicator has also been connected to aid in programming a smart transmitter.
  3. Record on the appropriate form.  A sample chart is shown below for a 0 to 150“ differential pressure transmitter.
  4. Apply the desired input from the table to the transmitter and read the output.  Record it in the chart under the “As Found” column.
  5. If the “As Found” number is outside of the minimum acceptable and the maximum acceptable ranges, then the transmitter must be recalibrated. Even if the readings are inside the limits, but still a bit off, it is a good idea to recalibrate the transmitter.
  6. Recalibrate the transmitter according to the procedure for that transmitter.  This usually involves putting the 0% value of input on the transmitter and zeroing the transmitter to 4 milliamps out. Then, put the 100% value on the transmitter and set the span to read 20 milliamps.
  7. Perform the calibration checks again as in step 4 and record the values in the “As Left” column.




Comments

Popular posts from this blog

Ferrules and Cross Ferruling

 Ferrules are identification labels provided for every wire terminations in an instrument, equipment, or electrical/instrumentation control panels. These tube-shaped sleeves can be inserted easily on each individual wire in a multi-core cable. In earlier days fixed digits/letters are used as ferrules, but now Instrumentation engineers/technicians prints out desired ferrules by using a ferrule printing machine. Typical Ferrule The numbers/ letters on the ferrules will be given as per the approved electrical hook up or loop diagrams. This helps technicians to easily identify a particular loop/wiring from a series of terminal blocks and to troubleshoot the desired terminal connection. Separate numbers on the ferrules distinguish the positive and negative polarities of wires, thus ensure the polarity protection of the instrument. Cross Ferruling  As a wire is connected on its both ends, it is quite useful to use a cross reference method for wire identification. Unlike normal ferru...

What is a Torbar? – Averaging Pitot Tubes

 The Torbar is employed for flow measurement of liquids, gases, or steam in circular, square, or rectangular section ducts for large flow rates. The Torbar is an insertion type multi-port self-averaging primary sensor for flow measurement. Torbar TORBAR is a set of Pitot tubes mounted on a bar across the pipeline with no moving parts. An averaging Pitot tube is a technology, while TORBAR is a manufacturing brand name. There are several brands available in the market with VERABAR, ANNUBAR, etc. Averaging Pitot Tube Principle Purpose Averaging Pitot tube can be employed when the average velocity of the flow profile, rather than the velocity in a specific point in the cross-section is desired. Averaging Pitot Tubes Principle It measures the differential pressure between the static pressure tap and the tap of full pressure of a stream. Thus such magnitude of differential pressure is directly proportional to the square of the flow rate. Working The TORBAR is designed in such a way that ...

Dissolved Oxygen Analyzer Working Principle

 What is dissolved oxygen ? Dissolved oxygen refers to oxygen dissolved in water. Its concentration is expressed as the amount of oxygen per unit volume and the unit is mg/L. Biologically, oxygen is an essential element for respiration of underwater life and also acts as a chemical oxidizer. The solubility of oxygen in water is affected by water temperature, salinity, barometric pressure, etc. and decreases as water temperature rises. Measurement of dissolved oxygen by the membrane electrode method The membrane electrode method measures a diffusion current or reduction current generated by the concentration of dissolved oxygen or partial pressure of oxygen to obtain the concentration of dissolved oxygen. This method is not affected by the pH value of water being measured, oxidation and reduction substances, color, turbidity, etc. and the measurement method offers good reproducibility. When a sensor is inserted into water, an air layer forms on the membrane (Teflon membrane). The ox...