ICEweb has nearly 100 Control, Instrumentation, Fire & Gas, Safety Instrumented Systems core pages and a total of more than 300 pages - It Really is Cool Engineering - By Engineers for Engineers it must be just about the World's first choice for Technical Information.
Whilst every effort is made to ensure technical accuracy of the information supplied on iceweb.com.au, Keyfleet Pty Ltd and its employees accept no liability for any loss or damage caused by error or omission from the data supplied. Users should make and rely on their own independent inquiries. By accessing the site users accept this condition. Should you note any error/omission or an article offends please do not ignore it, contact the webmaster and we will review, rectify and remove as necessary.
Get seen by the people who use
can be yours
This is a cached page from INX.Inc, ICEweb cannot find the original pages on the web and felt that the information was too useful to be not available. Should you know of a contact for the originator please advise by email
· The Importance of Calibration
Properly operating instruments are critical to plant safety and product quality. To calibrate process instruments, it is essential that controlled conditions and measurement signals present in the actual process installation are simulated.
For an analog electronic pressure transmitter a variable signal source provides a pressure input at the same values as used in the process, and a precision input standard measures the calibration input. A regulator is used to precisely control the input pressure. The range of the transmitter determines the input measurement standard. The output of the transmitter is connected in a series circuit that furnishes the 24 VDC transmitter power and measures the output signal from the transmitter. With the output signal range of 4-20 mA, the most appropriate output measurement standard would be a milliammeter. The instrument should be mounted in the same position as it is installed in the process.
The input is connected so the precision gage indicates the pressure to the instrument under test. Locate the precision gage on a tee between the regulator and the transmitter. A bleed valve connected between the regulator and the transmitter releases pressure from the pneumatic input calibration circuit and allows the system to return to zero. The output of the transmitter is connected in series with the milliammeter and the power source.
Calculate the input test points for the upscale and downscale check and the expected output values. The recommended five test points are typically 10%, 30%, 50%, 70%, and 90% of input span. Because 0% and 100% are extremes of the transmitter range, they are not recommended as test points.
With the five point check complete, use the resulting values to determine if any errors are present and calculate the accuracy.
Accuracy = (Deviation / Span ) * 100
Deviation = Expected Value - Actual Value
The zero shift is corrected first, since adjusting the span requires an accurate base point, or zero. Zero is typically adjusted at 10% input until 10% output is produced. Next, adjust the span with 90% of input pressure applied. With the input pressure at 90%, the span is adjusted to provide 90% of the output. Zero and span adjustments interact, continue correcting until both zero and span are correct.
2.Smart Pressure Transmitter Calibration
Smart transmitters can communicate with a hand held configuration instrument. Smart transmitters are calibrated at the factory. It is usually necessary to configure the transmitter to meet your process requirements.
Configuration enables the user to select the appropriate range for a specific process within the transmitter's wide span. Reranging the pressure transmitter may be necessary to meet measurement requirements when the application of the transmitter changes or when a smart pressure transmitter is replaced. At the beginning of the configuration procedure it is usually necessary to enter information about the instrument and its function in the process. Once the upper and lower range values are entered and verified they can be downloaded directly to the transmitter.
Enter the applicable measurement units by pressing the UNITS key on the interface device. Then, press ENTER and PROCEED, to confirm the units selected. Next, enter the upper and lower range values. Press the LRV key to select the lower range value. Then key in the value and press ENTER. The interface device will display the lower range value and is represented in the process as a 4 mA signal. The upper range value is set the same way, using the URV key. This selected value will now be represented in the process by a 20 mA signal
A differential pressure transmitter compares a high pressure value to a low pressure value.
The differential pressure transmitter can be calibrated with a low pressure calibrator. A low pressure calibrator contains a regulator to adjust pressure output values to the transmitter, and a digital precision gage that can measure the calibrator output to the instrument under test. The output circuit for the calibration includes a milliammeter to monitor the output signal and a 24 VDC power source.
Connect the pressure source from the calibrator to the high pressure port on the transmitter. The transmitter low pressure port is vented to atmosphere. This allows differential pressure to be read directly without any calculations. The transmitter output is connected in a series circuit.
Start the calibration with a five point upscale and downscale check. After you have collected the data, evaluate the readings to determine the instrument accuracy.
Correct the zero shift first by adjusting the input to 10% of the transmitter range. To correct the span error, first raise the input to 90% of the instrument's span. Then adjust the span until the appropriate milliamp output is indicated. Zero should be checked again after making a span adjustment because span and zero may interact. Continue rechecking and adjusting zero and span until the transmitter is calibrated within specifications.
Pressure changes applied to the gage cause the elastic element to expand and contract. The movement of the element is translated into movement of the pointer through links, levers, and gears. Calibrating a pressure gage includes adjustment of these components until the gage reading accurately represents the input.
The instrument under test ( in this case, the gage ) determines the calibration standards. First you need a source of pressure, which is provided best by a regulator. An input standard to measure the pressure applied is also needed. An appropriate input measurement standard for this calibration is precision.
Use a tee to connect the precision gage to the source of pressure and the gage under test. Be sure the gage under test is mounted in the same orientation as it is in the process.
· Five Point Check
Determine the five test points used for the upscale and downscale checks of the gage under test. With any link and lever instrument it is important that your entire upscale check be done in an upscale direction and your entire downscale check be done in a downscale direction.
The test results should be checked for an accuracy within the manufacturer's specifications. If the results are outside the manufacturer's specifications, determine the type of errors present. On most motion balance instruments, try to adjust linearity first. Linearity is corrected at the midpoint of the range, apply 50% PSI. Use a template to check the 90 Deg. angle, matching the linkage angle with the template. With linearity adjusted, position the pointer so the gage reads midscale. You may need to remove the pointer and reposition it on the shaft. Lower the input to 10%, and adjust the zero so the gage reading equals the 10% input value. Now correct the span error. The input pressure is increased to 90%. Adjust the span until the gage reads this same 90% value. Repeat the zero and span adjustments until the readings at 10% and 90% are accurate. Zero and span interact, rechecking is required for best results.
Thermocouples measure temperature and are used quite often in process controls.
When wires of two different thermoelectrically homogeneous materials are joined at one end and placed in a temperature gradient, a thermoelectric voltage (EMF) is observed at the other end. The connection is called the measuring junction. On all thermocouples, the red lead is negative. The color of the other wire indicates the thermocouple type. For example, on a J-type thermocouple, the positive wire is white. Tables for each type of thermocouple list the voltages produced at various temperatures. Thermocouples should be checked whenever there are indications that the output is not accurate. It may also be necessary to check a thermocouple that will be used for a measurement standard.
A temperature bath provides controlled temperatures for testing a sensor. A well in the temperature bath is used to hold the sensor during the accuracy check. Another well is used to hold a measurement standard thermometer, it is used to confirm the actual bath temperature. A second measurement standard thermometer is used to read the ambient temperature at the reference junction. The sensor output signal and ranges determine the thermocouple output measurement standard. Since the output is measured in millivolts, a millivolt meter is used to read the output.
Set up the temperature bath as a temperature input standard to the thermocouple. Select the output standard with the appropriate range for reading millivolts. Connect the red lead to the negative millivolt meter input and the white lead to the positive millivolt meter input.
Because no adjustments are possible, we can only check the calibration of a thermocouple sensor. This check is generally done at three test point input values: ambient temperature, mid-range temperature, and upper value of the application range. Recall that in a thermocouple, it is the difference in the temperature between the reference and the measuring junction that produces a millvoltage output. Before inserting the thermocouple into the bath, determine the ambient temperature, which represents the temperature at the measuring junction of the thermocouple. When using look-up tables that are referenced to 0 Deg., you must compensate for ambient temperature. The millivolt value in the tables for the ambient temperature is added to the value from the sensor. This compensated millivolt value is used to determine from the tables the correct temperature.
Accuracy checks for a resistance temperature device (RTD) is often necessary to validate the accuracy of a new RTD.
There are three basic types of RTDs: 2-wire, 3-wire, and 4-wire units all of which are wired in a bridge configured circuit. An RTD is a metallic element whose resistance varies with temperature. By connecting it in one leg of a Wheatstone bridge, its resistance can be measured. 2-wire RTDs are susceptible to errors caused by changes in lead wire resistance. To compensate for these and other errors, 3-wire RTDs and 4-wire RTDs may be used when accuracy is required.
· Input and Output Measurement Standards
The application range of the RTD determines the measurement standards to be used to check this instrument. An appropriate input standard would be a temperature bath. A standard thermometer must be placed in one of the wells to confirm the accuracy of the bath. RTDs do not require temperature compensation. A volt-ohmmeter or decade resistance box can be used as the output measurement standard, because they measure resistance. To check an RTD with a volt-ohmmeter, look-up tables that relate RTD resistance to temperature are needed.
The two red RTD leads are connected to the positive meter leads, and the black RTD lead is connected to the negative lead.
Test RTDs at: ambient temperature, mid-range temperature and high end of range temperature. RTD's cannot be calibrated. When they are not within the manufactures specifications, the unit must be replaced.
Calibration of temperature transmitters should be checked on a periodic basis.
The temperature transmitter discussed here receives an input signal from a thermocouple. A millivolt input signal will be needed for calibration, so a millivolt source can be used for the input standard. A milliammeter can be used to measure the transmitter output. Use a standard thermometer to calculate the input signal compensation for the ambient temperature. Finally, a power supply for the transmitter is necessary.
To calibrate a temperature transmitter with a millivolt meter as an input standard, you must compensate for any reference temperature other than 0 Deg. C ( 32 Deg. F.).
To make the input connections, the location of the reference junction must first be determined. When thermocouple wires are used to connect the millivolt source to the transmitter, the reference junction is at the transmitter connection. So, ambient temperature is measured in the transmitter housing. If copper wires are used, the reference junction is at the connection to the millvolt source, so measure ambient temperature at the millivolt source. Always observe the polarity of the leads. Connect the negative output from the millivolt source to the positive transmitter terminal. Connect the milliammeter in series with the transmitter and the power supply.
Adjust the millivolt source and the milliammeter to the proper values as required, turn on equipment and begin calibration.
Perform a five point check to determine if the transmitter is accurate according to specifications.
Adjust the zero shift first. It should be set with an input value of 10%. With the zero properly set, a 10% input results in a 10% output. Adjust the span using a 90% input. The zero and span may interact, check and readjust as required.
A multifunction temperature calibrator is used to provide all the necessary input values and output measurements. The calibrator provides the RTD simulation for the transmitter inputs and a milliammeter to measure the transmitter outputs. RTD tables are not necessary because temperature can be input directly. Connect the RTD transmitter input and output terminals according to the diagram provided with the calibrator. The calibrator simulates the RTD resistance for the transmitter and displays the resulting milliamp output. Perform a five point upscale and downscale test. Zero is corrected at 10% input and adjusted for 10% output as shown on the milliammeter. Correct the span at 90%. Zero and span interact, recheck and adjust as required.