معايرة الأوناش ومواد الرفع

Calibration services, sometimes called instrument calibration services, are an essential part of the upkeep of any measurement device or instrument. Calibration services’ purposes are to take a measurement of the precise output of a device, usually in terms of its power, and compare this reading with the manufacturer’s standards to find out if the device is running both safely and efficiently. 

Alignment and condition monitoring instruments require proper maintenance, care and calibration at specific intervals to achieve the highest possible measurement reliability and accuracy. Please refer to the individual operating manuals for the recommended recalibration interval. In most cases calibration accuracy checks are conducted every two years.

Poorly maintained alignment systems may lead to inaccurate measurements and incorrect results. All calibrations are performed in our high-tech test and measurement lab by our Service Team.

The standards and standard measurement equipment used for calibration are recalibrated at specific intervals according to the standards of the German PTB (Physikalisch- Technischen Bundesanstalt) and other national standards.

PRUFTECHNIK Service Center helps you achieve an optimal, long-lasting alignment of your machines.

Major features in PRUFTECHNIK products are high quality with virtually no need for repairs. Should any form of repair yet be necessary, count on us to be a reliable and fair partner.

We guarantee an efficient process with all the advantages offered by our PRUFTECHNIK Service Center.

Apart from repairing your instrument, we will ensure that it is fit for future alignment jobs, preventing possible defects and failures. In addition, all necessary improvements and updates will be carried out.

Should a service of your instrument ever become necessary, we offer you a quick turnaround time, and if necessary a loaner system or component will be available to you.

Poorly maintained alignment systems may lead to inaccurate measurements and incorrect results. All calibrations are performed in our high-tech test and measurement lab by our Service Team.

The standards and standard measurement equipment used for calibration are recalibrated at specific intervals according to the standards of the German PTB (Physikalisch- Technischen Bundesanstalt) and other national standards.

PRUFTECHNIK Service Center helps you achieve an optimal, long-lasting alignment of your machines.

A calibration label does not always mean your instruments are in tolerance and making accurate measurements. If your instrument is out-of-specification (out-of-tolerance), you can no longer trust your test results.

Are you 100% confident your calibration service partner provides an accurate calibration?

Custom Calibration

Specializes in on-site and laboratory calibration which will keep costly equipment downtime to a minimum and maximize your overall productivity. We have over 30 years of experience providing calibration services for mechanical, dimensional, scale, torque, humidity, and many more applications. Our company’s mission is to achieve total customer satisfaction by providing prompt, precise, tailor-made calibration solutions to fit your specific needs.

Calibration Laboratory, LLC

ISO/IEC 17025:2017 & ANSI/NCSL Z540.3 Accredited Laboratory. Electronic, dimensional, physical and thermodynamic calibrations performed onsite and in our lab. Professional ASQ Certified Calibration Technicians. We support the manufacturing and service sectors including; aerospace, automotive, chemical, electronic equipment, energy, food, industrial, machinery, medical, metal, military, nuclear, pharmaceutical, plastics, and transportation. Free local pickup and delivery. In business since 1977.

Morehouse Instrument Company, Inc.

Morehouse is an experienced leader in force and torque measurement helping to create a safer world. We use our knowledge to provide solutions including accurate measurement data and data analysis software. The goal is to help customers make better measurements which can make the difference between success or failure of everyday technology. We offer ISO/IEC 17025 accredited calibrations accurate to 0.002 percent of applied force up to 120,000 lbf and 0.01 percent up to 2,250,000 lbf.

Calibration Services

Calibration services, sometimes called instrument calibration services, are an essential part of the upkeep of any measurement device or instrument. Calibration services’ purposes are to take a measurement of the precise output of a device, usually in terms of its power, and compare this reading with the manufacturer’s standards to find out if the device is running both safely and efficiently. Read More…

Leading Manufacturers

Custom Calibration Inc.

North Haven, CT  |  203-484-3707

Custom Calibration Inc. specializes in on-site and laboratory calibration which will keep costly equipment downtime to a minimum and maximize your overall productivity. We have over 30 years of experience providing calibration services for mechanical, dimensional, scale, torque, humidity, and many more applications. Our company’s mission is to achieve total customer satisfaction by providing prompt, precise, tailor-made calibration solutions to fit your specific needs.

Calibration Laboratory, LLC

Merrillville, IN  |  800-373-1759

ISO/IEC 17025:2017 & ANSI/NCSL Z540.3 Accredited Laboratory. Electronic, dimensional, physical and thermodynamic calibrations performed onsite and in our lab. Professional ASQ Certified Calibration Technicians. We support the manufacturing and service sectors including; aerospace, automotive, chemical, electronic equipment, energy, food, industrial, machinery, medical, metal, military, nuclear, pharmaceutical, plastics, and transportation. Free local pickup and delivery. In business since 1977.

Morehouse Instrument Company, Inc.

York, PA  |  717-843-0081

Morehouse is an experienced leader in force and torque measurement helping to create a safer world. We use our knowledge to provide solutions including accurate measurement data and data analysis software. The goal is to help customers make better measurements which can make the difference between success or failure of everyday technology. We offer ISO/IEC 17025 accredited calibrations accurate to 0.002 percent of applied force up to 120,000 lbf and 0.01 percent up to 2,250,000 lbf.

Thermalogic® Corporation

Hudson, MA  |  800-343-4492

Since 1971, Thermalogic has been a leading manufacturer of electronic temperature and humidity control and sensors. Here at Thermalogic we work with our clients in a partnership to build a lasting business relationship. All of our products have quick turnaround times, including custom designs, and are thoroughly tested prior to being sent out. At Thermalogic we pride ourselves on our high quality, industrial grade, reliable products. Contact us today to get started!

Strainsert Company

West Conshohocken, PA  |  610-825-3310

For over 40 years, Strainsert Company has been an industry leader in manufacturing calibration services. Our goal is to provide calibrating services that are thorough and accurate, and we have experience serving a variety of applications.

Sierra Instruments

Monterey, CA  |  831-373-0200

Sierra provides accurate calibration services for mass flow meters and controllers, insertion thermal flow meters, vortex, and ultrasonic flow meters. With more than 40 years of expertise in gas, air, or liquid flow calibration, you can count on our team to make sure your flow meter operates with efficiency and pinpoint accuracy. We believe in providing personalized and customized service, and doing everything we can to maintain credibility with our clients. To get started, contact Sierra today!

Assurance Technologies, Inc.

Bartlett, IL  |  630-550-5000

Since 1955, Assurance Technologies has been a leading calibration service provider for tools and gages. We perform instrument calibration, laser alignment, temperature calibration, electronic calibration, hand gage calibration, torque calibration, pressure calibration and much more. ISO/IEC 17025:2005 Accredited.

Control Automation Technologies Corporation

Winston Salem, NC  |  336-725-6020

A2LA Accredited to ISO/IEC 17025 and with offices in VA and NC, Control Automation Technologies Corporation has provided field instrumentation services since 1989, including calibration, installation, maintenance & start-up. CATLab, CATC’s Accredited Laboratory offers Test Equipment Calibrations for electrical, pressure, temperature, humidity devices & more.

Calibration Services Companies List

A calibration device uses electrical signals as a means of calibration instruments or other devices. After a technician discovers the exact margin of error, they can adjust the calibrator to the manufacturer’s specifications for improved output.

Note: Calibration services are not to be confused with performance review calibration, which is when managers discuss as a group the performances of their employees and design a set of ratings by which to grade said performances.

Applications

Customers request calibration services in order to make sure that their standardized instruments are giving correct readings, and when applicable, are still able to properly detect anomalies, flues, heat, defts, wear patterns, etc. In industrial and lab settings, any device that depends on pressure, temperature, or speed should be calibrated to ensure efficiency.

Examples of the many instruments that experts can calibrate include load cells, lab scales, data acquisition sensors, temperature sensors, laboratory chemical sensors, strain gauges, automotive sensors (light sensors, temperature sensors, mass airflow sensors, wheel speed sensors, fuel level sensors, airbag sensors, etc.), proximity sensors, optical gauges (fiber optic sensors, LED sensors, photon counters scintillators, photo switches, etc.), temperature gauges (thermometers, calorimeters, thermocouples, bimetal strips, pyrometers, flame detectors, etc.), navigational instruments, Geiger counters, and more.

Strainsense Enterprises, Inc.
Calibration Services – Strainsense Enterprises, Inc.

History

The history of calibration services begins with the standardization of weights and measurements of length and distance. Before standardization, measurements varied widely from place to place, and often changed from person to person. For example, one of the first measures of distance in the world was the cubit. People measured cubits as the distance from a person’s shoulder to the tip of their nose. Since no two people are the exact same size, using this system, no two measurements could be exact.

Outrageously, while king between 1100 and 1135 AD, Henry of England made up a new unit of measurement–the yard–and said that it equaled the distance between his outstretched thumb and the tip of his nose. Finally, in the late 1100s, people began standardizing measurements. First, in 1196, English lawmakers established the Assize of Measures, a code of length measurements. Then, in 1215, the writers of the Magna Carta included in that document standard wine and beer measurements.

During the Renaissance and Industrial Revolution, inventors created all sorts of measurement devices, such as Torricelli’s mercury barometer in 1643. The creation of these devices eventually gave rise for the need to have a way to make sure they were accurate. During the Civil War, people first began using the word calibrate in reference to the caliber of a gun, or the measurements of its inside barrel, and bullet outside diameters. As a result, gun manufacturers began referring to calibration as the process of measuring gun barrels and bullets to make sure they were well-matched. Eventually, this term bled over into other accuracy and measurement applications.

Throughout the 20th century, calibration services proved themselves essential to a number of industries, especially the oil and gas industry. In the 1990s, concerned parties formed two of the most powerful and important standards organizations, the International Electrotechnical Commission (IEC) and the International Organization for Standardization (ISO). Those who carry out calibration services rely heavily on the literature put out by IEC and ISO to keep them informed on up-to-date standards of alignment and metrology, or the science of measurement.

Today, in addition to using IEC and ISO standards, the United States regularly updates its own national regulations for performance and calibration. These regulations are enforced by agencies like the National Institute of Standards and Technology (NIST). This helps to create standards that are applicable for safety and efficiency across the country. All quality calibration service providers can show evidence that they and their equipment are accredited by organizations like NIST. We highly recommend that you only work with accredited suppliers.

Electronic technology has served to dramatically improve the ability of calibration services. Most sensors and transducers can now provide more accurate measurements than ever before. These tools are also more versatile than they were in the past, and some are even capable of measuring different types of instruments in multiple locations for comparison. This high degree of precision was never possible when workers used manual calibration methods. Advances in electronic technology have made it possible to gauge exact numbers on high tech equipment to maintain high quality performance.

Service Procedure

Calibration service suppliers can check your instruments in their facilities, or they can send a field technician out to you upon request. Some suppliers, such as Fluke Calibration and Fluke Calibration’s other brands, will allow you to rent test equipment.

Calibration servicers calibrate instruments and tools based on standardized measurements established by metrology. Using their instruments, technicians will measure and calibrate your instruments per your needs, such as unit of measurement type and required measurement accuracy per standards and regulations.

The procedure that technicians follow during the calibration process can be summed up by the following three step descriptions: 1) definition of unit to be measured, 2) realization of measurement, and 3) traceability to a quantitative rating.

1. Definition of Unit to be Measured
Examples of units that technicians often define and measure include units of distance (anywhere from micrometers to light years), volume, mass, time, strength, flow, longitude, distance and temperature.

2. Realization of Measurement
Once technicians define the unit of measurement, they transfer them to a quantifiable scale, where they create ratings. They then measure the tool or instruments level of output or allowance of this unit.

3. Traceability to a Quantitative Rating
After measuring, technicians apply a rating to the results. They then compare it to a baseline and record their results. By recording their results, they create traceability.

Once technicians have taken these steps, they can finally adjust the measured instrument until it matches the accepted measurement or reads accurately.

Something that calibration service providers also do is occasionally calibrate and test their own test equipment. They do so on a schedule that they call the calibration interval. By doing so, they maintain their integrity and the quality of their services.

Machinery Used

To carry out the many different types of calibration out there, calibration service providers use a wide variety of calibration tools and machines. These include those that are handheld, fixed, and portable.

Handheld calibration devices are usually small and manually operated.

Fixed calibration instruments remain stationary. They are used in factories and other workplaces during production as needed.

Portable devices may be large and mounted on wheels so that they can be moved around a facility as needed.

Variations and Similar Processes

Traceable calibration is a type of calibration that involves comparing established traceable standards to a series of an instrument’s measurements. Traceable calibration allows technicians to most accurately account for an instrument’s quirks, such as precision and bias. It also helps them pick up on fluke readings.

Pipette calibration is used in medical and laboratory settings. A pipette dispenses measured quantities of liquids for lab or medical use. Accuracy is critical in these environments, so the pipette must be calibrated correctly.

Torque wrench calibration is commonly used on the nuts and bolts of more basic applications. Torque, the product of force and distance, can be calibrated on a torque wrench for improved accuracy. This is important for the threading of nuts and bolts so that over or under-tightening does not occur.

Load cell calibration helps to yield accurate weight measurements in scales. Load cell calibrators are often handheld. To use them, technicians plug them into a transducer with a ready load cell. The calibrator, which has been pre-standardized, takes a weight measurement of that which the load cell is also measuring. The results read out on the device, separate from the results on the load cell. Technicians then adjust the load cell until the calibrator and transducer produce the same reading.

Scale calibration is the process of calibrating weighing scales, such as laboratory scales or industrial scales. Often, this involves load cell calibration, since so many scales use load cells.

Multimeter calibration is the calibration of multimeter equipment. Multimeters are devices that measure electric voltage, current, and (usually) resistance.

Hardness tests measure the hardness of a given material. Hardness tests provide valuable information like the level of resistance to deformation.

Calibration laboratories are laboratories where professionals provide testing to smaller equipment and instruments. Calibration labs often work in conjunction with on-site calibration services. One of the most common tools that technicians use in a calibration laboratory is the pipette.

Pressure calibration is a calibration service that operators perform on instruments that measure control or pressure.

Temperature calibration is specifically for devices that measure or control temperatures, like detectors or thermostats.

Speedometer calibration is a service that helps a speedometer accurately measure speed in a vehicle or some other piece of machinery. It keeps the reading precise and properly accounting for distance. In automotive applications, this precision keeps drivers safe by giving them the information that they need to drive within the speed limit.

Benefits

Calibration services offer customers a number of benefits. First, regular calibration is the best and only sustainable method for maintaining instrument accuracy. In fact, in many industries, leaders not only recommend regular calibration, but require it to meet safety and efficiency standards. In addition, by maintaining instrument accuracy, calibration services help hold up fair trade and honest business practices around the world. By eliminating a large portion of the margin of error for important measurements, like those related to pollutants and allergens, calibration services also help keep society safe and healthy.

On top of ensuring instrument accuracy, calibration services help boost the performance and efficiency of machines that wear down over time. Likewise, they keep the maintenance and repair costs of these machines low. This is because, when measuring devices are calibrated correctly, the machines and devices that they monitor are less likely to malfunction and less likely to need repair.

Note that, although calibration devices are powerful, they are not perfect. To be truly accurate, technicians must be aware of every factor that affects the calibration of an instrument. The possibility of error is always present, but electronic calibration methods have greatly reduced the margin of error.

Things to Consider

To make sure that your devices continue to function well and measure accurately for years to come, you need to connect with a respected calibration service provider. You can find such a provider by checking out those calibration service companies we have listed on this page. Each of those we have listed is reliable, experienced, and properly accredited.

To find the right one for you, start by writing out a list of your requirements. Make sure to include not only the details of your instrument or machine, but also requirements like your budget and whether you can deliver your instrument to the service provider, or you need the service provider to come to you (on-site calibration). Once you have jotted down your requirements, start browsing the calibration companies we have listed on this page. Pick out three or four service providers in which you are most interested, then reach out to their respective customer service lines. Discuss your application at length which each of them. Make sure to write down notes or request a written quote. Once you have done this, compare and contrast your conversations. Determine which service provider will best serve you and go with them.

Directional Tool Calibration

Tools used for drilling in the oil & gas industry, whether directional drilling, Measurement While Drilling (MWD), or ranging tools, all use magnetometers as a means to determine direction.

These tools rely on Earth’s magnetic field measurements, for directional drilling and MWD, so are calibrated regularly to reduce errors. 

Calibration is performed at DC. This can be done by placing the sensor in a homogeneous and stable Earth field, which these days is increasingly difficult to find, or by placing the tools into a controlled field.

Our range of Helmholtz coils is perfectly suited for calibration, offering generation of a precise field in any orientation, which is much more practical than rotating the tool. Our larger diameter coils enable tools to be more easily accommodated within the coil’s homogeneous volume.

Use of an active compensation module does not remove the need for a clean environment in the immediate vicinity of the coils, but it does allow for DC measurements to be taken in a facility close to the storage and production facility.

The Fundamentals of Pressure Calibration

What is Pressure Calibration?

All measuring devices used in critical applications must be calibrated periodically to remain within tolerance of their manufacturer’s specifications. Calibration is the comparison of the pressure measurement of a device under test (DUT) to a reference standard. Calibration processes are defined in national and international standards for every measurement category and application. Pressure sensors can be found in instruments such as indicators and controllers and should all be calibrated in order to minimize the probability of erroneous readings. This page will serve as a guide for the basics of pressure calibration and will include how and why we calibrate, traceability and other topics vital to understanding the pressure calibration industry.

Pressure Calibration Terminology

Having a general knowledge of these pressure calibration terms and their definitions will help you understand the other concepts on this page. If you’re already familiar with measuring instrument specifications or calibration certificates, consider these a review or skip to the next section.

ACCURACY VS. UNCERTAINTY

Accuracy and uncertainty are two of the most common terms used to determine the specification of pressure measuring and controlling devices, however, they are often confused with each other.  According to the International Vocabulary of Metrology (VIM), measurement uncertainty is defined as the “parameter associated with the result of a measurement that characterizes the dispersion of values that could reasonably be attributed to the measurand” or a measure of the possible error in the estimated value as a result of the measurement. However, in day-to-day terms, it is basically the accumulation of all the systematic components that contribute toward the overall error in measurement. The typical components contributing toward an instruments’ measurement uncertainty are the defined uncertainty of the reference instrument, effect of ambient conditions, the intrinsic uncertainty of the instrument itself and the deviation recorded in measurement. Accuracy, on the other hand, is defined in the VIM as the “closeness of agreement between a measured quantity value and a true quantity value of a measurand.”  Accuracy is more of a qualitative concept rather than a quantitative measurement. Manufacturers often use this term to represent the standard value of the maximum difference between measured and actual or true values. So what does it really mean for pressure calibration? 

ACCURACY VS. UNCERTAINTY

 With pressure as a measurand, the uncertainty of the instrument is dependent on the reference calibrator’s uncertainty, the linearityhysteresis, and repeatability of measurements across measurement cycles, and the compensation for ambient conditions such as atmospheric pressure, temperature, and humidity. This is typically reported at a certain coverage factor. The coverage factor determines the probability density of the stated uncertainty using a numerical factor to derive the expanded uncertainty. The coverage factor is usually symbolized with a letter “k.” For example, k= 2 represents a 95% confidence level in reporting the expanded uncertainty, while k = 3 represents a 99% confidence level. It’s typical in pressure calibration that expanded uncertainty is reported with a confidence level of k=2. Accuracy being a qualitative concept allows for more flexibility in interpretation and may lead to different definitions from different manufacturers. As accuracy is the overall representation of the closeness of values, it often encompasses the contributions of measurement uncertainty, longterm stability, and a guard band over an interval of time. The purpose of this term is to provide the user with an estimation of the overall worst-case specification of their instrument over the stated time interval. Learn more about how accuracy and uncertainty are used in measurement specifications. 

PRECISION

Precision is defined by the VIM as “closeness of agreement between indications or measured quantity values obtained by replicate measurements on the same or similar objects under specified conditions.” Precision is a term that defines the nature of proximity an instrument’s measurement would have between the same measurement taken multiple times under the same conditions, such as ambient conditions, test setup and reference instrument used. 

 In pressure calibration, this is a term that plays a significant role when performing a measurement going upscale and downscale in pressure multiple times during the calibration. The error in the same measurement between these cycles determines precision. It is a specification that encompasses linearity, hysteresis, and repeatability of the measurement. Check out “Accuracy vs. Precision” for a simple, graphic representation of the difference between these two terms.

LINEARITY

In an ideal world, all measuring devices are linear, i.e. the deviation in true value and measured value throughout the range can be represented by a straight line. Unfortunately, this isn’t true. All measuring instruments have some level of nonlinearity associated with them. This implies that the deviation between the true value and measured value varies across the range. For pressure calibration, nonlinearity is measured by going upscale between various measuring points and comparing that to the true output. Nonlinearity can be compensated through a few different ways, such as best fit straight line (BFSL), zero-based BFSL or multipoint linearization. Each method has its pros and cons. The best fit straight line (BFSL) method is defined by a straight line to best represent the measuring points and their outputs across the range. It is drawn this way to minimize the relative error between the actual curve and the line itself. This method is most commonly used in applications requiring low accuracy, where the nonlinearity bandwidth is relatively higher. Zero-based BFSL is a derived form of the BFSL method where the line passes through the zero or the minimum point of the range to ensure the offset of the zero point is mitigated. Multipoint linearization is the most thorough process of the three. This method allows the line segment between multiple points in the range to be modified to come as close as possible to the actual calibration curve. This approach, although tedious, ensures the highest amount of correction toward nonlinearity. Measuring points typically include the zero and span point and then a multitude of different points can be selected within the range of the DUT. Find out if a one-point calibration would be sufficient for your needs.

HYSTERESIS

Hysteresis is the maximum difference in measurement at a specific point when the measurements are taken upscale to the same measurements taken downscale. For pressure calibration, hysteresis is measured at each pressure value being recorded by increasing the pressure to full scale and then releasing it down to minimum value. Different accreditation standards require different procedures to calculate the overall hysteresis. As an example, DKD-R-6 requires the upscale and downscale values to be recorded twice each and then an aggregate hysteresis value be derived for each pressure point. 

REPEATABILITY

Measurement repeatability is the degree in closeness between the same measurement taken with the same procedure, operators, system, and conditions over a short period of time. A typical example of repeatability is a comparison of a measurement output at one point in the range over a certain time interval while keeping all other conditions the same including the approach and descent to the measuring point.

STABILITY VS. DRIFT

Stability is defined by the VIM as the “property of measuring instrument, whereby its metrological properties remain constant in time.” It can be quantified in terms of the duration of a time interval where the stated property remains a certain value. For calibration, stability is a part of the overall accuracy definition of the instrument; it plays a crucial role in determining the calibration interval of the instruments. All pressure measuring devices drift over time from the day they were calibrated. Often pressure calibration equipment manufacturers specify stability as a byproduct of drifting for a specific measuring point or multiple points in the range. For absolute pressure instruments, this is the zero point. As a zero-point offset can cause a vertical shift in the calibration curve, this point’s drift over time becomes the determining factor in maintaining the manufacturer specifications. 

AS FOUND VS. AS LEFT DATA

These terms are usually found on calibration certificates when a device returns after being recalibrated. The simplest definition of as-found data is the data a calibration lab finds on a device it has just received, prior to making any adjustments or repairs. As-left data would be what the certificate shows once the calibration is complete and the device is ready to leave the lab.  For a complete description of as-found and as-left data and why it appears on your calibration certificates, read the blog “What Does As-Found and As-Left Data Mean in a Calibration?”

ADJUSTMENT

As the word suggests, adjustment describes performing some operation on the measuring system or measuring point so that it responds with a prescribed output to the corresponding measured value. In practice, adjustments are performed on specific measuring points for them to respond according to the stated manufacturer’s specifications. This is typically the minimum and maximum points in the range, i.e. zero adjustment and span adjustment. Adjustment is often carried out after an as-found calibration has highlighted the measuring points not meeting the desired specification. 

TAR VS. TUR

TAR corresponds to test accuracy ratio and TUR corresponds to test uncertainty ratio. These both represent the factor by which the DUT is worse in accuracy or uncertainty, respectively, compared to the reference standard used for its calibration. These ratios are regarded as the practical standard for selecting the optimal reference standard to calibrate the DUTs at hand. Download The Essential Pressure Calibration Glossary

Why Should You Calibrate?

The simple answer is that calibration ensures standardization and fosters safety and efficiency. If you need to know the pressure of a process or environmental condition, the sensor you are relying on for that information should be calibrated to ensure the pressure reading is correct, within the tolerance you deem acceptable. Otherwise, you cannot be certain the pressure reading accuracy is sufficient for your purpose.

A few examples might illustrate this better:

STANDARDIZATION IN PROCESSES

petrochemical researcher has tested a process and determined the most desirable chemical reaction is highly dependent on the pressure of hydrogen gas during the reaction. Refineries use this accepted standard to make their product in the most efficient way. The hydrogen pressure is controlled within recommended limits using feedback from a calibrated pressure sensor. Refineries across the world use this recommended pressure in identical processes. Calibration ensures the pressure is accurate and the reaction conforms to standard practices.

STANDARDIZATION IN WEATHER FORECASTING AND CLIMATE STUDY

The barometric pressure is a key predictor of weather and a key data point in climate science. Pressure calibration and barometric pressure, standardized to mean sea level, ensures that the pressures recorded around the world are accurate and reliable for use in forecasting and in the analysis of weather systems and climate.

SAFETY

A vessel or pipe manufacturer provides standard working and burst pressures for their products. Exceeding these pressures in a process may cause damage or catastrophic failure. Calibrated pressure sensors are placed within these processes to ensure acceptable pressures are not exceeded. It is important to know these pressures sensors are accurate in order to ensure safety.

EFFICIENCY

Testing has proven a steam-electric generator is at its peak efficiency when the steam pressure at the outlet is at a specific level. Above or below this level, the efficiency drops dramatically. Efficiency, in this case, equates directly to bottom line profits. The tighter the pressure is held to the recommended pressure, the more efficient the generator runs and the most cost-effective output is assured. With a calibrated high accuracy pressure sensor, the pressure can be held within a tight tolerance to provide maximum efficiency and bottom-line revenue.

Discover even more ways calibrating your pressure instruments can help you by reading “10 Reasons to Calibrate Your Instruments.”

How Often Should You Calibrate?

The short answer is as often as you think is necessary for the level of accuracy you need to maintain. All pressure sensors will eventually drift away from their calibrated output. Typically it is the zero point that drifts, this causes the whole calibration curve to shift up or down. There can also be a span drift component which is a shift in the slope of

The curve, as seen below:

 ZeroDrift   SpanDrift

The amount of drift and how long it will take to drift outside of acceptable accuracy specification depends on the quality of the sensor. Most manufacturers of pressure measuring devices will give a calibration interval in their product datasheet. This tells the customer how long they can expect the calibration to remain within the accuracy specification. The calibration interval is usually stated in days or years and is typically anywhere from 90 to 365 days. This interval is determined through statistical analysis of products and usually represents a 95% confidence interval. This means that statistically, 95% of the units will meet their accuracy specification within the limits of the calibration interval. For example, Mensor’s calibration interval specification is given as 365 or 180 days, depending on the sensor chosen.

The customer can choose to shorten or lengthen the calibration interval once they take possession of the sensor and have calibration data that supports the decision. By conducting an as-found calibration at its calibration interval the sensor will be in tolerance or out of tolerance. If it is found to be in tolerance, it can be put back in service and checked again within another calibration interval. If out of tolerance, offsets can be applied to bring it back in tolerance. In this case, the next interval can be shortened to make sure it holds its accuracy. Successive as-found calibrations will provide a history of each individual sensor and can be used to adjust the calibration interval based on this data and the criticality of the application where the sensor is used.

Read more about why and how often you should calibrate in “10 Reasons to Calibrate Your Instruments.”

Where is Pressure Calibration Performed?

Pressure calibrations can be performed in a laboratory environment, a test bench, or in the field. All that is needed to calibrate a pressure indicator, transmitter or transducer is a regulated pressure source, a pressure standard, a way to read the DUT, and the means to connect the DUT to the regulated pressure source and the pressure standard. Pressure rated tubing, fittings, and a manifold to isolate from the measured process pressure may be the only equipment necessary to perform the calibration. 

Calibration Lab Setup

Where to calibrate a pressure sensor is totally up to the user of the device. If however there is a requirement for an accredited calibration to the ISO/IEC 17025 General Requirements for the Competence of Testing and Calibration Laboratories Standard, then either your organization must be accredited by an accreditation body or the calibration must be performed by an organization that is accredited. An ISO/IEC 17025 standard accreditation ensures the organization conducting calibration is deemed competent. The standard focuses on general, structural, resource, process, and management system requirements, ensuring results are based on accepted science and the accredited organization is producing technical valid results. The Mensor calibration laboratory in San Marcos, Texas, is accredited by A2LA to the ISO/IEC 17025 standard. All devices manufactured here can be returned for calibration, and the lab is also capable of calibrating other pressure devices within our scope of accreditationFor more information about Mensor’s calibration lab and its capabilities, please see our Calibration Services brochure.

Instruments Used in Pressure Calibration

Deciding what instrument to use for calibrating pressure measuring devices depends on the accuracy of the DUT. For devices that ascribe to the highest accuracy achievable, the reference standard used to calibrate it should also have the highest achievable accuracy.

Accuracy of DUTs can range widely but for devices with accuracy greater than 1-5% it may not even be necessary to calibrate. It is completely up to the application and the discretion of the user. Calibration may not be deemed necessary for devices used only as a visual “ballpark” indication and are not critical to any safety or process concern. These devices may be used as a visual estimate of the process pressures or limits being monitored. To calibrate or not is a decision left to the owner of the device.

More critical pressure measuring instruments may require periodic calibration because the application may require more precision in the process pressure being monitored or a tighter tolerance in a variable or a limit. In general, these process instruments might have an accuracy of 0.1 to 1.0% of full scale.

THE CALIBRATOR

Common sense says the device being used to calibrate another device should be more accurate than the device being calibrated. A long-standing rule of thumb in the calibration industry prescribes a 4 to 1 test uncertainty ratio (TUR) between the DUT accuracy and the reference standard accuracy. So, for instance, a 100 psi pressure transducer with an accuracy of 0.04% full scale (FS) would have to employ a reference standard with an accuracy of 0.01% FS for that range.

Knowing these basics will help determine the equipment that can deliver the accuracy necessary to achieve your calibration goals. There are several levels of calibration that may be encountered in a typical manufacturing or process facility, described below as laboratory, test bench, and field. In general, individual facility quality standards may define these differently.

LABORATORY

Laboratory primary standard devices have the highest level of accuracy and will be the devices used to calibrate all other devices in your system. They could be deadweight testershigh accuracy piston gauges, or pressure controllers/calibrators. The accuracy of these devices typically range from about 0.001% (10 ppm) of reading to 0.01% of full scale and should be traceable to the SI units. Their required accuracy will be determined by what they are required to calibrate to maintain a 4:1 TUR. Adherence to the 4:1 rule can be relaxed but it must be reported on the calibration certificate. These laboratory devices are typically used in a controlled environment subject to the requirements of ISO 17025, which is the guideline for general requirements for the competence of testing and calibration laboratories. Laboratory test standards are typically the most expensive devices but are capable of calibration a large range of lower accuracy devices.https://www.youtube.com/embed/_nFtFtzf0Lo

TEST BENCH

Test bench devices are used outside of the laboratory or in an instrument shop, and are typically used as a check or to calibrate pressure instruments taken from the field. They possess sufficient accuracy to calibrate lower accuracy field devices. These can be desktop units or panel mount instruments like controllers, indicators or even pressure transducers. These instruments are sometimes combined into a system that includes a vacuum and pressure source, an electrical measurement device and even a computer for indication and recording. The pressure transducers used in these instruments are periodically calibrated in the laboratory to certify their level of accuracy. To maintain an acceptable TUR with devices from the field, multiple ranges may be necessary or devices with multiple and interchangeable transducer ranges internally. The accuracy of these devices are typically from 0.01% FS to 0.05% FS and are lower cost than the higher accuracy instruments used in the laboratory.https://www.youtube.com/embed/j1MpDsDtFPQ

FIELD

Field instruments are designed for portable use and typically have limited internal pressure generation and the capability to attach external higher pressure or vacuum sources. They may have multi-function capability for measuring pressure and electrical signals, data logging, built-in calibration procedures and programs to facilitate field calibration, plus certifications for use in hazardous areas. These multi-function instruments are designed to be self-contained to perform calibrations on site with minimal need for extraneous equipment. They typically have accuracy from 0.025% FS to 0.05% FS. Given the multi-function utility, these instruments are priced comparable to the instruments used on the bench and can also be utilized in a bench setting.https://www.youtube.com/embed/T4rH-rWVdq4

In general, what is used to calibrate your pressure instruments in your facility will be determined by your established quality and standard operating procedures. Starting from scratch will require an analysis of the cost given the range and accuracy of the pressure instruments that need to be calibrated.

How is Pressure Calibration Performed?

Understanding the process of performing a calibration can be intimidating even after you have all of the correct equipment to perform the calibration. The process can vary depending on calibration environment, device under test accuracy and the guideline followed to perform the calibration.

The calibration process consists of comparing the DUT reading to a standard’s reading and recording the error. Depending on specific pressure calibration requirements of the quality standards, one or more calibration points must be evaluated and an upscale and downscale process may be required. The test points can be at the zero and span or any combination of points in between. The standard must be more accurate than the DUT. The rule of thumb is that it should be four times more accurate but individual requirements may vary from this.

Depending on the choice of the pressure standard the process will involve the manual, semi-automatic or fully automatic recording of pressure readings. The pressure is cycled upscale and/or downscale to the desired pressure point in the range, and the readings from both the pressure standard and the DUT are recorded. These recordings are then reported in a calibration certificate to note the deviation of the DUT from the standard.   https://www.youtube.com/embed/tioSiTnzPu4  As mentioned, different guidelines detail the process of calibration differently. Below are some of the standards that highlight such differences when calibrating pressure transducers or gauges: IEC 61298-2 defines the process for “Process measurement and control devices.” The section on “Test procedures and precautions” defines the number of exercise cycles, the number of measurement cycles and test points required. DKD-R 6-1 “Calibration of Pressure Gauges” defines different process for different accuracy classes of devices. It also defines exercise cycles, number of cycles and points and also minimum times on how long to hold the pressure before taking a measurement. EURAMET Calibration Guide No. 17 has basic, standard and comprehensive calibration procedures depending on the uncertainty of the device being calibrated. It requires additional information like the standard deviation of the device’s output at each pressure point. Keep in mind specific industries may require their own calibration processes.

Calibration Traceability and Pressure Standards

CALIBRATION TRACEABILITY

A traceable calibration is a calibration in which the measurement is traceable to the International System of Units (SI) through an unbroken chain of comparable measurements to a National Metrology Institute (NMI).  This type of calibration does not indicate or determine the level of competence of the staff and laboratory that performs the calibrations. It mainly identifies that the standard used in the calibration is traceable to an NMI. NMIs demonstrate the international equivalence of their measurement standards and the calibration and measurement certificates they issue through the framework of CIPM Mutual Recognition Arrangement (CIPM MRA).

PRIMARY VS. SECONDARY PRESSURE STANDARDS

There seems to be a good deal of confusion on the difference between primary and secondary standards, mainly because of a lack of technical distinction between the two. Pressure is a unit derived from fundamental SI units so any pressure device could never be a primary standard. The lowest uncertainty pressure devices available are considered fundamental pressure standards and are typically ultrasonic interference manometers and piston gauges. They are often referred to as primary standards even though technically, they are not. The term primary standard is also sometimes used when referring to the most accurate pressure standard within a facility. In most cases, these are traceable to the best fundamental pressure devices at NMIs. The instruments at these institutes are also called primary standards and are probably more deserving of the title because they are at the pinnacle of accuracy in the chain of traceability.   To further complicate the issue, calibration laboratories frequently call their lowest uncertainty pressure devices their primary standards. Secondary standards are devices either calibrated by, or traceable to, the aforementioned primary standards, or even other secondary standards. Find out more about traceability and the hierarchy of bureaus and institutes in the unbroken chain linking to SI.

Accredited Calibrations

A calibration laboratory is accredited when it is found to be in compliance with ISO/IEC 17025,  which outlines the general requirements for the competence of testing and calibration laboratories. Accreditation is awarded through an accreditation body that is an ILAC-MRA signatory organization. These accreditation bodies audit the laboratory and its processes to determine the laboratory competent to perform calibrations and to issue their calibration results as accredited.   Accreditation recognizes a lab’s competence in calibration and assures customers that calibrations performed under the scope of accreditation conform to international standards.

 The laboratory is audited periodically, to ensure continued compliance with the ISO/IEC 17025 standard.

Check out this article for a more detailed look at the differences between NIST traceable and ISO/IEC 17025 accredited calibrations, including a checklist for how to achieve them.

Factors Affecting Pressure Calibration and Corrections

There are several corrections, ranging from simple to complex, which may need to be applied during the calibration of a device under test (DUT).

HEAD HEIGHT

If the reference standard is a pressure controller, the only correction that may need to be applied is what is referred to as a head height correction. The head height correction can be calculated using the following formula:

( ρ– ρ)gh

Where ρf is the density of the pressure medium (kg/m3), ρa is the density of the ambient air (kg/m3), is the gravity (m/s2) and h is the difference in height (m). Typically, if the DUT is below the reference level, the value will be negative, and vice versa if the DUT is above the reference level. Regardless of the pressure medium, depending on the accuracy and resolution of the DUT, a head height correction must be calculated. Mensor controllers allow the user to input a head height and the instrument will calculate the head height correction.

SEA LEVEL

Another potentially confusing correction is what is referred to as a sea level correction. This is most important for absolute ranges, particularly barometric pressure ranges. Simply put, this correction will provide a common barometric reference regardless of elevation. This makes it easier for meteorologists to monitor weather fronts as all of the barometers are referenced to sea level. For an absolute sensor, as the sensor increases its altitude, it will approach absolute zero, as expected. However, this can become problematic for a barometric range sensor as the reading will no longer be ~14.5 psi when vented to atmosphere. Instead, the local barometric pressure may read ~12.0 psi. However, this is not the case. The current barometric pressure in Denver, Colorado, for example, will actually be closer to ~14.5 psi and not ~12.0 psi. This is because the barometric sensor has a sea level correction applied to it. The sea level pressure can be calculated using the following formula:

(Station Pressure / e(-elevation/T*29.263))

Where Station Pressure is the current, uncorrected barometric reading (in inHg@0˚C), elevation is the current elevation (meters) and T is the current temperature (Kelvin).

For everyday users of pressure controllers or gauges, those may be the only corrections they may encounter. The following corrections apply mainly to piston gauges and the necessity to perform them relies on the desired target specification and associated uncertainty.

TEMPERATURE

Another source of error in pressure calibrations are changes in temperature. While all Mensor sensors are compensated over a temperature range during manufacturing, this becomes particularly important for reference standards such as piston gauges, where the temperature must be monitored. Piston-cylinder systems, regardless of composition (steel, tungsten carbide, etc.), must be compensated for temperature during use as all materials either expand or contract depending on changes in temperature. The thermal expansion correction can be calculated using the following formula:

1 + (α+ αc)(T – TREF )

Where αP is the thermal expansion coefficient of the piston (1/˚C) and αC is the thermal expansion coefficient of the cylinder (1/˚C), T is the current piston-cylinder temperature (˚C) and TREF is the reference temperature (typically 20˚C).

As the temperature of the piston cylinder increases, the piston-cylinder system expands, causing the area to increase, which causes the pressure generated to decrease. Conversely, as the temperature decreases, the piston-cylinder system contracts, causing the area to decrease, which causes the pressure generated to increase. This correction will be applied directly to the area of the piston and errors will exceed 0.01% of the indicated value if uncorrected. The thermal expansion coefficients for the piston and cylinder are typically provided by the manufacturer, but they can be experimentally determined.

DISTORTION

A similar correction that must be made to piston-cylinder systems is referred to as a distortion correction. As the pressure increases on the piston-cylinder system, it will cause the piston area to increase, causing it to effectively generate less pressure. The distortion correction can be calculated using the following formula:

 1 + λP

Where λ is the distortion coefficient (1/Pa) and P is the calculated, or target, pressure (Pa). With increasing pressure, the piston area increases, generating less pressure than expected. The distortion coefficient is typically provided by the manufacturer, but it can be experimentally determined.

SURFACE TENSION

A surface tension correction must also be made with oil-lubricated piston-cylinder systems as the surface tension of the fluid must be overcome to “free” the piston. Essentially, this causes an additional “phantom” mass load, depending on the diameter of the piston. The effect is larger on larger diameter pistons and smaller on smaller diameter pistons. The surface tension correction can be calculated using the following formula:

πDT

Where is the diameter of the piston (meters) and T is the surface tension of the fluid (N/m). This correction is more important at lower pressures as it becomes less with increasing pressure.

AIR BUOYANCY

One of the most important corrections that must be made to piston-cylinder systems is air buoyancy.

As introduced during the head height correction, the air surrounding us generates pressure… think of it as a column of air. At the same time, it also exerts an upward force on objects, much like a stone in water weighs less than it does on dry land. This is because the water exerts an upward force on the stone, causing is to weigh less. The air around us does exactly the same thing. If this correction is not applied, it can cause an error as high as 0.015% of the indicated value. Any mass, including the piston, will need to have what is referred to as an air buoyancy correction. The following formula can be used to calculate the air buoyancy correction:

1 – ρam

Where ρa is the density of the air (kg/m3) and ρm is the density of the masses (kg/m3). This correction is only necessary with gauge calibrations and absolute by atmosphere calibrations. It is negligible for absolute by vacuum calibrations as the ambient air is essentially removed.

LOCAL GRAVITY

The final correction and arguably the largest contributor to errors, especially in piston-gauge systems, is a correction for local gravity. Earth’s gravity varies across its entire surface, with the lowest acceleration due to gravity being approximately 9.7639 m/s2 and the highest acceleration due to gravity being approximately 9.8337 m/s2. During the pressure calculation for a piston gauge, the local gravity may be used and a gravity correction may not need to be applied. However, many industrial deadweight testers are calibrated to standard gravity (9.80665 m/s2) and must be corrected. Were an industrial deadweight tester calibrated at standard gravity and then taken to the location with the lowest acceleration due to gravity, an error greater than 0.4% of the indicated value would be experienced. The following formula can be used to calculate the correction due to gravity:

gl/gs

Where gl is the local gravity (m/s2) and gs is the standard gravity (m/s2).

The simple formula for pressure is as follows:

P = F / A = mg / A

This is likely the fundamental formula most people think of when they hear the word “pressure.”  As we dive deeper into the world of precision pressure measurement, we learn that this formula simply isn’t thorough enough. The formula that incorporates all of these corrections (for gauge pressure) is as follows:P = F / A = mg / Amg / A =  (mg ( 1 –  ρa) + πDT ) / (Ae (1 + (αp + αc )(T – TREF))( 1 + λP)) + ( ρ– ρ)gh


OSS Middle East Provide all field with quality services and Management Systems to ask for a price list or free consultation for any services we provide contact us now!


Oss Middle East Company:

Aim to help organization in all sectors in Egypt and Middle East to apply the international standard in Quality Management systems in all fields.


OSS Accredited by:-

Egyptian Organization For Standardization & Quality
"EOS "

OSS register by Many Egyptian Organization:

EGPC
Egyptian General Petroleum Corporation
IMC - مركز تحديث الصناعة

Industrial Modernization Centre (IMC).

Other Article:-

Leave a Reply

Your email address will not be published. Required fields are marked *