This unit of the Metrology Fundamentals series was developed by the Mitutoyo Institute of Metrology, the educational department within Mitutoyo America Corporation. The Mitutoyo Institute of Metrology provides educational courses, on-demand training videos, andother resources across a wide variety of measurement related topics including basic inspection techniques, principles of dimensional metrology, calibration methods, and GD&T. For more information on the educational opportunities available from Mitutoyo America Corporation, visit us at www.mitutoyo.com/education.
Let us start with an unfortunate yet common example of the costly confusion in calibration practice. Say you own a
coordinate measuring machine (CMM) and hire somebody to “calibrate the CMM”. What do you think they will do? You select an accredited laboratory and their calibration method follows the ISO standard for CMMs. Good so far. They show up, make some measurements, put a new calibration label on your CMM, and issue a calibration certificate. You feel good and file the certificate. Six months later, a quality auditor shows up and happens to review the CMM calibration certificate. They notice that there are no tolerances on the certificate and no statements of conformity. You start getting a bit nervous, and go find the specifications of the CMM in some old records. As you start to compare the values on the certificate with the specifications of the CMM, you realize that the CMM is significantly out of tolerance, and has been used for months to check critical parts. You have just exposed your organization to huge risks. Can you blame the person who calibrated your CMM? Did they do a proper calibration? Doesn’t calibration include comparison to tolerance? Doesn’t calibration include adjustments to be within specifications? Unfortunately, the answers to these questions are not straightforward or simple.
What is Calibration?
Requirements for calibration appear in most national and international quality standards, and most organizations recognize that measuring equipment must be calibrated. However, there is much confusion regarding the definition of calibration, and this confusion increases quality-related risks to organizations. The purpose of this technical bulletin is to sort out all the different concepts that exist for the term calibration, including official definitions and those in common use.
The Official Definition
There is an official dictionary of measurement terms called the VIM, the International Vocabulary of Metrology. The VIM, also known as ISO/IEC Guide 99:2007 or JCGM 200:2012, contains terms and definitions that are accepted and utilized at a very high level across national and international standards organizations, government agencies, and regulation bodies. The VIM defines the term calibration as:
operation that, under specified conditions, in a first step, establishes a relation between the quantity values
with measurement uncertainties provided by measurement standards and corresponding indications with
associated measurement uncertainties and, in a second step, uses this information to establish a relation
for obtaining a measurement result from an indication This definition is a bit awkward, but we can see that there are two important “steps” in calibration to understand.
The first step is about relating measured values from your measuring equipment to those from calibrated measurement standards.
This is the generally understood critical connection between calibration and traceability. Calibration enables the units of measurement on your equipment, like the inch or meter, to be traced back to some official reference (like NIST in the U.S.).
The second step in the definition of calibration is what causes so much trouble, but is so important, as it clarifies that theinformation gained in the first step is only as good as how it is used. In other words, whatever information is used as the calibration is indeed the calibration, and therefore we need to explore the ways that calibration information is used.