Often there's no need to send indicators, micrometers, calipers, etc. to gage labs in order to have them calibrated. Gage blocks and standards, on the other hand, must be sent to a lab which specializes in this procedure.
Your gages, instead, can be calibrated in your own shop, and in fact, should be calibrated in your own shop. Ideally, every gage should be calibrated before every use. Only in this way can you be sure that your readings are accurate. Even to comply with various ISO requirements all you need is to label each tool with a serial number and then keep written records of when, where and how often you calibrate them.
In most cases, all you'll need is a set of gage blocks which have been certified. Even inexpensive gage blocks can be used for routine calibration.
Industry standards imply that annual calibration is sufficient for compliance purposes. You should consider the implications of this carefully. You may have damaged your gage after just one use, and then be using an out-of-calibration gage for the rest of the year.
Calibration needs to be customized to the frequency of use of the gage. A gage used once a month can easily be calibrated just once a year. A gage used hourly should probably have a one-month calibration.
Gages in harsh environments need more attention than those used in a clean-room.
Your quality and production team will have to make the call.
A good way to find out is to start an arbitrary calibration cycle. If everything passes, then you may want to prolong the calibration cycle just to the point where you start to see inaccuracies.
If you're not inclined to calibrate before every use (no one really is) then standard procedure is to calibrate every few months depending on use and wear. If your gage is in constant use then you must choose a more frequent interval.
If you need to follow specific military or industry standards, then you must obtain those standards and by all means, do as they say. Neither the military nor industry always uses common sense in these matters. (Sorry, we can not provide those details. Please confer with the National Institute of Standards [NIST] or an accredited calibration laboratory.)
In theory, if an optical flat is in perfect contact with a perfectly flat anvil, then no light bands will be visible. Make sure the optical flat and anvil are clean. Attempts to wring the optical flat will only scratch the glass. Get a good fit by gently squeezing the optical flat onto the surface. When finished, lift off without sliding. If the light bands (rainbows) you see are evenly spaced and in straight lines, then your surface is flat.
Ideally you'd use monochromatic light and if you're doing calibration full time, it's probably a good idea to invest in such a light bulb. Regular room lighting works fine for us. When looking through the optical flat, look straight down: avoid looking at an angle. Check this out for yourself and you'll see that the image changes dramatically as you increase the angle of vision.
If you see many light bands, then press the optical flat a little harder. You'll probably see fewer bands and that makes it easier to interpret the results.
The degree to which the light bands arch can be used to calculate the flatness. The ideal micrometer anvil is flat to .000012".
What this means is that, when you're looking at the arches, the top of one arch just touches the bottom of the next arch. See how the imaginary line indicates the bottom of the next arch in the image on the left?
one light band (left) vs. two light bands (right)
At this point you have a flatness error of one light band, or .000012". This is exactly what you want. If, on the other hand, the bottom of the light band touches the top of the second arch over, as in the image on the right, then you have a flatness error of 2 light bands, or .000024" and it's high time to have your anvils lapped.
Performing this test on each anvil will determine the degree of flatness of each anvil; but, by using a flat which has parallel sides you can close the micrometer anvils on the flat and also determine the degree of parallelism.
Mitutoyo suggests this procedure: wring the optical parallel to the micrometer anvil (the stationary part of the micrometer) so that only one interference fringe (light band) shows. Now close the micrometer spindle onto the parallel. This should occur exactly at .500" when using the optical parallels in our calibration kit. At this point count the number of fringes (light bands) on the spindle by looking through the optical parallel from the other side. Then apply this formula:
Number of fringes on spindle side x 0.32µm = parallelism of the anvils with the spindle in that position
You may want to convert this metric result to inches using a scientific calculator or your own gray matter (equivalent to about .00004"). Since the optical flats themselves are parallel to .00002", you'll have to take this possible deviation into account. Your result would be expressed as .00004" ± .00002"
It's important to take note of the phrase: with the spindle in that position. If you rotate the spindle a bit, the surfaces may no longer be parallel. For that reason the calibration set shown above includes two parallels. Perform the same test using the second parallel. Now your reading will occur at .5125" instead of .500" This puts the spindle at 180° from the first reading. If the anvils still are parallel, then you're set to go. If the anvils are now out of parallel then the spindle isn't running true and we're in trouble. A real stickler for details would even use 4 optical parallels to measure every 90 degrees, but for our purposes that may be going a bit too far. A qualified calibration lab can perform that procedure for you if the need arises.
A somewhat easier method for checking parallelism requires the use of a gage ball. Any diameter under 1" will do. Close the micrometer onto the gage ball and take the reading. Do this in 5 different places on the surface of the anvils. If the anvils are parallel, then the readings will all be the same. It proves parallelism but doesn't actually give you a numeric value. This method can also locate high spots or dips on the anvil surface, which should lead you to have them serviced and lapped.
Using gage balls as described above will work fine. You can use a gage ball larger than 1" for this purpose, or you can use a gage ball in conjunction with a certified gage block, although this will be a tricky procedure if you're normally "all thumbs."
You can use optical parallels instead. You must have optical flats with parallel sides. The ones in your micrometer checking set are parallel to .00002" but these optical flats are only good for checking parallelism on the 0-1" range micrometers. Larger ranges need larger and very expensive optical parallels (upwards of $1000). It will be more cost effective to have a calibration lab check these for you. Without investing in more expensive equipment, you may have to resort to the gage ball technique mentioned above.
When the spindle is screwed closed on a 0-1" micrometer, the reading should be zero. Use the spindle ratchet to obtain a light, even pressure, if your micrometer has one. If the zero is slightly off it can be adjusted by turning the barrel into position. A special wrench is usually provided for this procedure. On micrometers with ranges above 1" you will have to insert a gage block or micrometer standard equal to the lower value of the micrometer's range and set the zero as above.
Blade micrometers can be calibrated with square or rectangular gage blocks when the blades are still flat and parallel. After much use, these blades will develop worn areas, particularly if you measure slots or grooves on cylindrical parts. In this case, use a pin gage to establish the correct readings on the good portion of the blade. The pin gage you use must be of a diameter which is smaller than the curvature of the worn area. Then take a measurement with the same pin gage in the worn area. (To find the worn areas, close the blades and hold them up to a light. You'll see the spots where the blades no longer meet.) The mathematical difference between the two readings will give you a correction factor which you add to any readings you now take in that worn area. Repeat this calibration often, because the correction factor will change with additional wear. When the damage is out of control, then you can return the blade mike to us for grinding and lapping. The blades will be returned to factory specs. If the blades chip or break, they can also be replaced.
Micrometers with one or two spherical anvils are designed for measuring thickness of tubular walls. When the anvils are closed, the micrometer must read zero. Zero setting is done just like any other outside micrometer. Similarly, gage blocks or micrometer standards are used to verify accuracy. Refer to the information outlined in the section for outside micrometers. Parallelism is not an issue with these micrometers.
If you have installed a "snap-on" spherical ball to your standard micrometer anvil, then you must take the diameter of this addition into consideration, not only when calibrating, but also when measuring. It stands to reason that an 8 mm diameter ball attached to the surface of the anvil will mean that the spindle will read 8 mm when closed.
Calibration labels can be as fancy or as simple as you like. There's no great mystery about them. Each gauge whether it be a dial caliper, a test indicator or a micrometer measuring standard needs to have a unique serial or ID number assigned to it. Most gages have these inscribed by the manufacturer. If you have a great many tools in your arsenal then you might want to create your own set of ID numbers to help you track them down. Scratch the numbers into the tool or use a permanent marker.
Avoid electro-engraving any tool with internal gears. The sparks from the process can cause tiny pinions to weld together. Don't electro-engrave anything with electronic components like digital calipers.
You'll want to set up a calibration sheet for each of these instruments. This sheet features the ID number for quick identification and its location in your shop. On the sheet you'll write down when the gage was calibrated and by whom, and when the next calibration is due. Then you'll attach the calibration certificate and keep them all in a binder, in a safe place.
If you send the gages out for calibration, you'll receive a certificate. If you calibrate these yourself, then you'll create your own certificate. On the certificate you'll note the instrument's ID number, a brief description of the tool and its graduations and range. Then you'll write down the test data which you obtained during calibration. Make note of the equipment which you used to make this calibration. It might be a gage block set, for instance. Date and initial, and you're done.
If you have many instruments to keep track of, then you'll probably want some software designed to automate this process, but we're more concerned with the small scale do-it-yourself approach here. We'll leave the software to someone else.
Now comes the label. This little sticker fits onto most tools and you'll simply write the tool's ID number on it, the date you had it calibrated and the date for the next calibration. Labels are often initialed by the person performing the calibration. If the sticker doesn't fit on the tool, or if the labels tend to peel off because of coolants or oils, then put the label on the tool's box. When you do that, make sure that an ID number on the instrument is linked to the ID number on the label and always put the gage back in its proper case, otherwise you'll mess things up.
And where do you get these labels? You could use pieces of masking tape, or an address label cut down to size and simply pen the information onto it. You could invest in fancy label printers that make really impressive calibration stickers with very, very tiny print.
Dial bore gage calibration poses unique problems. The gage is designed to be used as a comparator and not for direct reading. As a result, accuracy over its full range is immaterial. What is important is repeatability and accuracy over the short range, usually just a few graduations. The indicator reading will need to be accurate to ± one graduation per revolution.
The proper way to calibrate a bore gage is to use certified setting rings or a bore gage calibrator as sold by Mitutoyo, for example.
Set the bore gage to zero at the center of its range, using a setting ring (for instance at 0.500"). Then check with a setting ring at either end of the range (for instance 0.490" and 0.510") or, possibly at some points in between. It's really the only way that the gage accuracy can be verified, by using it the way it was designed to be used.
A setting master can be used to preset your bore gage but should not be used for calibration since other measuring variables come into play.
As a final and important test, check for repeatability against a ring gage, setting master or a micrometer. You can also use the micrometer to see if you'll still get ± one graduation per revolution, more or less. Since the micrometer is accurate to .0001" at best, you can only use this for rough calibration.
Thinking about how the gage is used, you'll see that accuracy of travel has no effect on the accuracy of comparison.
A Chinese manufacturer of indicators and micrometers by the name of Tresna (Guilin, Guangxi Province) has chosen to publish some of the contents of this page on their own web-site. Of course, they didn't ask for permission and apparently have no intention of respecting copyright laws. We should be flattered but they could have given us credit as authors at the very least.
STARRETT
An American Manufacturing Tradition