page 7

www.longislandindicator.com

Long Island Indicator Service Inc

Precision Tool Repairs, Sales & Spare Parts

repairs

spareparts

contactus

index

cart

search

Calibration of gages

 

Contents of this page:

Calibration procedures

 

Often there's no need to send indicators, micrometers, calipers, etc. to gage labs in order to have them calibrated. Gage blocks and standards, on the other hand, must be sent to a lab which specializes in this procedure.

Your gages, instead, can be calibrated in your own shop, and in fact, should be calibrated in your own shop. Ideally, every gage should be calibrated before every use. Only in this way can you be sure that your readings are accurate. Even to comply with various ISO requirements all you need is to label each tool with a serial number and then keep written records of when, where and how often you calibrate them.

In most cases, all you'll need is a set of gage blocks which have been certified. Even inexpensive gage blocks can be used for routine calibration.

Industry standards imply that annual calibration is sufficient for compliance purposes. You should consider the implications of this carefully. You may have damaged your gage after just one use, and then be using an out-of-calibration gage for the rest of the year.

Calibration needs to be customized to the frequency of use of the gage. A gage used once a month can easily be calibrated just once a year. A gage used hourly should probably have a one-month calibration.

Gages in harsh environments need more attention than those used in a clean-room.

Your quality and production team will have to make the call.

A good way to find out is to start an arbitrary calibration cycle. If everything passes, then you may want to prolong the calibration cycle just to the point where you start to see inaccuracies.

If you're not inclined to calibrate before every use (no one really is) then standard procedure is to calibrate every few months depending on use and wear. If your gage is in constant use then you must choose a more frequent interval.

If you need to follow specific military or industry standards, then you must obtain those standards and by all means, do as they say. Neither the military nor industry always uses common sense in these matters. (Sorry, we can not provide those details. Please confer with the National Institute of Standards [NIST] or an accredited calibration laboratory.)

Herewith some general instructions and guidelines.

 

Test indicator calibration

 

The fast, economical and accurate way to calibrate a quantity of test indicators is to invest in a Dial Indicator Calibrator with the Test Indicator attachment. These mechanical devices are available in inch or metric models from several manufacturers. They are in effect a micrometer head with a large 3.5" diameter, .00005" accuracy and 0-1" range. The test indicator is positioned above the spindle using the test indicator attachment. The micrometer head is rotated and readings are compared. It will be necessary to have this unit regularly calibrated by a calibration lab to maintain traceability. Ideally, readings should be taken at every numeral printed on the test indicator dial, or as your quality manual requires.

If you need to calibrate large quantities of analog and/or digital indicators you may want to invest in the electronic i-Checker which is hooked up to a computer system and generates inspection certificates. E-mail us for information on this rather costly apparatus ($8900.00 without computer). If you only need to calibrate .001" or .0005" indicators, then you can consider the mechanical indicator calibrator shown on page 131.

Test indicators can also be calibrated on a surface plate using certified gage blocks. The indicator is securely fastened to a stand and the contact point is brought in contact with a gage block of a given size. The contact point must be parallel with the surface of the block for most manufacturers. Interapid test indicators are an exception and should be at a 12-degree angle, approximately. The gage block can then be removed and replaced a number of times to check for repeatability. Be certain that discrepancies in repeatability are not due to poorly tightened clamps, flimsy stands or other factors. Usually one quarter of a graduation repeatability is allowable, but check with the manufacturer's calibration specs for your particular model.

Errors in repeatability indicate a need for cleaning and, possibly, repair. Do not attempt this without experience.

Accuracy in travel is checked by replacing the gage block with one larger. Very small intervals are required. Ideally you'd want to check the travel at every half revolution, or better. During this procedure be certain that the gage blocks are properly wrung to each other and to the surface. In general, accuracy should not vary more than one graduation per dial revolution on .0005" indicators. Calibration specifications for various manufacturers can be found on this site by referring to the Table of Contents.

If an incremental error occurs - one which increases regularly over the entire travel - then the contact point is of the wrong length or the angle of the point in regard to the surface is incorrect. You should verify that the correct length point is being used. Furthermore, you can make small adjustments by changing the contact point angle. Make repeated calibration attempts with varying angles until you find one which gives correct results. Obviously, it will now be necessary to recreate this same angle when the indicator is used in actual test situations. Some indicators (Girodtast, for example) allow you to make small adjustments in length with a set screw. (See page 19 for details.)

One final method requires a certified height master. This takes the place of gage blocks. The one we use has an accuracy of .00002". The test indicator is firmly fastened to a test stand and the contact point is positioned (at the proper angle) over one of the height master's test surfaces. Comparison readings are now taken at half-revolution intervals - or better - in both directions.

About the cosine error: for test indicators excluding Interapid models. If the contact point can not be kept parallel to the work surface then you will have to make a mathematical adjustment to the dial reading.

contact point angle

correction factor

 

10°

reading times 0.98

15°

reading times 0.97

20°

reading times 0.94

30°

reading times 0.87

40°

reading times 0.77

50°

reading times 0.64

60°

reading times 0.50

 

From this chart you will notice that a contact point held at a 60-degree angle results in one-half the dial reading. Once you determine the angle, simply multiply the dial reading by the corresponding correction factor.

For example, an indicator reading of .0085" at an angle of 30-degrees is equivalent to
.0085" x .87 = .0074"

Dial indicator calibration

 

The fast, economical and accurate way to calibrate a quantity of dial indicators is to invest in a Dial Indicator Calibrator. These mechanical devices are available in inch or metric models from several manufacturers. They are in effect a micrometer head with a large 3.5" diameter, .00005" accuracy and 0-1" range. The dial indicator is positioned in front of the spindle. The micrometer head is rotated and readings are compared. Ideally, readings should be taken at every numeral printed on the indicator dial, or as your quality manual requires. In practice, a reading taken every half revolution is sufficient. It will be necessary to have this unit regularly calibrated by a calibration lab to maintain traceability.

If you need to calibrate large quantities of analog and/or digital indicators you may want to invest in the electronic i-Checker which is hooked up to a computer system and generates inspection certificates. E-mail us for information on this rather costly apparatus ($8900.00 without computer). If you only need to calibrate .001" or .0005" indicators, then you can consider the mechanical indicator calibrator shown on page 131.

Gage blocks are an accurate but time consuming way to check your dial indicators. Fasten the indicator in a stand on a granite plate. Lower the contact point to the surface and set the indicator at zero. Now it's a simple procedure to insert gage blocks under the contact point and take the readings. Be certain that the blocks are clean and wrung to the surface, that the indicator is perpendicular to the surface, and that it's securely fastened. Blocks should be used which will allow a reading at every half revolution or as your calibration manual stipulates. Obviously, the gage blocks will need to have their accuracy certified on a regular basis (annually is the norm).

The indicator is deemed accurate if it does not deviate more than one graduation over the first 2-1/3 revolutions and not more than one additional graduation per revolution thereafter. Most manufacturers offer better accuracy than this, particularly on the short range indicators. Check with the manufacturer for specific details.

Repeatability should be less than half a graduation in all cases. Allow the contact point to come down several times onto the same gage block. Any variation in readings will indicate a problem with repeatability. If this occurs, check once again that your indicator is firmly attached to an indicator stand of some substance. Check that the screws on the indicator back are tightened (if you're holding the indicator by its lug back) and check that the contact point is tight as well. The indicator will need to be serviced if repeatability is unacceptable.

Gage blocks: investing in eight gage blocks (.020" .025" .050" .100" .250" .400" 1.00" and 2.00") will allow you to calibrate all your inch reading dial indicators with ranges up to 2" travel. With these blocks you'll be able to check each indicator at the 2-1/2 revolution mark as well as the full range mark. This will serve as the absolute minimum requirement for indicator calibration. NIST certificates included. Calibration Grade 0.

  • Dial Indicator gage block set ... see page 164



 

A complete micrometer calibration set

 

  • Micrometer calibration set #10-616-1 (same as B&S 598-10-18)
  • Yikes! Made in China (probably the only Chinese item we're willing to sell on this site)
  • Includes 10 rectangular steel gage blocks, Grade 2. Sizes .105" .210" .315" .420" .500" .605" .710" .810" .920" 1.000" Each is etched with the size and serial number, an ISO 9000 requirement. These combinations of blocks are sufficient for both 0-1" and 1-2" micrometers. NIST traceable certificates of calibration are included. (For larger micrometer ranges, see below)
  • Includes 2 optical flats of 1" diameter with thickness of .5000" and .5125". They are parallel to .00002" and flat to .000004". This allows checking the anvil faces at half revolutions of the spindle for flatness (see instructions in the section below). Unlike the gage blocks in this set, the optical flats are not certified.
  • Wooden box included.
  • Gage blocks can also be used to calibrate many dial indicators up to 2" travel.
  • B&S List price $333.00
  • Your price $289.00 order now

micrometer calibration set

A calibration certificate is included with the set (but the optical flats are not certified). It may have been calibrated at the factory some time ago. In theory, if the set is never used then there is no expiration to the certificate. The manufacturer can not predict the amount of use the set will receive and thus can not establish an expiration date for calibration. A calibration cycle must be established once the set is put into use. Generally this set needs to be calibrated once per year. If the set is rarely used then the cycle can be increased to as much as 3 years. If the set receives constant use, you may want to set the cycle at 3 months. For calibration send the entire set to a calibration laboratory in your area. For a partial directory of accredited calibration labs, see page 76.

Take note: this calibration set should probably not be used with digital micrometers having ±.00005" accuracy. For those micrometers you'll need at least Grade 1, if not Grade 0 gage blocks which can be ordered below. See page 164 for higher grade gage blocks.

If you need to calibrate larger calipers and micrometers, consider the measuring rods featured on page 58.


 

ASME Grade 0 micrometer calibration gage block set - inch reading

item5

This is a high precision (± 5.5 µinch) steel gage block set designed for calibrating micrometers up to 2" and with the combination possibilities of extending this to 3". An optical parallel is included in the set which has .0625", .100", .125", .200", .250", .300", .500", 1" and 2" Grade 0 steel blocks.

  • Inch micrometer checking set 516-930-26 ... $360.00 purchase order price
  • Internet discount price ... $348.00 order now

 

Micrometer Grade 0 calibration gage block set - metric reading

mitutoyometriccalibrationset

This high quality gage block set (shown above) with ASME Grade 0 rating (maximum error ±.14 µm ) contains the following ten certified steel blocks: 1.00, 1.25, 1.50, 2, 3, 5, 10, 15, 20, 25 mm. This set does not include optical parallels.

  • Metric micrometer checking set 516-103-26 ... $372.00 purchase order price
  • Internet discount price ... $358.00 order now

     

Outside micrometers

 

Testing for Parallelism and Flatness

In theory, if an optical flat is in perfect contact with a perfectly flat anvil, then no light bands will be visible. Make sure the optical flat and anvil are clean. Attempts to wring the optical flat will only scratch the glass. Get a good fit by gently squeezing the optical flat onto the surface. When finished, lift off without sliding. If the light bands (rainbows) you see are evenly spaced and in straight lines, then your surface is flat.

Ideally you'd use monochromatic light and if you're doing calibration full time, it's probably a good idea to invest in such a light bulb. Regular room lighting works fine for us. When looking through the optical flat, look straight down: avoid looking at an angle. Check this out for yourself and you'll see that the image changes dramatically as you increase the angle of vision.

If you see many light bands, then press the optical flat a little harder. You'll probably see fewer bands and that makes it easier to interpret the results.

The degree to which the light bands arch can be used to calculate the flatness. The ideal micrometer anvil is flat to .000012".

What this means is that, when you're looking at the arches, the top of one arch just touches the bottom of the next arch. See how the imaginary line indicates the bottom of the next arch in the image on the left?

one light band (left) vs. two light bands (right)

At this point you have a flatness error of one light band, or .000012". This is exactly what you want. If, on the other hand, the bottom of the light band touches the top of the second arch over, as in the image on the right, then you have a flatness error of 2 light bands, or .000024" and it's high time to have your anvils lapped.

Performing this test on each anvil will determine the degree of flatness of each anvil; but, by using a flat which has parallel sides you can close the micrometer anvils on the flat and also determine the degree of parallelism.

Mitutoyo suggests this procedure: wring the optical parallel to the micrometer anvil (the stationary part of the micrometer) so that only one interference fringe (light band) shows. Now close the micrometer spindle onto the parallel. This should occur exactly at .500" when using the optical parallels in our calibration kit. At this point count the number of fringes (light bands) on the spindle by looking through the optical parallel from the other side. Then apply this formula:

Number of fringes on spindle side x 0.32µm = parallelism of the anvils with the spindle in that position

For example: 3 fringes x 0.32µm = 0.96µm which is the ideal parallelism of a 0-1" range micrometer's anvils.

You may want to convert this metric result to inches using a scientific calculator or your own gray matter (equivalent to about .00004"). Since the optical flats themselves are parallel to .00002", you'll have to take this possible deviation into account. Your result would be expressed as .00004" ± .00002"

It's important to take note of the phrase: with the spindle in that position. If you rotate the spindle a bit, the surfaces may no longer be parallel. For that reason the calibration set shown above includes two parallels. Perform the same test using the second parallel. Now your reading will occur at .5125" instead of .500" This puts the spindle at 180° from the first reading. If the anvils still are parallel, then you're set to go. If the anvils are now out of parallel then the spindle isn't running true and we're in trouble. A real stickler for details would even use 4 optical parallels to measure every 90 degrees, but for our purposes that may be going a bit too far. A qualified calibration lab can perform that procedure for you if the need arises.

A somewhat easier method for checking parallelism requires the use of a gage ball. Any diameter under 1" will do. Close the micrometer onto the gage ball and take the reading. Do this in 5 different places on the surface of the anvils. If the anvils are parallel, then the readings will all be the same. It proves parallelism but doesn't actually give you a numeric value. This method can also locate high spots or dips on the anvil surface, which should lead you to have them serviced and lapped.

Parallelism on larger micrometers

Using gage balls as described above will work fine. You can use a gage ball larger than 1" for this purpose, or you can use a gage ball in conjunction with a certified gage block, although this will be a tricky procedure if you're normally "all thumbs."

You can use optical parallels instead. You must have optical flats with parallel sides. The ones in your micrometer checking set are parallel to .00002" but these optical flats are only good for checking parallelism on the 0-1" range micrometers. Larger ranges need larger and very expensive optical parallels (upwards of $1000). It will be more cost effective to have a calibration lab check these for you. Without investing in more expensive equipment, you may have to resort to the gage ball technique mentioned above.

Checking the Zero Setting

When the spindle is screwed closed on a 0-1" micrometer, the reading should be zero. Use the spindle ratchet to obtain a light, even pressure, if your micrometer has one. If the zero is slightly off it can be adjusted by turning the barrel into position. A special wrench is usually provided for this procedure. On micrometers with ranges above 1" you will have to insert a gage block or micrometer standard equal to the lower value of the micrometer's range and set the zero as above.

Calibrating the Micrometer

Use micrometer standards or gage blocks for this procedure. Be certain that the blocks are properly wrung and take special care with carbide tipped anvils so that you don't damage the gage blocks. The micrometer is calibrated at several points throughout its range. Arbitrary readings are considered better than evenly spaced dimensions. The "lead" error will be the difference, plus or minus, between the actual and the observed readings. Lead errors should not exceed .0001" or possibly .0002". If errors are found, keep track of them and you can always add or subtract the lead error when you use the micrometer at that particular range.

Indicating (dial) micrometers pose other problems. Since the micrometer is not used to make direct measurements (it is a comparator) and, since the anvil is movable, we can not calibrate the spindle using gage blocks or micrometer standards. Of importance here is the repeatability of the indicating pointer, which should be less than one-half graduation. Close the spindle and lock it in place. Now check for the pointer's ability at repetition. The accuracy of the indicating mechanism is then verified by sequentially inserting gage blocks with a difference of .001" to verify that the pointer registers the correct reading. Ultimately, flatness and parallelism are of paramount importance (see notes above).

Refer to specifications for Etalon indicating micrometers.

 

Blade micrometers

 

Blade micrometers can be calibrated with square or rectangular gage blocks when the blades are still flat and parallel. After much use, these blades will develop worn areas, particularly if you measure slots or grooves on cylindrical parts. In this case, use a pin gage to establish the correct readings on the good portion of the blade. The pin gage you use must be of a diameter which is smaller than the curvature of the worn area. Then take a measurement with the same pin gage in the worn area. (To find the worn areas, close the blades and hold them up to a light. You'll see the spots where the blades no longer meet.) The mathematical difference between the two readings will give you a correction factor which you add to any readings you now take in that worn area. Repeat this calibration often, because the correction factor will change with additional wear. When the damage is out of control, then you can return the blade mike to us for grinding and lapping. The blades will be returned to factory specs. If the blades chip or break, they can also be replaced.

Spherical Anvil Micrometers

 

Micrometers with one or two spherical anvils are designed for measuring thickness of tubular walls. When the anvils are closed, the micrometer must read zero. Zero setting is done just like any other outside micrometer. Similarly, gage blocks or micrometer standards are used to verify accuracy. Refer to the information outlined in the section for outside micrometers. Parallelism is not an issue with these micrometers.

If you have installed a "snap-on" spherical ball to your standard micrometer anvil, then you must take the diameter of this addition into consideration, not only when calibrating, but also when measuring. It stands to reason that an 8 mm diameter ball attached to the surface of the anvil will mean that the spindle will read 8 mm when closed.

Dial Caliper Calibration

 

These instructions apply to mechanical as well as digital calipers.

Both the inside jaws and the outside jaws need to be calibrated, as well as the depth rod and the step measurement, if these are used. Calipers should be frequently checked for accuracy. They are more susceptible to damage than other tools.

To check for wear in the jaws, do this: clean them and close them. Then hold them up to the light and if they're worn you'll see light shining through the gaps. You can continue to use the calipers if you measure with the unworn surfaces. For total reliability, however, you'll have to send the calipers for servicing. The surfaces can be ground flat again.

For the outside jaws it's a simple matter of inserting a series of gage blocks between them and recording the caliper readings. They must not deviate by more than one graduation (.001") over the first 4" of range. From 4" to 8" the error may be .0015" (one and one-half graduation). From 8" to 12" the error can be .002". Accuracy may vary among different models and the manufacturer's specs should be consulted for this information. Take readings at 1-inch intervals. Three gage blocks (see below) of 1", 2" and 3" sizes will be all you need.

To calibrate the inside jaws you may use a set of ring gages. Do not rely on very small ring gages because the inside jaws can not accurately measure small inside diameters. Suitable ring gages are shown on page 163. If you invest in just one ring, make it the 2" size. You may also set a calibrated .0001" micrometer to 1" (and higher, if possible) and then use the inside jaws to measure this distance. Since the micrometer has a discrimination ten times that of the calipers, you'll get an accurate reading.

Repeatability means the dial hand returns to the same position on different attempts to measure the same gage block. If this fails, then you'll have to have the calipers serviced.

 

Additional Gage Blocks for Calibration

 

Each rectangular steel gage block comes with a serial number engraved and a certificate of accuracy traceable to NIST. These have a tolerance grade ASME 0 and ASME 00. They can be wrung together to create a larger span, 6" for calibrating a caliper, for instance. For ISO purposes, keep track of these serial numbers and keep a copy of the certificate with your calibration records. When you calibrate your instrument make note of the instrument's serial number and the serial numbers of the gage blocks which you used to calibrate it. Additionally, have the gage blocks certified by a local calibration lab on an annual basis and then keep the current certificate of calibration with your records. To order gage blocks see page 164.

Gage Block Certification

 

You will periodically have to send your gage blocks for certification to an accredited laboratory. This might be on an annual, 2-year, 3-year or other cycle as determined by your quality manager. We are not equipped to perform this service. Calibration labs specializing in this procedure are located throughout the US. Mitutoyo runs an elaborate and sophisticated lab but there are many independent local calibration labs which can provide various services.

Depth Micrometer Calibration

 

Ideally you'd use a pair of matched gage blocks on a granite surface plate and set the depth micrometer base across these blocks. You will then be able to verify the micrometer against the blocks. Do this for the beginning and end of each rod's range. In other words, check the 1-2" rod at 1" and at 2".

More conveniently, use a square gage block with a center hole.

Just because you have calibrated the depth micrometer with one rod, does not mean that the other rods are automatically accurate. You will have to calibrate each rod separately before use.

Less reliably, you can use just one block for calibration. In this case make every effort to keep the depth micrometer rods perpendicular to the granite surface. If your micrometer has a ratchet thimble, by all means use it. Gage blocks are available on page 164

 

Dial Thickness Gage Calibration

 

If the dial thickness gage has flat anvils, as most of them do, you will want to make sure they lie parallel when closed. Clean the anvils with alcohol as needed. The easy way to check this is to hold them up to a light and look for gaps. If there are no obvious gaps then set the dial to zero. At this point, insert a calibrated gage block between the anvils and check the dial reading several times. You should not be off by more than one graduation. If it's off, then make sure the block is lying flat and properly seated. Also make sure that the anvils—and the gage block—are very, very clean. It stands to reason that if the anvils are not parallel, you'll get a different reading on one side of the anvils. Use the gage block to test for this possibility.

If your thickness gage has a relatively short range, then one or two different size gage blocks should suffice for calibration. Choose one for the middle range and one for the far range. Of course, if you only use the thickness gage for some specific measurement, then choose a gage block with approximately the same dimension.

For more information on thickness gages see page 12. To order gage blocks see page 164.

 

Calibration Labels

 

These can be as fancy or as simple as you like. There's no great mystery about them. Each gauge whether it be a dial caliper, a test indicator or a micrometer measuring standard needs to have a unique serial or ID number assigned to it. Most gages have these inscribed by the manufacturer. If you have a great many tools in your arsenal then you might want to create your own set of ID numbers to help you track them down. Scratch the numbers into the tool or use a permanent marker.

Avoid electro-engraving any tool with internal gears. The sparks from the process can cause tiny pinions to weld together. Don't electro-engrave anything with electronic components like digital calipers.

You'll want to set up a calibration sheet for each of these instruments. This sheet features the ID number for quick identification and its location in your shop. On the sheet you'll write down when the gage was calibrated and by whom, and when the next calibration is due. Then you'll attach the calibration certificate and keep them all in a binder, in a safe place.

If you send the gages out for calibration, you'll receive a certificate. If you calibrate these yourself, then you'll create your own certificate. On the certificate you'll note the instrument's ID number, a brief description of the tool and its graduations and range. Then you'll write down the test data which you obtained during calibration. Make note of the equipment which you used to make this calibration. It might be a gage block set, for instance. Date and initial, and you're done.

If you have many instruments to keep track of, then you'll probably want some software designed to automate this process, but we're more concerned with the small scale do-it-yourself approach here. We'll leave the software to someone else.

Now comes the label. This little sticker fits onto most tools and you'll simply write the tool's ID number on it, the date you had it calibrated and the date for the next calibration. Labels are often initialed by the person performing the calibration. If the sticker doesn't fit on the tool, or if the labels tend to peel off because of coolants or oils, then put the label on the tool's box. When you do that, make sure that an ID number on the instrument is linked to the ID number on the label and always put the gage back in its proper case, otherwise you'll mess things up.

And where do you get these labels? You could use pieces of masking tape, or an address label cut down to size and simply pen the information onto it. You could invest in fancy label printers that make really impressive calibration stickers with very, very tiny print.

Otherwise, preprinted labels make it quick and easy and they'll give your tool room a uniform, professional look. Vinyl labels which are 1" by 1/2" will fit most any tool. They also have a clear plastic cover to protect the writing.

additional information and pricing

PS: Interapid indicators have angular bodies so you'll have to trim the label to fit, or put it on the box. On Last Word indicators you could wrap the label around the body or, again, put it on the box. All you'll need is a ball point pen or better yet Ultra Fine Point Sharpie® and you're set. Hint: if you keep the unused labels in the same loose-leaf binder as your calibration certificates, then you won't have to hunt for them next time.

 

Dial bore gage calibration

 

Dial bore gage calibration poses unique problems. The gage is designed to be used as a comparator and not for direct reading. As a result, accuracy over its full range is immaterial. What is important is repeatability and accuracy over the short range, usually just a few graduations. The indicator reading will need to be accurate to ± one graduation per revolution.

The proper way to calibrate a bore gage is to use certified setting rings or a bore gage calibrator as sold by Mitutoyo, for example.

Set the bore gage to zero at the center of its range, using a setting ring (for instance at 0.500"). Then check with a setting ring at either end of the range (for instance 0.490" and 0.510") or, possibly at some points in between. It's really the only way that the gage accuracy can be verified, by using it the way it was designed to be used.

A setting master can be used to preset your bore gage but should not be used for calibration since other measuring variables come into play.

As a final and important test, check for repeatability against a ring gage, setting master or a micrometer. You can also use the micrometer to see if you'll still get ± one graduation per revolution, more or less. Since the micrometer is accurate to .0001" at best, you can only use this for rough calibration.

Thinking about how the gage is used, you'll see that accuracy of travel has no effect on the accuracy of comparison.

 

Measuring Rods

 

Measuring rods are the better solution for calibrating larger micrometers and calipers because you will avoid having to stack and wring gage blocks. See page 58 for information.



 

Flattery or plagiarism?

 

A Chinese manufacturer of indicators and micrometers by the name of Tresna (Guilin, Guangxi Province) has chosen to publish some of the contents of this page on their own web-site. Of course, they didn't ask for permission and apparently have no intention of respecting copyright laws. We should be flattered but they could have given us credit as authors at the very least.
 



Buyer Beware! We strongly urge you to stick to brand name products such as Mitutoyo, Brown & Sharpe and Starrett. Avoid any off-brands which are likely to be inexpensive Chinese imports.


 

Books by René Urs Meyer

The Companion Reference Book on Dial and Test Indicators
Repair Manual for Swiss-made BesTest and TesaTast Indicators
Starrett 711 Last Word Indicator Repair Manual

Repair Manual for Interapid Test Indicators
 

www.longislandindicator.com

Long Island Indicator Service Inc
14 Sarah Drive — Hauppauge NY 11788 — USA


Contact us
Shopping CartSales TermsRepair ServiceHome

This page's most recent revision: 13 APRIL 2016
All Rights Reserved

Original photographs and content copyright 2016 by JWGrum

glossyblack3dbuttoniconsocialmedialogosfacebooklogo2

Interested in metrology? Join our group on Facebook by clicking on the link above.