I’ve received a fair number of complaints about the readout being inaccurate when the default counts-per-inch were used. These problem can be caused by several different issues. First of all, CPI for most of the the capacitive scales isn’t officially provided by the scale manufacturers. The values for many of the commonly available scales have been found experimentally, and might be off by one or two counts. In addition, there are manufacturing tolerances, rounding issues etc, that can skew the numbers even more. Furthermore, many of the scales that come from China are metric. For instance, the glass scales that are advertised to have resolution of 0.0002” have resolution of 0.005mm, and 5 microns don’t equal 2 ten thousands. This leads to an error in CPI. Based on the 0.0002” resolution the CPI should be 5000, but in fact it’s 5080, which is almost 2% off.
To make sure that your setup is as accurate as it can be, you can’t trust the “default” value. Instead, the scales need to be calibrated in place following one of the methods described below. The calibration process is very simple and you already have all of the necessary tools. The gist of the exercise is to move the scales by a known number of inches, record the number of counts the app receive and use it to calculate the CPI. The formula for this is: received position in counts per inch / number of inches traveled. There are many ways to do it, so let’s look at a couple that don’t require any extravagant tools or equipment.
First, quite obviously in order to calibrate the scales the TouchDRO application needs to be installed and talking to the scale interface controller. Moreover, the scales need to be mounted to the machine in their “permanent” positions. I.e. when you remove/reinstall the scales the calibration might need to be repeated.
Second, the application settings need to be altered a bit to make the calibration easier:
- Set display format to show four digits after the decimal period (in inch mode).
- Set CPI for the axes you intend to calibrate to 10000.
- If W axis is used, make sure it’s value is displayed separately, not summed up with another axis.
Finally, the application needs to be switched to inches.
The simplest way to use the graduated collars that your machine came with.
- Move the table to the end of it’s travel in one direction.
- Move it back a few turns to get rid of the backlash.
- Zero out the axis.
- Zero out the collar.
- Move the table to the other end of the travel by a number of whole inches (i.e. 5, 10, 20, etc.), making sure that the collar reads zero at the end.
- Write down the displayed readout for the axis, ignoring the decimal point. I.e. pretend the decimal points isn’t there.
- Divide that number by the number of inches traveled.
- Navigate to the “Axis CPI” setting for the axis in question and enter the number from step 6.
- Return to the DRO screen and verify that the readout display the number of inches the table traveled.
This method works sufficiently well when the lead screw is known to be accurate (i.e. isn’t worn out). The main advantage is the fact that it doesn’t require any additional tools, and the CPI can be averaged over a relatively long distance. The downside is that if the screw is worn at the spots where the table is stopped, it can throw off the calibration slightly. Similarly, miscounting the number of revolutions can throw the calibration off pretty drastically.
Another approach is to use a precision ground block of known length in conjunction with an edge finder, wiggler or a dial test indicator to accurately measure the traveled distance.
I will use a set of 1-2-3 blocks and a lever-type dial test indicator, but any block of a known size can be used.
- Clean the table and clamp one of the blocks square to the table.
- Place the other block against the first one so the long side is parallel to the axis of travel.
Preload the DTI so it reads zero
- Mount a dial test indicator into the spindle using a drill chuck or a collet.
- Preload the indicator against the 1-2-3 block as shown in the picture.
- Zero out the axis.
- Zero out the indicator collar.
- Carefully remove the block.
- Move the table until the indicator reads zero touching the clamped block.
After removing the free-standing block, move the table so the TDI reads zero again
- Record the DRO readout ignoring the period.
- Divide the value by the length of the used block.
- Enter the result into the “Axis CPI” setting for the axis.
- Exit back to the DRO screen and verify that the readout display the block lengh.
The same method can be adapted for the vertical axis using either a plunger-type test indicator, as shown in the picture below, or the same lever-type indicator with the probe bent 90 degrees or so. Finally, I used the DTI because for me it’s more repeatable than a wiggler or an edge finder, but with care either can be used.
|Vertical axis can be calibrated the same way, but requires only one block.|
Using this method the value will still be averaged over a few inches and as long as you take care to accurately indicate the edges of the block, will produce accurate counts per inch.
As you can see, the calibration process is very straight forward and can be done pretty quickly and the results will be well worth it. I’ve used both methods with good success and don’t have a strong preference either way. In practice, as long as you can move the scales for a known distance that is accurate to a few ten thousands, you will get to pretty close within the limits of most DRO scales.
One thing I’d like to mention, though: you might be tempted to use a dial test indicator to measure the distance by simply pressing it against a fixed object. I don’t recommend this method because it has two major drawbacks. First, if the indicator is mounted at a slight angle, it will not measure the distance accurately. Instead it will introduce it’s own cosine error. Second, most common dial test indicators offer only one inch of travel, so you will not be able to average the CPI. If you’re of by one or two counts per inch, this can amount to a few thousandths over several inches of travel. This might be OK for iGaging scales or alike, but glass scales can be much more accurate.