DRO Scale Calibration

Wednesday, February 19, 2014

I’ve received a fair number of complaints about the readout being inaccurate when the default counts-per-inch were used. These problem can be caused by several different issues. First of all, CPI for most of the the capacitive scales isn’t officially provided by the scale manufacturers. The values for many of the commonly available scales have been found experimentally, and might be off by one or two counts. In addition, there are manufacturing tolerances, rounding issues etc, that can skew the numbers even more. Furthermore, many of the scales that come from China are metric. For instance, the glass scales that are advertised to have resolution of 0.0002” have resolution of 0.005mm, and 5 microns don’t equal 2 ten thousands. This leads to an error in CPI. Based on the 0.0002” resolution the CPI should be 5000, but in fact it’s 5080, which is almost 2% off.

To make sure that your setup is as accurate as it can be, you can’t trust the “default” value. Instead, the scales need to be calibrated in place following one of the methods described below. The calibration process is very simple and you already have all of the necessary tools. The gist of the exercise is to move the scales by a known number of inches, record the number of counts the app receive and use it to calculate the CPI. The formula for this is: received position in counts per inch / number of inches traveled. There are many ways to do it, so let’s look at a couple that don’t require any extravagant tools or equipment.

Prerequisites

First, quite obviously in order to calibrate the scales the TouchDRO application needs to be installed and talking to the scale interface controller. Moreover, the scales need to be mounted to the machine in their “permanent” positions. I.e. when you remove/reinstall the scales the calibration might need to be repeated.

Second, the application settings need to be altered a bit to make the calibration easier:

  • Set display format to show four digits after the decimal period (in inch mode).
  • Set CPI for the axes you intend to calibrate to 10000.
  • If W axis is used, make sure it’s value is displayed separately, not summed up with another axis.

Finally, the application needs to be switched to inches.

Method 1

The simplest way to use the graduated collars that your machine came with.

  1. Move the table to the end of it’s travel in one direction.
  2. Move it back a few turns to get rid of the backlash.
  3. Zero out the axis.
  4. Zero out the collar.
  5. Move the table to the other end of the travel by a number of whole inches (i.e. 5, 10, 20, etc.), making sure that the collar reads zero at the end.
  6. Write down the displayed readout for the axis, ignoring the decimal point. I.e. pretend the decimal points isn’t there.
  7. Divide that number by the number of inches traveled.
  8. Navigate to the “Axis CPI” setting for the axis in question and enter the number from step 6.
  9. Return to the DRO screen and verify that the readout display the number of inches the table traveled.

This method works sufficiently well when the lead screw is known to be accurate (i.e. isn’t worn out). The main advantage is the fact that it doesn’t require any additional tools, and the CPI can be averaged over a relatively long distance. The downside is that if the screw is worn at the spots where the table is stopped, it can throw off the calibration slightly. Similarly, miscounting the number of revolutions can throw the calibration off pretty drastically.

Method 2

Another approach is to use a precision ground block of known length in conjunction with an edge finder, wiggler or a dial test indicator to accurately measure the traveled distance.

I will use a set of 1-2-3 blocks and a lever-type dial test indicator, but any block of a known size can be used.

  1. Clean the table and clamp one of the blocks square to the table.
  2. Place the other block against the first one so the long side is parallel to the axis of travel.
    Preload the DTI so it reads zero
  3. Mount a dial test indicator into the spindle using a drill chuck or a collet.
  4. Preload the indicator against the 1-2-3 block as shown in the picture.
  5. Zero out the axis.
  6. Zero out the indicator collar.
  7. Carefully remove the block.
  8. Move the table until the indicator reads zero touching the clamped block.
    After removing the free-standing block, move the table so the TDI reads zero again
  9. Record the DRO readout ignoring the period.
  10. Divide the value by the length of the used block.
  11. Enter the result into the “Axis CPI” setting for the axis.
  12. Exit back to the DRO screen and verify that the readout display the block lengh.

The same method can be adapted for the vertical axis using either a plunger-type test indicator, as shown in the picture below, or the same lever-type indicator with the probe bent 90 degrees or so. Finally, I used the DTI because for me it’s more repeatable than a wiggler or an edge finder, but with care either can be used.

Vertical axis can be calibrated the same way, but requires only one block.

Using this method the value will still be averaged over a few inches and as long as you take care to accurately indicate the edges of the block, will produce accurate counts per inch.

Conclusion

As you can see, the calibration process is very straight forward and can be done pretty quickly and the results will be well worth it. I’ve used both methods with good success and don’t have a strong preference either way. In practice, as long as you can move the scales for a known distance that is accurate to a few ten thousands, you will get to pretty close within the limits of most DRO scales.

One thing I’d like to mention, though: you might be tempted to use a dial test indicator to measure the distance by simply pressing it against a fixed object. I don’t recommend this method because it has two major drawbacks. First, if the indicator is mounted at a slight angle, it will not measure the distance accurately. Instead it will introduce it’s own cosine error. Second, most common dial test indicators offer only one inch of travel, so you will not be able to average the CPI. If you’re of by one or two counts per inch, this can amount to a few thousandths over several inches of travel. This might be OK for iGaging scales or alike, but glass scales can be much more accurate.

11 comments :

  1. Hi Yuriy
    I have found that if you select a reference start point zero the axis then move the axis a known distance. You can enter a trial value into the axis cpi then back out to the operation mode check the reading against the known value and repeat the correction until the scale reads the exact value you moved the axis.
    I went in to town yesterday and tried to download the beta version . I am registered as a tester and registered at the Google store but every attempt to get the beta version go me to the version I have loaded I just do not use these fancy new toys enough. give me an IBM360 and good old Fortran and I can make headway faster :)

    ReplyDelete
  2. How to do the same with a lathe?
    Where to get the required just two gauge blocks? more accurate than 0.005mm?

    ReplyDelete
    Replies
    1. Lathe is no different from the mill. You might need to improvise a bit, but the idea is generally the same. If you don't have the 1-2-3 blocks, just make something and measure it with an accurate micrometer.
      On another note, I don't know why you would need better accuracy than 0.005 mm, though; I honestly don't know why you would need 0.005, actually. That is insanely close tolerance that you probably won't be able to achieve in an average workshop anyway, unless you're using a heavy uber-precise tool room lathe.

      Regards
      Yuriy

      Delete
    2. Hi Yuri. Great software, thanks for making it.

      I am having problems with the radius and diameter functions on my lathe set up. After calibrating, when I move the cross slide 1 inch (checking with a dial indicator) the radius shows 1 inch but diameter only shows 0.500 when it should show 2 inches. Can you help with this?

      Delete
  3. Perfect perfect perfect I just bought myself a Samsung galaxy S5 downloaded the DRO app calibrated and it's perfect, I would always get disconnected when using my Acer tablet but not so with the Galaxy.

    Thank you so much Yuriy

    ReplyDelete
  4. Hi Yuri,
    Great site. I have a question about using a 12 bit rotary encoder matched to a rack/pinion assembly. Does your software allow for calibrating such an assembly?
    Thanks
    Doug

    ReplyDelete
  5. Hi!
    Is it possible to use metric numbers while doing the calibration or is it optimized for inches?

    ReplyDelete
    Replies
    1. Hi,

      X_disp = the value shown on the display
      X_real = the real value of the travel
      CPI_start = the CPI value used for the calibration
      CPI_new = the calibrated CPI you want to know

      X_disp / X_real * CPI_start = CPI_new

      it doesn't matter what units you use (inch,metric or norwegian beard length)

      Delete
    2. i wonder what i do wrong, i can't get the right readings using this formula. do i have to use abs or incr mode? what do i forget in the settings? using the metric numbers. thx Gerard

      Delete
    3. I am having the same issue with my setup as well. How did you get it fixed.

      Delete
  6. Awesome work, Yuriy! I used an ArduinoNano and the smaller of the two bluetooth modules, both mounted to the same perf board w/three mini USB connectors soldered to the board.

    Re: method #1 above: Step 8 should use the number from step 7, not step 6. We want to enter the number from step 6 divided by the amount of inches traveled into step 8, yes?

    Thank you for creating this and putting it out there!
    -=Randy

    ReplyDelete