For those wondering, I’m still here, toiling away with my A350. I’ve finally gotten to the point where I’m reasonably happy with the flatness of my build-platform. I ended up milling it down. There’s a few pads that are still too low, but it’s “good enough” for now - prompted by someone wanting to make something for them - groan/grin.
I’ve now gotten to the point where I want to calibrate the XYZ-axis (M92) stepping. The default setting is 400 steps per unit for each.
I’m doing this manually using a stack of 123 blocks that I can individually measure with a micrometer. For my purposes today, let’s assume that they’re all three inches tall, in all I have a distance of nine inches to use as a measurement space. (As an aside, whilst cheap eBay acquisition, they are remarkably uniform and to spec - within the tolerance of my 2 micron DTI.)
I’m using a Digital Multi-Meter set to continuity. I have a dowel pin in the spindle, have one contact of the DMM connected to the spindle, the other to a 123 block. When I hear a beep, I know it’s touching.
Using that method I can set a new steps per unit value. Over several measurement cycles I’d expect these values to converge, but they’re not. I’ve tried several times.
I read that this is due to micro-stepping. Apparently, the more decimal places you supply, the less accurate this becomes.
I don’t know if this is true or not.
I don’t know if the SM firmware supports the Marlin INCH_MODE_SUPPORT, the GitHub appears to indicate support for G20, but I’d expect that instead of a value of 400 steps per mm, I’d see 10160 per inch. I have not (yet) tried this.
Before going down this line, I want to ask if using imperial stepping would increase the resolution without involving an error.
A single step in metric, is .25% of the unit, but a single step in imperial is less than 0.01%.
Am I missing something here?