Recent discussion has given me ideas, but sadly while I can do a lot with g-code, I’m no good at any web language. There’s been some discussion about using the built-in thickness measurement on the 10W laser to make auto-adjustments to the Z height. While currently this is not an option, Z movement to follow objects is (if you use my non-planar guide). So here’s my idea and hopefully someone is smarter than I can implement it.
A simple webpage with a few options for input;
Snapmaker IP (input the IP of your snapmaker, maybe there can be an auto-scan like Luban does to auto-fill)
Origin Location (The X/Y location of your intended origin)
Object Size (The size in mm of your planned project, maybe two boxes one for X and Y, or a combo doing X,Y format)
Interval spacing (How fine you want the mesh, small spacing takes longer, maybe default 10mm)
Laser offset (During my testing, the measurement laser hits the object 73mm in the X- compared to the blue diode, so this could be default, but you can test your own offset. I’ve not done Y offset tests yet)
Jog Speed (default 3000 or 6000 just to help speed things up, snapmaker uses 1500 in Luban)
The internal process would be after a user inputs these options, some simple math is done in the background to take your origin location (say 0,0) add the laser offset to ensure the red laser is on the actual origin (making it 73,0). These numbers are plugged into the web api to initiate the measurement cycle.
http://{Snapmaker IP}:8080/api/request_Laser_Material_Thickness?x={X Origin}&y={Y Origin}&feedRate={Jog Speed}
Then after it gets a response similar to this: {"status":true,"thickness":12.56268310546875}
truncate to 2 decimal places, i.e. 12.56 and save this height at the X/Y location. Afterwards, augment the original X origin by the Interval Spacing
, repeat until it reaches the Object Size
. Do the same for Y so it moves up to the next row, and begin decreasing X back to the starting origin until it reaches the object size in Y.
Then take the saved positions and build an STL, kinda like a flat lithophane, to make a visualization of the scanned object. Between each point could be averaged to smooth it out, but pretty much mimicking the way the bed level visualizer plugin for octoprint works, just outputs to an STL.
The resulting STL could then be used to build the project on in Fusion360 or wherever so the movements can be non-planar and follow the object.
Or this could just be rambling and not really viable, is early morning and I’m still tired.
EDIT: According to how the toolhead moves from measuring to setting origin. There’s an X70 Y-8 difference.
Red measuring laser:
Blue diode laser: