Working camera capture with Lightburn

Hello @Slynold & @Mayco

Thank you for your super great implementation of the Snapmaker Camera with Lightburn under Windows. This takes a lot of us to a much higher level.

My question is, do you have a solution for Mac and Linux users?

I use MacOS and Ubuntu Studio for my necessary work. Lightburn also runs on both. Thanks to @Skreelink ](Accurate, Repeatable Laser Guide) now even better. Maybe I’m not knowledgeable enough in all distributions but I think I can’t run a .bat on Linux. Or can I? I already tried WINE, but it didn’t really get me drunk :wink:
Do you have a hint for a solution for us windowless users?

Anyway, your guide is awesome
Jork

@Codeck42 Great, enjoy!

@jork It would be possible to port the application to Linux and MacOS, but I would have to figure out what is used by LightBurn in place of DirectShow (maybe you know or @Slynold knows?). This may not work properly when using Wine as it is Windows SDK functionality.

This is amazing. Thanks for working it all out.

Do you find it’s ability to translate the picture to coordinated is better? That was my only issue with luban.

Do you mean the accuracy of a job set up using the camera? If so, yes, it’s much better than Luban once you’ve gone through the calibration procedure outlined in the OP.

Hey jork,

my first solution, before @Mayco implemented the Lightburn Host, was using the OBS Studio which in the latest versions is shipped with a Virtual Webcam Plugin. I Just used an older version because we need a direct show webcam in windows, and an older plugin provided this.

Just take a look at version 4 of my first post (use the edit marker on the top right of it)

OBS is for Windows Linux and Mac OS, so you can use this on your workstations!

Another possibility, which needs some programming, would be to use v4l2loopback
( GitHub - umlaeute/v4l2loopback: v4l2-loopback device ) as virtual camera, and some python code with pyvirtualcam as a feeder ( pyvirtualcam · PyPI ) .

At the moment I do not have time to implement this, so I would recommend you to stick to OBS studio. In fact, this should be only a few basic lines of code, but at the moment finishing my toddlers room has priority.

1 Like

Hello @Slynold
Thank you very much for your answer and the time you spent on it. I also came across OBS Studio over the holidays. I’m just getting my head around it.

By the way, your reference to your fourth version is doubly great. Firstly, I didn’t know about this edit function with review and secondly, your threat has brought me a lot further.

Thank you and have fun with renovating and your family.

Thank you for this solution. One hang-up I’m having is when I hit to capture a new image, the 10W laser toolhead moves to the max X and Y and almost the highest Z (329, 347, 323.50 in my case). Therefore, each capture only covers part of the bed and also does not permit a valid camera calibration in Lightburn. Luban takes photos near the center of the bed as expected so hoping to get closer to center with this too.

Question is whether there is a settings file or command line option to set the default position for the toolhead when ENTER is pressed to get a new image?

Thanks!

I think a settings file is a good idea, as the current default setting is only valid for the A350.

Related to your problem: when I was setting up Lightburn I experienced the same issue sporadically. I managed to fix it by homing and setting the system to machine coordinates I believe. Not near my PC at the moment so I’ll double check and update this post with the exact commands when I get the chance.

I should have mentioned that my machine is the Snapmaker A350 so it’s odd that it moves to the upper corner if those are the defaults.

I’ve gone through many settings in Lightburn but none had any effect on the image sources since they’re controlled from the command line.

Appreciate the support!

Indeed I understood you have an A350, I have experienced the same issue on my A350. What solved it for me was ensuring that the coordinate system was set to workspace coordinates. Before I start camera capture, I make sure to home (G28) and switch to workspace coordinates (G54). I have the same in my “End GCode” so that it’s guaranteed to be in workspace coordinates when the job is finished.

However, if you’d like to go a more customized route, I have released a new version of the tool that can be configured using a config file: Release v1.0.3 · PolymerPrints/SnapmakerLightBurnHost · GitHub. You have to run the tool once for the config file to be generated, then you are free to edit. Note that the config file is only read on startup, so the tool needs to be restarted for the changes to take effect.

1 Like

Just tested this out and it’s working perfectly. You were right about the homing plus G54.

I was able to dial in the settings by moving it 5 or so mm and now it aligns the camera directly above the center of the bed. This should also help anyone with a 150 or 250 to align theirs (assuming 10W laser obviously).

Thanks!

2 Likes

Hi everybody, i am quite amazed an impressed to see a solution to work with a camera and lightburn. Unfortunately I am not able to install the softcam. I think i installed the softcam.bat, but was not able to build a “project” or find the binary. I would be very happy i anybody can give me a hint how to do it. Thanks a lot and best regards.

For softcam: completely extract the .zip file, then double click “RegisterSoftcam.bat”. After that, extract “Release.zip” (also found on Release v1.0.3 · PolymerPrints/SnapmakerLightBurnHost · GitHub) and run “SnapmakerLightburnHost.exe”. If this does not work, please let me know (make sure to share any errors that occur).

Hi Mayco, thanks a lot, not it is running fine. Thanks for our quick response and much more for the original program. And of course also slynold for the tutorial. I love the Snapmaker hardware, but change the slicer software long ago. And from now one I can change the laser software as well.

2 Likes

Hello.
Please let me know as the adjustment is not working.
Which type of lens should I choose in Lens Calibration?
Standard lens or fisheye lens?

When I choose the standard lens, the image looks like this The picture on the right side looks as if it was taken with a fisheye lens.

With the fisheye lens, the picture is squashed vertically.


Is there a mistake in the setting somewhere?
Thank you in advance for your help.

Fisheye lens is the correct setting. The image on the right may look a bit morphed, but as long as the score is good you should be able to complete the calibration successfully. If not, please let me know.

Should SnapmakerLightBurnHost also work with Artisan?
Because for me it’s not. Seems to be the whole communication via curl is not working with the Artisan.

Sorry, I don’t have an Artisan so I have not tried it. It’s possible that it uses a different API and thus will not work. If I can get my hands on the API documentation for Artisan I can add support.

1 Like

Among the coordinates set in the config.json file, the Y-coordinate is not being correctly reflected. When I check on the touch panel, the X and Z coordinates match the values in the config.json file, but the Y-coordinate always moves to the position of 5.0. I’m not sure what could be causing this. Can you help me understand the reason?

Are the coordinates specified in Config.json relative to the work origin rather than the machine origin? I believe the incorrect work origin is causing the head and bed positions to be misaligned. After powering on my SM2, when I move to the control screen using the touch panel, it prompts me to perform the homing operation. At the completion of homing, the touch panel displays the work coordinates as X=0mm, Y=260mm, Z=0mm. Then, when I send the capture command (Enter key) in Lightburn Host, the XY axes move, but the Z axis cannot move because it is at the 0mm position. In the Config.json file, I have specified the settings for the 250 model as X: 160.0, Y: 120.0, Z: 175.0. I believe there might be some missing steps or settings related to setting the work origin. Can you help me identify them?