Result (Seam displacement)pose and \SearchStop robtarget in Optsearch_1D
Hi,
I am using the optsearch_1D for seam tracking with a Servo Robot powercam.
I would like to have an understanding of
A. The returned seam displacement pose (called 'Result') (a pose to use with respect to what coordinate system, to do what?)
B. The \SearchStop robtarget. This one, I understant that it finds the beginning of the seam and I am successfully using it to find the start the of the seam to use ArcLStart with preprocesstracking. However, I'd like to understand how the orientation for that robtarget was deduced. Is it an interpolation between the Search positions?
The problem I have is that this returned robtarget has sometimes the wrong orientation if I move my workpiece (. My workpiece is a fillet joint and is flat parallell to the base of the robot. So let's say it can rotate about Z and move in the X Y Z plane). My nozzle can sometimes hit one of the plates because although the position of the torch is correct, the orientation is not perfectly at 45 degrees.
How can I programmatically calculate my end position based on this robtarget (start of the seam), the length of the workpiece and the sensor? I was actually expecting the sensor to be able to return a direction vector for the seam.
Any help is appreciated.
I am using the optsearch_1D for seam tracking with a Servo Robot powercam.
I would like to have an understanding of
A. The returned seam displacement pose (called 'Result') (a pose to use with respect to what coordinate system, to do what?)
B. The \SearchStop robtarget. This one, I understant that it finds the beginning of the seam and I am successfully using it to find the start the of the seam to use ArcLStart with preprocesstracking. However, I'd like to understand how the orientation for that robtarget was deduced. Is it an interpolation between the Search positions?
The problem I have is that this returned robtarget has sometimes the wrong orientation if I move my workpiece (. My workpiece is a fillet joint and is flat parallell to the base of the robot. So let's say it can rotate about Z and move in the X Y Z plane). My nozzle can sometimes hit one of the plates because although the position of the torch is correct, the orientation is not perfectly at 45 degrees.
How can I programmatically calculate my end position based on this robtarget (start of the seam), the length of the workpiece and the sensor? I was actually expecting the sensor to be able to return a direction vector for the seam.
Any help is appreciated.
0
Answers
-
Are you doing the search, displacing the seam, then welding (no active tracking while welding)?0
-
Hi,
I am trying to search, find the start of my workpiece and the seam, then weld with active tracking while welding.
Basically, all I know is the length of the workpiece. I want to programmatically find the start point and end point of the weld (end points of the vector in 3D space), so that the next workpiece can be placed in an approximative position (max 5 degrees rotation and 50mm offset for example) (within tolerance, such as to still find the seam with the search), but with the tracker the same exact path is obtained, and I need the same orientation of the torch with respect to the welding path (45 degrees).
Isn't this how people are using seam finding in general?
Thanks
0 -
See chapter 4 in the attached manual (for RobotWare 6).
This gives examples of how to search for the two end points of a weld.0 -
Hi @graemepaulin
Thanks for your really useful link! A lot of info in there that is missing in the optical tracking manual.
So what I had not understood so far is that the search is only in one dimension and is used only to find the displacement.
This seems not to be the best when using a 3D laser camera like the powercam...I should be able to deduce the 3D location and direction of the weld (we do not just have a point but the reference side walls as well in the image) with just one pass isn't it? To get the 6DOF position of the part with the powercam, is it really the idea to search from 3 different sides as if this was a dot laser?
Isn't there a similar manual for optical seam tracking?
Thanks0 -
I have messaged you with the optical tracking manual as it is not a public manual.
You need one of the following RobotWare options to be able to use the functionality:. Sensor Interface• Optical Tracking CAP• Optical Tracking Arc• Externally Guided Motion (EGM)0 -
Hi again,
That's the manual I had been using, and we have sensor interface and optical tracking arc options installed.
I am just trying to piece my head around what can and cannot be done when searching.
Does this mean that the only motion that the workpiece is expected to do is a translation (x,y,z) but to keep its original rotation?
I expected the search to fully adapt the path with a 6DOF offset of the start and end points of the workpiece.0 -
Please share some more information about your sensor - i failed to find any hits with a google search on "powercam 3D laser camera "0
-
It's the predecessor of this one https://servo-robot.com/power-tracshr/ but the image acquisition principle is the same I think, it's laser line triangulation.0
Categories
- All Categories
- 5.5K RobotStudio
- 396 UpFeed
- 18 Tutorials
- 13 RobotApps
- 297 PowerPacs
- 405 RobotStudio S4
- 1.8K Developer Tools
- 250 ScreenMaker
- 2.8K Robot Controller
- 316 IRC5
- 61 OmniCore
- 7 RCS (Realistic Controller Simulation)
- 798 RAPID Programming
- AppStudio
- 3 RobotStudio AR Viewer
- 18 Wizard Easy Programming
- 105 Collaborative Robots
- 5 Job listings