RobotStudio event

Coordinated work object definition

Hi people.

 I have here 2 robot, one in front of the other, and they should work sichronized and coordinated. The main robot (R1) has an emitter installed on its tool. The second one (R2) has a receiver on the the tool. This couple is used to inspect some parts and to do so, they move along the desired area and R2 is always folowing R1, "looking to it's eyes". I've beeing doing this configuring R1 and R2 tool, wobject and banse frame using nominal values from the drawings. For most of the parts, where there is light curvature, this method achieved good results.
 Now I'm dealing with a section of a spherical shell, something like those chineese hats. With this part the tool is always beeing reoriented to be keept orthogonal to the surface. It is impossible to have the real working cell with exactly the same dimension it was designed. And this small differences (0.5 mm in the robot pad, 0.2 mm in the tool and so on) get together and result in a misalignment between R1's tool and R2's tool. During the inspection path it is possible to notice that the angle between R1 and R2 tools change as the tool moves and that implies in attenuating the measurement. Sometimes it is possible to see something like 3mm offset from R1's TCP and R2's.

 So, I was wondering if there is a method that can be used to define the coordinated working  object as simple as the 3 points method used to define a traditional work object. With that I could calibrate all this working cell frame with it's real dimensions and positioning.

Thank you for any help.
Leo


Comments

  • osku
    osku ✭✭
    Hi Leo
     

    Have you seen the Application Manual - MultiMove? There is described how to do a relative calibration of two robots (Calibration - Relative calibration). Maybe the accuracy would increase if you would do a calibration with 10 targets. Don't forget to have really good TCP's during calibration.

     

    -Osku
  • Well, I could solve the problem by my self. Here the is what Idid:
    1) Use a sharp pointed well calibrated tool in the master robot (R1).
    2) Define a certain point in space, ex.: P0.
    3) From P0, mark an extreme PX point from R1's tool coordinates X.
    4) From P0, mark an extreme PY point from R1's tool coordinates Y.
    5) Equip and calibrate the coordinated slave robot (R2) with another sharp point tool.
    7) Create a new wobj in R2. This will be the coordinated wobj.
    6) In R2 wobj definition, set ufprog to "FALSE". Also update ufmec with the main robot's name. Ex.: "Rob1"

    7) Jog  R2 to P0. Certify that both tools centers will touch.
    8) Start the wobj definition routine. Note that user frame points must be unavailabe because theese points will be dynamicaly defyned according to R1's flange position.
    9) Mark P0 as X1, jog to PX and mark it as X2 and repeat to PY as Y1.

    Now the coordinated wobj is well defined, but maybe the base frame of R2 is also misdefined. This was my case. I needed to reaclibrate R2's base frame and the procedure is not complicated if the robots share a comon working area, as stated below:
    1) Use sharppoint calibrated tools as in the situation before.
    2) Select the slave robot to calibrate its base frame: Calibration:Rob2:Base Frame.
    3) Jog the robots in at least 3 points (recommended 5 or more) and keep both TCPs in contact.
    4) Better using non coplanar and no colinear points.

    After calibrating the base frame the misalignment just finished and all moves, linear or reorintation jus run out perfectly.

    Best regards
    Leo
  • How often are you having to calibrate the robots?  Or how often do you want the robots to calibrate?  Every part?

    If the part is circular I would use a SearchC command in conjunction with a touch probe, like sometime offered from Renishaw.  The Renishaw touch probes have great accuracies and are reliable to roughly 1.0 um, which would be more than enough accuracy. 

    With SearchL or SearchC commands you could detect the parts themselves, or detect a "calibration" point to help define a common world coordinate system for the robots.


  • Dear USLuser1,
    Thank you for your tip. The base frame calibration must be done every time you change robot displacement, what means for mos of people, few times in life. In fact, when I wrote this post I was wondering if there was a method just like wobj calibration that could be applied to base frame. The method I foud is that one I described in the last post. It is simple enough (just marking some points) and proved it's functionality. But it is mandatory to use well calibrated tools.

    The work object calibration is done every time the part is moved or changed, so the robot knows where the origin is on each situation.

    The program for this spherical shell and for other I use here was obtained using Mastercam and Robotmaster softwares. With theese applilcation I program the robot path using the CAD model of the part. So, when I want to run it, I must restabilish the wobj in order to make both coords systems to coincide. This can be issued without a touching probe, but of course there is the software cost to build the path.

    Specialy in this case, the problem was not "finding" the part, but aligning the two robots that work on it. With the base frame well calibrated the secondary robot knows exactly it's position regarding to the main robot. So, there is no more gap or lost of positining while robots move and reorient coordinatedly.

    Best regards
    Leo