A six axis Force/Torque Sensor measures the forces and torque on the 6 degrees of freedom of a robot. It can be integrated with a robot to perform specific tasks like product testing, robotic assembly, grinding and polishing. In research, these sensors are used in robotic surgery, haptics, rehabilitation, neurology and many others applications. ATI's F/T sensors offers support for multiple system interface options like Ethernet (TCP or UDP), EtherNet/IP, DeviceNet, CAN bus systems, EtherCAT etc.,
In this MecaNetwork example we will discuss about the integration of ATI's Mini40 Net F/T sensor with Meca500 using Python. We are taking advantage of the Meca500's Velocity Control move API's to integrate with force sensor feedback to create some basic and advanced force control logics that can be used to build your force control applications
ATI Min40 Net F/T sensor
M12- M12 D coded RJ45 cable
Net F/T sensor power supply
Windows or Linux PC with Python 3.6 or above , PyQt5 and pyqtgraph installed
NetFT Sensor Python Class:
To interface the sensor, we have created a simple Python class for connecting and streaming the sensor values over UDP. This is done for simplicity and reduced processing overload than other communication interfaces. Depending on your application's requirements you could choose a different interface option. The complete code for the NetFT Sensor class is available in the attached files as NetFT.py
class Sensor: '''The class interface for an ATI Force/Torque sensor. This class contains all the functions necessary to communicate with an ATI Force/Torque sensor with a Net F/T interface using RDT. ''' def __init__(self, ip): '''Start the sensor interface This function intializes the class and opens the socket for the sensor. Args: ip (str): The IP address of the Net F/T box. ''' self.ip = ip self.port = 49152 self.sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) self.mean =  * 6 self.stream = False
MecademicRobot Python Class:
Mecademic Robots can be controlled via our Python API available at our Github page. The Python API has classes for both robot control on port 10000 and feedback on port 10001. This integration example uses both ports.
Force/Torque Sensor setup:
Setting up of the Force/Torque sensor is not presented in the scope of this MecaNetwork article. Please refer to the Quick Start guide form ATI for more detailed setup instructions for setting the IP address, biasing the output data and setting the TCP of the sensor.
Velocity Control Mode for Meca500:
With firmware 8.1.5 onwards, Meca500 supports the Velocity Control mode which opens up the possibility of implementing complex applications like Force Control, Online Path correction etc., The Jog panel and Joystick jog of the Meca500 is based on the Velocity Control API. A significant advantage of the Velocity Controls APIs is that the commands are not buffered like position control command and the Trajectory Generator of the Meca500 controller doesn't go into an error state if the move causes the robot reach a joint limit or Singularity. Dynamic path planning can be accomplished by modifying the velocity arguments.
More information on this is available in Section 2.2.5 Position and Velocity Modes in the Programming Manual. For the rest of this article, it is important to understand how the velocity mode works in the Meca500, as it is the foundational concept on which the Force Control implementations are based on.
Application Example #1: Force controlled robot move
In this implementation, the robot is trying to move towards a target pose either
1. Till the target force is met
2. Move as long as the force measured is above or below a certain threshold
There are different scenarios that could be used as an example here. In the first mode, consider an application where the robot needs to move towards a surface, say an oscillating grind wheel for a polishing application. While, a predefined target pose can be utilized, consider an application where the approach pose orientation is dynamically selected based on input from a vision system. In this case, it is not easy to accurately define the target pose to start the grinding. In such a limited scenario, the force sensor's feedback can be used to guide the robot to move along the TRF of Z axis at a certain velocity till the force measured on the sensor is equal or greater than the preset threshold which is indicative of an active contact of tooling with the grinding surface.
A simplistic implementation of this would follow an algorithm as given below
while(ForceSensor_Z_axis < Threshold force) and (RobotPoseFeedback_Z < TargetPose_Z): MoveLinearinVelMode(0,0,x,0,0,0)
With Python, the implementation will be as below. The complete code is available as Polishing.py in the attached file.
def move_to_grindwheel(self,vel = 25,force = -5): ''' Starts the jog move towards the grindwheel at the specified vel in Z axis of TRF till the specified target force is detected by the sensor The target pose's Z axis value is specified as back up to stop moving further If no value is specified, it takes the default value of 25mm/s and -5N on Z axis ''' print('Moving towards the Grindwheel...') self.force_threshold = force self.approach_vel = vel while True: if (self.robot_pose.cartesian > 100) and (self.sensor.value > self.force_threshold): self.robot._send('MoveLinVelTRF(0,0,'+str(self.approach_vel)+',0,0,0)') time.sleep(0.01) else: print('Move Completed') break
In the second mode which is a slight variant of the first, the user can have a certain sequence executed continuously on a separate thread as long as the force measured is less than the target value. If in the event, measured force is more than the threshold, the application can be paused or aborted depending on the application requirements. Such an implementation was used in this demo for testing Automotive Displays where the force sensor reading was used for monitoring forces during button press events and touch screen probing events. The complete code is available in the attachment TouchScreenTest.zip.
Application Example #2: Force guided robot jog
In this application, we are mapping the normalized force sensor reading (-1 to +1) measured at the TCP to the arguments of a MoveLinVelTRF command. A continuously executed QThread then keeps sending the target velocity command within the SetVelTimeout. With the Velocity mode offering dynamic control, the robot can be guided by applying the force on the TCP.
class ForceGuidedJog(QObject): completed = pyqtSignal() def __init__(self,robot, sensor, parent = None): QObject.__init__(self, parent = parent) self.continue_run = True self.sensor = sensor self.r = robot self.r.SetTRF(0,0,90,0,0,-45) self.r.ResumeMotion() def jog(self): maxVel = 250 while self.continue_run: cmd = 'MoveLinVelTRF' + '(' + (','.join(format(vi, ".3f") for vi in ( i*maxVel for i in self.sensor.norm_value))) + ')' self.r._send(cmd) if self.r.is_in_error(): self.r.ResetError() self.r.ResumeMotion() time.sleep(0.001) self.completed.emit() def stop(self): self.continue_run = False #self.sensor.stop()
To handle this parallel execution, we are using PyQt5's QThreads and QObjects. The ForceGuidedJog is a class which has the method to jog the robot using force sensor input, subclassed from QObject. During execution, this object instance is moved to a QThread instance and upon completion, the thread and the object are deleted to exit gracefully.
def start_jog_thread(self): self.thread1 = QThread() self.force_guide_jog = ForceGuidedJog(self.robot,self.sensor) self.ForceJogStop.connect(self.force_guide_jog.stop) self.force_guide_jog.moveToThread(self.thread1) self.force_guide_jog.completed.connect(self.thread1.quit) self.force_guide_jog.completed.connect(self.force_guide_jog.deleteLater) self.thread1.finished.connect(self.thread1.deleteLater) self.thread1.started.connect(self.force_guide_jog.jog) self.thread1.finished.connect(self.force_guide_jog.stop) self.thread1.start()
Application Example #3: Force guided robot trajectory teaching
This application is force guided jog added with a logging feature to log the pose feedback available on the Monitoring port (10001) of Meca500. As the robot is being jogged using a force sensor input, the joint pose feedback is logged at regular samples. The saved file can be then used to replay the trajectory that was generated by the force guided jog. This is useful in applications where a complex trajectory is needed to be taught by manually moving the robot.
class TrajRecorder(QObject): completed = pyqtSignal() def __init__(self,parent = None): QObject.__init__(self,parent = parent) self.continue_run = True self.fb = RobotFeedback('192.168.0.100','8.3.5') self.fb.connect() self.fb.get_data() self.last_value = self.fb.joints # Remove previous trajectory log if os.path.exists("trajectory_record.prg"): os.remove("trajectory_record.prg") def record(self): while self.continue_run: self.fb.get_data() self.current_value = self.fb.joints if self.current_value != self.last_value: open("trajectory_record.prg", "a").write("MoveJoints"+str(self.current_value)+'\n') self.last_value = self.current_value time.sleep(0.001) self.completed.emit() def stop(self): self.continue_run = False
Application Example #4: Force based measurements in Testing and Inspection Applications
In this kind of application, the force sensor feedback is not tied to the robot's movement. The force feedback at specific instances of the robot trajectory is used to perform Testing and Inspection application; the robot performs the moves which are part of the testing sequence and the force sensor input is used to validate the pass/fail results.
As shown above, these are only a small sample of varied implementations of Force Control that is possible with a Force/Torque Sensor and the Meca500. The Velocity mode of Meca500 enables users to build much more complex applications and sophisticated Force Control/ Path correction algorithms as the application demands.
Please note that these examples are provided as-is by either Mecademic or it's Partners. These can be used as a starting point for development and testing but Mecademic or it's partners are not under obligation to provide support and are not liable for any errors or unintended behavior caused by the examples. These examples could be updated over time.