(6) Andrei - Reaction Force Control Test / GUI

(6) Andrei - Reaction Force Control Test / GUI

The second task of the qualification phase requires Valkyrie to walk in front of a closed door, push a button and then pass through the door. This requires Valkyrie to be able to maintain balance both when walking and when pressing the button. As such, in order for the current whole-body controller to be able to accurately balance the robot, we need accurate estimates for the reaction forces acting on the body. My first task was to design an experimental setup that will test the simulator's (Gazebo) capabilities to measure reaction forces and to provide the right control inputs. 

Reaction Force Control Test

To accomplish this task, I decided to create a simple pendulum that resembles a robotic arm with joint actuation. The purpose of this arm is to touch a solid box that has a touch sensor attached to its surface, in order to measure the amount of force exerted. We reinforce the fact that accurate reaction force data is necessary for the safety of the robot and for the stability of the balance process in whole-body control. 

There are two main challenges here:

  1. Accurate control of joint actuation,
  2. Accurate measurement of reaction force. 

The joint control was initially accomplished using the Gazebo's GUI. By manipulating directly the joint force, I was able to apply a set force to the solid box. Moreover, to simplify the procedure, the pendulum was initialized to a vertical downward position almost touching the box sensor. This allows us to eliminate the requirement of calculating the force exerted by the pendulum's mass and minimized the impulsive force when making contact with the box sensor. While this method seemed promising due to its ease of use, it allowed for unknowns in the experimental procedure, as we were not able to determine exactly how force tracking is accomplished through the GUI. As a consequence, the results from the touch sensor were noisy, as seen in the figure below. Note here, that in order to plot the sensor readings, I had to include an RQT plotter plugin (a ROS feature) within the gazebo file to be able to debug better the experimental procedure.

One hypothesis for the appearance of noise in sensor data is the possibility of an oscillatory motion resulting from the initial impulsive force applied on the solid box. To fix this problem, I applied a torque input that is time dependent and follows a smooth transition from 0 Nm to 10 Nm, using the hyperbolic tangent function:

The results show an improved sensor measurement with very few data points that are noisy, compared to the previous measurements.

Furthermore, the values match closely the theoretical value expected for this setup, which was obtained from the torque equilibrium equation: 

 

where all the variables are represented in the schematic below:

These results confirm that using these models and virtual sensors will accurately represent the robot motion and interaction. This is necessary for the button press, since we thought that we would need to implement this reaction force in our whole-body controller.

In terms of implementation, the whole model was built using standard URDF format, with two plugins: one Gazebo plugin for the touch sensor and one ROS plugin for torque control. These plugins had to be integrated in the simulation along with the physics model in order to be run on the same simulator time. The challenge here was to create a program that will allow the ROS plugin to connect in real-time (or simulator time) with the physics simulator. For this, I had to learn a lot about Gazebo's code structure in order to integrate the ROS plugin with it. The solution was to append a sensor update event inside the physics model that will allow the simulator to update the controller plugin at the same time with the world simulation. Details can be found inside the code structure provided in the RRBot_arm folder of our Bitbucket repository.

Robot Command GUI

Next, I developed a simple programmatic GUI that will connect to ROS, using callback functions provided by Marcelino. The GUI is designed for simple waypoint navigation and task management. It consists of a waypoint table that describes the locations that have to be reached by the robot and communication buttons that are able to send task data to Valkyrie, stop Valkyrie, or shutdown completely the communication by shutting down the ROS node specific to it. It is important to note that the GUI was created programmatically instead of using the GUIDE tool from Matlab, which is usually easier and faster to use, because it was meant to be scaled to incorporate different functionalities after the qualification period was over. It was also made as an object-oriented program to allow for the supplied software components to be easily adapted and modified in the future.

The following figure illustrates the GUI design and the code can be found on Bitbucket, under the GUI folder. 

Lastly, I started working on a joint position controller for the hands and the hokuyo sensor until NASA updated the competition rules and requested us to use their controllers. 

Future Work

To improve the current log for the qualification, I aim to work with other team members on making the procedure for trial and testing more automatic. Once this is accomplished, we could introduce a reinforcement learning method to find the best parameterization for the qualification tasks.