Point cloud and 3D reconstruction

2. Point clouds

To compute point clouds from the robot head, the following commands should be invoked;

roslaunch rh_cameras RHcam_stereo.launch
rosrun rh_stereo_proc RHpoint_cloud 
rosrun rh_stereo_proc RHstereomatcher.py

After the above nodes have been loaded, it is needed to capture a set of stereo-pairs as follows:

rostopic pub -1 /RH/cmd/acquire rh_cameras/CamerasSync '{timeStamp: now, data: full}'

Note 1: the current version of RHstereomatcher.py takes around 15-20 mins to compute disparity images. A much faster version will be released soon.

Note 2: At the moment, RHdynamicCalibration saves a point cloud to the directory where it was invoked. To visualise the computed point cloud, just type in a terminal: pcd_viewer test_pcd.pcd -ax 0.5

2.1 Simulation mode

To test the point cloud computation, it is possible to avoid the need to compute disparity maps (from RHstereomatcher) and just read them from disk. To do this, it is needed the computed disparity maps (e.g. fMatch0.txt, fMatch1.txt and fMatch2.txt) at:

../rh_stereo_proc/C3D/

These disparities maps can be obtained by invoking rosrun rh_stereo_proc RHstereomatcher.py as described in the Point clouds section. Then, one needs to specify that rh_stereo_proc and rh_calibration will run in Debug mode. For this, one needs to find "Input" and "Debug_Mode" in rh_calibration/src/calibration_input.xml such that it looks as follows:

<!-- DEFINE FOR DEBUG PURPOSES ONLY!!
     The input to use for calibration.
     To use an image list -> give the path to the XML file containing the list of the images 
-->
<Input>"src/simulation_nodes/input_images_models.xml"</Input>
<!-- If true (non-zero) the program will read the list in Input-->
<Debug_Mode>1</Debug_Mode>

input_images_models.xml is stored in rh_calibration/src/simulation_nodes/. This XML file must contain a list of the image-pair used to generate the disparity map. This file should look similar to the example below:

<?xml version="1.0"?>
<opencv_storage>
<!-- List images by pairs, e.g.
     left1.tif
     right1.tif
     left2.tif
     right2.tif
     ...
     leftn.tif
     rightn.tif
 -->
<images>
/path/to/image/1_Left.tif
/path/to/image/1_Right.tif
</images>
</opencv_storage>

Finally, to run the point cloud computation, just type in a terminal:

roslaunch rh_stereo_proc capture_simulation_c3d.launch
rosrun rh_stereo_proc RHstereomatcher_simulation.py
rosrun rh_stereo_proc RHpoint_cloud

Thereafter, just follow instructions given in Note 2 in the previous section.

h3. 3 Robot head description (URDF)

URDF and Xacro descriptions of the robot head can be found in *rh_ptu/description*. This should be updated according to the geometry adopted in each partner's lab. To verify the robot head model in rviz, there is _rh_ptu/launch/RH_ptu_urdf.launch_ that allows a user to virtually control the robot head. This URDF description is used for hand-eye calibration and tf brodcasting. To run the launch file, simply type:

&lt;pre&gt;
roslaunch rh_ptu RH_ptu_urdf.launch
&lt;/pre&gt;

+Note: At the moment, there are default values for hand-eye and camera calibration in the clopema_robothead stack. These do not actually reflect an updated configuration of the robot head.+

*If there are problems or bugs, please contact "Gerardo Aragon":mailto:Gerardo.AragonCamarasa@glasgow.ac.uk*