Karel's notes after final review

CALIBRATION OF STEREOHEAD

- move the robot to home position, open gripper r1, and insert calibration target into gripper (facing chessboard towards the robot)

- rh_ptu/RH_ptu_launch.launch - dynamic calibration has to be off while calibration
<node pkg="rh_robot_interface" type="RHdynamicCalib" name="RHdynamicCalib" output="screen">
- RH_ptu_launch.launch is started with the robot if run_rh:=true

- rh_calibration/launch/RHcalibration_service_CVUT.launch
- here are also parameters which define dimensions of calibration target

- rh_robot_interface/src/calibration/calibration_rh_complete_CVUT.py

- let only capture these 37 calibration position and after stereo calibration is computed, you can CTRL+C (hand-eye is not used anymore)

- after calibration:
- some calibration files are stored in kinect1 pc in rh_cameras/calibrations/CVUT
- some calibration files are stored in "your client" PC in rh_calibrations/calibrations/CVUT
- make the robot model again (clopema_description/calibrate.sh)

VERGENCE DEMO - cameras are verging to selected point in the left image

- rh_launch/launch/vergence_demo.launch

GRASPING

- rh_launch/launch/grasping.launch
- rh_feature_extraction/src/surface_features/RH_GraspingFeature_Node.py - let it start completely
- rh_skills/src/grasping_actionlib_client.py

FLATTENING

- rh_launch/launch/flattening.launch
- rh_feature_extraction/src/surface_features/RH_GraspingFeature_Node.py - let it start completely
- rh_skills/src/flattening_actionlib_client.py