User Tools

Site Tools


ros

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Last revisionBoth sides next revision
ros [2011/11/01 15:05] – fQTmtdfZpvtlAUkvB 159.220.30.191ros [2011/11/08 22:03] – old revision restored memeruiz
Line 1: Line 1:
-pb1cig  <href="http://vwxfgzkoldsf.com/">vwxfgzkoldsf</a>+====== ROS ====== 
 + 
 +===== Introduction ===== 
 + 
 +  * Read the documentation here: http://www.ros.org/wiki/ROS/Introduction 
 + 
 +===== Installation ===== 
 + 
 +  * Take an overview of the installation instructions from here: http://www.ros.org/wiki/cturtle/Installation/Ubuntu/SVN 
 +  * Install rosinstall: 
 + 
 +  sudo easy_install -U rosinstall 
 + 
 +  * Create directory where you want to store ros: 
 + 
 +  cd local/src 
 +  mkdir ros 
 + 
 +  * As of 10.08.2010 is better (more stable, sufficiently new) to use cturtle ros distribution. 
 + 
 +  rosinstall ros http://www.ros.org/rosinstalls/cturtle_base.rosinstall 
 + 
 +  * In the last step rosinstall will try to download and compile cturtle. If you get some errors is because you may be missing some dependencies. Look at the error messages and find out the name of the packages that you have to install in debian, then try rosintall command again until you don't get any errors. 
 +  * Create a file called .bashrc.ros Put inside the following: 
 + 
 +  export ROS_ROOT=${HOME}/local/src/ros/ros ;  
 +  export PATH=${ROS_ROOT}/bin:${PATH} ;  
 +  export PYTHONPATH=${ROS_ROOT}/core/roslib/src:${PYTHONPATH} ;  
 +  export OCTAVE_PATH=${ROS_UP}/ros/core/experimental/rosoct/octave:${OCTAVE_PATH} ;  
 +  #if [ ! "$ROS_MASTER_URI" ] ; then export ROS_MASTER_URI=http://leela:11311 ; fi ;  
 +  if [ ! "$ROS_MASTER_URI" ] ; then export ROS_MASTER_URI=http://localhost:11311 ; fi ;  
 +  export ROS_PACKAGE_PATH=${HOME}/local/src/ros/stacks:${HOME}/local/src/repositories/oid5; 
 +  #export ROS_STACK_PATH=${ROS_ROOT}:${ROS_UP}/ros-pkg ;  
 +   
 +  #source `rosstack find ias_semantic_mapping`/setup.sh 
 +   
 +  NUM_CPUS=`cat /proc/cpuinfo |grep processor |wc -l` 
 +  let "PAR_JOBS=${NUM_CPUS}+2" 
 +  export ROS_PARALLEL_JOBS="-j${PAR_JOBS}" 
 +   
 +  export ROS_LANG_DISABLE=roslisp:rosjava 
 +   
 +  export ROS_IP=`ip addr show \`/sbin/route -n | awk '/UG/ {print $8}'\`| awk '/inet /{print $2}' |sed -e 's/\/.*//'
 +  #export ROS_IP=192.168.137.2 
 +  #HOSTNAME=`hostname` 
 +  #if [[ "${HOSTNAME}" == "leela" ]] ; then echo "Hola"; else echo "lala" ;fi 
 +   
 +  alias kcart_left="`rospack find kbd-cart-cmd`/kbd-cart-cmd -p kbd-left -c /lwr/left" 
 +  alias kcart_right="`rospack find kbd-cart-cmd`/kbd-cart-cmd -p kbd-right -c /lwr/right" 
 +   
 +  . ${ROS_ROOT}/tools/rosbash/rosbash  
 + 
 +  * Adjust the ROS_MASTER_URI to point to the computer where the rosmaster that you want to use is. 
 +  * Inside of your .bashrc add de following line: 
 + 
 +  alias env_ros='source ${HOME}/.bashrc.ros' 
 + 
 +  * Logout and login again to reload your .bashrc 
 +  * Run env_ros 
 +  * Now you can use the ros utilities 
 + 
 +===== Installing extra ros packages ===== 
 + 
 +In the ros directory there is a subdirectory called stacks this is where you can put extra packages. You just have to download somehow this packages and put them there. 
 + 
 +Example: Gstreamer video adquisition: 
 + 
 +  * Search for gstreamer in http://www.ros.org/browse/list.php page. This will show a package called gscam.  
 +  * cd into the stacks directory 
 + 
 +  cd $ROS_ROOT 
 +  cd ../stacks 
 + 
 +  * Download the code there: 
 + 
 +  svn co http://brown-ros-pkg.googlecode.com/svn/tags/brown-ros-pkg 
 + 
 +  * Now you can compile the code. 
 + 
 +  rosmake gscam 
 + 
 +  * Rosmake deals with ros dependencies. It will automatically compile any other necessary ros packages that are dependencies. 
 +  * Rosmake is pretty slow checking dependencies, so if you already compile the program once, and you now that the dependencies are already compiled, then just roscd to the package you need and do make. 
 + 
 +===== Gscam ===== 
 + 
 +Captures images from gstreamer video sources and sends them to a ros topic. 
 + 
 +  * Test your webcam: 
 + 
 +  gst-launch --gst-debug=v4l2:2 v4l2src device=/dev/video1 ! video/x-raw-rgb,width=800,height=600,framerate=30/1 ! ffmpegcolorspace ! xvimagesink  
 + 
 +  * You should see a windows with the webcam image. Close this program. 
 + 
 +  * Start using gscam: 
 + 
 +  export GSCAM_CONFIG="v4l2src device=/dev/video0 ! video/x-raw-rgb,width=800,height=600,framerate=30/1 ! ffmpegcolorspace ! identity name=ros ! fakesink" 
 +  rosrun gscam gscam 
 + 
 +  * To look at the image: 
 + 
 +  rosmake image_view 
 +  rosrun image_view image_view image:=/gscam/image_raw 
 + 
 +==== Improving image quality with some cameras ==== 
 + 
 +With Logitech webcam C600 one can get better image quality (less noisy) setting the video mode to YUV. In gst-launch: 
 + 
 +  gst-launch --gst-debug=v4l2:5 v4l2src device=/dev/video1 ! video/x-raw-yuv,format=\(fourcc\)YUY2,width=800,height=600,framerate=15/1 ! ffmpegcolorspace ! xvimagesink 
 + 
 +For gscam: 
 + 
 +  export GSCAM_CONFIG="v4l2src device=/dev/video1 ! video/x-raw-yuv,width=800,height=600,framerate=15/1,format=(fourcc)YUY2 ! ffmpegcolorspace ! video/x-raw-rgb ! identity name=ros ! fakesink" 
 + 
 +the last conversion is necessary because gscam only takes rgb images. 
 +Please carefully notice that for the gscam GSCAM_CONFIG export there are no back-slashes for the format "fourcc" part.  
 + 
 +One can use gst-inspect to check the capabilities of the different gstreamer filters. 
 + 
 +===== Camera calibration ===== 
 + 
 +  * Compile and calibrate camera. (you need to be running gscam before) 
 + 
 +  rosmake camera_calibration 
 +  rosrun camera_calibration cameracalibrator.py --size 5x4 --square 0.02464 image:=/gscam/image_raw camera:=/gscam 
 + 
 +  * Last command uses the calibration board that comes with the pr2 robot.  
 +  * Move the board until the calibration button activates, try to move slow so that the calibrator don't chose any blurred image, also move the board to the corners of the image, this is where the distortion is more evident. 
 +  * Save the calibration. This will create a file in /tmp with the calibration parameters inside. 
 +  * Commit the calibration. This will create a file called camera_parameters.txt one directory up where gscam is running. 
 +  * Run the image distorter: 
 + 
 +  export ROS_NAMESPACE=gscam 
 +  rosrun image_proc image_proc 
 + 
 +  * To view the results: 
 + 
 +  rosrun image_view image_view image:=/gscam/image_rect_color 
 + 
 +==== Calibration manually ==== 
 + 
 +The algorithm that selects the pictures in camera_calibration is not perfect, and it prefers to select blur images over sharp, also most of the times it doesn't use enough pictures. So here we explain how to use your own pictures to calibrate. 
 + 
 +  * Run image_view and take the pictures that you consider adequate for calibration using the left clicking. 
 + 
 +  cd /tmp/ 
 +  rosrun image_view image_view image:=/gscam/image 
 + 
 +Images are store in the current directory. 
 + 
 +  * Run the calibration from disk images: 
 + 
 +  rosrun camera_calibration camera_calibrate_from_disk.py --size 8x6 --square 0.0247881 /tmp/frame00* 
 + 
 +  * This will print the parameters to screen. Replace them in one camera_calibration.txt example file. (look in gscam directory). 
 + 
 +===== Markers tracking ===== 
 + 
 +==== Artoolkit in ros ==== 
 + 
 +  roscd; cd ../stacks 
 +  git clone http://robotics.ccny.cuny.edu/git/ccny-ros-pkg.git 
 +  rosmake artoolkit 
 +  rosmake ar_pose 
 + 
 +  * Add the markers that you want to detect in the file data/object_data2. For example: 
 + 
 +  4x4_23 
 +  data/4x4/4x4_23.patt 
 +  25.0 
 +  0.0 0.0 
 + 
 +First is the name of the marker, second is the file of the marker, then the size in mm, then the relative position? 
 + 
 +  * run ar_pose: 
 + 
 +  rosrun ar_pose ar_multi /usb_cam/camera_info:=/gscam/camera_info /usb_cam/image_raw:=/gscam/image_rect 
 + 
 +  * To look at the markers detected: 
 + 
 +  rostopic echo /visualization_marker 
 + 
 +===== Getting images from yarp to ros ===== 
 + 
 +  roscd; cd ../stacks 
 +  svn co https://code.ros.org/svn/wg-ros-pkg/branches/trunk_cturtle/sandbox/yarp 
 +  rosmake yarp 
 + 
 +  * Get tum-ros-internal repository and compile yarp2 and yarp_to_ros_image: (you need git access to our repo, tum-ros-pkg of the ros website is not complete and doesn't include all the stuff that is on tum-ros-internal) 
 + 
 +  roscd; cd ../stacks 
 +  git command that I still don't know 
 +  rosmake yarp2 
 +  rosmake yarp_to_ros_image 
 + 
 +  * Running yarp_to_ros_image package: 
 + 
 +  rosrun yarp_to_ros_image yarp_to_ros_image 
 + 
 +===== iCub and all this ===== 
 + 
 +==== Calibrate the icub cameras ==== 
 + 
 +  * Get icub images in ros: 
 + 
 +  roscd yarp_to_ros_image 
 +  rosrun yarp_to_ros_image yarp_to_ros_image 
 +  yarp connect /icub/cam/left /yarp_to_ros_image 
 + 
 +  * Calibrate icub cameras: 
 + 
 +  rosrun camera_calibration cameracalibrator.py --size 6x4 --square 0.040625 image:=/yarp_to_ros_image/image camera:=/yarp_to_ros_image 
 + 
 +This is for our middle sized checker board. 
 + 
 +  * Do save and commit. 
 +  * Now there is a camera_calibration.txt in yarp_to_ros_image directory. Save this file it is the calibration file for the camera. 
 + 
 +==== Detecting markers ==== 
 + 
 +  * Run yarp to ros image module: 
 + 
 +  roscd yarp_to_ros_image 
 +  rosrun yarp_to_ros_image yarp_to_ros_image 
 + 
 +This will use the camera_calibration.txt file that is in the yarp_to_ros_image directory. 
 + 
 +  * Connect icub camera image to yarp_to_ros_image module: 
 + 
 +  yarp connect /icub/cam/left /yarp_to_ros_image 
 + 
 +  * Run image_proc to undistort the image  
 + 
 +  export ROS_NAMESPACE=yarp_to_ros_image 
 +  rosrun image_proc image_proc image_raw:=image 
 + 
 +  * Run ar_pose for markers detection: 
 + 
 +  rosrun ar_pose ar_multi /usb_cam/camera_info:=/yarp_to_ros_image/camera_info /usb_cam/image_raw:=/yarp_to_ros_image/image_rect 
 + 
 +  * Markers detected can be read with: 
 + 
 +  rostopic echo /ar_multi/visualization_marker 
 + 
 +  * Start rviz and add the tf module. 
 +  * Put some markers in front of the camera, so that they get detected. In this moment rviz will recognize the markers frames and the camera frames. Set the Fixed and target frame to /r_eye3. Then you will see the frames of the markers in rviz. 
ros.txt · Last modified: 2021/02/01 05:55 by 127.0.0.1