{"id":45267,"date":"2019-01-10T10:04:13","date_gmt":"2019-01-10T17:04:13","guid":{"rendered":"https:\/\/www.realsenseai.com\/uncategorized-cn\/how-to-multiple-camera-setup-with-ros\/"},"modified":"2019-01-10T17:04:13","modified_gmt":"2019-01-11T00:04:13","slug":"how-to-multiple-camera-setup-with-ros","status":"publish","type":"post","link":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/","title":{"rendered":"How-to: Multiple camera setup with ROS"},"content":{"rendered":"\n<img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-1383 alignnone\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/02\/How-to-Multiple-camera-setup-with-ROS.jpg\" alt=\"\" width=\"1024\" height=\"576\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/02\/How-to-Multiple-camera-setup-with-ROS.jpg 1024w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/02\/How-to-Multiple-camera-setup-with-ROS-300x169.jpg 300w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/02\/How-to-Multiple-camera-setup-with-ROS-768x432.jpg 768w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/>\n\nRealSense\u2122 D400 series depth cameras use stereo-based algorithms to calculate depth. One key advantage of stereo depth systems is the ability to use as many cameras as you want to within a specific scene. In this post, we are going to cover creating a unified point cloud with multiple cameras using ROS.\n\nFor the initial demonstration, we set up two RealSense\u2122 D435 cameras looking at the same area from two different points of view.\n<p style=\"text-align: left;\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-1385 alignnone\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/02\/2_cams_setup_annotated_resize-768x576-1.jpg\" alt=\"\" width=\"768\" height=\"576\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/02\/2_cams_setup_annotated_resize-768x576-1.jpg 768w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/02\/2_cams_setup_annotated_resize-768x576-1-300x225.jpg 300w\" sizes=\"auto, (max-width: 768px) 100vw, 768px\" \/><\/p>\n<p style=\"text-align: left;\">The cameras themselves have no data regarding their relative position. In Step 3, we\u2019ll use a 3rd party program to set this up.<\/p>\n\n<h3>Step 0: Ensure you have your environment set up.<\/h3>\nInstall the RealSense SDK 2.0 (<em>librealsense<\/em>) for Linux* following the instructions <a href=\"https:\/\/github.com\/IntelRealSense\/librealsense\/blob\/master\/doc\/installation.md\">here<\/a>.\nInstall the ROS (not ROS 2) wrapper for <em>librealsense<\/em> from <a href=\"https:\/\/github.com\/intel-ros\/realsense\/releases\">here<\/a>. It is recommended to follow this set of <a href=\"https:\/\/github.com\/intel-ros\/realsense\/blob\/development\/.travis.yml\">instructions<\/a> for the installation.\n<h3>Step 1: Obtaining the camera serial numbers.<\/h3>\nIf you already know each camera\u2019s serial number you can skip this step.\n\nOtherwise,\n\n1. Open terminal and change directory to <em>catkin_ws<\/em>\n\n2. Make sure only <em>cam_1<\/em> is connected and start the <em>realsense2_camera<\/em> wrapper:\n<pre class=\"lang:default decode:true \">roslaunch realsense2_camera rs_camera.launch<\/pre>\nNote the serial number it finds in the following log line:\n<pre class=\"lang:default decode:true \">[ INFO] [1538983051.813336379]: setupDevice...\n \n[ INFO] [1538983051.813399792]: JSON file is not provided\n \n[ INFO] [1538983051.813427820]: ROS Node Namespace: camera\n \n[ INFO] [1538983051.813442869]: Device Name: RealSense D435\n \n[ INFO] [1538983051.813605893]: Device Serial No: 728312070349\n \n[ INFO] [1538983051.813622583]: Device FW version: 05.10.03.00<\/pre>\nThe serial number in this case is <strong>728312070349<\/strong>.\nRecord this somewhere and terminate the node using CTRL+C.\nRepeat the step with now only <em>cam_2<\/em> connected.\n<h3>Step 2: Start both cameras.<\/h3>\nOpen 2 terminal windows (and change directory to <em>catkin_ws<\/em> directory).\n\nIn <u>terminal 1<\/u> enter the following (don\u2019t forget to replace <strong>728312070349<\/strong> with the <em>cam_1<\/em> serial number you recorded in <u>Step 1<\/u>):\n<pre class=\"lang:default decode:true\">roslaunch realsense2_camera rs_camera.launch camera:=cam_1 serial_no:=728312070349 filters:=spatial,temporal,pointcloud_<\/pre>\nIn <u>terminal 2<\/u> enter the following (again, fill in the correct serial number for <em>cam_2<\/em>):\n<pre class=\"lang:default decode:true \">roslaunch realsense2_camera rs_camera.launch camera:=cam_2 serial_no:=&lt;serial_number_2&gt; filters:=spatial,temporal,pointcloud_<\/pre>\nWe now have two cameras running with their own point clouds.\n<h3>Step 3: Publish spatial connection between cameras.<\/h3>\nWe estimate the translation of <em>cam_2<\/em> from <em>cam_1<\/em> at 70(cm) on X-axis and 60(cm) on Y-axis. We also estimate the yaw angle of <em>cam_2<\/em> relative to <em>cam_1<\/em> as 90(degrees) clockwise \u2013 this based on our knowledge of the setup in the image above. To simplify things, the coordinate system of <em>cam_1<\/em> will serves as the coordinate system for the whole scene. These are the initial parameters we set for the transformation between the 2 cameras.\n\nThe following script calculates the transformation between the 2 cameras from the given parameters and publish the frame transformation.\n\nOpen a third terminal window \u2013 <u>terminal 3<\/u> \u2013 and enter the following:\n<pre class=\"lang:default decode:true\">cd catkin_ws\npython src\/realsense\/realsense2_camera\/scripts\/set_cams_transforms.py cam_1_link cam_2_link 0.7 0.6 0 -90 0 0<\/pre>\n<h3>Step 4: Visualizing the point clouds and fine-tune the camera calibration.<\/h3>\nOpen a fourth terminal \u2013 <u>terminal 4<\/u> \u2013 and run RViz. RViz is a 3D visualizer for displaying sensor data and state information from ROS:\n<pre class=\"lang:default decode:true \">rviz<\/pre>\nIn the RViz window, do the following:\n<pre class=\"lang:default decode:true \">1. Set \u201cFixed Frame\u201d to \u201ccam_1_link\u201d\n2. Add -&gt; By topic -&gt; \/cam_1\/depth\/color\/points\/PointCloud2\n3. Add -&gt; By topic -&gt; \/cam_2\/depth\/color\/points\/PointCloud2<\/pre>\nNow both point clouds are shown together. If you looked closely it\u2019s easy to see that the clouds are not exactly in position. Some features are duplicated. Now it\u2019s time for fine calibration.\n\nSwitch to <u>terminal 3<\/u>, where <em>set_cams_transforms.py<\/em> is still running. Use it to fine-calibrate <em>cam_2<\/em> relative to <em>cam_1<\/em>. The instructions are printed by the program:\n<pre class=\"lang:default decode:true \">Using default file \/home\/doronhi\/catkin_ws\/src\/realsense\/realsense2_camera\/scripts\/_set_cams_info_file.txt\n \nUse given initial values.\n \nPress the following keys to change mode: x, y, z, (a)zimuth, (p)itch, (r)oll\n \nFor each mode, press 6 to increase by step and 4 to decrease\n \nPress + to multiply step by 2 or - to divide\n \nPress Q to quit<\/pre>\nBelow that a constantly updating line is printed, showing the current mode, value and step size after every key stroke.\n\nNotice, that the program prints the path of the current configuration file. It saves its last configuration automatically, all the time, to be used on the next run. If you have been running the program for a while, and want to back up the configuration file, just copy it aside. Read the <em>rviz<\/em> manual, by running without any parameters to learn more about changing the configuration filename.\n\nThere is no question that it would be much better to find the precise matrix between the cameras automatically. In the meanwhile, after a lot of fiddling around, I was relatively satisfied with the following numbers:\n<dl>\n \t<dt>Parameter<\/dt>\n \t<dd>Value<\/dd>\n \t<dt>x<\/dt>\n \t<dd>0.75<\/dd>\n \t<dt>y<\/dt>\n \t<dd>0.575<\/dd>\n \t<dt>z<\/dt>\n \t<dd>0<\/dd>\n \t<dt>azimuth<\/dt>\n \t<dd>-91.25<\/dd>\n \t<dt>pitch<\/dt>\n \t<dd>0.75<\/dd>\n \t<dt>roll<\/dt>\n \t<dd>0<\/dd>\n<\/dl>\nThe result is a more complete cover of the scene:\n\n<img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-1384 alignnone\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/02\/2_cameras_cloud_screenshot.jpg\" alt=\"\" width=\"381\" height=\"341\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/02\/2_cameras_cloud_screenshot.jpg 381w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/02\/2_cameras_cloud_screenshot-300x269.jpg 300w\" sizes=\"auto, (max-width: 381px) 100vw, 381px\" \/>\n\n<img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-2779\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/01\/2_cams_cover-1024x713-1.gif\" alt=\"\" width=\"1024\" height=\"713\" \/>\n<h3>Adding more cameras<\/h3>\nThere are a couple of reasons why the number of cameras connected to a single computer is limited: CPU usage, USB bandwidth, power supply and also, the length of the USB cable. These are all covered in length in the <a href=\"https:\/\/dev.realsenseai.com\/docs\/multiple-depth-cameras-configuration\" target=\"_blank\" rel=\"noopener noreferrer\">Multiple camera configuration white paper<\/a>.\nIn order to demonstrate the scenario we created the following array:\n\n<img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-1382 alignnone\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/02\/scematics_multiple.jpg\" alt=\"\" width=\"558\" height=\"550\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/02\/scematics_multiple.jpg 558w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/02\/scematics_multiple-300x296.jpg 300w\" sizes=\"auto, (max-width: 558px) 100vw, 558px\" \/>\n\n<img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-1387 alignnone\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/02\/2_cams_setup_annotated_resize.jpg\" alt=\"\" width=\"1024\" height=\"768\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/02\/2_cams_setup_annotated_resize.jpg 1024w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/02\/2_cams_setup_annotated_resize-300x225.jpg 300w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/02\/2_cams_setup_annotated_resize-768x576.jpg 768w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/>\n\nBoth <em>cam_2<\/em> and <em>cam_3<\/em> are visible in <em>cam_1<\/em>\u2019s point cloud. That way, it\u2019s easier, though not trivial, to use <em>set_cams_transforms.py<\/em> to fix their location in <em>cam_1<\/em>\u2019s coordinate system.\n\nWhile using multiple computers we have to specifically set the roscore host. In this demo, the first command in each terminal is:\n<pre class=\"lang:default decode:true \">export ROS_MASTER_URI=http:\/\/&lt;ros host ip address&gt;:&lt;port&gt;<\/pre>\nIn this demo, the <em>roscore<\/em>, <em>cam_1<\/em> and <em>cam_2<\/em> run on <em>perclnx319<\/em> and <em>cam_3 <\/em>is on <em>perclnx217<\/em>.\nAlso, in this demo all the terminals are being opened on roscore\u2019s computer, <em>perclnx319<\/em>.\nThe commands to run are as follows \u2013 but remember to replace the serial numbers with your own.\n<h4>Terminal 1<\/h4>\n<pre class=\"lang:default decode:true \">roscore<\/pre>\n<h4>Terminal 2<\/h4>\n<pre class=\"lang:default decode:true \">export ROS_MASTER_URI=http:\/\/perclnx319:11311 roslaunch realsense2_camera rs_camera.launch camera:=cam_1 serial_no:=728312070349 filters:=spatial,temporal,pointcloud<\/pre>\n<h4>Terminal 3<\/h4>\n<pre class=\"lang:default decode:true \">export ROS_MASTER_URI=http:\/\/perclnx319:11311 roslaunch realsense2_camera rs_camera.launch camera:=cam_2 serial_no:=713612070696 filters:=spatial,temporal,pointcloud<\/pre>\n<h4>Terminal 4 \u2013 Used to operate the second machine, perclnx217 in our case<\/h4>\n<pre class=\"lang:default decode:true\">ssh perclnx217\n \n[Enter Pasword]\n \nexport ROS_MASTER_URI=http:\/\/perclnx319:11311 roslaunch realsense2_camera rs_camera.launch camera:=cam_3 serial_no:=725112060487 filters:=spatial,temporal,pointcloud<\/pre>\n<h4>Publish the spatial connection between cameras.<\/h4>\n<em>cam_1<\/em> is rotated about 30 degrees compared to the left side of the imaginary square created by the 3 cameras. This camera is our coordinate system.\nWe estimate the translation of <em>cam_2<\/em> from <em>cam_1<\/em> at 90(cm) on X-axis and 30(cm) on Y-axis. We also estimate the yaw angle of <em>cam_2<\/em> at 120(degrees) clockwise, compared to <em>cam_1<\/em>. These are the initial parameters we set for the transformation between the 2 cameras.\nAs for <em>cam_3<\/em>, we estimate the translation on X-axis at 160(cm) and on Y-axis at -20(cm). We estimate the yaw at 170(degrees) counter-clockwise.\nThe following script calculates the transformation from the given parameters and publish the frame transform.\n<h4>Terminal 5<\/h4>\n<pre class=\"lang:default decode:true \">cd catkin_ws export ROS_MASTER_URI=http:\/\/perclnx319:11311 python src\/realsense\/realsense2_camera\/scripts\/set_cams_transforms.py cam_1_link cam_2_link 0.9 0.3 0 -120 0 0 --file \u2018_cams_set_info_1_to_2\u2019<\/pre>\nPress Q to Quit.\n\nNow set transformation for <em>cam_3<\/em>:\n<pre class=\"lang:default decode:true \">python src\/realsense\/realsense2_camera\/scripts\/set_cams_transforms.py cam_1_link cam_3_link 1.6 -0.2 0 170 0 0 --file \u2018_cams_set_info_1_to_3\u2019<\/pre>\nSince <em>set_cams_transforms.py<\/em> is not a ROS-package and you can\u2019t run 2 nodes with the same name, we run one copy at a time.\n\nFrom now on, the last values for the relative transformations are saved in the specified files. If you want to modify <em>cam_2<\/em> extrinsics, type:\n<pre class=\"lang:default decode:true \">python src\/realsense\/realsense2_camera\/scripts\/set_cams_transforms.py cam_1_link cam_2_link --file \u2018_cams_set_info_1_to_2\u2019 To modify cam_3: python src\/realsense\/realsense2_camera\/scripts\/set_cams_transforms.py cam_1_link cam_3_link --file \u2018_cams_set_info_1_to_3\u2019<\/pre>\n<strong>Notice<\/strong>: Use 2 different file names for saving the results for two different sets of cameras.\n<h3>Visualizing the point clouds and fine-tuning the camera calibration<\/h3>\n<h4>Terminal 6<\/h4>\nEnter the following command: \u2018 rviz \u2018 In RViz, do the following:\n<pre class=\"lang:default decode:true \">1. Set \u201cFixed Frame\u201d to \u201ccam_1_link\u201d\n2. Add -&gt; By topic -&gt; \/cam_1\/depth\/color\/points\/PointCloud2\n3. Add -&gt; By topic -&gt; \/cam_2\/depth\/color\/points\/PointCloud2\n4. Add -&gt; By topic -&gt; \/cam_3\/depth\/color\/points\/PointCloud2\n5. Add -&gt; By display type -&gt; TF\n6. In the \u201cDisplays\u201d window under TF &gt; Frames enable only cam_1_link, cam_2_link and cam_3_link. Just for clarity reasons.<\/pre>\nNow three point clouds are shown together. It\u2019s easy to see that the clouds are not exactly in position. Some features are duplicated. Now it\u2019s time for fine calibration:\nSwitching to <u>terminal 5<\/u>, where <em>set_cams_transforms.py<\/em> is still running, we use it to fine calibrate <em>cam_2<\/em> and <em>cam_3<\/em> relative to <em>cam_1<\/em>.\nThe instruction are printed by the program:\n<pre class=\"lang:default decode:true \">Using file \/home\/doronhi\/catkin_ws\/_cams_set_info_1_to_3.txt read initial values from it.\n \nPress the following keys to change mode: x, y, z, (a)zimuth, (p)itch, (r)oll.\n \nFor each mode, press 6 to increase by step and 4 to decrease.\n \nPress + to multiply step by 2 or - to divide.\n \nPress Q to quit<\/pre>\nWithout an automatic calibration tool it can be quite challenging to tune the cameras. Here are a couple of tips for calibration:\n<ol>\n \t<li>Calibrate one camera at a time.<\/li>\n \t<li>Set cam_2 in the field view of cam_1. Keep in RViz display only the point cloud of cam_1 and the axis notation (TF) of cam_2. Now it\u2019s possible to watch the axis notation of cam_2 and change its x,y,z parameters until it\u2019s located right on top of the camera as seen in cam_1\u2019s point cloud. You can even try and put some distinct object close by and directly in line of sight of cam_2 and try to modify its orientation parameters to align the axis image so it will point to the object as seen in cam_1\u2019s point cloud.<\/li>\n \t<li>Now, display also the point cloud of cam_2. Put a board to be seen by both cameras. Modify cam_2\u2019s parameters to rotate the board until it is aligned parallel to the board as seen from cam_1, then translate it, rotate again and so on, iteratively, until they are exactly aligned. Use the board to understand what parameter to modify and your face for fine tuning.<\/li>\n \t<li>Repeat step 2 and 3 for cam_3 as well.<\/li>\n<\/ol>\nEventually, I was relatively satisfied with the following numbers.\n\nFor <strong><em>cam_2<\/em><\/strong>:\n<dl>\n \t<dt>Parameter<\/dt>\n \t<dd>Value<\/dd>\n \t<dt>x<\/dt>\n \t<dd>0.95<\/dd>\n \t<dt>y<\/dt>\n \t<dd>0.38125<\/dd>\n \t<dt>z<\/dt>\n \t<dd>-0.015625<\/dd>\n \t<dt>azimuth<\/dt>\n \t<dd>-104.25<\/dd>\n \t<dt>pitch<\/dt>\n \t<dd>0<\/dd>\n \t<dt>roll<\/dt>\n \t<dd>-0.5<\/dd>\n<\/dl>\nfor <strong><em>cam_3<\/em><\/strong>:\n<dl>\n \t<dt>Parameter<\/dt>\n \t<dd>Value<\/dd>\n \t<dt>x<\/dt>\n \t<dd>1.5875<\/dd>\n \t<dt>y<\/dt>\n \t<dd>-0.703125<\/dd>\n \t<dt>z<\/dt>\n \t<dd>-0.05<\/dd>\n \t<dt>azimuth<\/dt>\n \t<dd>163<\/dd>\n \t<dt>pitch<\/dt>\n \t<dd>1<\/dd>\n \t<dt>roll<\/dt>\n \t<dd>1.25<\/dd>\n<\/dl>\nNo doubt, a SLAM based solution is better then the manual labor I\u2019ve been through to get this calibration, both in time and results. Nevertheless, the final 3D-model I got looked like that:\n\n<div style=\"width: 536px;\" class=\"wp-video\"><video class=\"wp-video-shortcode\" id=\"video-45267-1\" width=\"536\" height=\"432\" preload=\"metadata\" controls=\"controls\"><source type=\"video\/mp4\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/02\/multi_cams_3d.mp4?_=1\" \/><a href=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/02\/multi_cams_3d.mp4\">https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/02\/multi_cams_3d.mp4<\/a><\/video><\/div>\n","protected":false},"excerpt":{"rendered":"<p>RealSense\u2122 D400 series depth cameras use stereo-ba [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":42861,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"featured_image_focal_point":[],"inline_featured_image":false,"footnotes":""},"categories":[1193,1206],"tags":[500,1246,538,907],"capability_application":[],"industry":[],"class_list":["post-45267","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-insights-cn","category-whitepapers-cn","tag-robotics-2","tag-robotics-cn","tag-stereo-2","tag-stereoscopic-depth-cn"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v26.7 (Yoast SEO v27.0) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>How-to: Multiple camera setup with ROS - RealSense<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/\" \/>\n<meta property=\"og:locale\" content=\"zh_CN\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"How-to: Multiple camera setup with ROS\" \/>\n<meta property=\"og:description\" content=\"RealSense\u2122 D400 series depth cameras use stereo-ba [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/\" \/>\n<meta property=\"og:site_name\" content=\"RealSense\" \/>\n<meta property=\"article:published_time\" content=\"2019-01-10T17:04:13+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2019-01-11T00:04:13+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/How-to-Multiple-camera-setup-with-ROS.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1024\" \/>\n\t<meta property=\"og:image:height\" content=\"576\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"webmaster@freshwatercreative.ca\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"\u4f5c\u8005\" \/>\n\t<meta name=\"twitter:data1\" content=\"webmaster@freshwatercreative.ca\" \/>\n\t<meta name=\"twitter:label2\" content=\"\u9884\u8ba1\u9605\u8bfb\u65f6\u95f4\" \/>\n\t<meta name=\"twitter:data2\" content=\"9 \u5206\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/\"},\"headline\":\"How-to: Multiple camera setup with ROS\",\"datePublished\":\"2019-01-10T17:04:13+00:00\",\"dateModified\":\"2019-01-11T00:04:13+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/\"},\"wordCount\":1361,\"publisher\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/How-to-Multiple-camera-setup-with-ROS.jpg\",\"keywords\":[\"Robotics\",\"Robotics\",\"Stereo\",\"Stereoscopic depth\"],\"articleSection\":[\"\u65b0\u95fb\u4e0e\u6d1e\u5bdf\",\"\u767d\u76ae\u4e66\"],\"inLanguage\":\"zh-Hans\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/\",\"url\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/\",\"name\":\"How-to: Multiple camera setup with ROS - RealSense\",\"isPartOf\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/How-to-Multiple-camera-setup-with-ROS.jpg\",\"datePublished\":\"2019-01-10T17:04:13+00:00\",\"dateModified\":\"2019-01-11T00:04:13+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#breadcrumb\"},\"inLanguage\":\"zh-Hans\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"zh-Hans\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#primaryimage\",\"url\":\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/How-to-Multiple-camera-setup-with-ROS.jpg\",\"contentUrl\":\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/How-to-Multiple-camera-setup-with-ROS.jpg\",\"width\":1024,\"height\":576},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"RealSense\",\"item\":\"https:\/\/www.realsenseai.com\/cn\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"\u65b0\u95fb\u4e0e\u6d1e\u5bdf\",\"item\":\"https:\/\/www.realsenseai.com\/cn\/category\/news-insights-cn\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"\u767d\u76ae\u4e66\",\"item\":\"https:\/\/www.realsenseai.com\/cn\/category\/news-insights-cn\/whitepapers-cn\/\"},{\"@type\":\"ListItem\",\"position\":4,\"name\":\"How-to: Multiple camera setup with ROS\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#website\",\"url\":\"https:\/\/www.realsenseai.com\/cn\/\",\"name\":\"RealSense\",\"description\":\"Powering Physical AI with Advanced Vision and Perception\",\"publisher\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.realsenseai.com\/cn\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"zh-Hans\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#organization\",\"name\":\"RealSense\",\"url\":\"https:\/\/www.realsenseai.com\/cn\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"zh-Hans\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/realsenseai.com\/wp-content\/uploads\/2025\/07\/realsenseai_logo.jpeg\",\"contentUrl\":\"https:\/\/realsenseai.com\/wp-content\/uploads\/2025\/07\/realsenseai_logo.jpeg\",\"width\":200,\"height\":200,\"caption\":\"RealSense\"},\"image\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/linkedin.com\/company\/realsenseai\/\",\"https:\/\/www.youtube.com\/@IntelRealSense\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#\/schema\/person\/267263d5df51bbe26099751b79bf7d7a\",\"name\":\"webmaster@freshwatercreative.ca\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"zh-Hans\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/f709b39fa8422d35a6d83876ff73052452221b58a440c579b3494f5577e5bbc5?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/f709b39fa8422d35a6d83876ff73052452221b58a440c579b3494f5577e5bbc5?s=96&d=mm&r=g\",\"caption\":\"webmaster@freshwatercreative.ca\"},\"sameAs\":[\"https:\/\/realsenseai.com\"]}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"How-to: Multiple camera setup with ROS - RealSense","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/","og_locale":"zh_CN","og_type":"article","og_title":"How-to: Multiple camera setup with ROS","og_description":"RealSense\u2122 D400 series depth cameras use stereo-ba [&hellip;]","og_url":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/","og_site_name":"RealSense","article_published_time":"2019-01-10T17:04:13+00:00","article_modified_time":"2019-01-11T00:04:13+00:00","og_image":[{"width":1024,"height":576,"url":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/How-to-Multiple-camera-setup-with-ROS.jpg","type":"image\/jpeg"}],"author":"webmaster@freshwatercreative.ca","twitter_card":"summary_large_image","twitter_misc":{"\u4f5c\u8005":"webmaster@freshwatercreative.ca","\u9884\u8ba1\u9605\u8bfb\u65f6\u95f4":"9 \u5206"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#article","isPartOf":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/"},"headline":"How-to: Multiple camera setup with ROS","datePublished":"2019-01-10T17:04:13+00:00","dateModified":"2019-01-11T00:04:13+00:00","mainEntityOfPage":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/"},"wordCount":1361,"publisher":{"@id":"https:\/\/www.realsenseai.com\/cn\/#organization"},"image":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#primaryimage"},"thumbnailUrl":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/How-to-Multiple-camera-setup-with-ROS.jpg","keywords":["Robotics","Robotics","Stereo","Stereoscopic depth"],"articleSection":["\u65b0\u95fb\u4e0e\u6d1e\u5bdf","\u767d\u76ae\u4e66"],"inLanguage":"zh-Hans"},{"@type":"WebPage","@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/","url":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/","name":"How-to: Multiple camera setup with ROS - RealSense","isPartOf":{"@id":"https:\/\/www.realsenseai.com\/cn\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#primaryimage"},"image":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#primaryimage"},"thumbnailUrl":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/How-to-Multiple-camera-setup-with-ROS.jpg","datePublished":"2019-01-10T17:04:13+00:00","dateModified":"2019-01-11T00:04:13+00:00","breadcrumb":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#breadcrumb"},"inLanguage":"zh-Hans","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/"]}]},{"@type":"ImageObject","inLanguage":"zh-Hans","@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#primaryimage","url":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/How-to-Multiple-camera-setup-with-ROS.jpg","contentUrl":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/How-to-Multiple-camera-setup-with-ROS.jpg","width":1024,"height":576},{"@type":"BreadcrumbList","@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/how-to-multiple-camera-setup-with-ros\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"RealSense","item":"https:\/\/www.realsenseai.com\/cn\/"},{"@type":"ListItem","position":2,"name":"\u65b0\u95fb\u4e0e\u6d1e\u5bdf","item":"https:\/\/www.realsenseai.com\/cn\/category\/news-insights-cn\/"},{"@type":"ListItem","position":3,"name":"\u767d\u76ae\u4e66","item":"https:\/\/www.realsenseai.com\/cn\/category\/news-insights-cn\/whitepapers-cn\/"},{"@type":"ListItem","position":4,"name":"How-to: Multiple camera setup with ROS"}]},{"@type":"WebSite","@id":"https:\/\/www.realsenseai.com\/cn\/#website","url":"https:\/\/www.realsenseai.com\/cn\/","name":"RealSense","description":"Powering Physical AI with Advanced Vision and Perception","publisher":{"@id":"https:\/\/www.realsenseai.com\/cn\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.realsenseai.com\/cn\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"zh-Hans"},{"@type":"Organization","@id":"https:\/\/www.realsenseai.com\/cn\/#organization","name":"RealSense","url":"https:\/\/www.realsenseai.com\/cn\/","logo":{"@type":"ImageObject","inLanguage":"zh-Hans","@id":"https:\/\/www.realsenseai.com\/cn\/#\/schema\/logo\/image\/","url":"https:\/\/realsenseai.com\/wp-content\/uploads\/2025\/07\/realsenseai_logo.jpeg","contentUrl":"https:\/\/realsenseai.com\/wp-content\/uploads\/2025\/07\/realsenseai_logo.jpeg","width":200,"height":200,"caption":"RealSense"},"image":{"@id":"https:\/\/www.realsenseai.com\/cn\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/linkedin.com\/company\/realsenseai\/","https:\/\/www.youtube.com\/@IntelRealSense"]},{"@type":"Person","@id":"https:\/\/www.realsenseai.com\/cn\/#\/schema\/person\/267263d5df51bbe26099751b79bf7d7a","name":"webmaster@freshwatercreative.ca","image":{"@type":"ImageObject","inLanguage":"zh-Hans","@id":"https:\/\/www.realsenseai.com\/cn\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/f709b39fa8422d35a6d83876ff73052452221b58a440c579b3494f5577e5bbc5?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/f709b39fa8422d35a6d83876ff73052452221b58a440c579b3494f5577e5bbc5?s=96&d=mm&r=g","caption":"webmaster@freshwatercreative.ca"},"sameAs":["https:\/\/realsenseai.com"]}]}},"_links":{"self":[{"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/posts\/45267","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/comments?post=45267"}],"version-history":[{"count":0,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/posts\/45267\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/media\/42861"}],"wp:attachment":[{"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/media?parent=45267"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/categories?post=45267"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/tags?post=45267"},{"taxonomy":"capability_application","embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/capability_application?post=45267"},{"taxonomy":"industry","embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/industry?post=45267"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}