{"id":45263,"date":"2019-07-15T17:10:19","date_gmt":"2019-07-15T23:10:19","guid":{"rendered":"https:\/\/www.realsenseai.com\/uncategorized-cn\/beginners-guide-to-depth\/"},"modified":"2019-07-15T23:10:19","modified_gmt":"2019-07-16T05:10:19","slug":"beginners-guide-to-depth","status":"publish","type":"post","link":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/","title":{"rendered":"Beginner&#8217;s guide to depth (Updated)"},"content":{"rendered":"\nWe talk a lot about depth technologies on the RealSense\u2122 blog, for what should be fairly obvious reasons, but often with the assumption that the readers here know what depth cameras are, understand some of the differences between types, or have some idea of what it\u2019s possible to do with a depth camera. This post assumes the opposite \u2013 that you know nothing and are brand new to depth. In this post, we will cover a variety of types of depth cameras, and why the differences are important, what depth cameras are, how you might get started and more.\n<h3><strong>What are depth cameras?<\/strong><\/h3>\nStandard digital cameras output images as a 2D grid of pixels. Each pixel has values associated with it \u2013 usually we think of those as Red, Green and Blue, or RGB. Each attribute has a number from 0 to 255, so black, for example, is (0,0,0) and a pure bright red would be (255,0,0). Thousands to millions of pixels together create the kind of photographs we are all very familiar with. A depth camera on the other hand, has pixels which have a different numerical value associated with them, that number being the distance from the camera, or \u201cdepth.\u201d Some depth cameras have both an RGB and a depth system, which can give pixels with all four values, or RGBD.\n\nThe output from a depth camera can be displayed in a variety of ways \u2013 in the example below, the color image is shown side by side with the depth image, where each different color in the depth map represents a different distance from the camera. In this case, cyan is closest to the camera, and red is furthest.\n\n<img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-4054\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/05\/what_is_color_depth.jpg\" alt=\"What are depth cameras\" width=\"700\" height=\"236\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/05\/what_is_color_depth.jpg 700w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/05\/what_is_color_depth-300x101.jpg 300w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/>\n\nIt doesn\u2019t really matter what color values the depth map uses, this is just displayed in this way to make it easy to visualize.\n<h3><strong>Types of depth camera<\/strong><\/h3>\nThere are a variety of different methods for calculating depth, all with different strengths and weaknesses and optimal operating conditions. Which one you pick will almost certainly depend on what you are trying to build \u2013 how far does it need to see? What sort of accuracy do you need? Does it need to operate outdoors? Here\u2019s a quick breakdown of camera types and roughly how each one works.\n<h4><strong>Structured Light and Coded Light<\/strong><\/h4>\nStructured light and coded light depth cameras are not identical but similar technologies. They rely on projecting light (usually infrared light) from some kind of emitter onto the scene. The projected light is patterned, either visually or over time, or some combination of the two. Because the projected pattern is known, how the sensor in the camera sees the pattern in the scene provides the depth information. For example, if the pattern is a series of stripes projected onto a ball, the stripes would deform and bend around the surface of the ball in a specific way.\n\n<img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-4058\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/05\/how_coded_light_works-1.jpg\" alt=\"How coded light works\" width=\"700\" height=\"411\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/05\/how_coded_light_works-1.jpg 700w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/05\/how_coded_light_works-1-300x176.jpg 300w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/>\n\nIf the ball moves closer to the emitter, the pattern would change too. Using the disparity between an expected image and the actual image viewed by the camera, distance from the camera can be calculated for every pixel. The <a href=\"https:\/\/www.intelrealsense.com\/coded-light\/#sr300camera\">RealSense\u2122 SR300 line<\/a> of devices are coded light cameras. Because this technology relies on accurately seeing a projected pattern of light, coded and structured light cameras do best indoors at relatively short ranges (depending on the power of the light emitted from the camera). Another issue with systems like this is that they are vulnerable to other noise in the environment from other cameras or devices emitting infrared. Ideal uses for coded light cameras are things like gesture recognition or background segmentation (also known as virtual green screen).\n\nThe new <a href=\"https:\/\/www.intelrealsense.com\/depth-camera-sr305\/\">RealSense\u2122 Depth Camera SR305<\/a> is a coded light depth camera and is a great place for people who are beginners to depth to start experimenting with depth development. As a short range indoor camera it is the perfect low risk place to get started. Since it uses the RealSense\u2122 SDK 2.0, any code you write or anything you develop with this camera will also work with all the other depth cameras in the RealSense product lines, enabling you to upgrade from short range indoor to longer range, higher resolution or outdoor cameras.\n<a class=\"link_double\" style=\"color: #ffffff;\" href=\"https:\/\/www.intelrealsense.com\/coded-light\/#sr300camera\">Learn more<\/a> <a class=\"link_double\" style=\"color: #ffffff;\" href=\"https:\/\/store.realsenseai.com\/buy-intel-realsense-developer-kit-sr300.html\">Buy SR300<\/a>\n<h4><strong>Stereo Depth<\/strong><\/h4>\nStereo depth cameras also often project infrared light onto a scene to improve the accuracy of the data, but unlike coded or structured light cameras, stereo cameras can use any light to measure depth. For a stereo camera, all infrared noise is good noise. Stereo depth cameras have two sensors, spaced a small distance apart. A stereo camera takes the two images from these two sensors and compares them. Since the distance between the sensors is known, these comparisons give depth information. Stereo cameras work in a similar way to how we use two eyes for depth perception. Our brains calculate the difference between each eye. Objects closer to us will appear to move significantly from eye to eye (or sensor to sensor), where an object in the far distance would appear to move very little.\nBecause stereo cameras use any visual features to measure depth, they will work well in most lighting conditions including outdoors. The addition of an infrared projector means that in low lighting conditions, the camera can still perceive depth details. The <a href=\"https:\/\/www.intelrealsense.com\/depth-camera-d435\/\">RealSense\u2122 D400 series<\/a> cameras are stereo depth cameras. The other benefit of this type of depth camera is that there are no limits to how many you can use in a particular space \u2013 the cameras don\u2019t interfere with each other in the same way that a coded light or time of flight camera would.\n\nThe distance these cameras can measure is directly related to how far apart the two sensors are \u2013 the wider the baseline is, the further the camera can see. In fact, astronomers use a very similar technique to measure the distance of faraway stars, by measuring the position of a star in the sky at one point in time, and then measuring that same star six months later when the earth is at the furthest point in its orbit from the original measuring point. In this way, they can calculate the distance (or depth of the star) using a baseline of around 300 million kilometers.\n\n<img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-4062\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/05\/how_stereo_depth_works.jpg\" alt=\"How stereo depth works\" width=\"700\" height=\"398\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/05\/how_stereo_depth_works.jpg 700w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/05\/how_stereo_depth_works-300x171.jpg 300w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/>\n<a class=\"link_double\" style=\"color: #ffffff;\" href=\"https:\/\/www.intelrealsense.com\/stereo-depth\/\">Learn more<\/a> <a class=\"link_double\" style=\"color: #ffffff;\" href=\"https:\/\/store.realsenseai.com\/stereo-depth-products.html\">Buy depth cameras<\/a>\n<h4><strong>Time of Flight and LiDAR<\/strong><\/h4>\nEach kind of depth camera relies on known information in order to extrapolate depth. For example, in stereo, the distance between sensors is known. In coded light and structured light, the pattern of light is known. In the case of time of flight, the speed of light is the known variable used to calculate depth. LiDAR sensors, which you may be familiar with from things like self-driving cars are a type of time of flight camera which use laser light to calculate depth. All types of time of flight device emit some kind of light, sweep it over the scene, and then time how long that light takes to get back to a sensor on the camera. Depending on the power and wavelength of the light, time of flight sensors can measure depth at significant distances \u2013 for example, being used to map terrain from a helicopter.\n\nThe primary disadvantage of time of flight cameras is that they can be susceptible to other cameras in the same space and can also function less well in outdoor conditions. Any situation where the light hitting the sensor may not have been the light emitted from the specific camera but could have come from some other source like the sun or another camera can degrade the quality of the depth image.\n\nThe new <a href=\"https:\/\/www.intelrealsense.com\/lidar-camera-l515\/\">RealSense\u2122 LiDAR Camera L515<\/a> is a new type of time-of-flight or LiDAR based camera. While most types of LiDAR devices have mechanical systems which spin around to sweep the environment with light which is then detected, the L515 uses a proprietary miniaturized scanning technology. This technology allows the L515 to be the world\u2019s smallest high resolution LiDAR depth camera.\n\n<img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-4066\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2019\/05\/how_lidar_works.jpg\" alt=\"How LiDAR works\" width=\"700\" height=\"378\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/05\/how_lidar_works.jpg 700w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2019\/05\/how_lidar_works-300x162.jpg 300w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/>\n<h3><strong>What can you do with depth?<\/strong><\/h3>\nAll depth cameras give you the advantage of additional understanding about a scene, and more, it gives any device or system the ability to understand a scene in ways that don\u2019t require human intervention. While it\u2019s possible for a computer to understand a 2d image, that requires significant investment and time in training a machine learning network (for more information on this topic you can check out <a href=\"https:\/\/www.intelrealsense.com\/machine-learning-and-depth-cameras\/\">this post<\/a>). A depth camera inherently gives some information without the need for training, for example, it\u2019s easier to distinguish foreground and background objects from a scene. This becomes useful in things like background segmentation \u2013 a depth camera can remove background objects from an image, allowing a green-screen free capture.\nDepth cameras are also very useful in the field of robotics and autonomous devices like drones \u2013 if you have a robot or drone navigating its way around a space, you would probably want to automatically detect if something appears directly in front of the robot, to avoid collision. In this video that\u2019s exactly the use that the depth camera serves, in addition to making a 3d map or scan of the space.\n\n<iframe loading=\"lazy\" title=\"Robotics navigation with Intel\u00ae RealSense\u2122 Tracking Camera T265 and Depth Camera D435\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/LBMsWJJxLXQ?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n\nThese are just a few of the use cases for depth cameras, there are <a href=\"https:\/\/www.intelrealsense.com\/use-cases\/\">many more<\/a>. What can you do with the added understanding of physical space for a computer or device?\n\n<a class=\"link_double\" style=\"color: #ffffff;\" href=\"https:\/\/www.intelrealsense.com\/developers\/\">Get Started<\/a>\n","protected":false},"excerpt":{"rendered":"<p>We talk a lot about depth technologies on the Real [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":42849,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"featured_image_focal_point":[],"inline_featured_image":false,"footnotes":""},"categories":[1199,1193],"tags":[598,531,516,1296,599],"capability_application":[],"industry":[],"class_list":["post-45263","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-how-to-cn","category-news-insights-cn","tag-coded-light-2","tag-lidar-2","tag-stereo-depth-3","tag-stereo-depth-cn","tag-time-of-flight-2"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v26.7 (Yoast SEO v27.0) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>Beginner&#039;s guide to depth (Updated) - RealSense<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/\" \/>\n<meta property=\"og:locale\" content=\"zh_CN\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Beginner&#039;s guide to depth (Updated)\" \/>\n<meta property=\"og:description\" content=\"We talk a lot about depth technologies on the Real [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/\" \/>\n<meta property=\"og:site_name\" content=\"RealSense\" \/>\n<meta property=\"article:published_time\" content=\"2019-07-15T23:10:19+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2019-07-16T05:10:19+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/beginners_guide_hero.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1920\" \/>\n\t<meta property=\"og:image:height\" content=\"1080\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"webmaster@freshwatercreative.ca\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"\u4f5c\u8005\" \/>\n\t<meta name=\"twitter:data1\" content=\"webmaster@freshwatercreative.ca\" \/>\n\t<meta name=\"twitter:label2\" content=\"\u9884\u8ba1\u9605\u8bfb\u65f6\u95f4\" \/>\n\t<meta name=\"twitter:data2\" content=\"8 \u5206\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/\"},\"headline\":\"Beginner&#8217;s guide to depth (Updated)\",\"datePublished\":\"2019-07-15T23:10:19+00:00\",\"dateModified\":\"2019-07-16T05:10:19+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/\"},\"wordCount\":1571,\"publisher\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/beginners_guide_hero.jpg\",\"keywords\":[\"Coded Light\",\"LiDAR\",\"Stereo depth\",\"Stereo depth\",\"Time of Flight\"],\"articleSection\":[\"How-To\",\"\u65b0\u95fb\u4e0e\u6d1e\u5bdf\"],\"inLanguage\":\"zh-Hans\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/\",\"url\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/\",\"name\":\"Beginner's guide to depth (Updated) - RealSense\",\"isPartOf\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/beginners_guide_hero.jpg\",\"datePublished\":\"2019-07-15T23:10:19+00:00\",\"dateModified\":\"2019-07-16T05:10:19+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#breadcrumb\"},\"inLanguage\":\"zh-Hans\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"zh-Hans\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#primaryimage\",\"url\":\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/beginners_guide_hero.jpg\",\"contentUrl\":\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/beginners_guide_hero.jpg\",\"width\":1920,\"height\":1080},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"RealSense\",\"item\":\"https:\/\/www.realsenseai.com\/cn\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"\u65b0\u95fb\u4e0e\u6d1e\u5bdf\",\"item\":\"https:\/\/www.realsenseai.com\/cn\/category\/news-insights-cn\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"How-To\",\"item\":\"https:\/\/www.realsenseai.com\/cn\/category\/news-insights-cn\/how-to-cn\/\"},{\"@type\":\"ListItem\",\"position\":4,\"name\":\"Beginner&#8217;s guide to depth (Updated)\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#website\",\"url\":\"https:\/\/www.realsenseai.com\/cn\/\",\"name\":\"RealSense\",\"description\":\"Powering Physical AI with Advanced Vision and Perception\",\"publisher\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.realsenseai.com\/cn\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"zh-Hans\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#organization\",\"name\":\"RealSense\",\"url\":\"https:\/\/www.realsenseai.com\/cn\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"zh-Hans\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/realsenseai.com\/wp-content\/uploads\/2025\/07\/realsenseai_logo.jpeg\",\"contentUrl\":\"https:\/\/realsenseai.com\/wp-content\/uploads\/2025\/07\/realsenseai_logo.jpeg\",\"width\":200,\"height\":200,\"caption\":\"RealSense\"},\"image\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/linkedin.com\/company\/realsenseai\/\",\"https:\/\/www.youtube.com\/@IntelRealSense\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#\/schema\/person\/267263d5df51bbe26099751b79bf7d7a\",\"name\":\"webmaster@freshwatercreative.ca\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"zh-Hans\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/f709b39fa8422d35a6d83876ff73052452221b58a440c579b3494f5577e5bbc5?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/f709b39fa8422d35a6d83876ff73052452221b58a440c579b3494f5577e5bbc5?s=96&d=mm&r=g\",\"caption\":\"webmaster@freshwatercreative.ca\"},\"sameAs\":[\"https:\/\/realsenseai.com\"]}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Beginner's guide to depth (Updated) - RealSense","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/","og_locale":"zh_CN","og_type":"article","og_title":"Beginner's guide to depth (Updated)","og_description":"We talk a lot about depth technologies on the Real [&hellip;]","og_url":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/","og_site_name":"RealSense","article_published_time":"2019-07-15T23:10:19+00:00","article_modified_time":"2019-07-16T05:10:19+00:00","og_image":[{"width":1920,"height":1080,"url":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/beginners_guide_hero.jpg","type":"image\/jpeg"}],"author":"webmaster@freshwatercreative.ca","twitter_card":"summary_large_image","twitter_misc":{"\u4f5c\u8005":"webmaster@freshwatercreative.ca","\u9884\u8ba1\u9605\u8bfb\u65f6\u95f4":"8 \u5206"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#article","isPartOf":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/"},"headline":"Beginner&#8217;s guide to depth (Updated)","datePublished":"2019-07-15T23:10:19+00:00","dateModified":"2019-07-16T05:10:19+00:00","mainEntityOfPage":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/"},"wordCount":1571,"publisher":{"@id":"https:\/\/www.realsenseai.com\/cn\/#organization"},"image":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#primaryimage"},"thumbnailUrl":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/beginners_guide_hero.jpg","keywords":["Coded Light","LiDAR","Stereo depth","Stereo depth","Time of Flight"],"articleSection":["How-To","\u65b0\u95fb\u4e0e\u6d1e\u5bdf"],"inLanguage":"zh-Hans"},{"@type":"WebPage","@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/","url":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/","name":"Beginner's guide to depth (Updated) - RealSense","isPartOf":{"@id":"https:\/\/www.realsenseai.com\/cn\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#primaryimage"},"image":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#primaryimage"},"thumbnailUrl":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/beginners_guide_hero.jpg","datePublished":"2019-07-15T23:10:19+00:00","dateModified":"2019-07-16T05:10:19+00:00","breadcrumb":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#breadcrumb"},"inLanguage":"zh-Hans","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/"]}]},{"@type":"ImageObject","inLanguage":"zh-Hans","@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#primaryimage","url":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/beginners_guide_hero.jpg","contentUrl":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/beginners_guide_hero.jpg","width":1920,"height":1080},{"@type":"BreadcrumbList","@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/beginners-guide-to-depth\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"RealSense","item":"https:\/\/www.realsenseai.com\/cn\/"},{"@type":"ListItem","position":2,"name":"\u65b0\u95fb\u4e0e\u6d1e\u5bdf","item":"https:\/\/www.realsenseai.com\/cn\/category\/news-insights-cn\/"},{"@type":"ListItem","position":3,"name":"How-To","item":"https:\/\/www.realsenseai.com\/cn\/category\/news-insights-cn\/how-to-cn\/"},{"@type":"ListItem","position":4,"name":"Beginner&#8217;s guide to depth (Updated)"}]},{"@type":"WebSite","@id":"https:\/\/www.realsenseai.com\/cn\/#website","url":"https:\/\/www.realsenseai.com\/cn\/","name":"RealSense","description":"Powering Physical AI with Advanced Vision and Perception","publisher":{"@id":"https:\/\/www.realsenseai.com\/cn\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.realsenseai.com\/cn\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"zh-Hans"},{"@type":"Organization","@id":"https:\/\/www.realsenseai.com\/cn\/#organization","name":"RealSense","url":"https:\/\/www.realsenseai.com\/cn\/","logo":{"@type":"ImageObject","inLanguage":"zh-Hans","@id":"https:\/\/www.realsenseai.com\/cn\/#\/schema\/logo\/image\/","url":"https:\/\/realsenseai.com\/wp-content\/uploads\/2025\/07\/realsenseai_logo.jpeg","contentUrl":"https:\/\/realsenseai.com\/wp-content\/uploads\/2025\/07\/realsenseai_logo.jpeg","width":200,"height":200,"caption":"RealSense"},"image":{"@id":"https:\/\/www.realsenseai.com\/cn\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/linkedin.com\/company\/realsenseai\/","https:\/\/www.youtube.com\/@IntelRealSense"]},{"@type":"Person","@id":"https:\/\/www.realsenseai.com\/cn\/#\/schema\/person\/267263d5df51bbe26099751b79bf7d7a","name":"webmaster@freshwatercreative.ca","image":{"@type":"ImageObject","inLanguage":"zh-Hans","@id":"https:\/\/www.realsenseai.com\/cn\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/f709b39fa8422d35a6d83876ff73052452221b58a440c579b3494f5577e5bbc5?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/f709b39fa8422d35a6d83876ff73052452221b58a440c579b3494f5577e5bbc5?s=96&d=mm&r=g","caption":"webmaster@freshwatercreative.ca"},"sameAs":["https:\/\/realsenseai.com"]}]}},"_links":{"self":[{"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/posts\/45263","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/comments?post=45263"}],"version-history":[{"count":0,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/posts\/45263\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/media\/42849"}],"wp:attachment":[{"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/media?parent=45263"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/categories?post=45263"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/tags?post=45263"},{"taxonomy":"capability_application","embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/capability_application?post=45263"},{"taxonomy":"industry","embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/industry?post=45263"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}