{"id":45318,"date":"2020-01-29T17:02:44","date_gmt":"2020-01-30T00:02:44","guid":{"rendered":"https:\/\/www.realsenseai.com\/uncategorized-cn\/hand-tracking-overview\/"},"modified":"2026-03-19T15:54:59","modified_gmt":"2026-03-19T21:54:59","slug":"hand-tracking-overview","status":"publish","type":"post","link":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/","title":{"rendered":"Hand tracking and Gesture Recognition"},"content":{"rendered":"\nWhen we start to talk about hand tracking and gesture recognition, they can be easily confused for each other, so it\u2019s worthwhile to start with a brief explanation of both, what they are and how they differ. Both are methods which allow users to use their hands to interact with computers, without the need for touch, controllers or devices. In some cases, hand tracking or gesture recognition systems use markers, gloves or sensors, but for the most part the ideal system doesn\u2019t require the user to touch anything. These kinds of systems have applications in medical interfaces for example, allowing a surgeon to view and interact with a screen without needing to touch it, or in augmented reality headsets, where the user may be wearing a headset with a digital overlay on the real world that they need to interact with.\n\nWhile there is some overlap, gesture recognition systems and hand tracking systems have one fundamental difference \u2013 in most cases, a gesture recognition system recognizes specific gestures and only those gestures, for example, using a thumbs up gesture to indicate an \u201cok\u201d or click, or a flat hand to indicate \u201cstop.\u201d A gesture-based system is usually limited to a specific number of gestures, since people have a hard time remembering more than a few gestures, but for those limited number of hand poses, the gesture system will usually recognize them fairly robustly.\n\n<img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-8236\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2020\/01\/hand_and_gesture_recognition.jpg\" alt=\"Hand tracking and Gesture Recognition\" width=\"700\" height=\"980\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2020\/01\/hand_and_gesture_recognition.jpg 700w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2020\/01\/hand_and_gesture_recognition-214x300.jpg 214w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/>\n\nA hand tracking system however, usually has a more variable number of interactions that it can support, since hand tracking systems usually track either the volume of the hand, or the individual joint and finger positions. This allows for a theoretically unlimited number of interactions with digital objects, just as we would use our hands to interact with objects in the real world, but can sometimes run into problems with occlusion \u2013 what does the system do when your fist is closed? Where are your fingers when one hand is behind the other? Is that your left hand palm up, or your right hand palm down? Because gesture systems are trained on a few specific gestures, they don\u2019t have the same problems, but they also don\u2019t have the same flexibility.\n\n<img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-8241 size-full\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2020\/01\/hand_tracking.jpg\" alt=\"Hand tracking\" width=\"700\" height=\"467\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2020\/01\/hand_tracking.jpg 700w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2020\/01\/hand_tracking-300x200.jpg 300w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/>\n\nAs an analogy that may help to understand the difference between hand tracking and gesture recognition, think of the difference between a phone with a keypad, and a modern touchscreen. The keypad has defined buttons, a limited number of them, which perform a specific set of interactions very well, but a touchscreen can perform an unlimited number of actions, although some of the time you might more easily hit the wrong key or letter because of the more nuanced nature of the interactions. Neither system is inherently better or worse than the other, instead, it\u2019s important to choose which is optimal for your specific use case and user needs.\n<h3>Developing hand tracking systems<\/h3>\nIn the broadest sense, most modern hand tracking solutions use a machine learning approach in order to develop a robust system for detecting hand positions. In general, machine learning systems utilize known, labeled data, in order to allow a computer to predict unknown but similar data. For example, by labeling hundreds of images of cats and dogs, a system might be able to distinguish between cats and dogs with reasonably high accuracy. By using a similar technique to label depth images of hands, it\u2019s possible to detect finger positions with reasonably high accuracy.\n<h3>Building the dataset<\/h3>\nIn <a href=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/8781633\" target=\"_blank\" rel=\"noopener noreferrer\">this paper<\/a> published by IEEE, the authors propose using RealSense Depth cameras combined with colored gloves to accurately create the dataset for hand segmentation, the crucial first step in building a hand tracking database. When using the word segmentation with regard to depth cameras, in general this refers to segmenting a specific foreground object or objects from unimportant background elements. For example, background segmentation can be used to extract a person from their background without the need for a green screen.\n\nBy using colored gloves, the authors of the paper were able to quickly and easily generalize between left hand and right hand, as well as distinguish individual fingers when they are overlapping or interacting with each other such as interlaced fingers. Their automatic annotation method reduces the need for human interaction with the data and should lead to more advanced and accurate hand tracking systems.\n<h3>Hand model estimation for Augmented Reality Glasses<\/h3>\nFor augmented reality glasses, it\u2019s important that a user can easily interact with the digital items without needing to use controllers or other physical interfaces. Ideally, every surface becomes a potential touchscreen, every pen could be a stylus. One of the more challenging aspects of hand tracking for augmented reality specifically is that users can still see their own physical hands, with zero latency, so any system requires the estimation algorithm to be fast and low latency. In addition, since the environment and background are complete unknowns, the hand tracking system must be robust to a cluttered background.\n\n<img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-8246\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2020\/01\/hand_tracking_augmented_reality.jpg\" alt=\"Hand Tracking Augmented Reality\" width=\"700\" height=\"700\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2020\/01\/hand_tracking_augmented_reality.jpg 700w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2020\/01\/hand_tracking_augmented_reality-300x300.jpg 300w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2020\/01\/hand_tracking_augmented_reality-150x150.jpg 150w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/>\n\nIn the paper \u201c<a href=\"https:\/\/ieeexplore.ieee.org\/document\/8951959\" target=\"_blank\" rel=\"noopener noreferrer\">Real-Time Hand Model Estimation from Depth Images for Wearable Augmented Reality Glasses<\/a>\u201d the team propose using an RealSense depth camera with an algorithm that was designed to perform best when used as part of an augmented reality headset \u2013 for example, they make some assumptions such as the placement of wrists being always in the lower half of the depth image frame. With their algorithm, the authors of the paper were able to get hand tracking accuracy of 85-98% depending on background objects.\n<h3>EMG sensors for hand gesture recognition<\/h3>\nThere are a variety of reasons that using EMG sensors to track finger movement are desirable. Non-invasive, on-skin electrodes are used to register muscle activity. To date, while this is an interesting technique, it is not a very accurate one, in part because electrodes must be placed repeatedly and accurately in the same position on the forearm, something difficult to achieve outside of laboratory conditions. In <a href=\"https:\/\/www.mdpi.com\/1424-8220\/19\/16\/3548\/htm\" target=\"_blank\" rel=\"noopener noreferrer\">this paper<\/a>, the team used 24 electrodes fixed around the forearm of experiment test subjects using 3 elastic bands with 8 electrodes on each. The experiment also included an RealSense depth camera pointed at the subject\u2019s hands as they move through a series of defined and predetermined motions. By combining the data from the array of sensors with the depth images as ground truth, they were able to correlate finger position with the electrical signals from the muscles, allowing them to create a public dataset for use in a variety of applications.\n<h3>Prosthetic hand and finger control systems using hand tracking<\/h3>\nAn example of a useful application for EMG sensor-based systems is for prosthetic hands \u2013 many current systems require precise anatomical data, precise electrode placement. This limits the amount of control a user might have over their prosthesis \u2013 limiting them to predefined grips or gestures. Again using a machine learning approach, combining an RealSense depth camera with the EMG sensors, the authors of <a href=\"https:\/\/www.ncbi.nlm.nih.gov\/pmc\/articles\/PMC6775184\/\" target=\"_blank\" rel=\"noopener noreferrer nofollow\">this paper<\/a> were able to propose an alternative to conventional EMG sensor placement with an array EMG system to cover the user\u2019s forearm, allowing detection of muscle movement deep within the user\u2019s forearm, and better able to track the finger angles with more precision. As this work progresses, it could lead to increasingly nuanced prosthetic control.\n\n<img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-8251\" src=\"https:\/\/realsenseai.com\/wp-content\/uploads\/2020\/01\/prosthetic_hand.jpg\" alt=\"Prosthetic hand\" width=\"700\" height=\"467\" srcset=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2020\/01\/prosthetic_hand.jpg 700w, https:\/\/www.realsenseai.com\/wp-content\/uploads\/2020\/01\/prosthetic_hand-300x200.jpg 300w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/>\n","protected":false},"excerpt":{"rendered":"<p>When we start to talk about hand tracking and gest [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":43063,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"featured_image_focal_point":[],"inline_featured_image":false,"footnotes":""},"categories":[1210,1193],"tags":[564,612],"capability_application":[],"industry":[],"class_list":["post-45318","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-skeletal-tracking-cn","category-news-insights-cn","tag-gesture-recognition-2","tag-hand-tracking-2"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v26.7 (Yoast SEO v27.0) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>Hand tracking and Gesture Recognition - RealSense<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/\" \/>\n<meta property=\"og:locale\" content=\"zh_CN\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Hand tracking and Gesture Recognition\" \/>\n<meta property=\"og:description\" content=\"When we start to talk about hand tracking and gest [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/\" \/>\n<meta property=\"og:site_name\" content=\"RealSense\" \/>\n<meta property=\"article:published_time\" content=\"2020-01-30T00:02:44+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-03-19T21:54:59+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/hand_tracking_and_gesture_recognition.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1920\" \/>\n\t<meta property=\"og:image:height\" content=\"1080\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"webmaster@freshwatercreative.ca\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"\u4f5c\u8005\" \/>\n\t<meta name=\"twitter:data1\" content=\"webmaster@freshwatercreative.ca\" \/>\n\t<meta name=\"twitter:label2\" content=\"\u9884\u8ba1\u9605\u8bfb\u65f6\u95f4\" \/>\n\t<meta name=\"twitter:data2\" content=\"6 \u5206\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/\"},\"headline\":\"Hand tracking and Gesture Recognition\",\"datePublished\":\"2020-01-30T00:02:44+00:00\",\"dateModified\":\"2026-03-19T21:54:59+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/\"},\"wordCount\":1210,\"publisher\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/hand_tracking_and_gesture_recognition.jpg\",\"keywords\":[\"Gesture Recognition\",\"hand tracking\"],\"articleSection\":[\"Skeletal tracking\",\"\u65b0\u95fb\u4e0e\u6d1e\u5bdf\"],\"inLanguage\":\"zh-Hans\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/\",\"url\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/\",\"name\":\"Hand tracking and Gesture Recognition - RealSense\",\"isPartOf\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/hand_tracking_and_gesture_recognition.jpg\",\"datePublished\":\"2020-01-30T00:02:44+00:00\",\"dateModified\":\"2026-03-19T21:54:59+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#breadcrumb\"},\"inLanguage\":\"zh-Hans\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"zh-Hans\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#primaryimage\",\"url\":\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/hand_tracking_and_gesture_recognition.jpg\",\"contentUrl\":\"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/hand_tracking_and_gesture_recognition.jpg\",\"width\":1920,\"height\":1080},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"RealSense\",\"item\":\"https:\/\/www.realsenseai.com\/cn\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"\u65b0\u95fb\u4e0e\u6d1e\u5bdf\",\"item\":\"https:\/\/www.realsenseai.com\/cn\/category\/news-insights-cn\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Hand tracking and Gesture Recognition\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#website\",\"url\":\"https:\/\/www.realsenseai.com\/cn\/\",\"name\":\"RealSense\",\"description\":\"Powering Physical AI with Advanced Vision and Perception\",\"publisher\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.realsenseai.com\/cn\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"zh-Hans\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#organization\",\"name\":\"RealSense\",\"url\":\"https:\/\/www.realsenseai.com\/cn\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"zh-Hans\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/realsenseai.com\/wp-content\/uploads\/2025\/07\/realsenseai_logo.jpeg\",\"contentUrl\":\"https:\/\/realsenseai.com\/wp-content\/uploads\/2025\/07\/realsenseai_logo.jpeg\",\"width\":200,\"height\":200,\"caption\":\"RealSense\"},\"image\":{\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/linkedin.com\/company\/realsenseai\/\",\"https:\/\/www.youtube.com\/@IntelRealSense\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#\/schema\/person\/267263d5df51bbe26099751b79bf7d7a\",\"name\":\"webmaster@freshwatercreative.ca\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"zh-Hans\",\"@id\":\"https:\/\/www.realsenseai.com\/cn\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/f709b39fa8422d35a6d83876ff73052452221b58a440c579b3494f5577e5bbc5?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/f709b39fa8422d35a6d83876ff73052452221b58a440c579b3494f5577e5bbc5?s=96&d=mm&r=g\",\"caption\":\"webmaster@freshwatercreative.ca\"},\"sameAs\":[\"https:\/\/realsenseai.com\"]}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Hand tracking and Gesture Recognition - RealSense","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/","og_locale":"zh_CN","og_type":"article","og_title":"Hand tracking and Gesture Recognition","og_description":"When we start to talk about hand tracking and gest [&hellip;]","og_url":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/","og_site_name":"RealSense","article_published_time":"2020-01-30T00:02:44+00:00","article_modified_time":"2026-03-19T21:54:59+00:00","og_image":[{"width":1920,"height":1080,"url":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/hand_tracking_and_gesture_recognition.jpg","type":"image\/jpeg"}],"author":"webmaster@freshwatercreative.ca","twitter_card":"summary_large_image","twitter_misc":{"\u4f5c\u8005":"webmaster@freshwatercreative.ca","\u9884\u8ba1\u9605\u8bfb\u65f6\u95f4":"6 \u5206"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#article","isPartOf":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/"},"headline":"Hand tracking and Gesture Recognition","datePublished":"2020-01-30T00:02:44+00:00","dateModified":"2026-03-19T21:54:59+00:00","mainEntityOfPage":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/"},"wordCount":1210,"publisher":{"@id":"https:\/\/www.realsenseai.com\/cn\/#organization"},"image":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#primaryimage"},"thumbnailUrl":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/hand_tracking_and_gesture_recognition.jpg","keywords":["Gesture Recognition","hand tracking"],"articleSection":["Skeletal tracking","\u65b0\u95fb\u4e0e\u6d1e\u5bdf"],"inLanguage":"zh-Hans"},{"@type":"WebPage","@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/","url":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/","name":"Hand tracking and Gesture Recognition - RealSense","isPartOf":{"@id":"https:\/\/www.realsenseai.com\/cn\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#primaryimage"},"image":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#primaryimage"},"thumbnailUrl":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/hand_tracking_and_gesture_recognition.jpg","datePublished":"2020-01-30T00:02:44+00:00","dateModified":"2026-03-19T21:54:59+00:00","breadcrumb":{"@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#breadcrumb"},"inLanguage":"zh-Hans","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/"]}]},{"@type":"ImageObject","inLanguage":"zh-Hans","@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#primaryimage","url":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/hand_tracking_and_gesture_recognition.jpg","contentUrl":"https:\/\/www.realsenseai.com\/wp-content\/uploads\/2026\/01\/hand_tracking_and_gesture_recognition.jpg","width":1920,"height":1080},{"@type":"BreadcrumbList","@id":"https:\/\/www.realsenseai.com\/cn\/news-insights-cn\/hand-tracking-overview\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"RealSense","item":"https:\/\/www.realsenseai.com\/cn\/"},{"@type":"ListItem","position":2,"name":"\u65b0\u95fb\u4e0e\u6d1e\u5bdf","item":"https:\/\/www.realsenseai.com\/cn\/category\/news-insights-cn\/"},{"@type":"ListItem","position":3,"name":"Hand tracking and Gesture Recognition"}]},{"@type":"WebSite","@id":"https:\/\/www.realsenseai.com\/cn\/#website","url":"https:\/\/www.realsenseai.com\/cn\/","name":"RealSense","description":"Powering Physical AI with Advanced Vision and Perception","publisher":{"@id":"https:\/\/www.realsenseai.com\/cn\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.realsenseai.com\/cn\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"zh-Hans"},{"@type":"Organization","@id":"https:\/\/www.realsenseai.com\/cn\/#organization","name":"RealSense","url":"https:\/\/www.realsenseai.com\/cn\/","logo":{"@type":"ImageObject","inLanguage":"zh-Hans","@id":"https:\/\/www.realsenseai.com\/cn\/#\/schema\/logo\/image\/","url":"https:\/\/realsenseai.com\/wp-content\/uploads\/2025\/07\/realsenseai_logo.jpeg","contentUrl":"https:\/\/realsenseai.com\/wp-content\/uploads\/2025\/07\/realsenseai_logo.jpeg","width":200,"height":200,"caption":"RealSense"},"image":{"@id":"https:\/\/www.realsenseai.com\/cn\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/linkedin.com\/company\/realsenseai\/","https:\/\/www.youtube.com\/@IntelRealSense"]},{"@type":"Person","@id":"https:\/\/www.realsenseai.com\/cn\/#\/schema\/person\/267263d5df51bbe26099751b79bf7d7a","name":"webmaster@freshwatercreative.ca","image":{"@type":"ImageObject","inLanguage":"zh-Hans","@id":"https:\/\/www.realsenseai.com\/cn\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/f709b39fa8422d35a6d83876ff73052452221b58a440c579b3494f5577e5bbc5?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/f709b39fa8422d35a6d83876ff73052452221b58a440c579b3494f5577e5bbc5?s=96&d=mm&r=g","caption":"webmaster@freshwatercreative.ca"},"sameAs":["https:\/\/realsenseai.com"]}]}},"_links":{"self":[{"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/posts\/45318","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/comments?post=45318"}],"version-history":[{"count":1,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/posts\/45318\/revisions"}],"predecessor-version":[{"id":47864,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/posts\/45318\/revisions\/47864"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/media\/43063"}],"wp:attachment":[{"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/media?parent=45318"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/categories?post=45318"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/tags?post=45318"},{"taxonomy":"capability_application","embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/capability_application?post=45318"},{"taxonomy":"industry","embeddable":true,"href":"https:\/\/www.realsenseai.com\/cn\/wp-json\/wp\/v2\/industry?post=45318"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}