Academic literature on the topic 'Azure Kinect'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Azure Kinect.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Azure Kinect"

1

Tölgyessy, Michal, Martin Dekan, and Ľuboš Chovanec. "Skeleton Tracking Accuracy and Precision Evaluation of Kinect V1, Kinect V2, and the Azure Kinect." Applied Sciences 11, no. 12 (June 21, 2021): 5756. http://dx.doi.org/10.3390/app11125756.

Full text
Abstract:
The Azure Kinect, the successor of Kinect v1 and Kinect v2, is a depth sensor. In this paper we evaluate the skeleton tracking abilities of the new sensor, namely accuracy and precision (repeatability). Firstly, we state the technical features of all three sensors, since we want to put the new Azure Kinect in the context of its previous versions. Then, we present the experimental results of general accuracy and precision obtained by measuring a plate mounted to a robotic manipulator end effector which was moved along the depth axis of each sensor and compare them. In the second experiment, we mounted a human-sized figurine to the end effector and placed it in the same positions as the test plate. Positions were located 400 mm from each other. In each position, we measured relative accuracy and precision (repeatability) of the detected figurine body joints. We compared the results and concluded that the Azure Kinect surpasses its discontinued predecessors, both in accuracy and precision. It is a suitable sensor for human–robot interaction, body-motion analysis, and other gesture-based applications. Our analysis serves as a pilot study for future HMI (human–machine interaction) designs and applications using the new Kinect Azure and puts it in the context of its successful predecessors.
APA, Harvard, Vancouver, ISO, and other styles
2

Tölgyessy, Michal, Martin Dekan, Ľuboš Chovanec, and Peter Hubinský. "Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2." Sensors 21, no. 2 (January 8, 2021): 413. http://dx.doi.org/10.3390/s21020413.

Full text
Abstract:
The Azure Kinect is the successor of Kinect v1 and Kinect v2. In this paper we perform brief data analysis and comparison of all Kinect versions with focus on precision (repeatability) and various aspects of noise of these three sensors. Then we thoroughly evaluate the new Azure Kinect; namely its warm-up time, precision (and sources of its variability), accuracy (thoroughly, using a robotic arm), reflectivity (using 18 different materials), and the multipath and flying pixel phenomenon. Furthermore, we validate its performance in both indoor and outdoor environments, including direct and indirect sun conditions. We conclude with a discussion on its improvements in the context of the evolution of the Kinect sensor. It was shown that it is crucial to choose well designed experiments to measure accuracy, since the RGB and depth camera are not aligned. Our measurements confirm the officially stated values, namely standard deviation ≤17 mm, and distance error <11 mm in up to 3.5 meters distance from the sensor in all four supported modes. The device, however, has to be warmed up for at least 40–50 min to give stable results. Due to the time-of-flight technology, the Azure Kinect cannot be reliably used in direct sunlight. Therefore, it is convenient mostly for indoor applications.
APA, Harvard, Vancouver, ISO, and other styles
3

Tölgyessy, Michal, Martin Dekan, Ľuboš Chovanec, and Peter Hubinský. "Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2." Sensors 21, no. 2 (January 8, 2021): 413. http://dx.doi.org/10.3390/s21020413.

Full text
Abstract:
The Azure Kinect is the successor of Kinect v1 and Kinect v2. In this paper we perform brief data analysis and comparison of all Kinect versions with focus on precision (repeatability) and various aspects of noise of these three sensors. Then we thoroughly evaluate the new Azure Kinect; namely its warm-up time, precision (and sources of its variability), accuracy (thoroughly, using a robotic arm), reflectivity (using 18 different materials), and the multipath and flying pixel phenomenon. Furthermore, we validate its performance in both indoor and outdoor environments, including direct and indirect sun conditions. We conclude with a discussion on its improvements in the context of the evolution of the Kinect sensor. It was shown that it is crucial to choose well designed experiments to measure accuracy, since the RGB and depth camera are not aligned. Our measurements confirm the officially stated values, namely standard deviation ≤17 mm, and distance error <11 mm in up to 3.5 m distance from the sensor in all four supported modes. The device, however, has to be warmed up for at least 40–50 min to give stable results. Due to the time-of-flight technology, the Azure Kinect cannot be reliably used in direct sunlight. Therefore, it is convenient mostly for indoor applications.
APA, Harvard, Vancouver, ISO, and other styles
4

IVORRA, EUGENIO, MARIO ORTEGA, and MARIANO ALCANIZ. "AZURE KINECT BODY TRACKING UNDER REVIEW FOR THE SPECIFIC CASE OF UPPER LIMB EXERCISES." MM Science Journal 2021, no. 2 (June 2, 2021): 4333–41. http://dx.doi.org/10.17973/mmsj.2021_6_2021012.

Full text
Abstract:
A tool for human pose estimation and quantification using consumer-level equipment is a long-pursued objective. Many studies have employed the Microsoft Kinect v2 depth camera but with recent release of the new Kinect Azure a revision is required. This work researches the specific case of estimating the range of motion in five upper limb exercises using four different pose estimation methods. These exercises were recorded with the Kinect Azure camera and assessed with the OptiTrack motion tracking system as baseline. The statistical analysis consisted of evaluation of intra-rater reliability with intra-class correlation, the Pearson correlation coefficient and Bland–Altman statistical procedure. The modified version of the OpenPose algorithm with the post-processing algorithm PoseFix had excellent reliability with most intra-class correlations being over 0.75. The Azure body tracking algorithm had intermediate results. The results obtained justify clinicians employing these methods, as quick and low-cost simple tools, to assess upper limb angles.
APA, Harvard, Vancouver, ISO, and other styles
5

Neupane, Chiranjivi, Anand Koirala, Zhenglin Wang, and Kerry Brian Walsh. "Evaluation of Depth Cameras for Use in Fruit Localization and Sizing: Finding a Successor to Kinect v2." Agronomy 11, no. 9 (September 5, 2021): 1780. http://dx.doi.org/10.3390/agronomy11091780.

Full text
Abstract:
Eight depth cameras varying in operational principle (stereoscopy: ZED, ZED2, OAK-D; IR active stereoscopy: Real Sense D435; time of flight (ToF): Real Sense L515, Kinect v2, Blaze 101, Azure Kinect) were compared in context of use for in-orchard fruit localization and sizing. For this application, a specification on bias-corrected root mean square error of 20 mm for a camera-to-fruit distance of 2 m and operation under sunlit field conditions was set. The ToF cameras achieved the measurement specification, with a recommendation for use of Blaze 101 or Azure Kinect made in terms of operation in sunlight and in orchard conditions. For a camera-to-fruit distance of 1.5 m in sunlight, the Azure Kinect measurement achieved an RMSE of 6 mm, a bias of 17 mm, an SD of 2 mm and a fill rate of 100% for depth values of a central 50 × 50 pixels group. To enable inter-study comparisons, it is recommended that future assessments of depth cameras for this application should include estimation of a bias-corrected RMSE and estimation of bias on estimated camera-to-fruit distances at 50 cm intervals to 3 m, under both artificial light and sunlight, with characterization of image distortion and estimation of fill rate.
APA, Harvard, Vancouver, ISO, and other styles
6

Albert, Justin Amadeus, Victor Owolabi, Arnd Gebel, Clemens Markus Brahms, Urs Granacher, and Bert Arnrich. "Evaluation of the Pose Tracking Performance of the Azure Kinect and Kinect v2 for Gait Analysis in Comparison with a Gold Standard: A Pilot Study." Sensors 20, no. 18 (September 8, 2020): 5104. http://dx.doi.org/10.3390/s20185104.

Full text
Abstract:
Gait analysis is an important tool for the early detection of neurological diseases and for the assessment of risk of falling in elderly people. The availability of low-cost camera hardware on the market today and recent advances in Machine Learning enable a wide range of clinical and health-related applications, such as patient monitoring or exercise recognition at home. In this study, we evaluated the motion tracking performance of the latest generation of the Microsoft Kinect camera, Azure Kinect, compared to its predecessor Kinect v2 in terms of treadmill walking using a gold standard Vicon multi-camera motion capturing system and the 39 marker Plug-in Gait model. Five young and healthy subjects walked on a treadmill at three different velocities while data were recorded simultaneously with all three camera systems. An easy-to-administer camera calibration method developed here was used to spatially align the 3D skeleton data from both Kinect cameras and the Vicon system. With this calibration, the spatial agreement of joint positions between the two Kinect cameras and the reference system was evaluated. In addition, we compared the accuracy of certain spatio-temporal gait parameters, i.e., step length, step time, step width, and stride time calculated from the Kinect data, with the gold standard system. Our results showed that the improved hardware and the motion tracking algorithm of the Azure Kinect camera led to a significantly higher accuracy of the spatial gait parameters than the predecessor Kinect v2, while no significant differences were found between the temporal parameters. Furthermore, we explain in detail how this experimental setup could be used to continuously monitor the progress during gait rehabilitation in older people.
APA, Harvard, Vancouver, ISO, and other styles
7

Terven, Juan R., and Diana M. Córdova-Esparza. "KinZ an Azure Kinect toolkit for Python and Matlab." Science of Computer Programming 211 (November 2021): 102702. http://dx.doi.org/10.1016/j.scico.2021.102702.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bailey, Janet L., and Bradley K. Jensen. "Telementoring: using the Kinect and Microsoft Azure to save lives." International Journal of Electronic Finance 7, no. 1 (2013): 33. http://dx.doi.org/10.1504/ijef.2013.051755.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Antico, Mauro, Nicoletta Balletti, Gennaro Laudato, Aldo Lazich, Marco Notarantonio, Rocco Oliveto, Stefano Ricciardi, Simone Scalabrino, and Jonathan Simeone. "Postural control assessment via Microsoft Azure Kinect DK: An evaluation study." Computer Methods and Programs in Biomedicine 209 (September 2021): 106324. http://dx.doi.org/10.1016/j.cmpb.2021.106324.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Yeung, Ling-Fung, Zhenqun Yang, Kenneth Chik-Chi Cheng, Dan Du, and Raymond Kai-Yu Tong. "Effects of camera viewing angles on tracking kinematic gait patterns using Azure Kinect, Kinect v2 and Orbbec Astra Pro v2." Gait & Posture 87 (June 2021): 19–26. http://dx.doi.org/10.1016/j.gaitpost.2021.04.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Azure Kinect"

1

Baroya, Sydney. "Real-time Body Tracking and Projection Mapping in the Interactive Arts." DigitalCommons@CalPoly, 2020. https://digitalcommons.calpoly.edu/theses/2250.

Full text
Abstract:
Projection mapping, a subtopic of augmented reality, displays computer-generated light visualizations from projectors onto the real environment. A challenge for projection mapping in performing interactive arts is dynamic body movements. Accuracy and speed are key components for an immersive application of body projection mapping and dependent on scanning and processing time. This thesis presents a novel technique to achieve real-time body projection mapping utilizing a state of the art body tracking device, Microsoft’s Azure Kinect DK, by using an array of trackers for error minimization and movement prediction. The device's Sensor and Bodytracking SDKs allow multiple device synchronization. We combine our tracking results from this feature with motion prediction to provide an accurate approximation for body joint tracking. Using the new joint approximations and the depth information from the Kinect, we create a silhouette and map textures and animations to it before projecting it back onto the user. Our implementation of gesture detection provides interaction between the user and the projected images. Our results decreased the lag time created from the devices, code, and projector to create a realistic real-time body projection mapping. Our end goal was to display it in an art show. This thesis was presented at Burning Man 2019 and Delfines de San Carlos 2020 as interactive art installations.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Azure Kinect"

1

Wang, Guangjun, Ming Cheng, Xueshu Wang, Yi Fan, Xin Chen, Liangliang Yao, Hanyuan Zhang, and Zuchang Ma. "Design and Evaluation of an Exergame System of Knee with the Azure Kinect." In Communications in Computer and Information Science, 331–42. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-5943-0_27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Rosa, Benedetta, Filippo Colombo Zefinetti, Andrea Vitali, and Daniele Regazzoni. "RGB-D Sensors as Marker-Less MOCAP Systems: A Comparison Between Microsoft Kinect V2 and the New Microsoft Kinect Azure." In Advances in Simulation and Digital Human Modeling, 359–67. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-79763-8_43.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Dimitrijević, Dejan, Vladimir Todorović, Nemanja Nedić, Igor Zečević, and Sergiu Nedevschi. "Toolbox for Azure Kinect COTS Device to be Used in Automatic Screening of Idiopathic Scoliosis." In Trends and Innovations in Information Systems and Technologies, 544–52. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-45691-7_51.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Azure Kinect"

1

Romeo, Laura, Roberto Marani, Matteo Malosio, Anna G. Perri, and Tiziana D'Orazio. "Performance Analysis of Body Tracking with the Microsoft Azure Kinect." In 2021 29th Mediterranean Conference on Control and Automation (MED). IEEE, 2021. http://dx.doi.org/10.1109/med51440.2021.9480177.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Yun, Won Joon, and Joongheon Kim. "3D Modeling and WebVR Implementation using Azure Kinect, Open3D, and Three.js." In 2020 International Conference on Information and Communication Technology Convergence (ICTC). IEEE, 2020. http://dx.doi.org/10.1109/ictc49870.2020.9289518.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Chao, Xingchen Man, and Cheng Han. "Geometric Correction for Projection Image Based on Azure Kinect Depth Data." In 2020 International Conference on Virtual Reality and Visualization (ICVRV). IEEE, 2020. http://dx.doi.org/10.1109/icvrv51359.2020.00047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Darwish, Walid, Quentin Bolsee, and Adrian Munteanu. "Robust Calibration of a Multi-View Azure Kinect Scanner Based on Spatial Consistency." In 2020 International Conference on 3D Immersion (IC3D). IEEE, 2020. http://dx.doi.org/10.1109/ic3d51119.2020.9376321.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dobrea, Dan-Marius, Daniel Maxim, and Stefan Ceparu. "A face recognition system based on a Kinect sensor and Windows Azure cloud technology." In 2013 International Symposium on Signals, Circuits and Systems (ISSCS). IEEE, 2013. http://dx.doi.org/10.1109/isscs.2013.6651227.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Alaoui, Hamza, Mohamed Tarik Moutacalli, and Mehdi Adda. "AI-Enabled High-Level Layer for Posture Recognition Using The Azure Kinect in Unity3D." In 2020 IEEE 4th International Conference on Image Processing, Applications and Systems (IPAS). IEEE, 2020. http://dx.doi.org/10.1109/ipas50080.2020.9334945.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kwon, Do Hyung, and Rose Gebhardt. "An Affordable, Accessible Human Motion Controlled Interactive Robot and Simulation through ROS and Azure Kinect." In HRI '21: ACM/IEEE International Conference on Human-Robot Interaction. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3434074.3446946.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography