Skip to content

ibaiGorordo/pyKinectAzure

Repository files navigation

PyPI

pyKinectAzure

Azure kinect color and depth combination

Python 3 library for the Azure Kinect DK sensor-SDK.

Similar solutions

Part of the ideas in this repository are taken from following repositories:

  • pyk4a: Really nice and clean Python3 wrapper for the Kinect Azure SDK.

  • Azure-Kinect-Python: More complete library using ctypes as in this repository, however, examples about how to use the library are missing and the library is harder to use.

The objective of this repository is to combine the strong points of both repositories by creating a easy to use library that allows the use of most of the functions of the Kinect Azure. Also, to create sample programs to showcase the uses of the library

Prerequisites

  • Azure-Kinect-Sensor-SDK: required to build this library. To use the SDK, refer to the installation instructions here.
  • ctypes: required to read the library.
  • numpy: required for the matrix calculations
  • opencv-python: Required for the image transformations and visualization.

Installation

pip install pykinect_azure

How to use this library

  • The library has been tested in Windows 10 and Ubuntu 20.04 with the Kinect Azure SDK 1.4.0 and 1.4.1, it should also work with other operating systems.

    • Windows: When using the pyKinectAzure class, it requires the path to the k4a.dll module, make sure that the path is the correct one for your Kinect Azure SDK version. By default the path (module_path) is set to C:\\Program Files\\Azure Kinect SDK v1.4.0\\sdk\\windows-desktop\\amd64\\release\\bin\\k4a.dll.

    • Linux: When using the pyKinectAzure class, it requires the path to the k4a.so module, make sure that the path is the correct one for your Kinect Azure SDK version. When using Linux set module_path to /usr/lib/x86_64-linux-gnu/libk4a.so, please follow the instruction from microsoft to install the right packages.

    • Nvidia Jetson: When using the pyKinectAzure class, it requires the path to the k4a.so module, make sure that the path is the correct one for your Kinect Azure SDK version. When using Nvidia Jetson set module_path to to '/usr/lib/aarch64-linux-gnu/libk4a.so', please follow the instruction from microsoft to install the right packages.

  • The pyKinectAzure class is a wrapper around the _k4a.py module to make the library more understandable. However, the pyKinectAzure class still contains few methods from the Kinect Azure SDK

  • The _k4a.py module already contains all the methods in the Kinect Azure SDK. So, for more advanced of the Kinect Azure SDK check the _k4a.py module.

Examples

For an example on how to obtain and visualize the depth data from the Azure Kinect DK check the exampleDepthImageOpenCV.py script.

git clone https://github.com/ibaiGorordo/pyKinectAzure.git
cd pyKinectAzure/examples
python exampleDepthImageOpenCV.py

Also, there is an example to obtain and visualize the smooth depth from the Azure Kinect DK check the exampleSmoothDepthImageOpenCV.py script.

python exampleSmoothDepthImageOpenCV.py

note: when you are dealing on the linux platform, please insure that the user have permission to the usb devices, or always execute under the root permission by adding sudo ahead.

Azure kinect smoothed depth image comparison

Contribution

Feel free to send pull requests.

Bug reports are also appreciated. Please include as much details as possible.

TODO:

Wrappers for the Kinect Azure data

  • Create wrapper to read depth images.
  • Create wrapper to read Infrared images.
  • Create wrapper to read IMU data.
  • Create function to convert image buffer to image depending on the image type.
  • Create wrapper to transform depth image to color image.
  • Create wrapper to transform depth image to 3D point cloud.
  • Create funtion to visualize 3D point cloud.

Create examples

  • Example to visualize depth images.
  • Example to visualize passive IR images.
  • Example to plot IMU data.
  • Example to visualize Depth as color image.
  • Example to overlay depth color with alpha over real image.
  • Example to visualize 3D point cloud

Body tracking

  • Create library for the body tracking SDK similar the same way as the current library.
  • Combine image and skeleton data.
  • Generate 3D skeleton visualization.

Future ideas

  • Run Deep Learning models on Kinect data (Openpose 3D skeleton, semantic segmentation with depth, monocular depth estimation validation)
  • Track passive infrared marker for motion capture analysis