搜档网
当前位置:搜档网 › OpenNI_UserGuide

OpenNI_UserGuide

OpenNI_UserGuide
OpenNI_UserGuide

User Guide

Table of Contents

License Notice (4)

Overview (4)

Natural Interaction (4)

What is OpenNI? (4)

Abstract Layered View (5)

Concepts (6)

Modules (6)

Production Nodes (7)

Production Node Types (8)

Production Chains (9)

Capabilities (11)

Generating and Reading Data (12)

Generating Data (12)

Reading Data (12)

Mock Nodes (13)

Sharing Devices between Applications and Locking Nodes (13)

Licensing (13)

General Framework Utilities (14)

Recording (14)

Production Node Error Status (15)

Backwards Compatibility (15)

Getting Started (15)

Supported Platforms (15)

Main Objects (16)

The Context Object (16)

Metadata Objects (16)

Configuration Changes (16)

Data Generators (17)

User Generator (18)

Creating an empty project that uses OpenNI (18)

Basic Functions: Initialize, Create a Node and Read Data (19)

Enumerating Possible Production Chains (20)

Understanding why enumeration failed (21)

Working with Depth, Color and Audio Maps (22)

Working with Audio Generators (23)

Recording and Playing Data (24)

Recording (24)

Playing (25)

Node Configuration (26)

Configuration Using XML file (27)

Licenses (28)

Log (28)

Production Nodes (29)

Global Mirror (29)

Recordings (29)

Nodes (29)

Queries (30)

Configuration (31)

Start Generating (33)

Building and Running a Sample Application (33)

NiSimpleRead (34)

NiSimpleCreate (34)

NiCRead (34)

NiSimpleViewer (34)

NiSampleModule (35)

NiConvertXToONI (35)

NiRecordSynthetic (35)

NiViewer (35)

NiBackRecorder (36)

Troubleshooting (38)

Glossary (38)

License Notice

OpenNI is written and distributed under the GNU Lesser General Public License which means that its source code is freely-distributed and available to the general public.

You can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

OpenNI is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details: . Overview

Natural Interaction

The term Natural Interaction (NI) refers to a concept where Human-device interaction is based on human senses, mostly focused on hearing and vision. Human device NI paradigms render such external peripherals as remote controls, keypads or a mouse obsolete. Examples of everyday NI usage include:

?Speech and command recognition, where devices receive instructions via vocal commands. ?Hand gestures, where pre-defined hand gestures are recognized and interpreted to activate and control devices. For example, hand gesture control enables users to manage living room consumer electronics with their bare hands.

?Body Motion Tracking, where full body motion is tracked, analyzed and interpreted for gaming purposes.

What is OpenNI?

OpenNI (Open Natural I nteraction) is a multi-language, cross-platform framework that defines API s for writing applications utilizing Natural Interaction. OpenNI APIs are composed of a set of interfaces for writing NI applications. The main purpose of OpenNI is to form a standard API that enables communication with both:

?Vision and audio sensors (the devices that ‘see’ and ‘hear’ the figures and their surroundings.)

?Vision and audio perception middleware (the software components that analyze the audio and visual data that is recorded from the scene, and comprehend it). For example, software that receives visual data, such as an image, returns the location of the palm of a hand

detected within the image.

OpenNI supplies a set of APIs to be implemented by the sensor devices, and a set of APIs to be implemented by the middleware components. By breaking the dependency between the sensor and the middleware, OpenNI’s API enables a pplications to be written and ported with no additional effort to operate on top o f different middleware modules (“write once, deploy everywhere”). OpenNI's API also enables middleware developers to write algorithms on top of

raw data formats, regardless of which sensor device has produced them, and offers sensor manufacturers the capability to build sensors that power any OpenNI compliant application. The OpenNI standard API enables natural-interaction application developers to track real-life (3D) scenes by utilizing data types that are calculated from the input of a sensor (for example, representation of a full body, representation of a hand location, an array of the pixels in a depth map and so on). Applications can be written regardless of the sensor or middleware providers. OpenNI is an open source API that is publicly available at https://www.sodocs.net/doc/f4344912.html,.

Abstract Layered View

Figure 1 below displays a three-layered view of the OpenNI Concept with each layer representing an integral element:

?Top: Represents the software that implements natural interaction applications on top of OpenNI.

?Middle: Represents OpenNI, providing communication interfaces that interact with both the sensors and the middleware components, that analyze the data from the sensor.

?Bottom: Shows the hardware devices that capture the visual and audio elements of the scene.

Concepts

Modules

The OpenNI Framework is an abstract layer that provides the interface for both physical devices and middleware components. The API enables multiple components to be registered in the OpenNI framework. These components are referred to as modules, and are used to produce and process the sensory data. Selecting the required hardware device component, or middleware component is easy and flexible.

The modules that are currently supported are:

Sensor Modules

?3D sensor

?RGB camera

?IR camera

?Audio device (a microphone or an array of microphones)

Middleware components

?Full body analysis middleware: a software component that processes sensory data and generates body related information (typically data structure that describes joints,

orientation, center of mass, and so on).

?Hand point analysis middleware: a software component that processes sensory data and generates the location of a hand point

?Gesture detection middleware: a software component that identifies predefined gestures (for example, a waving hand) and alerts the application.

?Scene Analyzer middleware: a software component that analyzes the image of the scene in order to produce such information as:

o The separation between the foreground of the scene (meaning, the figures) and the background

o The coordinates of the floor plane

o The individual identification of figures in the scene.

Example

The illustration below displays a scenario in which 5 modules are registered to work with an OpenNI installation. Two of the modules are 3D sensors that are connected to the host. The other three are middleware components, including two that produce a person’s full-body data, and one that handles hand point tracking.

Modules, whether software or actual devices that wish to be OpenNI compliant, must implement certain interfaces.

Production Nodes

OpenNI defines Production Nodes, which are a set of units that have a productive role in the process of creating the data required for Natural Interaction based applications. Each production node can use other lower level production nodes (read their data, control their configuration and so on), and be used by other higher level nodes, or by the application itself. Example

The application wants to track the motion of a human figure in a 3D scene. This requires a production node that produces body data, or, in other words, a user generator. This specific user generator obtains its data from a depth generator. A depth generator is a production node that is implemented by a sensor, which takes raw sensory data from a depth sensor (for example, a stream of X frames per second) and outputs a depth map.

"Meaningful"3D data is defined as data that can comprehend, understand and translate the scene. Creating meaningful 3D data is a complex task. Typically, this begins by using a sensor device that produces a form of raw output data. Often, this data is a depth map, where each pixel is represented by its distance from the sensor. Dedicated middleware is then used to process this raw output, and produce a higher-level output, which can be understood and used by the application.

Common examples of higher level output are as described and illustrated below:

?The location of a user’s hand.

The output can be either the center of the palm (often referred to as ‘hand point’) or the finger tips.

?The identification of a figure within the scene.

The output is the current location and orientation of the joints of this figure (often referred to as ‘body data’).

?The identification of a hand gesture (for example, waving).

The output is an alert to the application that a specific hand gesture has occurred.

Production Node Types

Each production node in OpenNI has a type and belongs to one of the following categories:

?Sensor-related Production Nodes

?Middleware Related Production Nodes

The production node types that are currently supported in OpenNI are:

Sensor Related Production Nodes

?Device: A node that represents a physical device (for example, a depth sensor, or an RGB camera). The main role of this node is to enable device configuration.

?Depth Generator: A node that generates a depth-map. This node should be implemented by any 3D sensor that wishes to be certified as OpenNI compliant.

?Image Generator: A node that generates colored image-maps. This node should be implemented by any color sensor that wishes to be certified as OpenNI compliant

?IR Generator: A node that generates IR image-maps. This node should be implemented by any IR sensor that wishes to be certified as OpenNI compliant.

?Audio Generator: A node that generates an audio stream. This node should be implemented by any audio device that wishes to be certified as OpenNI compliant.

Middleware Related Production Nodes

?Gestures Alert Generator: Generates callbacks to the application when specific gestures are identified.

?Scene Analyzer: Analyzes a scene, including the separation of the foreground from the background, identification of figures in the scene, and detection of the floor plane.

The Scene Analyzer’s main output is a labeled depth map, in which each pixel holds a label that states whether it represents a figure, or it is part of the background.

?Hand Point Generator: Supports hand detection and tracking. This node generates callbacks that provide alerts when a hand point (meaning, a palm) is detected, and when a hand point currently being tracked, changes its location.

?User Generator: Generates a representation of a (full or partial) body in the 3D scene.

For recording purposes, the following production node types are supported:

?Recorder: Implements data recordings

?Player: Reads data from a recording and plays it

?Codec: Used to compress and decompress data in recordings

Production Chains

As explained previously, several modules (middleware components and sensors) can be simultaneously registered to a single OpenNI implementation. This topology offers applications the flexibility to select the specific sensor devices and middleware components with which to produce and process the data.

What is a production chain?

In the Production Nodes section, an example was presented in which a user generator type of production node is created by the application. In order to produce body data, this production node uses a lower level depth generator, which reads raw data from a sensor. In the example below, the sequence of nodes (user generator => depth generator), is reliant on each other in order to produce the required body data, and is called a production chain.

Different vendors (brand names) can supply their own implementations of the same type of production node.

Example:

Brand A provides an implementation (a module) of user generator middleware. Brand B provides separate middleware that implements a user generator. Both generators are available to the application developer. OpenNI enables the application to define which modules, or production chain, to use. The OpenNI interface enumerates all possible production chains according to the registered modules. The application can then choose one of these chains, based on the preference for a specific brand, component, or version and so on, and create it.

Note: An application can also be non-specific, and request the first enumerated production chain from OpenNI.

Typically, an application is only interested in the top product node of each chain. This is the node that outputs the required data on a practical level, for example, a hand point generator. OpenNI enables the application to use a single node, without being aware of the production chain beneath this node. For advanced tweaking, there is an option to access this chain, and configure each of the nodes.

For example, if we look at the system illustration that was presented earlier, it described multiple registered modules and devices. Once an application requests a user generator, OpenNI returns the following four optional production chains to be used to obtain body data:

The above illustration shows a scenario in which the following modules were registered to OpenNI:

?Two body middleware components, each being different brands.

?Two 3D sensors, each being two different brands

This illustration displays the four optional production chains that were found for this implementation. Each chain represents a possible combination of a body middleware

component and a 3D sensor device. OpenNI offers the application the option to choose from the above four production chain alternatives.

Capabilities

The Capabilities mechanism supports the flexibility of the registration of multiple middleware components and devices to OpenNI. OpenNI acknowledges that different providers may have varying capabilities and configuration options for their production nodes, and therefore, certain non-mandatory extensions are defined by the OpenNI API. These optional extensions to the API are called Capabilities, and reveal additional functionality, enabling providers to decide individually whether to implement an extension. A production node can be asked whether it supports a specific capability. If it does, those functions can be called for that specific node. OpenNI is released with a specific set of capabilities, with the option of adding further capabilities in the future. Each module can declare the capabilities it supports. Furthermore, when requesting enumeration of production chains, the application can specify the capabilities that should be supported as criteria. Only modules that support the requested capability are returned by the enumeration.

Currently supported capabilities:

?Alternative View:Enables any type of map generator (depth, image, IR) to transform its data to appear as if the sensor is placed in another location (represented by another

production node, usually another sensor).

?Cropping: Enables a map generator (depth, image, IR) to output a selected area of the frame as opposed to the entire frame. When cropping is enabled, the size of the generated map is reduced to fit a lower resolution (less pixels). For example, if the map generator is working in VGA resolution (640x480) and the application chooses to crop at 300x200, the next pixel row will begin after 300 pixels. Cropping can be very useful for performance boosting.

?Frame Sync: Enables two sensors producing frame data (for example, depth and image) to synchronize their frames so that they arrive at the same time.

?Mirror: Enables mirroring of the data produced by a generator. Mirroring is useful if the sensor is placed in front of the user, as the image captured by the sensor is mirrored, so the right hand appears as the left hand of the mirrored figure.

?Pose Detection: Enables a user generator to recognize when the user is posed in a specific position.

?Skeleton: Enables a user generator to output the skeletal data of the user. This data includes the location of the skeletal joints, the ability to track skeleton positions and the user

calibration capabilities.

?User Position: Enables a Depth Generator to optimize the output depth map that is generated for a specific area of the scene.

?Error State: Enables a node to report that it is in "Error" status, meaning that on a practical level, the node may not function properly.

?Lock Aware: Enables a node to be locked outside the context boundary. For more information, see Sharing Devices between Applications and Locking Nodes.

Generating and Reading Data

Generating Data

Production nodes that also produce data are called Generators, as discussed previously. Once these are created, they do not immediately start generating data, to enable the application to set the required configuration. This ensures that once the object begins streaming data to the application, the data is generated according to the required configuration. Data Generators do not actually produce any data until specifically asked to do so. The

xn::Generator::StartGenerating() function is used to begin generating. The application may also want to stop the data generation without destroying the node, in order to store the configuration, and can do this using the xn::Generator::StopGenerating function.

Reading Data

Data Generators constantly receive new data. However, the application may still be using older data (for example, the previous frame of the depth map). As a result of this, any generator should internally store new data, until explicitly requested to update to the newest available data.

This means that Data Generators "hide" new data internally, until explicitly requested to expose the most updated data to the application, using the UpdateData request function. OpenNI enables the application to wait for new data to be available, and then update it using the xn::Generator::WaitAndUpdateData() function.

In certain cases, the application holds more than one node, and wants all the nodes to be updated. OpenNI provides several functions to do this, according to the specifications of what should occur before the UpdateData occurs:

?xn::Context::WaitAnyUpdateAll(): Waits for any node to have new data. Once new data is available from any node, all nodes are updated.

?xn::Context::WaitOneUpdateAll(): Waits for a specific node to have new data. Once new data is available from this node, all nodes are updated. This is especially useful when several nodes are producing data, but only one determines the progress of the application.

?xn::Context::WaitNoneUpdateAll(): Does not wait for anything. All nodes are immediately updated.

?xn::Context::WaitAndUpdateAll(): Waits for all nodes to have new data available, and then updates them.

The above four functions exit after a timeout of two seconds. It is strongly advised that you use one of the xn::Context::Wait*…+UpdateAll() functions, unless you only need to update a specific node. In addition to updating all the nodes, these functions have the following additional benefits:

?If nodes depend on each other, the function guarantees that the "needed" node (the lower-level node generating the data for another node) is updated before the "needing" node.

?When playing data from a recording, the function reads data from the recording until the condition is met.

?If a recorder exists, the function automatically records the data from all nodes added to this recorder.

Mock Nodes

OpenNI provides a mock implementation for nodes. A mock implementation of a node does not contain any logic for generating data. Instead, it allows an outside component (such as an application, or another node implementation) feed it configuration changes and data. Mock nodes are rarely required by the application, and are usually used by player nodes to simulate actual nodes when reading data from a recording.

Sharing Devices between Applications and Locking Nodes

In most cases, the data generated by OpenNI nodes comes from a hardware device. A hardware device can usually be set to more than one configuration. Therefore, if several applications all using the same hardware device are running simultaneously, their configuration must be synchronized.

However, usually, when writing an application, it is impossible to know what other applications may be executed simultaneously, and as such, synchronization of the configuration is not possible. Additionally, sometimes it is essential that an application use a specific configuration, and no other.

OpenNI has two modes that enable multiple applications to share a hardware device:

?Full Sharing (default): In this mode, the application declares that it can handle any configuration of this node. OpenNI interface enables registering to callback functions of any configuration change, so the application can be notified whenever a configuration changes (by the same application, or by another application using the same hardware device).

?Locking Configuration: In this mode, an application declares that it wants to lock the current configuration of a specific node. OpenNI will therefore not allow "Set" functions to be called on this node. If the node represents a hardware device (or anything else that can be shared between processes), it should implement the "Lock Aware" capability, which enables locking across process boundaries.

Note: When a node is locked, the locking application receives a lock handle. Other than using this handle to unlock the node, the handle can also be used to change the node configuration without releasing the lock (in order that the node configuration will not be "stolen" by another application).

Licensing

OpenNI provides a simple licensing mechanism that can be used by modules and applications. An OpenNI context object, which is an object that holds the complete state of applications using OpenNI, holds a list of currently loaded licenses. This list can be accessed at any stage to search for a specific license.

A license is composed of a vendor name and a license key. Vendors who want to use this mechanism can utilize their own proprietary format for the key.

The license mechanism is used by modules, to ensure that they are only used by authorized applications A module of a particular vendor can be installed on a specific machine, and only be accessible if the license is provided by the application using the module. During the enumeration process, when OpenNI searches for valid production chains, the module can check the licenses list. If the requested license is not registered, the module is able to hide itself, meaning that it will return zero results and therefore not be counted as a possible production chain.

OpenNI also provides a global registry for license keys, which are loaded whenever a context is initialized. Most modules require a license key from the user during installation. The license provided by the user can then be added to the global license registry, using the niLicense command-line tool, which can also be used to remove licenses.

Additionally, applications sometimes have private licenses for a module, meaning that this module can only be activated using this application (preventing other applications from using it). General Framework Utilities

In addition to the formal OpenNI API, a set of general framework utilities is also published, intended mainly to ease the portability over various architectures and operating systems. The utilities include:

? A USB access abstract layer (provided with a driver for Microsoft Windows)

?Certain basic data type implementation (including list, hash, and so on)

?Log and dump systems

?Memory and performance profiling

?Events (enabling callbacks to be registered to a specific event)

?Scheduling of tasks

Those utilities are available to any application using OpenNI. However, these utilities are not part of standard OpenNI, and as such, backwards compatibility is only guaranteed to a certain extent.

Recording

Recordings are a powerful debug tool. They enable full capture of the data and the ability to later stream it back so that applications can simulate an exact replica of the situation to be debugged.

OpenNI supports recordings of the production nodes chain; both the entire configuration of each node, and all data streamed from a node.

OpenNI has a framework for recording data and for playing it back (using mock nodes). It also comes with the nimRecorder module, which defines a new file format (".ONI") - and implements a Recorder node and a Player node for this format.

Production Node Error Status

Each production node has an error status, indicating whether it is currently functional. For example, a device node may not be functional if the device is disconnected from the host machine. The default error state is always OK, unless an Error Status capability is implemented. This capability allows the production node to change its error status if an error occurs. A node that does not implement this capability always has a status of "OK".

An application can check the error status of each node although it mostly only needs to know if any node has an error status, and is less interested which node (other than for user notification purposes). In order to receive notifications about a change in the error status of a node, the application can register to a callback that will alert of any change in a node's error status. OpenNI aggregates the error statuses of all the nodes together into a single error status, called Global Error Status. This makes it easier for applications to find out about the current state of a node or nodes. A global error status of XN_STATUS_OK means that all the nodes are OK. If only one node has an error status, that error status becomes the global error status (for example, if one sensor is disconnected, the OpenNI global error status is

XN_STATUS_DEVICE_NOT_CONNECTED). If more than one node has an error status, the global error status is XN_STATUS_MULTIPLE_NODES_ERROR. In such a situation, the application can review all nodes and check which one has an error status, and why. Backwards Compatibility

OpenNI declares full backwards compatibility. This means that every application developed over any version of OpenNI, can also work with every future OpenNI version, without requiring recompilation.

On a practical level, this means that each computer should ideally have the latest OpenNI version installed on it. If not this, then the latest OpenNI version required by any of the applications installed on this computer. In order to achieve this, we recommend that the application installation should also install OpenNI.

Getting Started

Supported Platforms

OpenNI is available on the following platforms:

?Windows XP and later, for 32-bit only

?Linux Ubuntu 10.10 and later, for x86

Main Objects

The Context Object

The context is the main object in OpenNI. A context is an object that holds the complete state of applications using OpenNI, including all the production chains used by the application. The same application can create more than one context, but the contexts cannot share information. For example, a middleware node cannot use a device node from another context. The context must be initialized once, prior to its initial use. At this point, all plugged-in modules are loaded and analyzed. To free the memory used by the context, the application should call the shutdown function.

Metadata Objects

OpenNI Metadata objects encapsulate a set of properties that relate to specific data alongside the data itself. For example, typical property of a depth map is the resolution of this map (for example, the number of pixels on both an X and a Y axis). Each generator that produces data has its own specific metadata object.

In addition, the metadata objects play an important role in recording the configuration of a node at the time the corresponding data was generated. Sometimes, while reading data from a node, an application changes the node configuration. This can cause inconsistencies that may cause errors in the application, if not handled properly.

Example

A depth generator is configured to produce depth maps in QVGA resolution (320x240 pixels), and the application constantly reads data from it. At some point, the application changes the node output resolution to VGA (640x480 pixels). Until a new frame arrives, the application may encounter inconsistency where calling the xn::DepthGenerator::GetDepthMap() function will return a QVGA map, but calling the xn::DepthGenerator::GetMapOutputMode() function will return that the current resolution is a VGA map. This can result in the application assuming that the depth map that was received is in VGA resolution, and therefore try to access nonexistent pixels.

The solution is as follows: Each node has its metadata object, that records the properties of the data when it was read. In the above case, the correct way to handle data would be to get the metadata object, and read both the real data (in this case, a QVGA depth map) and its corresponding resolution from this object.

Configuration Changes

Each configuration option in OpenNI interfaces comprises the following functions:

? A Set function for modifying the configuration.

? A Get function for providing the current value.

?Register and Unregister functions, enabling registration for callback functions to be called when this option changes.

Data Generators

Map Generator

The basic interface for all data generators that produce any type of map.

Main functionalities:

?Output Mode property: Controls the configuration by which to generate the map

?Cropping capability

?Alternative Viewpoint capability

?Frame Sync capability

Depth Generator

An object that generates a depth map.

Main Functionalities:

?Get depth map: Provides the depth map

?Get Device Max Depth: The maximum distance available for this depth generator

?Field of View property: Configures the values of the horizontal and vertical angles of the sensor

?User Position capability

Image Generator

A Map Generator that generates a color image map.

Main Functionalities:

?Get Image Map: Provides the color image map

?Pixel format property

IR Generator

A map generator that generates an IR map.

Main Functionality:

?Get IR Map: Provides the current IR map

Scene Analyzer

A map generator that gets raw sensory data and generates a map with labels that clarify the scene.

Main Functionalities:

?Get Label Map: Provides a map in which each pixel has a meaningful label (i.e. figure 1, figure 2, background, and so on)

?Get Floor: get the coordinates of the floor plane

Audio Generator

An object that generates Audio data.

Main Functionalities:

?Get Audio Buffer

?Wave Output Modes property: Configure the audio output, including sample rate, number of channels and bits-per-sample

Gesture Generator

An object that enables specific body or hand gesture tracking

Main Functionalities:

?Add/Remove Gesture: Turn on/off a gesture. Once turned on, the generator will start looking for this gesture.

?Get Active Gestures: Provides the names of the gestures that are currently active

?Register/Unregister Gesture callbacks

?Register/Unregister Gesture change

Hand Point Generator

An object that enables hand point tracking.

Main Functionalities:

?Start/Stop Tracking: Start/stop tracking a specific hand (according to its position)

?Register/Unregister Hand Callbacks: The following actions will generate hand callbacks: o When a new hand is created

o When an existing hand is in a new position

o When an existing hand disappears

User Generator

An object that generates data relating to a figure in the scene.

Main Functionalities:

?Get Number of Users: Provides the number of users currently detected in the scene

?Get Users: Provides the current users

?Get User CoM: Returns the location of the center of mass of the user

?Get User Pixels: Provides the pixels that represent the user. The output is a map of the pixels of the entire scene, where the pixels that represent the body are labeled User ID.

?Register/Unregister user callbacks: The following actions will generate user callbacks: o When a new user is identified

o When an existing user disappears

Creating an empty project that uses OpenNI

1.Open a new project or an existing one with which you want to use OpenNI.

2.In the Visual Studio menu, open the Project menu and choose Project properties.

3.In the C/C++ section, under the General node, select =>Additional Include Directories and

add "$(OPEN_NI_INCLUDE)". This is an environment variable that points to the location of the OpenNI Include directory. (The default location is: C:\Program files\OpenNI\Include.) 4.In the Linker section, under the General node, select Additional Library Directories and add

"$(OPEN_NI_LIB)". This is an environment variable that points to the location of the OpenNI include directory. (The default location is: C:\Program files\OpenNI\Lib.)

5.In the Linker section, under the Input node, select Additional Dependencies and add

OpenNI.lib.

6.If you wish to use an XML file to configure OpenNI, you can start from the basic XML file that

can be found in the OpenNI Data folder. (The default location is: C:\Program

files\OpenNI\Data.) For further information about OpenNI xml scripts, see Xml Scripts.

7.Ensure that you add the Additional Include and Library directories to both your Release and

Debug configurations.

8.Your code files should include XnOpenNI.h if using the C interface, or XnCppWrapper.h if

using the C++ interface.

Basic Functions: Initialize, Create a Node and Read Data

The following code illustrates the basic functionality of OpenNI. It initializes a Context object, and creates and reads data from a single Depth node.

Enumerating Possible Production Chains

The following code demonstrates how to fine control the enumeration process. It enumerates Production Chains for producing User output, reduces the options using a basic query, and then chooses the first of all the possibilities.

相关主题