Top Banner
Yingcai Xiao Game Development with Kinect
17

Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

Dec 21, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

Yingcai Xiao

Game Developmentwith Kinect

Page 2: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

Video GameVideo Game

Interactive animation: Interactive animation:

user->user->interface (look) -> interface (look) ->

action (feel) -> action (feel) -> feedback (A/V, haptic)feedback (A/V, haptic)

Page 3: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

Video GameVideo Game

UserUser

ControllerController

DisplayDisplay

Game (Software)

Game (Software)

Page 4: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

Video GameVideo Game

Input Device Driver

Input Device Driver

DisplayDeviceDriver (GDI)

DisplayDeviceDriver (GDI)

Game (Software)

Game (Software)

Page 5: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

Game controllers: input devicesGame controllers: input devices

The evolution of input devices:

CL (Commend Line Input)GUI (Graphical User Interface)NUI (Natural Interface)

The evolution of input devices:

CL (Commend Line Input)GUI (Graphical User Interface)NUI (Natural Interface)

Page 6: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

Kinect NUI - The Gang of FourKinect NUI - The Gang of FourOpenNI: a general-purpose framework for obtaining data from 3D sensors

NITE: a skeleton-tracking and gesture-recognition library

SensorKinect: the driver for interfacing with the Microsoft Kinect

ZigFu: Unity Package (Assets and Scripts)

OpenNI: a general-purpose framework for obtaining data from 3D sensors

NITE: a skeleton-tracking and gesture-recognition library

SensorKinect: the driver for interfacing with the Microsoft Kinect

ZigFu: Unity Package (Assets and Scripts)

Page 7: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

OpenNIOpenNI

Page 8: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

OpenNIOpenNI

Page 9: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

OpenNIOpenNIProduction Nodes:

• a set of components that have a productive role in the data creation process required for Natural Interaction based applications.

• the API of the production nodes only defines the language.

• The logic of data generation must be implemented by the modules that plug into OpenNI.

• E.g. for a production node that represents the functionality of generating hand-point data, the logic of hand-point data generation must come from an external middleware component that is both plugged into OpenNI, and also has the knowledge of how to produce such data.

Production Nodes:

• a set of components that have a productive role in the data creation process required for Natural Interaction based applications.

• the API of the production nodes only defines the language.

• The logic of data generation must be implemented by the modules that plug into OpenNI.

• E.g. for a production node that represents the functionality of generating hand-point data, the logic of hand-point data generation must come from an external middleware component that is both plugged into OpenNI, and also has the knowledge of how to produce such data.

Page 10: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

OpenNI OpenNI

(1) body imaging (2) joint recognition (3) hand waving

Page 11: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

OpenNI: Sensor-Related Production NodesOpenNI: Sensor-Related Production Nodes

Device: represents a physical device (a depth sensor, or an RGB camera). Its main role is to enable device configuration.

Depth Generator: generates a depth-map. Must be implemented by any 3D sensor that wishes to be certified as OpenNI compliant.

Image Generator: generates colored image-maps. Must be implemented by any color sensor that wishes to be certified as OpenNI compliant

IR Generator: generates IR image-maps. Must be implemented by any IR sensor that wishes to be certified as OpenNI compliant.

Audio Generator: generates an audio stream. Must be implemented by any audio device that wishes to be certified as OpenNI compliant.

Page 12: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

OpenNI: Middleware-Related Production NodesOpenNI: Middleware-Related Production Nodes

Gestures Alert Generator: Generates callbacks to the application when specific gestures are identified.

Scene Analyzer: Analyzes a scene, including the separation of the foreground from the background, identification of figures in the scene, and detection of the floor plane. The Scene Analyzer’s main output is a labeled depth map, in which each pixel holds a label that states whether it represents a figure, or it is part of the background.

Hand Point Generator: Supports hand detection and tracking. This node generates callbacks that provide alerts when a hand point (meaning, a palm) is detected, and when a hand point currently being tracked, changes its location.

User Generator: Generates a representation of a (full or partial) body in the 3D scene.

Page 13: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

OpenNI: Recording Production NotesOpenNI: Recording Production Notes

Recorder: Implements data recordings

Player: Reads data from a recording and plays it

Codec: Used to compress and decompress data in recordings

Page 14: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

OpenNI: CapabilitiesOpenNI: Capabilities

Supports the registration of multiple middleware components and devices. OpenNI is released with a specific set of capabilities, with the option of adding further capabilities in the future. Each module can declare the capabilities it supports.

Currently supported capabilities:

Alternative View: Enables any type of map generator to transform its data to appear as if the sensor is placed in another location.

Cropping: Enables a map generator to output a selected area of the frame.

Frame Sync: Enables two sensors producing frame data (for example, depth and image) to synchronize their frames so that they arrive at the same time.

Page 15: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

OpenNI: CapabilitiesOpenNI: Capabilities

Currently supported capabilities:

Mirror: Enables mirroring of the data produced by a generator.

Pose Detection: Enables a user generator to recognize when the user is posed in a specific position.

Skeleton: Enables a user generator to output the skeletal data of the user. This data includes the location of the skeletal joints, the ability to track skeleton positions and the user calibration capabilities.

User Position: Enables a Depth Generator to optimize the output depth map that is generated for a specific area of the scene.

Page 16: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

OpenNI: CapabilitiesOpenNI: Capabilities

Currently supported capabilities:

Error State: Enables a node to report that it is in "Error" status, meaning that on a practical level, the node may not function properly.

Lock Aware: Enables a node to be locked outside the context boundary.

Hand Touching FOV Edge: Alert when the hand point reaches the boundaries of the field of view.

Page 17: Yingcai Xiao Game Development with Kinect. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)

OpenNI: Generating and Reading DataOpenNI: Generating and Reading Data

• Production nodes that also produce data are called Generator.

• Once these are created, they do not immediately start generating data, to enable the application to set the required configuration.

• The xn::Generator::StartGenerating() function is used to begin generating data.

• The xn::Generator::StopGenerating stops it.

• Data Generators "hide" new data internally, until explicitly requested to expose the most updated data to the application, using the UpdateData request function.

• OpenNI enables the application to wait for new data to be available, and then update it using the xn::Generator::WaitAndUpdateData() function.