Quantcast
Channel: Coding4Fun Kinect Projects (HD) - Channel 9
Viewing all 220 articles
Browse latest View live

Unity Asset - Kinect v2 with MS-SDK

$
0
0

Last week I introduced you to the Unity Kinect of Rumen Filkov (aka RF Solutions), Unity Asset - Kinect [v1] with MS-SDK. As I said, here's another of his Unity Assets, today focusing on the Kinect v2.

Kinect v2 with MS-SDK

image

Kinect v2 with MS-SDK is a set of Kinect v2 examples that uses several major scripts, grouped in one folder. The package contains over ten demo scenes. The avatars-demo demonstrates how to utilize Kinect-controlled avatars in your Unity projects. The gestures-demo shows how to use Kinect gestures in your scenes. The interaction demo presents hand controlled cursors and utilization of the hand grips to drag and drop 3d-objects. The overlay-demos show how to align 3d-objects to the Kinect video stream. The face-tracking demos present Kinect face tracking and HD face models. The speech recognition demo shows how to use Kinect speech recognition to control the player with voice commands. There are many other demo scenes too, like the background removal demo, depth collider demo, multi-scenes demo and fitting-room demo. This package works with Kinect v2 and v1, supports 32- and 64-bit builds and can be used in Unity Pro and Unity Personal editors.

This package is free to schools, universities, students and teachers. If you match this criterion, send me an e-mail to get the Kinect-v2 package directly from me.

Customer support: First, see if you can find the answer you’re looking for on this page, in the comments below the articles or in the Unity forum. If it is not there, you may contact me, but please don’t do it on weekends or holidays. As everybody else, I also need some free time to rest.

How to Run the Example:

...

Download:
The official release of ‘Kinect v2 with MS-SDK’-package is available at the Unity Asset Store.

...

Troubleshooting:
* If you get exceptions at the scene start-up, make sure ...

...

What’s New in Version 2.5:

...

Videos worth 1000 Words:
Here is a video by Ricardo Salazar, created with Unity5, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.3:


 

Project Information URL: http://rfilkov.com/2014/08/01/kinect-v2-with-ms-sdk/

Project Download URL: https://www.assetstore.unity3d.com/en/#!/content/18708





Kinect to your Heart

$
0
0

Today Dwight Goins shares a great example on using one of the coolest features of the Kinect v2, heart rate detection...

Detecting heart rate with Kinect

When the latest Kinect sensor was unveiled more than a year ago at Build 2014, demos showed how it could determine a user’s heart rate without attaching sensors or wires to his or her body. But that was old news to regular followers of D Goins Insperience, the personal blog of Dwight Goins, a Microsoft Kinect for Windows MVP and founder of Dwight Goins Inc. As Goins revealed in February 2014, he had already devised his own application for detecting a person’s heart rate with the preview version of the latest Kinect sensor.  

Goins’ app, which he has subsequently refined, takes advantage of three of the latest sensor’s key features: its time-of-flight infrared data stream, its high-definition-camera color data stream, and face tracking. The infrared stream returns an array of infrared (IR) intensities from zero to 65,536, the color stream returns RGB data pixels, and the face tracking provides real-time location and positioning of a person’s face. He thus knew how to capture a facial image, measure its infrared intensity, and gage the RGB color brightness level in its every pixel. The following video shows Goins' Kinect v2 heart rate detector in action.

Project Information URL: http://blogs.msdn.com/b/kinectforwindows/archive/2015/06/12/detecting-heart-rate-with-kinect.aspx

Kinectv2HeartRate

Kinect for Windows v2 Heart Rate Library

image

This application is a .Net WPF application which uses the R Statistical programming language engine version > 3.12. This application requires the R engine to be installed on the system running the application. R can be installed from here: http://cran.r-project.org/ The WPF application utilizes the Kinect RGB, IR, and Face streams of data to determine a region around the face and calculate a spatially averaged brightness over time. The averaged values are then divided by their respective standard deviations to provide a unit variance value. These values are required for feeding into ICA algorithms. The values are saved into a csv file for processing with other Machine Learning techniques and algorithms.

The basic approach is simple. When a person's heart pumps blood, the volume of blood is pushed through various veins and muscles. As the blood pumps through the muscles, particularly the face, the more light is absorbed, and the less brightness the a web camera sensor picks up. This change in brightness value is very minute and can be extracted using mathematical tricks. The change in brightness is periodic. In other words, a signal or wave. If we can match the signal/wave to that of a blood pulse, we can calculate the heart rate.

In order to match the change in brightness to a blood pulse we use the Independent Component Analysis (ICA) concept. This concept is the cocktail party concept and is the basis for finding hidden signals within a set of mixed signals. If you have two people talking in a crowded room, and you have microphones placed at various locations around the room, ICA algorithms let you take a mixed sample of signals, such as sound waves, and calculates an estimated separation mixture of components. If you match the separate components to the original signal of a person speaking you have found that person in the crowded room.

This ICA concept is also known as blind source separation, and this project uses the JADE algorithm for R, to provide the separation matrix of components for the R,G, B, IR mixture of data. The separate components then have their signals extracted using a fast Fourier transform to find a matching frequency range of a heart rate.

Project Source URL: https://github.com/dngoins/Kinectv2HeartRate

Couple other times we've highlighted Dwight's work;

Contact Information:




Unity Asset - Kinect v2 with MS-SDK Tips, Tricks and Examples

$
0
0

For the last couple weeks I've been highlighted the Kinect Unity Assets of Rumen Filkov (aka RF Solutions), Unity Asset - Kinect [v1] with MS-SDK and Unity Asset - Kinect v2 with MS-SDK.

Today I'm wrapping up the series by sharing a great blog post from Rumen on using his “Kinect v2 with MS-SDK” asset...

Kinect v2 Tips, Tricks and Examples

After answering so many different questions about how to use various parts and components of the “Kinect v2 with MS-SDK”-package, I think it would be easier, if I share some general tips, tricks and examples. I;m going to expand this article in time with more tips and examples. Please drop by from time to time to check it out.

Table of Contents:

What is the purpose of all manages in the KinectScripts-folder
How to use the Kinect v2-Package functionality in your own Unity project
How to use your own model with the AvatarController
How to make the avatar hands twist around the bone
How to utilize Kinect to interact with GUI buttons and components
How to get the depth- or color-camera textures
How to get the position of a body joint
How to make a game object rotate as the user
How to make a game object follow user’s head position and rotation
How to get the face-points’ coordinates
How to mix Kinect-captured movement with Mecanim animation
How to add new model to the FittingRoom-demo

...

Project Information URL: http://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/

Finally, remember the Unity Asset Store is your friend... For example, check out all these Kinect Assets

image

Contact Information:




Raspberry Pi 2 and the Kinect, Making a Hand Held Scanner

$
0
0

This project is off to a great start and something I think we should keep an eye on...

Proof of concept 3D Scanner with Kinect and Raspberry Pi2

I am working on a proof of concept standalone mobile 3D Scanner. Hopefully it will be possible to use a Raspberry Pi2 for this project. I have already posted a video on youtube. And some people asked for short instructions on how to run the Kinect on the Raspberry Pi2. And here it comes….

First i printed and modified a Kinect handle i found at Thingiverse. I remixed this handle and added a Raspberry Pi2 and a display mount to it.  You can find the files at: http://www.thingiverse.com/thing:698577

IMG_2821_preview_featured

You  can get the Raspberry Pi display from watterott.com, instructions for installing the display can be found on GitHub. I recommend to use the current Raspberry Pi Image which you can also find on GitHub.

IMG_2823_preview_featured

Start with the clean display Image. I used the libfreenect for some experiments. It seems that libfreenect provides all the functionality which is provided by the Kinect.  Lets start!

First of all we need to install all required libs. We start with an update of the packaged list.

...

Project Information URL: http://www.mariolukas.de/2015/04/proof-of-concept-3d-scanner-with-kinect-and-raspberry-pi2/




Kinect Helps Detect PTSD

$
0
0

As a cold war army veteran and having a son that deployed to Afghanistan this post hit close to home....

Kinect helps detect PTSD in combat soldiers

...

According to the U.S. Department of Veterans Affairs, PTSD affects 11 to 20 percent of veterans who have served in the most recent conflicts in Afghanistan and Iraq. It’s no wonder, then, that DARPA (the Defense Advanced Research Projects Agency, a part of the U.S. Department of Defense), wants to detect signs of PTSD in soldiers, in order to provide treatment as soon as possible.

One promising DARPA-funded PTSD project that has garnered substantial attention is SimSensei, a system that can detect the symptoms of PTSD while soldiers speak with a computer-generated “virtual human.” SimSensei is based on the premise that a person’s nonverbal communications—things like facial expressions, posture, gestures and speech patterns (as opposed to speech content)—are as important as what he or she says verbally in revealing signs of anxiety, stress and depression.

The Kinect sensor plays a prominent role in SimSensei by tracking the soldier’s body and posture. So, when the on-screen virtual human (researchers have named her Ellie, by the way) asks the soldier how he is feeling, the Kinect sensor tracks his overall movement and changes in posture during his reply. These nonverbal signs can reveal stress and anxiety, even if the soldier’s verbal response is “I feel fine.”

SimSensei interviews take place in a small, private room, with the subject sitting opposite the computer monitor. The Kinect sensor and other tracking devices are carefully arranged to capture all the nonverbal input. Ellie, who has been programmed with a friendly, nonjudgmental persona, asks questions in a quiet, even-tempered voice. The interview begins with fairly routine, nonthreatening queries, such as “Where are you from?” and then proceeds to more existential questions, like “When was the last time you were really happy?” Replies yield a host of verbal and nonverbal data, all of which is processed algorithmically to determine if the subject is showing the anxiety, stress and flat affect that can be signs of PTSD. If the system picks up such signals, Ellie has been programmed to ask follow-up questions that help determine if the subject needs to be seen by a human therapist.

...

Giota Stratou, one of ICT’s key programmers of SimSensei, provided details on the role of the Kinect sensor. “We used the original Kinect sensor and SDKs 1.6 and 1.7, particularly to track the points and angles of rotation of skeletal joints, from which we constructed skeleton-based features for nonverbal behavior. We included in our analysis features encoded from the skeleton focusing on head movement, hand movement and position, and we studied overall value by integrating in our distress predictor models.”

...

Key links

Project Information URL: http://blogs.msdn.com/b/kinectforwindows/archive/2015/07/01/kinect-helps-detect-ptsd-in-combat-soldiers.aspx




Building the NUI Future...

$
0
0

Today's presentation is from Vincent Guigui who talks about the future of the NUI.

Building the Future of User Experience

from Vimeo.

Kinect, Oculus, Holograms, Wearables, Smart Objects... Over the past few years, we have seen a rise of the new devices and sensors coming to our everyday life.

This session will explain the principles of interfaces, what is innovation and how to use these new devices to create more natural and more personal computing experiences by blurring the line between our world and the digital one.

Project Information URL: https://vimeo.com/131932860, http://fr.slideshare.net/gcrao78/ncraftsio-2015-future-of-user-experiences 




Researching with HoloLens (and be awarded Dev Kits and Cash)

$
0
0

You're a researcher. You've seen HoloLens. You've got this great idea how you can use the HoloLens to change the world. You just need a little help... Say maybe Two HoloLens Dev kits and $100,000...

Academic Research Request for Proposals

Microsoft believes that mixed reality can be used to create new experiences that will contribute to advances in productivity, collaboration, and innovation. We engage with researchers across many disciplines to push boundaries in the state of the art at the intersection of software and hardware.

Microsoft HoloLens goes beyond augmented reality and virtual reality by enabling you to interact with three-dimensional holograms blended with your real world. Microsoft HoloLens is more than a simple heads-up display, and its transparency means you never lose sight of the world around you. High-definition holograms integrated with your real world will unlock all-new ways to create, communicate, work, and play.

On this page

Goals

The primary goal of this request for proposals (RFP) is to better understand the role and possible applications for holographic computing in society. Additional goals are to stimulate and advance academic research in mixed reality and encourage applications of holograms for novel purposes.

Proposals are invited from, but not limited to, the following areas of interest:

  • Data visualization
    • Example: Using mixed reality to make large data sets easier to navigate and understand
  • Evolution of pedagogy in STEM, medical, and design education
    • Example: Using existing 3D assets or new 3D assets for high-value training (e.g., interactive 3D models for medical training)
  • Future of communication and distributed collaboration
    • Examples: Remote training and support, first-responder emergency management, and virtual conferences
  • Interactive art and experimental media
    • Examples: Narrative storytelling, new forms of artistic expression, interactive journalism
  • Psychology-related topics
    • Examples: Human perception and human-computer interaction
  • Solving difficult problems and contributing new insights that are specific to the applicant’s field

Monetary and hardware awards

  • Microsoft anticipates making approximately five (5) awards consisting of US$100,000 and two Microsoft HoloLens development kits each. All awards are in the form of unrestricted gifts, which are delivered directly to the universities for the purpose of funding the winning proposals.
  • The awards are intended to be used for seed-funding larger initiatives, proofs of concept, or demonstrations of feasibility. It is important to understand that funding is not expected to continue after the first year and that PIs who are granted the Microsoft HoloLens Research Awards should therefore make every effort to use the award as one component of a diverse funding base in a larger or longer-running project. Proposals with a clear plan to secure co-funding are encouraged.

...

Submission process

Proposals must be written in English and submitted through the online application tool (https://cmt.research.microsoft.com/HoloLensRFP) no later than 11:30 P.M. (Pacific Daylight Time) on September 5, 2015.

...

Project Information URL: http://research.microsoft.com/en-us/projects/hololens/default.aspx




Mousing around with the Kinect v2

$
0
0

Friend of the Gallery and newly minted Microsoft Kinect MVP (Congrats!) Tango Chen is back with an extremely common request, using the Kinect to control a mouse (with source!)

Kinect v2 Mouse Control w/ Source Code

A mouse control application for Kinect v2, including a couple of options for various uses.

I’m so glad that I’ve just become a Microsoft Kinect MVP since this July, so I think I need to do more things.

One request asked most since the original Kinect came out is, “can I use my hand to control the mouse cursor?” There you can find a few applications for doing this with original Kinect on the web.

These days, some people still asked me for this kind of application for Kinect v2. Then I strangely found there’s no such downloadable applications on the web. I gotta release a Kinect v2 version app. And here’s it, with source code!


Options:

  • Mouse Sensitivity
  • Pause-To-Click Time Required

    The time you hold your hand for as a click

  • Pause Movement Thresold

    How large the circle range you hold your hand inside for a little while would be a click

  • Cursor Smoothing

    The large it is, the smoother the cursor would move, more slowly as well.

  • Grip Gesture
    Grip to drag/click
  • Pause To Click
    Hold your hand and don’t move for a little while to click.
  • No clicks, move cursor only

Project Information URL: http://tangochen.com/blog/?p=2137

Project Download URL: Kinect V2 Mouse Control - EXE

Project Source URL: https://github.com/TangoChen/KinectV2MouseControl

Contact Information:

Related past posts you might find interesting;





Kinect Studio Revisited

$
0
0

We've highlighted the Kinect Studio a number of times...

...but it's been over a year since our last post on it, so this post from the Kinect for Windows Team is nice and timely.

Kinect Studio lets you code on the go

...

Luckily for Anup, Kinect Studio makes coding for Kinect for Windows applications a lot easier to pack into a crowded day. Kinect Studio, which is included in the free Kinect for Windows SDK 2.0, allows a developer to record all the data that’s coming into an application through a Kinect sensor. This means that you can capture the data on a series of body movements or hand gestures and then use that data over and over again to debug or enhance your code. Instead of being moored to a Kinect for Windows setup and having to repeatedly act out the user scenario, you have a faithful record of the color image, the depth data, and the three-dimensional relationships. With this data uploaded to your handy laptop, you can squeeze in a little—or a lot—of Kinect coding whenever time permits.

Let’s take a quick look at the main features of Kinect Studio. As shown below, it features four windows: a color viewer, a depth viewer, a 3D viewer, and a control window that lets you record and playback the captured image.

The four windows in Kinect Studio, clockwise from top: control window, color viewer, depth viewer, and 3D viewer

The color viewer shows exactly what you’d expect: a faithful color image of the user scenario. The depth viewer shows the distance of people and objects in the scene using color: near objects appear red; distant ones are blue; and objects in-between show up in various shades of orange, yellow, and green. The 3D viewer gives you a three-dimensional wire-frame model of the scene, which you can rotate to explore from different perspectives.

The control, or main, window in Kinect Studio is what brings all the magic together. Here’s where you find the controls to record, save, and play back the captured scenario. You can stop and start the recording by moving the cursor along a timeline, and you can select and save sections.

Once you’ve recorded the user scenario and saved it to your laptop in Kinect Studio, you can play it over and over while you modify the code. The developers at Ubi, for instance, employ Kinect Studio to record usability sessions, during which end users act out various scenarios employing Ubi software. They can replay, stop, and start the scenarios frame by frame, to make sure their code is behaving exactly as they want. And since the recordings are accessible from a laptop, developers can test and modify their Kinect for Windows application code just about anywhere.

Using Kinect Studio to perform and analyze user experience studies

For Anup, it means that he can code during the bus ride home or in bed, after his children have gone to sleep. “Kinect Studio doesn’t actually increase the number of hours in the day,” he says, “but it sure feels like it.”

Project Information URL: http://blogs.msdn.com/b/kinectforwindows/archive/2015/07/10/kinect-studio-lets-you-code-on-the-go.aspx




NextStage - Realtime Camera Tracking for Kinect

$
0
0

Today's commercial project shows off just how powerful the Kinect really can be...

NextStage

For the past year I’ve been developing an application called NextStage. NextStage turns the Kinect V2 into a realtime virtual production camera, by tracking retroreflective markers in a scene.

More information can be found at NextStagePro.com and in the video below:

This is full 6 degree of freedom tracking running in realtime. Compared to the 6 dof tracking in Kinect fusion, it does take more time to set up the markers. However it can track over flat surfaces, is less processor intensive and doesn’t require a powerful GPU like fusion, can handle fast motion, dynamic objects in the scene, and doesn’t have the same drift errors that fusion can have.

I know people don't normally post their applications on this form, but I think there are some features relevant to Kinect developers and enthusiast.

There are two versions of NextStage, and NextStage Pro can stream the tracking data out to other applications using the OSC framework. This stream includes the Kinect’s position in meters, quaternion rotation, euler rotation, and the Kinect timestamp. Since multiple applications can access the Kinect at once, you can run NextStage in the background and stream the data out to your own Kinect project.

The marker sets that NextStage uses to track markers can also be shared between installations of NextStage. This can be used to very quickly calculate the difference in position and rotations between multiple Kinects.

I’ve been developing this application pretty much in a vacuum, but I’m very excited to finally get it out into the world. Please let me know if you have any questions, comments or concerns.

---

- Realtime Camera Tracking

When combined with infrared or retroreflective markers, NextStage is capable of instantly and accurately tracking position and rotation in 3D space.

- Instant Matchmoving

6DOF tracking lets users easily combine live action footage with virtual objects and sets, without the need for tedious frame-by-frame post processing.

- Depth-based Keying

Separate live action subjects from the background in realtime. Depth mattes let users place live action people or subjects on a virtual set without the need for green screen.

- Creative Effects

Depth mattes can be used as an instant, high quality garbage matte for green screen footage, or to quickly rotoscope actors and objects.

- HD Capture

Capture uncompressed RGBA footage in 720p with NextStage Lite, or sync tracking data to an external camera with NextStage Pro.

- Flexible Workflows

NextStage Pro lets users export 30hz tracking data to sync external cameras and devices at 24, 25 and 30 frames per second.

Project Information URL: https://social.msdn.microsoft.com/Forums/en-US/5b3ce727-7289-4a4c-a745-b635d157e9bc/nextstage-pro-realtime-camera-tracking-for-kinect?forum=kinectv2sdk, http://nextstagepro.com/




"Anatomy for Sculptors"

$
0
0

Today's post isn't really Kinect related, but is augmented reality and just kind of cool and not something I run across often...

New 3D Augmented Reality Book - Anatomy for Sculptors

This book with 3D model images will help painters, sculptors, illustrators and CG artists to develop sculptures, paintings or digital images.

Head & Neck Anatomy is the latest book of Anatomy for Sculptors. The thing that makes this book different than other medical books is the use of augmented reality. This advanced technology has been used in the book to provide readers with 3D images of the head and neck.

By integrating 3D imagery in the book, the readers will be able to understand it in a better way.

Project Information URL: http://x-tech.am/new-3d-augmented-reality-book/

ANATOMY FOR SCULPTORS

image

PDF e-book

226 page most easy-to-use human anatomy guide for artists, explaining the human body in a simple manner. The book contains keys to figuring out construction in a direct, easy-to-follow, and highly visual manner. Art students, 3D sculptors and illustrators alike will find this manual a practical foundation upon which to build their knowledge of anatomy – an essential background for anyone wishing to draw or sculpt easily and with confidence!

Uldis Zarins the book presentation on uartsy.com - http://www.uartsy.com/program-info/anatomy-for-sculptors-free-webinar-replay-july-2014




Things to check when running Kinect for Windows apps on Windows 10

$
0
0

Friend of the Gallery, Abhijit Jana, recently posted a couple tips for those of you Kinect v2 Dev's moving to Windows 10.

Running Kinect for Windows applications on Windows 10 – Things you should verify

Running a Kinect v2 device and a Kinect for Windows application on Windows 10 is not difficult nor different than what we have seen in the earlier version of Windows Operating System.  You can run a Kinect for Windows application (either a desktop app, or a store app ) on Windows 10. However, incase you found that your device is not detected properly, application is not running, or not able to read the data from sensor, please verify following.

1.  Verify Device Settings

The very first things you need to verify, if your device is connected and loaded properly. 

Go To PC Settings –> Devices –> Connected Devices

...

2.  Verify Privacy Settings

This settings is required to verify only for “Store App”.  For Kinect for Windows Store App, we must select “Microphone” and “Video” Capability” in the app manifest file. This enables the apps to access the Camera and Microphone for the targeted device.

...

Points to be remember
  1. This Privacy Settings is only for Store Apps. Normal desktop app would work without this.
  2. Even if the App is not running, the PC settings would be available once the app is deployed with capability added. You can do the necessary change in settings and start to app as well.

Project Information URL: http://dailydotnettips.com/2015/08/01/running-kinect-for-windows-applications-on-windows-10-things-you-should-verify/

Contact Information:




Kinect'ing with Gregory Kramida, University of Maryland

$
0
0

Shahed Chowdhuri recently posted a great interview, something I don't see near often enough...

UMD Kinect Q&A: an interview with Gregory Kramida at the University of Maryland

We’re here with Gregory Kramida to talk about his Kinect group projects at the University of Maryland.

Gregory Kramida at UMD

Gregory Kramida at UMD

1. Greg, tell us a little bit about yourself and your team.

2. How did you get started with Kinect development?

3. What programming languages and libraries/utilities are you using?

4. What kind of challenges and limitations have you faced? How did you overcome them?

5. How many Kinect sensors are you using from a single application? Can you more details about your configuration/setup?

6. What are the practical applications of the work you’ve done so far? What is the future direction of your projects?

7. Do you have any advice for other Kinect developers out there?

[Click through to read the entire post, including the answers... ;) ]

Project Information URL: http://wakeupandcode.com/umd-kinect-qa/

Contact Information:




Kinect to Minecraft with McAmusement

Hello Channel 9 and Hello Kinect

$
0
0

When a new Niner makes their first post about the Kinect, well that's got to be highlighted!

Everyone say hello to Amanda Lange and check out her first video...

Kinect 100 - August 2015

image

In this entry I'm going to discuss Kinect and how to get started.

There's a ton of resources on Kinect on Channel 9, so you may wonder why I'd do a video like this.

Mostly, it's because I run into a lot of confusion when I'm starting people out on Kinect, and I wanted to do a brief overview of the absolute basics to make getting started less intimidating!

For ideas for Kinect projects, check out the Coding4Fun channel! https://channel9.msdn.com/coding4fun/kinect

If you want to go much deeper, watch the Programming for Kinect for Windows V2 series! https://channel9.msdn.com/Series/Programming-Kinect-for-Windows-v2

Project Information URL: https://channel9.msdn.com/Blogs/Amanda-Lange/Kinect-100-August-2015

Contact Information:





Kinect v2 Minecraft for You

$
0
0

One of our most prolific Kinect Developers and Kinect MVP, Friend of the Gallery Tango Chen (Jingzhou Chen) is back with another take on Minecrafting with the Kinect

Minecraft in Real Life using Kinect v2

Hey guys, I just developed this Kinect v2 application a few days ago. It lets you be the Steve and interact with Minecraft stuff.

And, you can get it for free! For more information, please check out this Production Page.

Project Information URL: http://tangochen.com/blog/?p=2200

Kinect v2 Minecraft

Instruction

Body Interactions:

  • You have an ax and a block. At start, you have an ax on your right hand.
    To switch between an ax and a block, move your right hand behind your back and move back.
  • You don’t need to collect resources. To place a block, simply do a push-down with your right hand.
  • To break blocks, swipe your right hand with an ax.
  • Even when you break the blocks on bottom, the blocks above won’t fall down. To enable gravity to all the blocks you have now, do a jump.
  • Throw the ax like I did in the video!

Keyboard Control:

  • Hide yourself from the detectable range of Kinect sensor and Press G – G meaning “Ground”. This lets the program know your environment better. So when you have a chair in your room, the blocks may collide with it but not pass through it. However, this is optional, the program still works if you don’t do this.
  • Press R – Reload the application.
  • Press Esc – Exit the application.

Requirements

Basically the PC that you are able to run any other Kinect v2 applications on(for example, Kinect v2 sample application in SDK Browser v2.0), can run this application.

Details:

  • A Kinect v2 sensor, which includes a power hub and USB cabling
  • Windows 8 or 8.1, Windows Embedded 8, or Windows 10
  • USB 3.0 controller
  • 64-bit (x64) processor
  • Physical dual-core 3.1 GHz (2 logical cores per physical) or faster processor
  • 4 GB of RAM
  • Graphics card that supports DirectX 11

Get It For Free

Yes, you can get it for free, but ...

Project Information URL: http://tangochen.com/blog/?page_id=2212

Contact Information:

Related past posts from Tango you might find interesting (See what I mean about prolific!);




Kinect To MVA

$
0
0

This week Coding4Fun us doing a Theme Week, focusing on the great free courses available from the Microsoft Virtual Academy.

We highlighted many of these Kinect courses before, but today, we're highlighting ALL the Kinect courses languages... :)

Coding for Kinect with Scratch

Level 100  |  Published: 4 months ago

Would you like to know how to build natural user interface (NUI) programs using Microsoft Kinect and the Scratch...

Programming Kinect for Windows v2 Jump Start

Level 200  |  Published: 1 year ago

Devs, are you looking forward to building apps with Kinect for Windows v2? In this Jump Start, explore the brand new...

Quick Start Challenge: Kinect v2 Sensor and openFrameworks

Level 100  |  Published: 8 months ago

Want to work with the Kinect sensor v2? In this hands-on lab, learn how to use the Kinect sensor v2 in an openFramework...

Introduction au développement Kinect V2

 

Level 100  |  Published: 7 months ago

Ce cours sur le développement  Kinect V2, est destiné aux  débutants qui n'ont jamais touché ni de près ni de loin aux...

Einführung in Kinect for Windows v2

Level 200  |  Published: 6 months ago

Der Kurs „Kinect for Windows“ bietet einen Überblick über den Kinect Sensor und führt euch durch die ersten Schritte...

快速入门 : Kinect for Windows v2 开发

Level 200  |  Published: 6 months ago

本系列应用程序开发课程涵盖了 Kinect 开发的方方面面。对主要功能包括 Kinect Fusion、HD Face 等进行了深入解析。介绍了 Kinect 应用与第三方类库包括 Unity、Cinder、Openframework...

Desarrolla tu primera aplicación para Kinect V2 con Visual Studio

Level 200  |  Published: 1 year ago

Conocerás las nuevas características de Kinect para Windows V2.0 y cómo desarrollar tu primera aplicación Windows 8.1...




Unity, Kinect and Kristina

$
0
0

A few weeks ago we highlighted a new Kinect video on Channel 9, Hello Channel 9 and Hello Kinect. Since then we've had not only one but two new videos added by another Niner!

Gesture Control with Kinect and Unity made

In case you're curious about how to implement gesture control in your Unity game, you can easily follow my little video guide now. It is loosely based on Pete Daukintis' guide, which you can read here.

Project Information URL: https://channel9.msdn.com/Blogs/2p-start/Gesture-Control-with-Kinect-and-Unity-made-easy

DIY motion capture with Kinect 2, Unity and Cinema MoCap

Have  you always wanted your characters to walk around like you do? Then look no further :) Using Kinect v2 in conjunction with Unity you can record your own animation data, even without having a marker suit for example. In my video I show you how you can use the tool Cinema Mo Cap in order to record and import your animations to a 3D Model of your choice - Have fun!

Here are the tools that I used:

Project Information URL: https://channel9.msdn.com/Blogs/2p-start/DIY-motion-capture-with-Kinect-2-Unity-and-Cinema-MoCap

Contact Information:




Awesome made Kinect v2 and Unity

$
0
0

A few months ago we highlighted some Kinect Unity projects from Rumen Filkov (aka RF Solutions), Unity Asset - Kinect v2 with MS-SDK, Unity Asset - Kinect v2 with MS-SDK Tips, Tricks and Examples, Unity Asset - Kinect [v1] with MS-SDK

Today we're highlighting some of the great projects made with the Kinect v2 and Unity

Made with Kinect v2 and Unity

Here I’m posting videos, sent to me by users of the Kinect-v2 asset. They are better examples of what is possible with this Unity asset, than many pages of long descriptions. I’m also going to update the videos on this page from time to time, with the cool new things I get from you.

  • Hoverboard VR Tech Demo by Sander Sneek – virtual skateboard ride, using “the most immersive experience possible with the current hardware available:

  • Halo over the head and selfie-pictures, by the ever creative Ricardo Salazar:
  • Cool particles and gravity dance, by Ricardo Salazar:
  • A sample by momothemonster of the four scenes they built at Helios Interactive. These apps are live in 50 Best Buy stores across the US:
  • Minions – Digital Launching Campaign with Augmented Reality and Kinect, by Adrian Posteuca:
  • FireWings- Huawei P8 launch event, by Adrian Posteuca:
  • Interactive augmented reality installation, by Ricardo Salazar:

Project Information URL: http://rfilkov.com/2015/08/20/made-with-kinect-v2-asset/

Contact Information:




Vitruvius: Your Need for Kinect Dev Speep Solution

$
0
0

Vangos Pterneas, Friend of the Gallery has just released one of the most exciting Kinect foe Windows products I've seen in a long time. Being a Kinect MVP, he's heard and seen all the questions, needs and wants of the Kinect Community, and in this one package (of which there's a free to platinum tiers) he seems to touch all the bases!

Vitruvius: Developing Kinect apps in minutes

Vitruvius Kinect by Vangos Pterneas

During the past few months, I have been quite inactive. It was for a good reason, though. My team and I have been working on an ambitious project that helps Kinect developers build apps faster. I would like to introduce Vitruvius.

What is Vitruvius?

During the past few years, Kinect developers from all over the world have been creating outstanding Kinect apps, utilizing the power of Kinect SDK 2, Computer Vision, complex Mathematics, and Linear Algebra. Kinect development is hard and demanding. Through my blog posts (and your 600 comments), I have been trying to simplify the process and make it easier for students, researchers and engineers.

Kinect development should be fun.

So, after three years of professional experience and a lot of successful commercial projects, I developed a framework that accomplishes hard tasks using a few lines of code. Please welcome Vitruvius.

Vitruvius is a set of tools, utilities, controls and extensions that simplify Kinect development.

Features

Vitruvius is already used by big companies and institutions out there. I was proud to learn that NASA, XEROX and Michigan State University are actively using Vitruvius into their internal research projects.

Here’s exactly what’s included:

Avateering (3D model animation)

Animating a 3D model using Kinect is a very tough process. Not any more. Vitruvius lets you animate your FBX rigid models with one line of code! Never worry about the joint orientations or the complex Mathematics you need to apply. The following line of code animates a 3D model:

..

Angle calculations

Calculating the angle between selected joints is also a matter of one line of C# code. Calculating an angle will help you take human body measurements with accuracy. You can specify the target axis (X, Y, Z) or simply calculate the angle in the 3D space. Vitruvius also provides you with a handy Arc control to display one or multiple angles on top of a body:

...

HD Face Extensions

Kinect SDK includes the most powerful Face API in the world. Face and HD Face APIs lets you access over 1,000 facial points. Vitruvius provides an easy-to-use Face class that is populated with Eyes, Nose, Chin, Cheeks, Jaw and Forehead properties. No need to mess with the thousands of points. Accessing the facial properties has never been more straightforward:

...

Bitmap manipulation & Background removal

Converting a Kinect RGB, depth, or infrared frame into an image is, indeed, confusing. Bitmaps are handled by each programming platform. For example, XAML represents bitmaps using the WriteableBitmap class. Unity supports Texture2D. Vitruvius supports everything! We access the raw data of any Kinect frame and convert it to the corresponding image format.

...

Coordinate Mapping

Kinect developers have been struggling to convert points between the 3D world-space and the 2D screen-space. Kinect SDK includes CoordianteMapper, a useful tool that converts between camera space and depth/color space. Vitruvius simplifies Coordinate Mapping with the Vector and Point functions:

...

Body Extensions

Handling body data effectively is a core part of my Kinect philosophy. A lot of people ask me how to identify the closest body, the number of the tracked joints, or the height of the players. The built-in Body class provides a lot of great information, but there is always something missing. Vitruvius fills these gaps and gives you everything you need to create a solid experience for your NUI applications:

...

Gesture detection

Natural User Interfaces are strongly relying on gestures. Gestures help users accomplish tasks easily using their hands or body. Vitruvius supports waving, swiping, and zooming.

...

Documentation & Samples

Vitruvius supports WPF, Windows Store and Unity3D. There is extensive MSDN-like documentation for every platform.

The following demos are included in the Download section:

  • Avateering
  • Angles
  • Face
  • Fitting Room
  • Frame View
  • Gestures
  • Green Screen

Additionally, Vitruvius integrates amazingly well with your existing projects, too. No need to re-invent the wheel or change your coding style. Simply call any function you want from your existing code. It’s that simple!

Performance

We are taking performance seriously. Vitruvius is designed to use as few resources as possible. It only consumes what’s really necessary. There are no duplicates or unnecessary copies. Everything is real-time. As long as your computer meets the standard Kinect SDK requirements, you can use Vitruvius at no risk.

Project Information URL: http://pterneas.com/2015/09/26/vitruvius/, Vitruvius.

Project Download URL: http://vitruviuskinect.com/download/

Contact Information:




Viewing all 220 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>