Monday, February 25, 2013

Sorting out the Kinect skeleton feature



The Kinect has some very clever software and firmware which enables it to capture the skeleton patterns of up to two people at the same time.  I have been tracking the progress of the Kinect since it's introduction and bought a commercial, as opposed to an Xbox release, when it became available in 2012.  With the rapid completion of the Inmoov robot print here and the successful testing of the Mini Maestro 24 servo controller with the Hitech HS-805BB servos used to move the arms and shoulders, I began to delve into the Kinect SDK.

The Kinect can capture skeleton patterns in two modes; full body...



...and upper body {seated}




Extracting the Cartesian coordinates of the bones in three dimensions is straightforward.  Being a naturally suspicious sort, however, I decided to check to see how consistent the length of the individual bones actually was.  As I suspected, they vary.  This shouldn't be surprising considering the wickedness of trying to figure out the position of bones buried in flesh and muscle starting only with surface measurements of the body and on top of that making the calculations at a rate of thirty frames per second.

Just eyeballing the captured data it appeared that the bone lengths were varying wildly.  When I took a mean and standard deviation for each bone, however, I discovered that the calculation was remarkably consistent. I made a test data set using myself as the subject.  The measurements are in meters.


Mean SD
Shoulder
Left 0.217 0.017
Right 0.218 0.013
Humerus 
Left 0.242 0.025
Right 0.241 0.026
Forearm
Left 0.235 0.024
Right 0.235 0.026
Palm
Left 0.089 0.034
Right 0.086 0.029










The numbers are not too bad.  The wrist/palm measurements are the worst.  I have extra code which I hope will give better results for the hands and fingers.

The next step will be to see if I can transform the coordinates into something more useful with the Inmoov by fixing the bone lengths to calibrated means.

Saturday, February 23, 2013

Kinect Progress 13-02-23



This morning, I got a note from the UP 3D printer people that they wanted to do a Skype video session to put my printer right.  I thought they meant right then, but was wrong, apparently.  While the new extruder heater block helped matters it didn't fix my UP.

In the meantime, I worked on the Kinect motion capture task.  I have it capturing the upper torso, head and arms and transferring it to a disk file.  My programme is very crude.  I simply identify the joints and put the coordinates in a list box.  Surprisingly, my PC and Visual Basic 2010 is able to keep up with the 30 frames/second that the Kinect does that process at with no effort whatsoever.  After I record a session, I simply save the XML formatted information to a file.

My next task is to restrict the saved output to the left upper torso, shoulder and arm and to covert the cartesian coordinates to joint specific coordinate systems which match the kinematics that Gael has designed into the Inmoov arm.

Wednesday, February 20, 2013

Digging into the Kinect


I've been avoiding getting into .NET for years, but now I am having to get on with learning it.  The first thing I've encountered is a whole new set of controls called WPF {Windows Presentation Foundation} controls.  So far, so good.

Starting with the Skeleton tracking routine written in VB.NET 2010, I dug around in the data structure and managed to locate the Cartesian coordinates of the major body joints that Kinect is so adept at locating.  They are generated at 30 times/second and fill up a listbox rather quickly.


Watching the graphics display, it appears that adding a simple motion smoothing routine would be very useful.  I'm surprised that something of the sort wasn't included in the firmware.

The scale of the coordinates appears to be meters.

Monday, February 18, 2013

Kinect is active


I was able to get Kinect going and run it through most of its paces.  It can capture a 3D radar image out to about 4-5 meters as you can see with this colour coded screen grab.




This perspective projection gives you, perhaps, a better feel for the depth of the space.




Oddly, it flips the data from left to right.  I don't know what that is about yet.


It does face tracking down to about a meter away from the Kinect and skeleton tracking down to about 2 meters.  I was unable to capture that with the very primitive Visual Studio apps that I have so far.  I thought it showed the orientation of the palm, but with what I have I apparently don't have that possibility.

I was also able to videotape the skeleton tracking option.  Here is the full body tracking.





I understand that Microsoft used quite a large farm of servers to develop the parameters for this feature.  Notice that you have trouble when a foot leaves the visual field of the Kinect.  Also notice the problem when you accidentally get one leg behind the other or behind an obstacle, in this case a stool.

Here we restrict tracking to the upper body, viz, "seated mode".




Notice that when the hands leave the visual field of the Kinect the same thing happens that happened when the feet left the visual field in full body mode.  Also note what happens when I placed one and the both hands on my head.I was able to locate some open source finger tracking software for the Kinect that was written in Spain.  I will be trying that next.

An alternative to hobby servos?




The University of Tokyo is going at creating animatronic muscles in an entirely different, yet simple to implement way.  They basically use a block and tackle arrangement to multiply the torque of a coreless 
motor instead of gears.




A robot, activated with this approach reminds me of nothing so much as an antique sailing ship.





I am not much impressed with their robot's walking, but other movements are quite natural.





It apparently achieves a much better kw/kg ratio than more conventional robots.


Saturday, February 16, 2013

Hmmm....

Ran one of the HS-805BB high torque servos in a minor stress test.  It faulted out in about five minutes.  It overheated and stopped.  I let it cool off and tried it again and it worked.  I came back half an hour later and tried it yet again and it was stone dead.

This is NOT good news.  Those things cost $40 each.

HS-805BB servos arrive

Although the United States Post Office is due to stop Saturday deliveries, I was mistaken in thinking that they were going to stop today, so I have my the high-torque HS-805BB servos for my Inmoov robot arm today instead of Tuesday.

I plugged one into the Pololu Mini Maestro 24 servo controller board and it appears to work perfectly.

Tuesday, February 12, 2013

Commissioning the Mini Maestro 24 servo controller



I managed to get the drivers installed and the Visual Basic 2010 example coding running on my PC and demonstrated that I could drive a small test servo from my keyboard.


My old ATX PC power supply seems to be able to handle the test servo without a lot of bother.  It is rated at 30 amps, so with luck I won't have to do what Gael did with batteries and a charger just yet.  In theory, this controller should be able to handle the whole Inmoov robot.  It will certainly handle one arm, I hope.




I order the HS-805BB servos.  They should be here by Friday.

Just discovered a spool of 250 yards of 8 lb monofilament.  That should work for the hand.

Monday, February 11, 2013

Left arm of Inmoov robot complete



Printing an assembly for the left torso, shoulder, bicep, forearm, wrist and hand is complete.


Front view of left torso assembly, shoulder, bicep and base of forearm.


Detail of rear view of upper torso and shoulder.


Left arm raised.


Forearm flexed.


Forearm extended.


Detail of lead screw and collar controlling the left shoulder.

The Hitech HS-805BB servos that drive the torso and arm kinematics are on order.

Sunday, February 10, 2013

Printing Gael Langevin's Inmoov Robot

Andreas Maryanto and I have been working on animatronic hands for some time now with Andreas' Dexhand project streets ahead of my own efforts.  About a year ago, I happened across a printed hand done by hairygael {Gael Langevin} in Thingiverse.  It was interesting enough that I printed a copy of it, but at that time Gael had not got very far with the design, so I neglected to follow his subsequent progress.

Big mistake!  About a week ago I discovered that Gael's modest beginnings had subsequently blossomed into a full fledged animatronic robot upper half.



I was both shocked and completely entranced by what Gael had accomplished in so short a time.  Gael was kind enough to make his print files open source, so I immediately began a campaign to print a copy of his robot.  Using his files, I have so far managed to print out a copy of his Inmoov robot's left arm.



I apologise for the poor quality of the pictures.  My own camera was lost a few months ago when my son's car was broken into in Seattle and I have been remiss in replacing it even though his insurance company was quite prompt in reimbursing him for the loses.

At the moment, I have not tried to install servomotors but have concentrated on understanding how the many pieces go together.  I shall be buying the torso, shoulder, bicep and wrist servos in the next week or two.