Notes from a Leap Motion Presentation, part I: JRuby

On Wednesday, February 13, 2913, I gave a somewhat casual presentation at HeatSync Labs to show off a few things I’ve learned about the Leap Motion. This is a write-up of most of that presentation. It has some additional detail that I did not cover that night.

What it is

If you don’t already know, the Leap Motion (or just Leap, which is easier to say and write), is a motion-detection device. Specifically, it detects hands, fingers, and “pointables” (things that might seem like a finger but are longer, like a quill or something). The product description says it covers eight cubic feet but as a practical matter it seems to work best within a two cubic foot area. In the month or so that I’ve been hacking around with the Leap I find it most comfortable if I can keep my hand movements within a smallish area in front of me. I haven’t had a case where I felt the need to reach up as high as I can, or put my arms all the way out. There are a few demos available for the Leap and they all seem to work within a two-foot sweet-spot. Still, I may run some tests just to see how much range I can get.

You can pre-order a Leap for US $ 69.00. I got mine through the Leap developer program; I’ve seen reports of there being 10,000 to 20,000 developers in the program. The Leap people seem quite determined that there be many programs available when the device is publicly available (currently estimated as this Spring).

The device is about the size of disposable lighter (there’s a picture on the product page). It’s really quite amazing. It will detect if there are any hands present, and how many (if any) fingers or pointable thingies. You can grab all sorts of detailed data, such as finger tip location, distances, palm rotation, and the like.

Leap has some forums that are publicly available and seem to be a pretty good source of myriad details. One thing I have not yet seen is a hardware tear-down. I imagine once the device is generally available some enterprising individual will take a peek inside and reveal the results.

The technical details are by no means required fun and interesting things with the Leap, but I’m interested in “off-label” usage and have some experiments in mind.

Doing stuff

To use the Leap you need to plug it in to a USB2 or USB3 port and start up a Leap executable. Think of this as the Leap “server”. You need this in order to get any data from the device. The SDK provides assorted compiled library files, including a jar file. There is code in assorted languages all demoing more or less the same basic program. Ruby is not among these, but the Java is so that means JRuby. And Processing.

After running some demos (including a very neat “Leap visualization” program) I converted the Java sample to JRuby. I’ve been working on Windows 7 (there are drivers for Windows and OSX, with Linux to come). Porting the code was not difficult; the trickiest part is in getting JRuby to find the required libraries. So, semi-pro tip: I use a .bat file to launch my JRuby programs, which allows me to set java.library.path, like so:

jruby -J-Djava.library.path=%LEAP_SDK_HOME%\lib\x64 %1

You need to set the LEAP_SDK_HOME environment variable, and update it when you get a newer SDK.

Once I ported the sample app I took a stab at extracting what seemed reusable into a JRuby Leap library. There’s not much to it; so far there hasn’t been a need. I expect it to evolve the more code I write. I’ll be providing some code here, but first I want to describe the general set-up.

The way a basic Leap program works is that the Leap device grabs what it can see and passes this on to your program as a “frame.” Your code pulls what it needs from the frame (hand count, finger location) and behaves accordingly. You need to instantiate an instance of the Controller class (this talks to the Leap device), to which you pass an instance of a Listener class. The Listener class is where you do your frame handling, so what you really want is a subclass of the base Listener.

A Listener needs to implement a few methods, primarily onInit() and onFrame(). onInit() is like initialize() in a Ruby app. onFrame is what gets called for each generated frame of data. It gets passed the Controller instance, from which you get the frame data (among other things). And off you go. The frame tells you about hands, fingers, etc.

Some code

This is what I’ve been using to load the Leap libraries into my JRuby code. I have it in a file named jleap-ng.rb. You can find the current version up on GitHub: github.com/Neurogami/jleap-ng.

It starts by pulling in the jar file:

Yes, this may be Windows-centric. Works for me, work in progress, etc.

Next I set up a base Listener class.

I also needed to make the Controller and Leap Vector classes nicely accessible:

There are other classes in the com.leapmotion.leap package but I’ve not so far needed to wrap them in my module. It’s trivial to add them, though, if it comes up.

That should be enough to use for a basic Leap app, but I ran into something else. JRuby does a good job of making Java classes behave much like Ruby. For example, the Leap Controller class has a method addListener that you can invoke as using the more typical Ruby form add_listener. When getting an array of Hand or Finger objects, where the Java code has get(0) you can use first. But not subscript notation. hand = frame.hands.first works, but not hand = frame.hands[0]. (Or at least not for me; if I’m doing something goofy here I’d love to know.)

This being JRuby, though, I can get all intertwingly with the Leap Java classes.

I was getting this exception when trying to use the subscript notation:

org.jruby.exceptions.RaiseException: (NoMethodError) undefined method `[]' for #<Java::ComLeapmotionLeap::HandList:0x4d223ca6>

So as to get a proper Ruby array to use I added this to the Neurogami::Leap module:

Presto. Now I can use subscript notation.

This is perhaps a sort of vanity addition; I didn’t really need this. This example, though, helps demonstrate something quite handy: If you (or I) want the built-in Leap classes to have additional or altered behavior then it can be added easily.

A Sample Program

This is not what I showed at my demo talk; that demo app (which I will cover later) does a few things that require some extra explanation and set-up. This example is sort of trivial but shows how you might use frame data. It is a basically the Java demo that comes with the Leap SDK.

By the way, the Leap SDK is not at all final, and anything or everything you read here might change. Since the public release is looming I would not expect any major alterations, but just know that things are still fluid. I’ll be updating my code as things change.

I will not go into detail right now about the Leap API. You can only run this code if you have a Leap, and if you have a Leap I expect you have the SDK and API docs. I think, though, what’s shown here is pretty obvious.

The program starts by loading the jleap-ng library and including the module that defines the key Leap classes. Since it is subclassing BasicListenerBasicListener it only needs to re-implement onInit, in order print out some useful “Yes we are running” text. If there is nothing you need to do when you initialize your listener than you could omit redefining this method.

You really cannot do without defining onFrame, though, because that’s where the flavor is.

The method starts by dumping some data. This can be useful if you are not getting the behavior you expect from your program. It’s a handy way, when getting started, to see that your device is detecting what you think is should.

Next, it looks to get the average position of all the fingers on the first hand it has found. You’ll note that the API provides some helper methods plus, divide) for the vector arithmetic. I was also able to use conventional Ruby-style method calls (e.g. palm_normal instead of palmNormal) on the Leap classes.

I’ve been putting off investing a lot of time in clever gesture recognition while the API is in flux, but there is some low-hanging fruit to be picked by referencing previous frames.

The example program saves off each frame as @previous_frame and uses that to watch for changes in hand scaling. That is, it lets you see if a hand has moved closer or further away. Realistically you may want to track changes like this over more than one frame, and include some means of distinguishing deliberate from incidental motion.

The method finishes up with more data dumping. Again, look over the SDK docs to see what’s available to you, and play around.

The program wraps up with the code that kicks it all off. It’s perhaps a too-literal conversion from the Java sample; you don’t really need to create a single-method class to run this, though when you integrate the Leap in a more useful program some kind of encapsulation can be useful.

Wrapping up this part

I run this programing using my launcher script:

_runJrubyDemo.bat jleap-ng\jruby-example.rb

I’ve not yet tried this on OSX. I have no reason to think it wouldn’t work modulo changes in the launching script.

I happen to be running JRuby 1.7.0 on Windows 7. It’s not the most current JRuby; I just haven’t gotten around to updating it. Again, I don’t expect any problems running the code with the current JRuby.

In the next part I’ll show the demo JRuby program I did actually run during my presentation. It’s much like the example program show here (i.e. onFrame triggers interesting stuff) with a few twists. It reads Leap data and provides a Web socket server to drive a JavaScript/HTML5 game based on a few simple gestures, as well as sending OSC commands for some additional game sound-effects.

The Leap Motion has it’s own Web socket server and I’ll describe some of the pros and cons of using the Leap Web socket directly from the browser versus going against a proxy socket server.

I’ll be providing most (maybe all) of the code on Github: github.com/Neurogami/jleap-ng

comments powered by Disqus