Skip to content

Developing for LEAP on Looking Glass

NOTE: Windows is the way to go when it comes to LEAP, as they never released the hugely improved "Orion" update on Mac. Orion took the LEAP from being a glitchy novelty to actually being pretty great, even though the hardware hasn't changed since 2013. For many people, the reputation from back then still holds. The SDK for the version that works on Mac (V2) hasn't been updated since Orion came out, so it's pretty old now.

First, make sure you have all the right software:

How does LEAP work?

Unlike depth cameras like the Kinect or RealSense, LEAP doesn't actually generate a depth map. It does also use infrared emitters and receivers, but to get an isolated, high-contrast image of the hands. It then feeds the last few frames into some pretty amazing computer vision operations to get a feasible idea of what all your hand bones are doing. That happens in the continuous LEAP service (runtime), which the LEAP unity SDK then queries for tracking information. The SDK itself is responsible for higher-level tasks like generating the colliders, meshes, pinch detection and so on.

What is LEAP good for in the Looking Glass?

The most straightforward use case for LEAP may also be the most powerful: using the collider hands to interact with simple 3D physics never gets old. Sophisticated interaction is possible with (relatively solid) pinch detection and contextual UI's that pop out and attach to your hands.

The LEAP runtime

If it doesn't appear to be running in the tray, you can search for a "Leap Motion Control Panel" program, which should start it up. The icon turns green when the LEAP is detected, and you can double click it to open the control panel. If your Leap has been sitting around since 2013 (or if you just want to make sure it's performing optimally) I'd advise using the calibrator on the troubleshooting tab. If you ever want to check the version, that's in the About tab.

The LEAP SDK

I'd strongly recommend downloading the example scene, which I've configured to be a pretty optimal starting point for LKG + LEAP. You could also start from one of the example scenes that come with the LEAP SDK, but most are set up for VR, with some stuff we don't want. Just don't even try to set it all up from scratch, that's asking for trouble!

To get it running, import the LEAP SDK, HoloPlay Unity SDK, as well as the example scene all into one Unity project. Then open the LEAP+LKG scene. You should see this:

leap and lkg

LeapServiceProvider/Controller

The central component of the LEAP SDK is LeapServiceProvider.cs, which usually lives on a "Leap Motion Controller" gameobject. Note this is distinct from LeapXRServiceProvider.cs, which is for headset-mounted tracking instead. The service provider draws a cone gizmo with white lines representing the tracking space of the LEAP, up to 2-3 feet away or so in meatspace. Scaling the controller object scales the cone and thus the range of motion of the hands in the scene, but doing so will cause drastic deformation due a frustrating quirk of the LEAP SDK. This is what we call "skeleton hands." The LEAP SDK assumes that everyone will build their scene to fit the controller's default (1,1,1) scale. The HoloPlay capture size defaults to 5, so you could scale it down to 0.2 or so and play by LEAP's rules, but thankfully I figured out:

The scaling hack

Each individual object (physics and graphics hands) needs to match the scale of the controller object. If you just scale a parent of all the hands (a natural assumption) it'll look right most of the time, but every so often the controller instantiates a new skinny or stubby hand rather than re-enabling the old one. I've included a barebones script to automatically scale them, there's an array of hands it'll try to scale to the size of the controller and it executes in edit mode.

leap long fingers

NOTE: why the long fingers? This didn't start happening until the last version, but this can occur. The fix is simple though, replace RiggedFinger.cs (in the LEAP SDK files) with this version. This is only necessary until this updated version finds its way into the released SDK, 4.4.1 or some such. This updated version is included with the example scene too.

The tracking space

As for positioning and scaling the tracking space, we generally let the visible area of the Looking Glass correspond to about a foot of physical height, starting at about 4 inches from the LEAP (the minimum for solid tracking). A more subtle consideration is the angle relative to the Looking Glass, which sits at a 61 degree angle to the table. Picture it this way: when moving your meat hands around on a plane parallel to your table, do you want the 3D hands to look like they're moving parallel to the table as well, or parallel to the bottom of the Looking Glass block? A conceptually accurate -29 degree offset always feels like too much for some reason, so I usually set it to somewhere in between, -15 degrees or so.

Angled 0 degrees 0 degrees
Angled 15 degrees 15 degrees
Angled 30 degrees 30 degrees

If you're moving or animating the HoloPlay Capture, you can make the controller a child of it to have the hands stay localized in-frame as the camera moves around. Keep in mind that changing the capture size will require you to manually scale and reposition the controller (and the pinch distance, see below).

The hands

The hands themselves are made up of a few separate components:

  • Graphical (meshes, renderers)
  • Physical (colliders, rigidbodies)
  • Attachments (transforms for connecting things to joints)
  • Pinch detector (pinchy pinch)

Graphical hands:

There are basic Capsule Hands made up of primitives, and the somewhat nicer rigged meshy ones called LoPoly Rigged Hands, which I use for everything.

Physics hands:

There are hands that use box colliders and hadns that use capsule colliders. I think capsules are almost always superior, unless you happen to have some particularly boxy fingers? Some of the prefabs contain wrist/arm colliders too, but only the capsule hands can actually draw meshes for the wrist/arm. I'd suggest simply deleting the "Forearm" collider objects if you're using the LoPoly Rigged Hands. NOTE: If you do ever want to use these alternate prefabs, pull one of each hand (L/R) into the scene and replace the old references on the HandModels object.

Attachment hands:

very useful if you want to have something following or pinned onto a specific hand joint. ALWAYS USE THIS instead of trying to attach something to the graphics or physics hands, sometimes it re-generates those so any attachments might get lost. This is for the same reason as the scaling fiasco described above.

The Attachment Hands component has a nice little window where you can toggle which bones have attachments, which summons new objects into existence. Any children you add to these will then be attached, and will pivot around the attachment point unless you zero out their local position and rotation.

hand attachments

In many cases, the palm transform point is the smoothest/most reliable, but there's plenty of times when you want something to follow or interact with only a fingertip. For that sort of thing, It's usually way easier to set up specific colliders and functionality on an attached object rather than on a part of the hand prefab itself.

There's also a script called AttachmentHandEnableDisable.cs which is critical in many cases- it enables/disables the attachments to match the state of the hand they're on. It's most often preferable to turn off the attachment when the hand disappears, rather than having it stick to the last tracked position. You need one for each hand, and they each need a reference to a hand. Yes, I do think this simple functionality should just be a bool in AttachmentHands.cs instead.

Pinching:

PinchDetector.cs references a HandModel (either physics or graphics) and triggers UnityEvents upon pinch and release. The trigger distance is in world space and you'll need to change it if you scale the controller. If the "ControlsTransform" bool is true, the PinchDetector object will update to move around relative to the hand. This is a useful way to get the pinch position, the midpoint between the thumb and index fingertips.

Final notes

  • The attachment and pinch scripts don't actually need to be their own objects if you prefer to condense the whole setup.
  • There's a key binding in the LEAP SDK that will override the HoloPlay Capture's ctrl+E toggle binding.
  • We've occasionally seen the LEAP struggle when going through USB hubs or questionable cables.
  • It's also sensitive to infrared blasts from other equipment (vive lighthouses, depth cameras, roombas) and from direct sunlight, so keep that in mind.
  • At one point it seemed like a bunch of LEAPs in our office completely stopped working, even brand new ones we had just ordered. It turned out they auto-update the firmware sometimes but the new version was just breaking every LEAP we plugged in, so if you're still having trouble detecting hardware, I'd try their firmware reset tool.