Leap Motion support

I have been working on bringing my Leap Motion to work with Blender + Armory: https://github.com/yanngdev/armory-leap

I uses websockets so works great in Browser. I could not find a working implementation of websockets in krom/kha so I used a proxy file written by a python script to make it work with Krom. If anyone has an idea about natively use websockets in krom, feel free to share.

Next step I will try to make a more realistic hand using a blender skeleton.

Requirements:

  • a Leap Motion (!)
  • last version of sdk that supports strings without quotes for props (heavely used in the hand model)
  • tracking seems to better work with Orion version of the Leap software (only for Windows) than the v2 (available for MacOS).

If interested I previously made a similar script to make it work in Blender Game Engine (available in my Github repositories).

5 Likes

I have Leap Motion® maybe i can test it for Armory.
I’m using macOS® but i’m switching soon to Linux.

What is this sorcery! :open_mouth: Now I need to get Leap Motion asap.

Looking forward to updates. Anything crucial I should fix in Armory to make implementing this easier for you?

In next updates, I want to go on a better looking hand. Next make other physics interactions prototypes to work with objects. In the future I also will try to port the scripts to nodes for a full nodes setup if possible.

One thing I truly miss is an implementation of websockets in krom. I went all through armory tech stack and only found a WebSocketClient class in https://github.com/Kode/Kha/blob/master/Sources/kha/network/WebSocketClient.hx .
I tried all haxe implementations of Websockets, looked at haxe-js-kit and other solutions of using socket.io with haxe since I undersood the code is generated to JS but all lead to a dead end.
I also could not find any other way to get the data on a (simpler) socket.
That is why I ended up using a proxy file loaded with kha.Assets.loadBlobFromPath (which supports absolute paths) to share the json data beetween the leapmotion software and armory, using a python script in the middle.
I am opened to any suggestions.

2 Likes

Didn’t know anything about this tech until seeing this post. Now I want to recreate that scene from Johnny Mnemonic. The device itself is pretty affordable too, which is awesome. Look forward to seeing what you come up with @yulbryn

Sorry to piggy back on this, I am interested in a web socket solution to talk to BCI (Emotiv Cortex) did you ever do more on it.

@lubos don’t suppose you have time to help out?

Can take a look once the final 0.3 build is out (this week). :hatching_chick:

2 Likes

Was looking at the leap motion site, is this still possible in current armory ? Would be a heck of an affordable rt input if the data could be piped.
They just did some rt capture with a leap in unreal for monster. They call it the cutting edge of production.
I call it wasting no time. They are clearly mapping thee joints from different fingers to control limbs, hands and core/head at different passes. I think of it as possible hand/finger rigs. Think that would work well with the live recording blender does.
Pardon the link: https://youtu.be/YiOByO8J7xg