I uses websockets so works great in Browser. I could not find a working implementation of websockets in krom/kha so I used a proxy file written by a python script to make it work with Krom. If anyone has an idea about natively use websockets in krom, feel free to share.
Next step I will try to make a more realistic hand using a blender skeleton.
Requirements:
a Leap Motion (!)
last version of sdk that supports strings without quotes for props (heavely used in the hand model)
tracking seems to better work with Orion version of the Leap software (only for Windows) than the v2 (available for MacOS).
If interested I previously made a similar script to make it work in Blender Game Engine (available in my Github repositories).
In next updates, I want to go on a better looking hand. Next make other physics interactions prototypes to work with objects. In the future I also will try to port the scripts to nodes for a full nodes setup if possible.
One thing I truly miss is an implementation of websockets in krom. I went all through armory tech stack and only found a WebSocketClient class in https://github.com/Kode/Kha/blob/master/Sources/kha/network/WebSocketClient.hx .
I tried all haxe implementations of Websockets, looked at haxe-js-kit and other solutions of using socket.io with haxe since I undersood the code is generated to JS but all lead to a dead end.
I also could not find any other way to get the data on a (simpler) socket.
That is why I ended up using a proxy file loaded with kha.Assets.loadBlobFromPath (which supports absolute paths) to share the json data beetween the leapmotion software and armory, using a python script in the middle.
I am opened to any suggestions.
Didn’t know anything about this tech until seeing this post. Now I want to recreate that scene from Johnny Mnemonic. The device itself is pretty affordable too, which is awesome. Look forward to seeing what you come up with @yulbryn
Was looking at the leap motion site, is this still possible in current armory ? Would be a heck of an affordable rt input if the data could be piped.
They just did some rt capture with a leap in unreal for monster. They call it the cutting edge of production.
I call it wasting no time. They are clearly mapping thee joints from different fingers to control limbs, hands and core/head at different passes. I think of it as possible hand/finger rigs. Think that would work well with the live recording blender does.
Pardon the link: https://youtu.be/YiOByO8J7xg