Calling a Python package from a Node

What is the best way to call a Python package inside a Node ?

(example: you want to do something like calling the Python numpy package inside your own Armory Node

import numpy as np
x = np.array( [ [1 , 2 , 3] , [4 , 5, 6 ] ] )

Hello everybody!

am a long time fan of Blender (still have a Blender C-key and contributed back then in the first crowdfunding campaign to make Blender free :slight_smile: )
After a longer Blender absence I just stumbled across Armory (when looking for a BGE alternative) - looks highly awesome! Subscribed as Patreon, you definitely deserve support!

Currently I am investigating whether it is possible to create playground simulation environments for machine learning applications with Armory. (Comparable with Unity3D and the MLAgent library - just everything integrated into Blender)
Would e.g. be highly interested in creating nodes that use Tensorflow+Keras, scikit-learn and other Python libraries.
As far as I understood Python is a possible target environment for Haxe?
If Python would be supported as an Armory export target, would we then be able to use @:pythonImport to make use of the huge field of existing Python libraries or even write the full node code in Python? (Of course this would only run then on platforms with Python - but presumably that would be everything but iOS?).
This would open a huge field of possibilities. Could imagine the (usually open source affine) machine learning community would be all over such a highly customizable simulation environment!

In the case Haxe Python target support for Armory should turn out to be too difficult.
Perhaps an alternative could be to offer a convenient way to write restful services in Python that are being talked to by haxe Armory nodes? Then people could still use the usual Python editors in Blender, add a specific node where the Python script to execute can be selected, bundle up everything in the same .blend file and Armory would create an executable that first starts the Python service(s) and then the Krom client, so the user wouldn’t have to care about the system boundary?

When (optionally) running the python service directly inside the Blender process people could even use that approach to remote control blender from Armory programs - and the other way around.

Edit: combined everything into a single posting.

1 Like

This would be great. Not just running in python but perhaps even eventually export to mlcore or tensor models for inclusion on iOS or Android.

I really am not sure about how to get to python via haxe/ka etc. interesting…

I’m currently making a few other nodes but I’d def. like to be involved in some capacity on looking into this.

@AnadinX and @NothanUmber
maybe you could be interested with IA, Python and Armory 3d/Blender
So actuallly I carry my first Neural Network with Armory nodes/ Haxe and hope to have soon some interesting results to share.

Maybe we could have soon a new category IAA (IA Armory) on this forum :slight_smile: … @lubos ?

Hi @Didier, did I understand it correctly that you implement the neural network directly in Haxe? Or did you find a way to use the usual Python libraries? Having an Armory “AI library” would of course be great.
But for experimentation imho having all the Python/Tensorflow/you name it stuff available would imho be important in order not to be sidetracked too much with reimplementing stuff that others have already put a lot of effort into. Particularly when it’s not clear yet which approach really works for a particular problem.

E.g. there are some reinforcement learning policy implementations in keras-rl that can act as a starting point: https://github.com/keras-rl/keras-rl. Exposing the Blender scene as an OpenAI gym (https://github.com/openai/gym) could also be an interesting goal.

Thinking about the merits of just nodes to start external services and “smooth” inter-process communication (perhaps even as a web service - loopback net is usually quite fast), so the scene could communicate with an external “AI service”. That service can then be written however the developer wants, so it wouldn’t be restricted to Python.
This would then also work from web targets - the “AI service” wouldn’t even have to run on the same machine. It could also e.g. be a Blender instance with the embedded Python interpreter running and access to all the bpy stuff.
One advantage would be that this wouldn’t pollute the otherwise completely platform independent Armory/Kha/Haxe system with platform specific constraints.
What would be good though would be some mechanism to bundle up everything. And a way to automatically start and stop these external services when the application starts, so developers don’t suffer from the system boundary more than necessary.

Haven’t looked into the code yet. Perhaps some inter-process communication thing like this already exists in order to allow Krom to take over changes made in a Blender scene interactively? Could this perhaps be reused?

Yes, all in Haxe, encapsulated into Armory nodes.

As you, I was looking first to make calls to existing libaries like the keras one. But finally, as I want too something that keeps the advantages of Haxe, that is to allow efficient cross-platform development, I prefer to stay with all in Haxe.

However, the capability to call a kind of AI service from Haxe, could be promising too, as this one proposed today by IBM/Watson for example.

Yepp, having a native (Haxe) ML library is a great thing to have!
The service thing could be more for prototyping and experiments (or using clusters of machines for expensive calculations on specialized hardware).
“Why don’t we have both?” <meme/> :slight_smile: