I was wondering to what extent it would be possible to call the Blender Python Interpreter and eventually a virtual library from Armory.
The Goal would be to call Pytorch and create Real-time Learning behaviours in-Game.
My speciality is machine learning, but much of the library we use only exist in Python and would be a hassle to reimplement in other languages.
A crutch solution would be to call a shell, activate a virtual environent and then call python code from there, and communicate with it through pipes or fifos …
But that’s a crutch, that’s what I ended up doing in Unity … Even Unity’s in house solution is a hassle to use and is actually “a hack” to bypass the fact you can’t put python scripts, nor environment, nor interpreters, nor libraries in the unity assets … What they ended up doing was a ML agent that can only learn when used by the game designer … which is not the way that is most exiting to use Machine learning in games … We want the user to experience the learning.
So how much of a hassle do you think it woould be to use python code in Armory.
Let’s be clear: Not python code to manipulate armory objects … python code to do machine learning processes on standard data types …
basically multidimensional arrays of ints or floats