Android build

Tried the tutorial for mobile with the rotating head from the examples.
Awesome, running 60fps, very responsive to touch and perfect fluid motion when looking around.

Is it possible to detect an orientation change on the device ? Would that need be through Kha ?
Currently it handles it by stretching to conform to one dimension when changing orientation.
Also, is it possible to listen to zooming (two finger zoom) events ?

That is great to hear!

It is possible, but I think only with a script. It is through Kha, but still super simple:

  • kha.System.screenRotation - to retrieve orientation
  • kha.System.windowWidth() and kha.System.windowHeight() - to retrieve screen pixel size

There is a way to detect input for multiple fingers, example script here.

I can also look to make this available to logic nodes as well. For two finger zoom we could also roll some pre-built utility function, since right now you have to manually get the distance between the fingers and see if it’s shrinking or expanding…

It is also possible to lock the rotation to specific orientation, using Properties - Render - Armory Project - Orientation property.

2 Likes

I see. I totally missed that multi touch example file. Thanks for pointing that out.
So we have both the option to lock and access to the events and sizes. Perfect.

If you go pre built for gestures, exposing the works for onscreen command interfaces could be a major plus on top of multi finger interaction. But I see there’s an UI lib part to this that I haven’t looked into so that’s probably possible already.

Got to look at the Kha API, can probably use that for connecting with backends, this is great.