[SOLVED] Armory upgrade to Blender 2.9.X

Is the workaround really that ugly? :sweat_smile: I think it may be worth it if itā€™s not a hell to maintain and/or can be self-contained and not uglily spread all around the codebaseā€¦ In the mean-time probably an issue or message in the blender chat to reach the UI mantainers to see what they think about these limitations would probably be a good idea I thinkā€¦

Slightly redesigned trait panel, what do you think?

Pretty good! :+1:

2 Likes

Itā€™s at least ā€¦ unusual and a matter of opinion. I donā€™t even know if it would work at all but I can do some tests in the next days. A feature request is also a good idea, maybe there even exists one already (I also plan on contacting the author of the Scattering addon to maybe get some insight about how the preference editor thing works, maybe it uses a similar solution).

Sorry, there is again a big wall of text coming (beware of the two meanings of property as in ā€œProperties Editorā€ and ā€œcustom propertyā€):

Because I left out many details: UI drawing methods like layout.prop() require a property that is ā€œsynchronizedā€ with the drawing. Blenderā€™s data system only allows properties for ID blocks and some hand-picked other types like the addonā€™s preferences. So the value of the property belongs to the instance that actually holds the property, it can be an object for example or the entire window manager (like a global instance, but it isnā€™t saved. Thatā€™s why we have the ā€œArmā€ world). When drawing the property, you always need to pass that instance as well (like instance.value as in any programming language).

Because areas/regions canā€™t hold properties, we either need one instance of some type for any open properties editor or we need to have as many enum properties (all the tabs are stored in an enum) in the window manager instance as we have editors. I would go for the latter because we then donā€™t have to deal with data blocks the user can edit by accident. It is possible to hide text blocks (I used it a few years ago), so that could work as well. Maybe we can even replace the ā€œArmā€ world with that and add a small API for it for easier access.

Then, we need a way to identify each editor and give it a unique id (via pointer or hash()) and map each properties editor to a enum property that holds the current tab for that editor. The mapping should be calculated fast because it must be calculated in each drawing of the editor because we need to tell the draw functions which enum we wanā€™t to draw. The same goes for the poll() functions, so that the correct panels show up depending on the current editor context.

The most difficult part I guess is to react to new/closed editors and to create the enum property accordingly. I donā€™t know if this is possible with the msgbus module but I donā€™t think so, there is no logging in the info editor when opening a new editor. So again, a workaround is required. You could add some lines to the poll function of each property editorā€™s draw function that checks if the editor is ā€œregisteredā€/mapped yet to fake a listener and if not, create a property for it. But what to do with closing properties editors? It will create more properties and requires more and more memory until you close Blender if we donā€™t cleanup after a properties editor is closed. So we need yet another workaround here. But I didnā€™t research yet if somebody already found a solution for this, maybe weā€™re lucky and it is possible :slight_smile:

So yeah, this is a bunch of code that has to be well-documented and I have no idea if it works and if it is fast. But the idea of having an own editor panel and being the first addon which does this is quite tempting of course :sweat_smile:


Edit: You could probably listen to changes in the length of the areas or regions attribute of the current screen (I always confuse area with region, sorry^^). But I donā€™t think it works as changing the length of a collection does not really update the attribute value (the collection object remains the same). Letā€™s test that.

2 Likes

Ok, I get it, itā€™s a workaround on top of a workaround on top of an experimental feature on top of a workaround on top ofā€¦ :sweat_smile:

I read your ā€œbig wall of textā€ :laughing: three times but still there is a lot I donā€™t understand about blender api and the likes, but from what I gather from your text is that it sounds very tricky to get it right. Thatā€™s a shame.

Probably opening a feature request to have a proper way to add a new tab seems like a better idea.

2 Likes

Basically yes, that sums it up very wellā€¦ :sweat_smile: I couldnā€™t find a way so far to listen to new/closing editors so I guess itā€™s not even possible (the end of the ā€œworkaround chainā€ fails).


Redesigned Add Trait menu:
AddTraitFromMenu
What do you think of having icons here as well? The menu is wider now because of that (400px width instead of 300).

Also, I implemented a #todo comment from the code: when invoking the operator from a search menu, there is now a checkbox that lets you decide if the trait should be added to the selected object or to the current scene.
AddTraitFromSearch

2 Likes

Sneak peak of the Nishita sky model with Armory clouds on top (ported from here) :grin:

Previously the clouds were just overlayed on top of the sky which meant that they were always bright even if the sky is completely dark. Now (on the images above), they are multiplied with the sky color before mixing (to better approximate real clouds). What do you think? They loose some strength towards the horizon because of thatā€¦

4 Likes

Redesigned the project flags panel with the feedback a few posts above, what do you think?

armory_project_panel

Iā€™m still unsure whether Export Tangents should be in the Exporter subsection because it is not important for the user where the code for that is, itā€™s effects (precomputed tangent vectors) are more important.

The same goes for Minimize data (which we should probably rename in the future as it describes whether to export to .json or .arm), Iā€™m not sure if it should just be in the Build section for said reasons (it influences the build output).

If there are too many headings on the left now, we can also leave them out and just use small margins between the subsections.


Also, I found a better way to make the cloud lighting more realistic: they are now darkened if the sun is below a certain angle. This manipulates the cloud tracing function a bit so I guess I will make this optional. Also it still needs tweaking and I didnā€™t test the clouds with the old Hosek model yet, maybe they need other values/angles for that.

I also thought about adding a way to set a stars texture and a scale which would be displayed at night. Also, I still do not now how to translate Blenderā€™s air/dust and ozone settings into mie or rayleigh coefficients for the sky rendering. I found a lot of complicated math papers but I still understand much too less.

Edit: tested it with Hosek and the sky doesnā€™t seem to update with the sun position, does anybody know what must be done so that there is a day/night cycle? There are comments in the hosek-related code which suggest that it is implementedā€¦

4 Likes

Hi,

As for the project flags panel, I guess the export tangents could just stay there - If not, I think it could be moved down to the renderer section and be renamed to ā€œPrecompute tangentsā€?

I agree that the minimize data could probably be renamed, as to me it sort of sounds like it compresses the data, but from my understanding thatā€™s what the Asset Compression option does, so I agree it could probably be renamed and moved to the build section.

As for the panel, I donā€™t think itā€™s too bad to leave out options for the user, provided that the default options are good.


The new sky model looks great!

As for the the air, dust and ozone settings, I havenā€™t looked very thoroughly at it, but there might be some functions from Cycles that we can try to port:

As for the sky update when using Hosek Wilkie, it needs to be recomputed on every frame (or at least every time when the sun direction is changed), thereā€™s an example here - not sure if it still works though:

4 Likes

Thanks a lot!

I now renamed the discussed options to Precompute Tangents and Binary Scene Data. Maybe it makes even more sense to move the binary data option to the exporter panel where Optimize Data and Asset Compression reside? These options are global (ā€œArmā€ world) as well, but some of them only affect published builds. Btw, sorry for being so nitpicky here, but I think itā€™s better to create a decent UI once than changing it every update.

What do you think about the following idea? The Build section was renamed to Debug, the now called Binary Scene Data option was moved to the exporter panel and the options in that panel now belong to two sections called Compilation and Data. Compilation still needs a better name (Compiler?), but Minify JS doesnā€™t really belong to the compiler as it calls another tool in the backgroundā€¦

armory_project armory_exporter

Also, thanks for the links @Naxela. I didnā€™t have much time yet to really look into it, but understanding how Blender does it will help a lot!

3 Likes

Hi,

I think renaming Build to Debug makes sense, especially since verbose output and cached builds arenā€™t essential for published/released builds - And being nitpicky is fine in this case, as the simpler the UI can become, the easier new users will have.

I think Minify JS could possibly be moved to the Data section, as it saves space too - Not sure if it could be renamed to something like Code Compression or similar?

I take it you havenā€™t implemented the sky radiance/irradiance for objects yet, or is that included in that code? - If not, Iā€™m wondering if thereā€™s some stuff from the Hosek Wilkie implementation that can be reused for calculating the SH coefficients: https://github.com/armory3d/armory/blob/master/Sources/armory/renderpath/HosekWilkie.hx

4 Likes

Thatā€™s a good idea. I also thought about renaming Compilation to Output or Code Output, that would work as well. Iā€™m not sure what fits better, do you have a preference?

No, I havenā€™t implemented it so far and Iā€™m not yet sure if Iā€™m able to do so, as there were some issues regarding gamma correction/brightness in general (details in this PR). I wasnā€™t able to get a good looking output but I plan to revisit my old code some day.

The other problem for which I donā€™t know a solution is how to handle dynamic skies ā€“ or in general dynamic world shaders, now that there is full node support. If there are visible changes in the world, the irradiance and radiance maps should be updated as well, at least via a update function like in the Hosek case. But even in the Hosek case there are at least some SH (spherical harmonics) values hardcoded in write_probes.py. Do you know how other engines or Blender handle this? I have no knowlegde about stuff like this (physics/optics in general) so Iā€™m a bit lost when doing research about radiance/irradiance/SH calculations. Iā€™m unfortunately more of a script kiddie in that regard :laughing:

Also, do you know what the Hosek A-I, Z coefficients are used for? Is that radiance/irradiance related?


Small Nishita sky update: I was able to implement air and dust density settings, the values are just multiplied with the optical depth values in the shader. The output looks more or less the same as in Blender now. However the ozone setting is more tricky, I wasnā€™t able so far to get it working even when using the same density calculations used in Blender (which we btw canā€™t use unless we include a copy of the Apache license). Blenderā€™s algorithm differs a lot in some places due to heavy optimizations or precomutations and I have no idea whatā€™s going on there.

4 Likes

(which we btw canā€™t use unless we include a copy of the Apache license)

As long as it is not GPL or any other copyleft license, it seems to be fine AFAIK.

https://tldrlegal.com/license/apache-license-2.0-(apache-2.0)

2 Likes

Progress update:

  • I was finally able to implement ozone density for the new sky model, it now looks almost like Cycles. There are slight color differences because Cycles actually calculates 21 different wavelengths and we calculate only 3 (RGB). I could try manually adjusting the ozone absorption coefficients to make the color difference smaller, but I donā€™t know if it works well with different kinds of density settings. What do you think? Use more or less physically correct values or adjust them?

    Top: Cycles, bottom: Armory (ozone density = 10)
    OzoneCycles

  • I implemented a sun disk with limb darkening (if enabled in the Nishita node) based on a paper about the Frostbite engine. I think it looks much better than Blenderā€™s sun (the difference is hopefully acceptable) but it still needs adjustments.

    The sun has this weird shape because of https://github.com/armory3d/armory/issues/1277#issuecomment-782749822, I hope there soon exists a solution for that.

    Should the sun automatically follow the sun in the scene (current) or should it follow the elevation and rotation settings from the node?

  • I revisited my old radiance/irradiance export code and now it works. The irradiance now looks correct (it was way too bright before), the solution was to divide everything by 2 like it is currently done in write_probes.sh_to_json().

    There is still the question of how to update the radiance/irradiance data when there are changes in the world shader. I have no idea how other engines do that (probably they also donā€™t have such customizable worlds), but I have two ideas:

    • Render a small cubemap of the world if a recalculation is requested, then calculate irradiance/radiance. I have way to less knowledge of sh for example to be able to implement this method. Also, it requires a very fast world rendering to not slow the game down.

    • Add a light scenario system like the unreal engine has. This could be combined with the light mapper addon and you would be able to interpolate between scenarios.

    But dynamic (ir)radiance is probably something for future PRs.

  • The Nishita sky model is quite slow in this case as it is dynamically calculated each frame. With 16x8 samples (16 samples for the primary ray, 8 for the second; default in the original implementation) and already some optimizations (5% faster on DirectX according to Intel GPA) I still only get a 18ms frame time for an empty scene on my mid-end pc which is not acceptable. Lower sample sizes work much faster while having almost no visible differences but it is still wasted computation time.

    Most realtime implementations use 2D/3D/4D LUTs to look up precalculated scattering values, but I first have to understand how that works. There exist a bunch of papers that describe such techniques but they are complicated and full of physics/optics I donā€™t understand, so this will take some time and Iā€™m not sure if Iā€™m able to succeed. Maybe I will make the sample size adjustable and set it lower by default.

    The original implementation from Nishita himself uses a 2D LUT texture I think, but there are faster variants using higher dimensions. Problem with those is that webgl doesnā€™t support 3D texturesā€¦

4 Likes

That looks cool :grinning:

I think it would be better with just the most required calculations made, even if the looks are a bit compromised

To give maximum control to the game developer, it would be better to set the sun elevation and rotation with node. Or maybe have a checkbox in Blender to choose if it should follow the sun lamp in game.

I think this would be a slightly easier option, and quicker to implement. But I may be totally wrong. If it is faster, we already have something to start with rather than nothing.

Maybe keep working with a 2D texture LUT and then change over to 3D when it gets to WebGL. :wink:

3 Likes

Iā€™m not actually aware of how other engines handle this. I believe some might have it done only on compile/build-time, while I think others might offload this asynchronously to not slow the renderpipe. Iā€™m not sure, but I think this might be how Space Engineers does it, since the reflections getā€™s updated after a second or so.

As for the A to I values, I believe these are the RGB values for the SH bands (First is band 0, next three are band 1 and the rest is band 2) while the Z vector are the radians for the hemispherical direction.

This sort of helps explain it: Interactive Spherical Harmonic Visualization - Edilogues

The sky is looking really great! As for the values, I think having full parity with Cycles might be hard, so I guess making it look good while getting as close as possible is good enough, and the limb darkening seems like a good thing to have.

The second scenario you mention is something that I had started on slowly (related to commit), by extended the Armory Bake panel with a Probe tab, similar to what the standalone Lightmapper currently has, but make it more manageable by baking radiance and irradiance to the Bundled folder, and add some nodes that allows you to change your radiance (HDR file) and irradiance (c/coeff file) on runtime. The main problem Iā€™m having is trying to figure out the best way to interpolate these values rather than having weird instant shifts.

I think 3D textures might be supported by WebGL2 now (since Three.js and Babylon.js seems to support it), but I donā€™t think itā€™s implemented in Kha backend yet.

4 Likes

Thanks! This interactive visualization is awesome :slight_smile: Luckily we have cmft for now, Iā€™m very afraid of the maths behind spherical harmonics although it is probably not that difficult once you get the hang of it.

I think I will leave it as it is, it looks almost like Cycles apart from very slight differences for high ozone values for example. Also the area below the horizon looks different but thatā€™s negligible I think.

Thatā€™s good news. Do you know how Kha chooses between WebGL and WebGL2 (if that happens at all)? I guess I should ask Robert when I have more time again (exams are comingā€¦) and then try to implement it. There are still some browsers that donā€™t support WebGL2 and Iā€™m not sure how important that is for us. The current cloud implementation for example also relies on 3D textures, it currently doesnā€™t work in html5 at all.

I came up with my own basic 1D LUT implementation which works in 60fps now (but still takes a lot of time compared to other shaders because it doesnā€™t replace the loops/integral calculations), so I guess 2D LUTs are fast enough if we want to go the full html5 support route (or better: choose an approach based on the target). I still have to do more research on that topic though. For my implementation I have slightly visible artifacts, either because of wrong interpolation or because of different step sizes in the LUT calculation and the actual integralā€™s step size in the shader.

For now I have to back out a bit for exams but I will keep posting updates after that :slight_smile:

2 Likes

Yeah, I think cmft will do just fine - especially since the background lemon shape doesnā€™t seem to be attributed to cmft rather than the actual skydome (HDR Is not uniform Ā· Issue #1277 Ā· armory3d/armory Ā· GitHub).

Iā€™m not sure, but I think it might be automatically decided based on the devices capabilities - I think similar to three.js itā€™s WebGL2 by default, and using WebGL as a fallback - See: Kha/Backends/HTML5/kha/SystemImpl.hx at 1196088f3731c4120354e5b7eb863bdcf140baec Ā· Kode/Kha Ā· GitHub

True, thereā€™s always going to be some browsers that is stubbornly not adopting standards. With that being said, I think itā€™s fine to disregard Internet Explorer now and Opera Mini being deprecated - QQ, Baidu and KaiOS seem to have very minimal market share, and Safari (both Mac and iOS) can have it turned on manually, so I think itā€™s all good.

I think at first, itā€™s good enough to use whatever works - From there on, things can always be improved gradually performance wise and visually :+1:

In any case, good luck with the exams :smiley:

4 Likes

So, I guess itā€™s time for another update :slight_smile:

I got a 2D lookup table working for the Nishita sky model, resulting in a 6 times better performance by completely replacing the inner integral of the scattering calculations. The original version took ~25ms per frame, the 1D LUT version took ~19ms and now its only ~3ms. The LUT only needs to be recalculated when the density settings (air, dust, ozone) change and I plan to add an API for that as well as logic nodes (also for the Hosek Wilkie model which currently needs a Haxe script to update). This would allow for caching presets of sky configurations.

But there is still room for improvement on the performance, I found a paper that renders the sky to a lower resolution environment texture that can be sampled in the actual sky shader. The texturen then only needs to re-render if the sun changes its position. However, this obviously leads to a ā€œnon-steadyā€ framerate so if I should implement this, I will make it optional.


Unfortunately due to the big range of values, the LUT texture must be in RGB(A)128 format which takes up a lot of bandwidth and might be problematic or even unsupported on older hardware. Also, the format doesnā€™t support hardware interpolation as far as I know so a LUT resolution of at least 128x128px is required to have a sky without visible banding (manual interpolation would require sampling the texture multiple times which again slows things down). I still need to find out if itā€™s possible to omit the currently unused alpha channel to have 1/4 less bandwidth required. It probably has to be implemented for each back end (Kinc etc.) as there currently is no RGB128 format in Kha.

A solution could be to use the smaller RGBA64 format with downscaled values that get upscaled again in the shader (of course it would need to be tested if the precision is enough), but there currently is no way to set the bytes for such a texture because there is no Float16 type in Kha. I asked Robert and he said that Kha should get an implementation someday but Iā€™m not sure how the priorities are for this and it probably will take some time.


Another (minor?) issue with the current 2D LUT implementation is that the sun halo got a bit weaker compared to the non-LUT implementation and because Iā€™m using a squared height for better precision at the horizon, the sky below the horizon looks pretty flat now (and I have not the slightest idea why exactly that happens, it is only visible when the sun is low). The colors in general look a bit less saturated compared to the screenshots above, also only when the sun is low. Iā€™m still trying to figure out why that happens.

Iā€™m also working on implementing a proper (ir)radiance export for worlds including caching, so that will be included in the Nishita PR as well which hopefully will come soon.

6 Likes

What about this method to avoid banding? Maybe it can be used as a workaround before Float16 support

3 Likes

Thatā€™s a pretty good idea! I tried it but it looks like such a simple dithering technique only works well for that little banding as it can be seen in the linked article. If the LUT resolution is too low, the dithering doesnā€™t help much because the bands are too different.

But I guess that it is helpful even with the current 128px LUT resolution to make the almost not visible banding with this resolution even less visible. I measured the performance and it only slowed down the rendering by 0.1ms, so the dithering itself seems to be not very costly. I will experiment a bit to see if it improves the output, thanks for the idea!

Edit: Using dithering looks much better for the sky for some angles of the sun, so Iā€™ll include it. Thanks again!


Another small update: I finally found the reason for the flat look below the horizon and Iā€™m currently working on a solution. Problem is that I (again) didnā€™t stick 100% to the paper implementations and instead did my own variant because the proposed solutions often donā€™t apply well to the scattering algorithm in the ported implementation. A fix should be coming soon.

5 Likes