How to animate alpha fade?

Hi,

I’ve got another question that probably nobody’s going to answer again lol but:

How do I manually animate the fade in an alpha blended material?

What I’ve tried just to give you an idea of what I’m trying to do: math multiply node between the alpha connection, with a value that animates. I thought since you could actually do this setup that it would also animate. But perhaps that wouldn’t work since the shaders are compiled from a simple interpretation of the shadergraph?

So I’m wondering… do I need to have a locator or something and a helper script to say; ‘This locator drives the alpha value of this object’?

Is there a simpler way, or is my assumption correct?

-S

In the material nodes you can use attribute time to animate or you can use a shader data node to use uniforms including time. Also you can use value with the set material value param node to control it.

1 Like

Thanks @lapiznegro — the first part of what you mentioned (using attribute time) is great for atmospheric or procedural effects like lava flow or water movement.

What I’m specifically trying to solve, though, is timeline-based triggers that match up in real time with scene actions — for example, a fade keyed in Blender’s timeline that plays back in Armory exactly as previewed.

From what I understand, the reliable approach would be something like:

  1. Add a Shader Data node in the material with a parameter name like "fade", and multiply that into the alpha.
  2. Create an Empty (or any object) in Blender with a custom property also called "fade".
  3. Keyframe that property in the Blender timeline (so you can preview the fade there).
  4. In Armory, use Logic Nodes → Get Object Property(“fade”) → Set Material Param(“fade”) to push the animated value into the shader each frame at runtime.

That way, the Blender timeline drives a real-time uniform, and the same animation plays correctly in Armory. It kind of seems like a clunky work around for the ideal situation of Armory supporting the user keying node values.

It’s just about creating a workflow that allows artist to work as they normally do to visualise things in blender and rely on them looking correct before the programmers even get their hands on it.

Is this the way forward, or there a smarter way?

-S

Hey, I just thought of this idea — instead of trying to manually wire every object’s fade one by one, what if I just set up a scene-level fade manager that automatically finds every material with a fade uniform and drives them based on each object’s custom property?

So basically:

  • Every material has a Shader Data node called fade (multiplied into alpha).
  • Each object can have a custom property fade that’s keyframed in Blender’s timeline.
  • Then one global script or logic trait reads those values every frame and applies them as material params at runtime.

That would mean timeline animation still previews in Blender, but it also plays correctly in Armory — and I could still trigger fades programmatically when needed, without per-object setup.

Would that work, do you think? Could this be the best long-term way to handle timeline-linked fades and global control?

-S

For actions you can use markers there is a node thar retrieves that info. Blender keyframe properties are not exported in armory to use as object properties you will need to implement that. They are set in the armory object properties.

Thanks for the suggestion, but markers don’t really fit my use case:

  • No nuance: they’re discrete triggers, not continuous curves (even with eased timing it feels robotic).
  • Different workflow: most animators expect to shape fades as curves inside Actions.
  • Extra management: you end up wiring many one-off hooks instead of a single, cohesive pipeline.

What I need: Actions must be the drivers.

My previous attempt failed because Armory only carries over object / bone / shape-key (and some UV) animation. I still want to combine meshes later (keep per-pixel index sorting + individual effects), so here’s the new plan:

My Plan so far:

  • Per-region masks: one mask per object/region (packed ARGB = 4 masks per map) so we stay GPU-friendly and compatible with combined geo.
  • One hidden control empty per region: exposes up to 9 effect channels (3×pos, 3×rot, 3×scale) because those transform channels do export in Actions.
  • Blender preview: empties can directly drive the shader in the editor for lookdev.
  • Animator-friendly: animators key custom properties on the empty (or just its transforms via Actions); empties are there but can be ignored during blocking.
  • Runtime: a small trait reads the empty’s animated channels and writes uniform values to the shader; masks gate the effects per region on the combined mesh.

future considerations

  • One-click setup (auto-generate masks, empties, and wiring when a toggle is enabled).
  • Depth/ordering bake so merging geometry doesn’t break layer order—goal is a sizable perf win once combined.

So far this looks to be most in line with my needs and what is possible in Armory.

-S

Got it working — here’s what fixed it:

The core issue was that my shader uniform wasn’t actually being exported from Blender. In Armory, a Shader Data node only works at runtime if the value is marked as a Parameter.

@ lapiznegro

You did mention this but I wasn’t sure exactly where that was until now.

It’s on the shader editor; right panel in ‘Armory’/‘Armory Material Node’ And under that is a ‘Parameter’ tick. That’s what I was missing.

Once I ticked the “Parameter” box on my fade_test (and fade_test_preview) Value nodes, the engine finally exposed those uniforms to Haxe, (and I would assume to Logic Nodes too).

From there, I used a custom trait (ShaderActions.hx) that updates the fade_test parameter each frame based on a controller object’s X position (or wiggles if none found). It also forces fade_test_preview = 0 at runtime, so Blender’s preview driver doesn’t fight the live value. But while I’m working on the animation I see the same motion in the viewport because the empty is also driving that shader value.

The key setup steps:

  1. 2 Value nodes:
  • Titles = ‘fade_test’ and ‘fade_test_preview
  • Both checked as Parameter
  • Hooked into the material alpha (e.g. max(fade_test, fade_test_preview) * image alpha → Principled Alpha)
  1. Material set to Alpha Blend.
  2. Trait attached to root: arm.ShaderActions (file in Sources/arm/ShaderActions.hx).
  3. Clean + rebuild the project.

Should it be a scene level trait instead? Open to advice.

Once “Parameter” was ticked and the trait reattached under the correct name/path, the fade looked exactly as it did in Blender as it did in runtime.

I haven’t yet wired the custom properties to drive the empty yet, but I assume it’ll work fine since that’s all blender side stuff.

So I guess we can close this as it’s pretty plain to me that there is currently no other implementation other than this method to animate alpha in Amory.

You need you have an empty or any other object that has it’s transforms mapped to drive the values of the object.

-S

Is there a better way for Nodes than what I’m attaching?

AnimationAlpha.blend (1.2 MB)

1 Like

It’s fairly similar to what I did in haxe script :slight_smile:

The only thing I can think of that would be even more ideal was to optimise this is to omit actions that don’t have active movement etc, but that’s getting into details, and I’m not sure if that’s possible. I kind of want to script that in Haxe at some point, but I’m sure someone would love having that in nodes.

HOWEVER… I did end up using shader nodes in the sense that I control them via haxe.

Also, just to update - I went with bone driven actions in the end. It’s far easier to animate when everything is within the rig of the thing that needs animating after all, and it’s still nicely transferable between this and other games and software.

-S

1 Like