Real-time graphics · Tools deep-dive
Real-Time Generative Visuals: When to Use TouchDesigner, Notch, or Unreal
TouchDesigner, Notch and Unreal Engine are the three real-time engines you will keep meeting. Here is how we choose between them on actual jobs — concert visuals, brand environments and interactive installations.
If you are commissioning real-time generative visuals — for a concert tour, a brand activation, a museum installation or a broadcast pipeline — you have probably already heard the same three tool names: TouchDesigner, Notch, and Unreal Engine. They are the dominant platforms for real-time visuals in 2026, and they all do overlapping things. So which one should your studio be using on your job?
We use all three. We pick which one to deploy based on the project, not the other way around. This article is the explanation we give clients when they ask why we chose what we chose. If you are a producer, technical director, brand creative lead or marketing director evaluating a real-time visuals studio, this should help you ask better questions.
What real-time actually means in this context
Before getting into the tools, it is worth being precise about what we mean by real-time. A real-time visual system renders frames live, in response to inputs (music, sensors, performers, data feeds, operators) at a steady frame rate — usually 60 frames per second — with no pre-rendering. This is the opposite of a film or animation pipeline, where the final output is a fixed file rendered in advance.
Real-time matters when the visual has to react: to a song that is being performed live, to a crowd that is moving through a space, to data that is changing minute to minute, or to an operator who is composing visuals on the fly. If the brief does not require any of that, real-time is probably the wrong tool — a pre-rendered piece will give you more visual polish for the same budget.
TouchDesigner — the Swiss Army knife of real-time
TouchDesigner, made by Derivative, is a node-based visual programming environment for real-time graphics, audio reactivity and hardware integration. It is the tool we reach for first when a project mixes a lot of inputs and outputs — sensors, MIDI, OSC, DMX lighting, multiple displays, audio analysis — and needs to keep them all coherent.
TouchDesigner shines when the project is custom and integration-heavy. It is the right tool for an interactive installation that takes camera input, runs computer vision on it, drives a generative visual in response, and outputs to a multi-projector array. It is the right tool for a live A/V show where the music is being performed on a modular synth and the visuals need to react to specific frequency bands. It is the right tool for a brand environment that pulls live data from an API and visualises it in real time.
Where TouchDesigner is less ideal is high-fidelity narrative content. It can do beautiful generative work, but it is not optimised for cinematic 3D scenes the way a game engine is. If you need photorealistic environments with complex lighting, character animation, or film-style camera work, you will fight TouchDesigner for it.
Notch — built for live shows and broadcast
Notch is a real-time graphics tool built specifically for live performance, broadcast and large-scale events. It is the dominant tool in the touring concert visuals space, the broadcast augmented-reality space, and a growing share of brand activation work. Where TouchDesigner is a generalist, Notch is a specialist — and the specialty pays off when the project lives in its lane.
Notch is the right tool when the project is a concert visual package, a broadcast graphics pipeline, an augmented-reality piece for a live event, or a heavy-particle / heavy-VFX real-time piece. Its particle and field-based VFX are best in class. It integrates cleanly with the show-control systems used on tour (Disguise especially), which means the package you build in Notch translates directly into how the touring video team operates it night after night without rebuilding it.
Notch is less appropriate when the project is heavily custom or sensor-driven. It is a closed system relative to TouchDesigner, with less surface area for arbitrary inputs and integrations. If your project lives at the edge of what real-time graphics tools normally do, Notch will start to feel like a wall.
Unreal Engine — when fidelity matters
Unreal Engine, made by Epic Games, is a game engine that has become the dominant real-time tool for high-fidelity 3D content outside of games — virtual production for film, broadcast LED stage volumes, automotive visualisation, architectural walk-throughs, and the most cinematic end of brand activations.
Unreal is the right tool when the project requires photorealistic 3D, cinematic camera work, complex environments, or character-driven narrative that has to render live. It is also the right tool for projects shot on LED virtual production stages, since Unreal is the engine those stages typically run on. For real-time-rendered immersive rooms with detailed environments and physics, Unreal will beat the alternatives on visual fidelity.
Unreal is heavier than the alternatives. The team that ships an Unreal project well is closer to a small game studio than a small visuals studio — engine programmers, technical artists, environment artists, lighting artists. The fidelity is real and so is the team cost. Pick Unreal when the brief justifies it, not because it sounds more impressive.
How we actually choose between the three
When a project lands on our desk we walk it through three questions, in order. First — what kind of inputs does this need to react to? If the answer is sensors, custom data, hardware integration, or unusual signal flows, the answer is almost always TouchDesigner. Second — is this a touring or broadcast show with a Disguise / show-control workflow attached? If yes, Notch. Third — does the brief need photorealistic 3D environments rendered live? If yes, Unreal.
On many projects we end up using two of the three together. A common pattern is Notch for the headline visual content and TouchDesigner as the integration layer that sits between the show-control system and the inputs Notch cannot natively handle. Another common pattern is Unreal rendering the environment and TouchDesigner managing the show, the audio reactivity and the multi-display output.
If you are commissioning real-time work and a studio tells you they only ever use one of the three on every project, that is a useful signal. The tool should follow the brief, not the other way around.
If you have a real-time visuals project on the horizon and want a second opinion on the right tool path, send us the brief — we read every well-formed inquiry and reply within two working days.
- TouchDesigner
- Notch
- Unreal Engine
- real-time graphics
- generative art
- live visuals
- interactive installation