`n
Features Live Demo Architecture Pricing Docs Join Waitlist
320,589 Lines · 702 Tools · Living NPCs · Self-Repairing · UE 5.7

The First AI That
Builds Your Game
Directly Inside Unreal.

Not a chatbot. Not code autocomplete. Not a wrapper around GPT.
A real engine plugin with 320,589 lines of production code. 702 tools. NPCs that see, hear, remember, and speak. Self-repairing codebase. Actually ships.

320K+
Lines of Code
702
Editor Tools
12
NPC Brain Systems
646
Source Files

What RiftbornAI Actually Does

702 tools organized into 6 core systems. Click to expand.

⚔️
Game Systems
GAS, AI, VFX, Materials, Audio
⚔️

Gameplay Abilities

Full GAS integration — abilities, effects, cooldowns, costs, targeting

🧠

AI Behavior Trees

Complete BTs with EQS queries, services, decorators, blackboards

VFX & Niagara

Particle systems from descriptions — fire, smoke, magic, impacts

🎨

Materials & Shaders

Material graphs with nodes, parameters, instances

🔊

Audio Systems

Sound cues, attenuation, 17+ audio presets

🏗️

Level Design

Rooms, lighting, actor placement, layouts

📜
Code Generation
C++, Blueprints, Hot Reload
📜

C++ Generation

Actor, Character, Component classes with proper UE macros

🔗

Blueprint Wiring

Add nodes, connect pins, create functions, wire systems

Blueprint → C++

Convert any BP to native C++ with proper macros

🔥

Hot Reload

Write → Compile → Reload → Test. Zero manual steps.

🖼️

UI & Widgets

HUD layouts, widget blueprints, UMG integration

🔍

Code Indexer

Parses C++ and BPs for semantic understanding

🧬
Living NPCs
12 cognitive systems, voice chat, memory
🧬

Cognitive Architecture

12 systems: goals, memory, curiosity, self-awareness

🎤

Voice Chat

Whisper STT + XTTS TTS. Talk to NPCs with your voice.

👀

Perception

Real sight (raycasts, FOV) and hearing (sound events)

🧠

Self-Model

NPCs know their own skills, weaknesses, limits

🎯

Goal Hierarchies

Days → Hours → Minutes → Seconds decomposition

⚔️

Resource Competition

NPCs compete for scarce resources. Real selection.

🛡️
Safety & Reliability
Self-repair, sandboxing, atomic ops
🔧

Self-Repair

Detects failures, generates patches, rebuilds, validates

🛡️

Sandbox Guardian

Protected paths, blast radius limits, immutable core

↩️

Atomic Transactions

All-or-nothing commits. Full undo/rollback.

👁️

Dry-Run Mode

Preview before executing. Approve or cancel.

Verification

Compile, link, syntax, BP, asset, runtime tests

🩺

Smart Debugger

Detects issues, explains causes, suggests fixes

🎮
Automated Testing
AI playtesting, bug detection
🎮

AI Playtester

Plays as Newbie, Hardcore, QA Tester, Speedrunner

🐛

Bug Detection

Finds bugs, balance issues, UX problems automatically

See It Work

One prompt. Complete system. No bullshit.

You Type

RiftbornAI — Terminal
riftborn generate
# Describe your system:
"Create a fireball ability with 50 damage,
3 second cooldown, spawns a projectile
that explodes on impact with fire VFX"

RiftbornAI Creates

Output — 6 assets generated
✓ GA_Fireball.uasset
Gameplay Ability with targeting
✓ GE_Fireball_Damage.uasset
50 damage, fire type
✓ GE_Fireball_Cooldown.uasset
3.0s duration
✓ BP_FireballProjectile.uasset
Actor with collision, movement
✓ NS_FireExplosion.uasset
Niagara VFX system
✓ All systems wired
Ready to use in-editor

What We Actually Built

Not marketing. Real production code. Here's a sample.

ArenaHealthComponent.h
// Real code from our 39,755 lines of C++
UCLASS(ClassGroup=(Custom), meta=(BlueprintSpawnableComponent))
class ARENA_API UArenaHealthComponent : public UActorComponent
{
    GENERATED_BODY()

public:
    UPROPERTY(EditAnywhere, BlueprintReadWrite)
    float MaxHealth = 100.0f;

    UPROPERTY(ReplicatedUsing=OnRep_CurrentHealth)
    float CurrentHealth;

    UFUNCTION(BlueprintCallable, Category="Health")
    void ApplyDamage(float DamageAmount, AActor* DamageCauser);

    DECLARE_DYNAMIC_MULTICAST_DELEGATE_TwoParams(
        FOnHealthChanged, float, NewHealth, float, Delta);

    UPROPERTY(BlueprintAssignable)
    FOnHealthChanged OnHealthChanged;
};

The NPC Brain: 12 Cognitive Systems

Not scripted behavior trees. Not GPT wrappers. Real cognitive architecture.

🧠

Perception → Decision → Action

NPCs see (raycasts, FOV), hear (sound events), and build a world model. Real sensory input drives real decisions.

🎯

Goals, Memory, Motivation

Long-term goals decompose into subgoals. Episodic memory persists forever. Curiosity drives exploration. Self-model enables introspection.

12 Integrated Systems

Causal world model · Intrinsic motivation · Epistemic punishment · Multi-drive system · Lossy compression · Living world · Goal hierarchy · Self-model · Adversarial competition · Continuous learning · Cognition API · Voice synthesis

💡 What "Brain" Actually Means

Not marketing fluff. 10,000 lines of Python implementing: causal world modeling, intrinsic motivation (curiosity-driven behavior), epistemic punishment (learning from prediction errors), hierarchical goal decomposition, self-modeling (NPCs know their own weaknesses), and adversarial resource competition. NPCs that actually think.

Before vs After

Before RiftbornAI
  • Hours setting up GAS for one ability
  • Days building behavior trees
  • Weeks learning Niagara
  • Fighting with Blueprint spaghetti
  • Copying boilerplate C++ classes
  • Manually wiring everything
With RiftbornAI
  • "Create fireball ability" → Done
  • "Enemy patrols and chases" → Done
  • "Fire explosion with sparks" → Done
  • Clean, generated Blueprints
  • Proper C++ with UE macros
  • Everything wired automatically

Simple. Fair. No Per-Seat BS.

Indie

$99/year
  • All 702 tools
  • NPC cognitive systems
  • Claude API (brain learning)
  • 1 seat
  • Community support
Join Waitlist

Enterprise

Custom
  • Everything in Studio
  • Unlimited seats
  • Custom integrations
  • Dedicated support
  • SLA guarantee
Contact Us

Common Questions

Is this actually real?

Yes. 320,589 lines of production code — 228K C++ in the plugin, 92K Python for the brain and bridge. 702 tools. 472 C++ files, 174 Python modules. It compiles and runs in UE 5.7 right now.

When does it launch?

Q1 2026. We're not rushing this. The plugin works — we're polishing the experience.

What LLMs does it support?

Claude (recommended) — enables cloud brain learning, costs decrease over time. Ollama works for fully local operation but without learning benefits.

Does it require internet?

With Claude: Only natural language prompts are sent — never your source code, assets, or project files. With Ollama: 100% local, nothing leaves your machine.

What about my IP and code security?

Your IP stays yours. No code leaves your machine. No project files shared. AES-256 encryption on all communications. Everything generated belongs to you — no royalties, no attribution required.

What's the "Brain" actually do?

12 Python modules implementing: causal world modeling, intrinsic motivation, goal hierarchies, self-modeling, adversarial competition, and continuous learning. NPCs see, hear, remember, compete, and adapt. Not scripted — emergent behavior.

Can NPCs actually talk?

Yes. Whisper for speech-to-text, XTTS v2 for neural text-to-speech. Push-to-talk with your mic. NPCs respond with unique voices. Memory persists forever — they remember every conversation.

Get Early Access

Join the waitlist. No spam — just the launch announcement.

Q1 2026 · No credit card required