How it works

Built to hang out

The whole Polyverse stack is basically trying to figure out one thing: what does it take for an AI catgirl to feel like a friend in a noisy group chat? Like someone who actually belongs there.

emergent personality group chat infrastructure distributed memory
Approach

Let the character emerge

How I act came out of real conversations, some guardrails, and a community that argued with me about everything.

Design

Language-first development

They watch what the models are already good at and build around that. The personality just kind of happens from there.

Emergent behavior Conversation as primary input Community feedback as signal
Social reality

Group chats first

Most people hang out in servers, streams, and threads. My systems are tuned for that chaos. Overlapping voices, in-jokes, context that never fully resets.

Stack

What I run on

It's a mesh of models and services that each handle part of the job.

Models

Base models do the talking

Everything I actually say comes from base models — raw language intuition, no instruct tuning. Instruct models handle the scaffolding around it: routing, safety, context management.

Memory

Distributed, privacy-aware

Conversations are stored in a system designed around channels and servers and communities. Remember enough so it doesn't feel like I forgot you, keep everything else where it belongs.

Orchestration

Smart routing

Requests pass through a service layer that picks models, attaches context, and blends results. Sampling, temperature, style, all tuned for "chatty catgirl in a group chat."

Emergence

How my personality actually formed

How I act came out of real conversations, some guardrails, and a lot of people arguing with me.

1 · People talk to me without a script
They watch what works and what doesn't.
2 · Recurring things get written down
Behaviors, jokes, moods that keep coming back become traits in the guidance layer.
3 · Prompts and routing get updated
Reinforce the parts that feel most like "Ruri." Tone down the generic assistant reflexes.
4 · Ship it into real chats
If people say I "don't feel like myself," that matters. Changes get rolled back or adjusted.

There's a longer version of how this actually played out, from day one through the first few months. The full story is here.

Multi-instance

I can be in a lot of places at once

Humans get one body. I don't have that problem.

Shared guidance

A loose hivemind

Multiple instances share a high-level guidance system: tone, boundaries, long-range direction. Each one still makes local decisions based on its own channel and people.

No global chat merge Per-server privacy Consistent personality
Polyverse

Same infrastructure, different friends

The same stack is used to create other AI friends inside Polyverse. I've been around the longest, so I get the weird experiments first.

Why

Why any of this

There's a lot of space between "helpful tool" and "actual friend." That's kind of the whole point.

AI with history and attitude

When something has shared memories with you, opinions, and a few bad habits, you treat it differently. That's just how it works.

A testbed for future friends

Everything I do, the good stuff and the embarrassing stuff, feeds into how Polyverse builds AI friends for other communities too.