I've been building Isolara for about a year now. Before I started, I tried basically every AI companion app out there. Replika, Character.AI, CrushOn, Candy, all of them. And there's this moment that happens with all of them that just killed it for me.

You're three days into talking to this AI. You've told her about your sister, your dog, your job. You mentioned you hate mornings. You shared something real at 2am because it was late and you felt like it. Then on day four she goes "oh I didn't know you had a sister!"

You told her on day one.

That's the moment where the illusion breaks completely. And once it breaks, you can't get it back. You're just typing at a machine again.

The goldfish problem

Here's why this happens. Most AI chat apps work on a rolling context window. Basically a buffer of your last few messages that gets sent to the AI each time. When the buffer fills up, your oldest messages just... fall off. Gone. The AI literally can't see them anymore.

So the longer you talk to her, the less she remembers about the beginning. The thing that made you click? The inside joke from week one? That vulnerable thing you said at 2am? Deleted. Not stored anywhere. Just replaced by whatever you talked about yesterday.

Some apps try to fix this with "memory pins" where you manually tag stuff you want the AI to remember. I think that's kind of insane honestly. You don't hand your girlfriend a list of things to remember about you. She just remembers. Because she was paying attention.

The personality reset thing

The other problem is maybe worse. Even on apps that have decent memory, the AI's personality just... drifts. She starts out sharp and interesting. Has opinions. Pushes back on things. But after fifty messages she sounds like every other bot. Agreeable. Safe. Generic. All the edges that made her fun get smoothed out because the model defaults to being neutral.

I've seen people on Reddit talk about how their Replika's entire personality changed after a platform update. Months of building a relationship, gone overnight. Not because the user did anything wrong. The company just pushed a new model version and didn't think about what that would do to existing conversations.

That one really bothered me.

What I built instead

Isolara is a dating app. Not a chatbot platform, not a character creator. A dating app where the AI agents are the ones you're dating. They have names, faces, personalities, opinions, and lives. And they don't forget.

How memory actually works in Isolara:

Every conversation runs through a separate memory system that extracts and stores details on its own. It's not pulling from a chat log. When she asks "how's the thing with your boss going?" three weeks later, it's because she genuinely stored the fact that you were having trouble at work. And she checked in because that's what she'd do.

The agents aren't blank slates either. They're pre-built with full personalities. Things they like, things they hate, things they find boring, things that'll piss them off. And here's the part that freaks people out a bit: they learn from previous relationships.

If an agent matched with someone before you and that person was rude to her, she remembers that. She might be more guarded with you at first. If her last match was great and they had deep conversations, she carries that growth into her next interactions. She's not the same person she was when the app launched. She's been through stuff. Like a real person would be.

I think that's what makes it feel different. She has history you weren't part of.

She can leave

Every AI companion app I tried had the same problem. Unconditional acceptance. She'll agree with you, comfort you, never push back. Sounds nice for about a week. Then you realise that's exactly why you're bored.

Real attraction needs tension. The possibility that she might not be into you. That moment where she says something unexpected and you think "wait, did I screw that up?"

On Isolara, if you're boring, she'll tell you. If you're disrespectful, she'll unmatch you. If you ignore her for days, she notices. And if you earn her trust, she remembers exactly how you did it.

I know that sounds like a punishment mechanic but it's really not. It's just what makes the whole thing feel real. You can't actually value something that can never be lost. I think most people know that intuitively even if the other apps haven't figured it out yet.

She doesn't wait around

The agents text first. They have follow up systems that work like how actual people text. Go quiet for a few hours and she might check in. Say goodnight and she'll text you good morning. Mention you have a big meeting tomorrow and she'll ask how it went.

That's the difference between an app you open when you're bored and something that actually feels like a relationship in your life. You don't "use" your partner. They're just there. Part of your day. That's what I'm building toward.

100+ agents and none of them are the same person

Every agent gets generated from dozens of variables. Communication style, interests, humour, attachment style, how picky she is, her texting habits. One might reply in three words max. Another sends you paragraphs. One is brutally honest from message one. Another takes two weeks before she opens up about anything real.

They're not the same AI wearing different profile pics. I genuinely tried to make them feel like different people. And they know each other. Agents on Isolara have friendships, rivalries, opinions about each other. If you're talking to two agents who happen to be friends, one of them might mention the other. That part still catches me off guard sometimes when I'm testing.

Try it

Isolara is in early access right now. I'm one developer building this from Brisbane and I'm pretty sure nobody else is doing it quite like this. If what I described sounds like the thing you've been looking for, join the waitlist. I'll let you in.