New · April 8, 2026ATANT: An Evaluation Framework for AI Continuity · arXiv:2604.06710 →

Wegavemachinesintelligence.Wenevergavethemtime.

What if intelligence was never the hard part?

The hard part is continuity. Preserving the living state of what still matters, updating it when reality changes, and keeping intelligence oriented across time.

If that layer exists, the center of gravity in AI shifts. Models still matter. But the durable value begins to move toward the system that can carry forward understanding across people, projects, institutions, and years.

That is the direction Kenotic is building toward.

What begins to happen when intelligence no longer starts over?

Machines begin to understand why something matters, when action should happen, and what should happen next without being re-instructed from zero every time. In software, that changes how systems work. In hardware, it changes what systems can become.

“The things that matter most must never be at the mercy of the things that matter least.”

Johann Wolfgang von Goethe

That is the problem continuity solves. The things that matter to you (what you're going through, what changed, what's unfinished) are at the mercy of systems that forget the moment you leave.

This is not a product flaw. It is a structural limit in the current stack. Every assistant, agent, workflow, clinic, institution, and device eventually reaches the same boundary.

“All that you touch, You Change. All that you Change, Changes you. The only lasting truth is Change.”

Octavia E. Butler

Retrieval says: here are some related past things.

Continuity says: here is the current living state of your situation.

That difference begins to change where value lives. Not only in the system that answers best in the moment, but in the one that remains useful over time.

We build the
continuity layer.

Continuity is the system property that lets an AI carry forward what still matters, update it when reality changes, and reconstruct useful context later, in the right form, at the right time, for the right situation.

This is a layer, not a feature. It sits underneath the current generation of AI products and quietly changes what they can become over time.

A database can store facts. A retriever can find related text. But neither can preserve the living state of a situation. That is why continuity belongs beneath the stack, not at its edges.

Once intelligence can remain oriented across time, the stack above it does not stay the same.

What makes continuity real.

01

Reconstruction

Not just finding old facts. Rebuilding the current picture. 'Summarize my situation' is a harder question than 'when is my interview?' (and the one that actually matters).

02

Disambiguation

250 lives in one system. Your sister's story stays separate from your boss's. When narratives overlap, the system must know which one you mean.

03

Update Handling

You were nervous. Now you're not. The appointment moved. The plan changed. A continuity system knows the difference between what was true and what is true now.

04

Temporal Ordering

Not just what happened. When, in what order, and what's still true. 'Last time this failed' is different from 'this is what we do now.'

If this layer becomes real, the world above it begins to change.

Systems become able to understand why something matters, when action should happen, and what should happen next without needing to be re-instructed from zero every time. In software, that makes intelligence steadier. In hardware, it points toward a different kind of machine substrate.

What begins to happen when intelligence no longer starts over?

Software becomes steadier. Work becomes more compounding. Entire categories that still depend on humans to hold the thread begin to change. The result is not only better AI. It is a different kind of infrastructure underneath products, institutions, and devices.

That is why this is not only a product thesis. Once continuity exists as infrastructure, new business categories, new operating models, and new forms of machine usefulness begin to emerge above it.

The layer is the company. What gathers around it is the future.

Software serves people.

Your data is yours.

It stays on your device. Privacy is physics, not policy. We do not collect what we cannot see.

You are not the product.

You are not training data. You are not an engagement metric. The system carries your life forward, not extracts value from it.

AI should be fair.

It works for everyone. Not just the users who generate the most revenue.

We publish our evidence.

Our results include our failures. If we cannot prove it, we do not claim it.

How do we prove it works?

We built ATANT. An open evaluation framework with 250 real-life narratives, 1,835 verification questions, 10 checkpoints, 6 life domains. Each story simulates how a person actually talks over days and weeks. The system ingests the conversations, then answers questions about them. No LLM in the evaluation loop. Deterministic. Reproducible.

The number that matters is the last one. 96% at 250 stories cumulative means 250 distinct life narratives coexist in the same store, and the system retrieves the correct fact for the correct context with no cross-contamination across people, situations, or time.

Most memory systems collapse before they reach 100. The cumulative test is what separates a continuity layer from a database that happened to remember a few things. The standard is open. Any team building memory or continuity systems can run it against their own architecture and publish results.

250
Narratives
1,835
Questions
10
Checkpoints
6
Domains
TestResult
Legacy architecturestarting point58%
50 stories, isolatedone life at a time100%
250 stories, isolatedone life at a time100%
50 stories, cumulativeshared store100%
250 stories, cumulativethe moat96%

NURA reference implementation, ATANT v1.0. Results published in arXiv:2604.06710. The corpus grows. The standard evolves.

Provenance

The continuity layer is being designed by someone who has shipped production hardware.

Most AI infrastructure is designed by people who have never had to make something hold state across a power cycle. The continuity problem looks invisible from inside a Jupyter notebook. From inside a factory, it is the first problem you solve.

The roadmap ends in silicon. A continuity node that any device can integrate. The lineage that gets it there starts on the floor of a manufacturing line.

Hardware

Three years designing industrial automation systems. $1.5M of production machinery shipped to Fortune 500 manufacturers. The systems that build iPhone bodies and Tesla components. 24/7 factory operation. Zero-downtime requirement.

Determinism

Industrial controllers (PLCs) hold persistent state across power cycles, restarts, and shift changes. They are deterministic by construction. The continuity layer is the same idea, applied one stack higher.

Full stack

Mechatronics. Electronics & Computer Engineering. Information Systems (M.S., Central Michigan University). Silicon, board, firmware, software, and the manufacturing process underneath all of it.

Production hardware shipped to
Schneider Electric·Continental Automotive·Brose·Tenneco·Tata Electronics

Let's talk.

Partnership inquiries, investment, research collaboration, or joining the team.

What this opens

Tell us what you are exploring, where you see continuity mattering, or why Kenotic caught your attention.

info@kenoticlabs.com

Join the waitlist.

Be first to access the Continuity SDK or try Raya when it launches.

If this sparks something (Optional)

If technology could truly understand continuity, what would you want it to become capable of?