Back to blog
architecturerustengineering

The Architecture Behind Evokoa

Damien Lim
Damien Lim
CTO
·
May 5, 2026
·
6 min read

Evokoa looks simple from the outside: connect an existing database, ask relationship questions, get connected context back quickly. The architecture underneath is where most of the work lives.

We are not trying to replace Postgres or the systems teams already trust. We are building the layer that sits beside them. Evokoa keeps a compact relationship cache hot, traverses that cache first, then goes back to the source database only for the rows that matter.

The shape of the system

The core idea is separation. Operational databases are excellent at storing records, enforcing constraints, and serving transactional workloads. Relationship traversal has a different access pattern. It wants to move quickly across IDs, edges, offsets, and small pieces of metadata without pulling full records into memory at every step.

So Evokoa splits the problem into two paths. The source database remains the system of record. The relationship cache keeps the topology needed for traversal. Apps and agents ask Evokoa for connected context, Evokoa searches the cache, and the database hydrates the final result set.

This gives us a smaller hot path. It also makes the system easier to reason about. We do not need to convince a team to migrate data into a separate graph database before they can ask graph-shaped questions.

Why Rust

Rust was not a branding decision. It was a latency and correctness decision.

The engine spends its life in the critical path between an agent and live company data. If traversal pauses unpredictably, the agent pauses too. Garbage collection pauses were not something we wanted in that path. Rust gives us predictable memory ownership, no runtime garbage collector, and direct control over layout without giving up safety.

The other reason is concurrency. Evokoa has to ingest changes from source systems while queries are running. Reads are frequent, writes can arrive continuously, and the cache has to remain consistent. Rust makes many unsafe concurrency patterns difficult to express in the first place. That has been valuable for a small team building something performance-sensitive.

What Rust made easier

The biggest advantage has been confidence. When the compiler accepts a refactor across the core engine, we have already eliminated a large class of memory and ownership mistakes. That changes how we work. We can be aggressive about improving the internals without feeling like every change might introduce a hidden lifetime bug.

Rust also pushed us toward explicit interfaces. The boundaries between ingestion, indexing, traversal, and hydration are clearer because the types force us to name the contracts. That matters when the system is moving from prototype to infrastructure.

What Rust made harder

Rust makes you pay for ambiguity early. During the first versions of the engine, that could feel slow. Ideas that would be easy to sketch in a dynamic language required us to be precise about ownership, borrowing, and data lifetimes before the idea was fully settled.

Over time, that became a benefit. The hard questions did not disappear. Rust just made us answer them while the architecture was still small enough to change.

The design lesson

The architecture that survived is the one with the fewest moving parts in the hot path. Keep the source database authoritative. Keep the relationship cache compact. Keep traversal close to memory. Hydrate only what the query actually needs.

That is the direction we are continuing to push. Evokoa should feel boring to operate and unusually fast to query. Rust has been a good fit for that philosophy because it rewards the same things we care about: explicit structure, predictable performance, and careful control over the critical path.