Deaf Community First

Why CLERC exists because of, not for

People ask me what CLERC's strength is. They expect me to say the vision, or the data layer, or the AI. None of those are the answer.

CLERC's strength is the Deaf community. Without the Deaf community, CLERC does not exist.

That sentence is not a values statement. It is the most accurate technical description I can give of how this company works.

Not a values statement. A technical requirement.

Most companies that touch sign language frame their relationship to the Deaf community as a mission, a cause, or an inclusion strategy. CLERC does not. We're a tech company building infrastructure, and the reason the Deaf community sits at the center of our work is not ethical. It's structural.

Sign language is not a code that can be documented from the outside. It is a living language with three-dimensional grammar, regional variation, generational drift, and cultural context that no external observer can fully reconstruct. When a hearing team labels sign language, they make thousands of small judgment calls. Gloss boundaries. Register shifts. Prosodic markers. Regional variants. Each of those calls compounds silently into a corrupted dataset, and by the time a model trains on it, the damage is locked in.

The only way to build an honest data layer is to build it with the people who built the language. Everything else is approximation.

Inside the community, not adjacent to it

A decade in sign language AI taught me one thing. The bottleneck was never the model, the compute, or the architecture. The bottleneck was always the data underneath, and the data underneath was always built too far from the community.

CLERC is built around that lesson. Deaf founder. Deaf annotators. Native Deaf signers. The people who shape the corpus are the same people who carry the language. They know why a sign exists, why a movement means what it means, why a facial expression can turn a statement into a question.

This is not a recruiting choice. It is a methodology. The corpus is honest because the people building it grew up in the language.

Translation is not the goal. Understanding is.

The industry has spent a decade chasing translation. Sign language in, text out. Text in, avatar out. Most of it works inside controlled vocabularies and falls apart everywhere else.

The mistake is upstream of the model. Translation assumes the language is a string of tokens to be mapped onto another string. Sign language is not that. It is a system where space, motion, facial expression, eye gaze, and timing carry grammatical and semantic load simultaneously. Translating it without understanding it produces output that looks correct and means nothing.

What we are building is a data layer designed to make AI understand. Why a sign exists. Why a movement is shaped that way. Why two regional variants both mean the same thing. Why a facial marker turns a statement into a question. That kind of understanding cannot be reverse-engineered from raw video. It has to be encoded by the people who carry the language.

When the AI understands, the use cases follow. Translation is one of them. So are 3D avatars, SLR benchmarks, lesson generators, sign mapping across communities, and use cases we have not imagined yet.

From 19th century education to 21st century infrastructure

Laurent Clerc carried sign language education from France to the United States in 1816 and co-founded the first permanent school for the Deaf in the New World. That move shaped two centuries of Deaf education and the entire trajectory of American Sign Language.

CLERC carries the same logic into a different layer. The 19th century built the schools. The 20th built the linguistic foundation. The 21st has to build the data infrastructure that AI runs on, and it has to be built by the same community, for the same reason. Otherwise the next century of sign language gets defined by people who do not speak it.

That is the lineage we sit inside. The name is not a nod. It is a commitment.

Built by the community, for everyone

The data we build is shaped by the Deaf community. The use cases it unlocks are not.

When the corpus is structured honestly, AI starts returning to the Deaf community what has been kept out of reach for too long: opportunity, agency, power. Not for a handful of users in a pilot program. For the majority of the community, at scale. That is the first reason this infrastructure exists.

And once the AI actually understands sign language, the same infrastructure scales outward. Hearing learners, linguists, interpreters, researchers, families, colleagues, entire industries that have never had a way to engage with sign language seriously. Built by the community, optimized for the world.

This is the inversion most actors in this space miss. They build for Deaf people and end up serving a narrow market with thin data. We build with the Deaf community and end up with infrastructure that scales far beyond it, because the data is real.

CLERC is not a vision. It is not a product. It is not even an AI. It is a community of native Deaf signers and Deaf annotators encoding their own language into the data layer of the next decade of AI.

Without that community, none of it exists.

CLERC is building the foundational data layer for Sign Language AI. Native Deaf signers. Structured from the start.

Follow @CLERC to track the build.