Ninety days.

That is the honest answer. Three months for the foundation to compound. Three months before AI models start citing you in responses. Three months before the dashboard shows numbers worth talking about.

Most agencies dodge this question or bury it behind caveats. We lead with it. Because if you understand why it takes 90 days, you will also understand why the results after 90 days are worth the wait.

What happens each month

Month 1 / Foundation

Technical audit. Schema deployment across every critical page: Organization, WebSite, BreadcrumbList, Article, FAQPage. Entity registration on Wikidata, Crunchbase, G2, LinkedIn. Canonical tags, robots directives, structured data validated and live.

By the end of month one, AI models can read your site. They can parse your entity. They know what you do, who you are, and how your content relates to itself.

You will not see citations yet. That is expected. The foundation is invisible until it compounds.

Month 2 / Indexing

Content gets indexed. Citation tracking goes live. The dashboard shows movement: edge queries start appearing, long-tail questions where your structured data surfaces. AI models are learning your entity. Not citing yet. Learning.

This is the month that feels slow. The data is being absorbed into model training windows and retrieval indexes. The structured content you deployed in month one is being processed, not ignored. But you cannot see the processing. You can only see the inputs and, later, the outputs.

Month 3+ / Compounding

First verifiable citations. Share of answer starts climbing. Each piece of content reinforces the last because of the cross-linking schema graph you built in month one. The curve bends upward.

After month three, every additional month builds on everything before it. Month four is stronger than month three. Month six is dramatically stronger than month four. The compounding is real and measurable. But only if you built the foundation correctly in month one.

Why it takes this long

AI models retrain on cycles. ChatGPT, Gemini, Perplexity, and Google's AI Overviews all pull from indexes that update on their own schedule, not yours. The schema you deploy in week two gets absorbed into the next training window or retrieval crawl. You cannot rush the retraining schedule. Nobody can.

This is not a criticism. It is physics. The data pipeline between "content published" and "content cited" has latency built in. Understanding that latency is the difference between quitting too early and compounding past your competitors.

Why faster promises are lies

If an agency promises AI citations in 30 days, they are either lying about the timeline or redefining "results" to mean something other than verifiable AI citations. Usually they are selling you Google Ads traffic and calling it AEO. Paid clicks are not citations. They disappear when you stop paying.

AEO is infrastructure. It builds equity. When an AI model cites your site, that citation persists in the model's training data and reinforces future citations. Ads do the opposite. They rent attention and leave nothing behind.

This timeline is not unusual

Traditional SEO takes three to six months to show meaningful results. Everyone in the industry knows this. AEO operating on a similar timeline should not surprise anyone. The mechanisms are different but the patience required is the same.

The difference is what you are building toward. SEO builds rankings on a results page that fewer people scroll through every quarter. AEO builds citations in the answers that replaced the results page. Same patience, better destination.

The compound effect is the point

After month three, every additional month builds on everything before it. New content reinforces old content through schema relationships. Entity authority deepens. Citation frequency increases because the model already trusts your structured data from prior training windows.

Stopping at month two resets the curve. You absorb the cost of the foundation but never collect the return. This is not a campaign you run for a quarter and evaluate. It is infrastructure you build once and maintain. The ongoing cost is a fraction of the initial build. The returns compound indefinitely.

We have seen it work

When we took on Montaic as a client, the foundation took 12 days to build. Schema across every page type. Entity registration on six platforms. Seven pillar articles with full structured data. From zero AI citations to a complete search foundation in under two weeks.

The foundation is built. The compounding is underway. That is the pattern: fast build, patient compounding. The 90-day timeline is not about how long it takes us to do the work. It is about how long it takes the models to absorb it.

The real question

The question is not "how long does AEO take." The question is "what happens if you wait another 90 days to start." Every quarter you delay is a quarter your competitors use to register their entities, deploy their schema, and claim the citation slots you wanted.

AI search is not coming. It is here. The models are answering questions about your industry right now. The only question is whether they are citing you or someone else.