Annotations
When AI visibility metrics change, the first question is always why. Annotations answer that question by marking specific dates on your time-series charts with context about what happened — a content publish, a campaign launch, a site migration, a goal achievement. Without them, you're staring at lines on a chart with no narrative.
Why Annotations Matter
AI visibility metrics are influenced by dozens of factors: content changes, competitor activity, AI platform updates, seasonal trends. When you see a metric improve, you need to know whether it was your work or coincidence.
Annotations create a visual timeline of your actions overlaid on your data. When a citation spike lines up with an annotation reading "Published comprehensive CRM comparison guide," you have the basis for a causal story — not just a correlation.
For SEO and AEO Professionals
Your work follows a pattern: research, optimize, publish, wait, measure. The "wait" part is where things get lost. By the time metrics shift (often 1-3 weeks after a content change), you've moved on to other tasks and may not remember exactly what changed when.
Annotations solve this by creating a permanent record:
- Content publishes: Mark the date you published or updated a key page. When you review Historical Metrics two weeks later and see brand consistency trending up, the annotation tells you exactly what drove it.
- Technical changes: Structured data updates, robots.txt changes, schema markup additions — these affect how AI platforms parse your site but are invisible in content. Annotate them so you can track the downstream impact.
- Competitor movements: Notice a competitor launched a major content hub? Annotate it. If your share of voice dips, you'll know it wasn't your content degrading — it was a competitive shift.
- Algorithm or platform changes: When AI platforms update their models or citation behavior, mark it. This helps separate "the world changed" from "our work had an effect."
For PR and Communications Teams
PR campaigns have defined timelines and milestones. Annotations let you map those milestones onto visibility data:
- Outreach phases: Mark when you began pitching, when the first placement published, and when the campaign wrapped. This creates a clear narrative arc on the charts.
- Key placements: When a target publication runs your story, annotate it. Then check Citation Intelligence to see if and when AI platforms start citing that publication in response to relevant queries.
- Events and launches: Product announcements, conference appearances, press events — all create visibility moments worth tracking. Mark the date so you can measure lift.
- Client milestones: If you're reporting to clients on campaign impact, annotations on the charts tell a visual story that raw numbers can't. "Here's when Forbes ran the piece, and here's when AI citations from Forbes appeared" is far more compelling than a table of metrics.
How Annotations Appear on Charts
Annotations show as diamond markers at the bottom of time-series charts in the Historical Metrics and Citation Intelligence dashboards. Each diamond sits on the date of the annotation.
Hovering over a diamond shows:
- The date of the annotation
- The title you gave it
This works across all chart types: Brand Consistency, Share of Voice, Citations Over Time, AI Readiness Score, and Grounding Search Gap Trends.
Creating Annotations
Manual Annotations
Navigate to Annotations in the property sidebar under Configuration. Click Add Annotation and provide:
| Field | Required | Description |
|---|---|---|
| Date | Yes | The date of the event. Defaults to today. |
| Title | Yes | A short label (up to 200 characters). This is what appears on hover in charts. |
| Description | No | Additional context for your team. Not shown on charts but available in the annotation list. |
Good annotation titles are short and specific:
- "Published AI tools comparison guide"
- "TechCrunch coverage went live"
- "Added schema markup to product pages"
- "Competitor X launched content hub"
- "Gemini model update"
Auto-Generated Annotations
Some annotations are created automatically by Spyglasses:
- Project goal achievements: When a Project goal is hit (a tracked page is cited or a tracked publisher appears in an AI response), an annotation is automatically created at that date. These appear with an "Auto" badge in the annotation list.
Auto-generated annotations are tied to their project and include details about which goal was achieved.
Managing Annotations
The Annotations dashboard shows all annotations for the property in a table:
| Column | Description |
|---|---|
| Date | When the event occurred |
| Title | The annotation label |
| Description | Additional context (if provided) |
| Source | "Manual" for your entries, "Auto" for system-generated ones |
You can edit any annotation to update its date, title, or description. You can also delete annotations you no longer need. Deletions are permanent.
Best Practices
Annotate Before You Forget
The hardest part of annotation is remembering to do it. Create the annotation the same day you publish content or complete a technical change. Waiting a week means you'll either forget or get the date wrong.
Be Specific in Titles
"Content update" doesn't help you three months later. "Rewrote /blog/crm-comparison with AI-focused structure" tells you exactly what happened. The title is what appears on the chart, so make it informative enough to jog your memory.
Annotate Both Your Actions and External Events
Your annotations shouldn't only reflect your own work. Significant external events that affect AI visibility are equally worth tracking:
- A competitor's major content launch
- A known AI platform model update
- A Google algorithm change that affects grounding searches
- Seasonal shifts in your industry
This gives you a complete picture when interpreting metric changes.
Use Annotations for Reporting
When preparing monthly or quarterly visibility reports, pull up the Historical Metrics dashboard filtered to the reporting period. The annotation markers create a visual narrative: "Here's what we did, and here's how metrics responded." This is far more effective than a spreadsheet of dates and numbers.
Coordinate with Projects
If you're using Projects, many annotations will be created automatically when goals are hit. Supplement these with manual annotations for actions the system can't detect — content publishes, outreach milestones, technical changes. Together, the auto and manual annotations create a complete project timeline.
Don't Over-Annotate
Marking every minor edit or insignificant event creates noise. Focus on changes that could plausibly affect AI visibility: substantial content publishes, technical infrastructure changes, major competitor moves, and significant campaign milestones. If you wouldn't mention it in a report, it probably doesn't need an annotation.
Related
- Projects — Track specific efforts with goals, prompts, and annotations
- Historical Metrics — Time-series charts where annotations appear
- Citation Intelligence — Citation trend charts with annotation markers
- Property Pages — Manage pages tracked in SEO project goals