Learn
Free guides on accessibility, SEO, and AI search optimization. Understand what eiSEO scans for — and how to fix what it finds.
10 guides
Color Contrast
Color contrast is the difference in luminance between foreground text and its background. WCAG 2.1 requires a minimum contrast ratio of 4.5:1 for normal text and 3:1 for large text (18px bold or 24px regular). Insufficient contrast makes content difficult or impossible to read for users with low vision, color blindness, or those viewing screens in bright sunlight.
Image Alt Text
Alt text (the HTML alt attribute) is a short text description of an image that screen readers announce to visually impaired users. It also displays when an image fails to load and is used by search engines to understand image content. Every non-decorative image on a page must have meaningful alt text that conveys the image's purpose or information.
Heading Hierarchy
Heading hierarchy refers to the logical nesting of HTML heading elements (H1 through H6) on a page. A well-structured hierarchy starts with a single H1, followed by H2s for major sections, H3s for subsections, and so on without skipping levels. Screen readers use headings as a navigation shortcut, allowing users to jump between sections.
Keyboard Navigation
Keyboard navigation means that every interactive element on a page — links, buttons, form fields, menus, and custom widgets — can be reached and operated using only the keyboard (typically via the Tab, Enter, Space, and arrow keys). This is a fundamental accessibility requirement because many users cannot use a mouse.
ARIA Violations
ARIA (Accessible Rich Internet Applications) attributes provide extra semantic information to assistive technologies when native HTML elements are insufficient. ARIA violations occur when these attributes are used incorrectly — for example, applying an invalid role, using aria-labelledby to reference a non-existent ID, or adding redundant ARIA roles to elements that already have implicit semantics.
Focus Management
Focus management controls which element on the page currently receives keyboard input, indicated by a visible focus ring or outline. Proper focus management means interactive elements show a clear visual indicator when focused, focus moves logically through the page, and focus is programmatically moved to the right place when content changes (e.g., opening a modal should move focus into it).
Link Text
Accessible link text clearly describes the destination or purpose of a link without relying on surrounding context. Screen reader users often navigate by pulling up a list of all links on a page, so each link must make sense in isolation. Generic phrases like "click here", "read more", or "learn more" provide no information about where the link leads.
Form Labels
Form labels are HTML <label> elements that programmatically associate a text description with a form input (text field, checkbox, select, etc.) using the for attribute matched to the input's id. Without a label, screen readers announce a form field as just "edit text" or "checkbox" with no indication of what information the user should enter.
Alt Text vs. Title Attribute
The alt attribute and the title attribute serve different purposes on HTML images. The alt attribute provides a text alternative that screen readers announce to visually impaired users and that displays when an image fails to load. The title attribute generates a tooltip on mouse hover and provides supplementary information. Alt is required by WCAG for non-decorative images; title is optional and not reliably accessible.
Accessibility Benchmarking
Accessibility benchmarking is the practice of measuring and comparing WCAG compliance metrics across multiple websites to establish baselines, track progress, and identify competitive gaps. Rather than auditing a single site in isolation, benchmarking uses normalized metrics like errors per page and health scores to make fair comparisons between sites of different sizes. A site with 50 accessibility errors across 500 pages (0.1 errors/page) is performing better than one with 20 errors across 10 pages (2.0 errors/page).
13 guides
Title Tags
The title tag is the HTML <title> element that defines the page's title in the browser tab and, most importantly, as the clickable headline in search engine results pages (SERPs). It is widely considered the single most important on-page SEO element because it directly tells search engines and users what the page is about.
Meta Descriptions
A meta description is the HTML <meta name="description"> tag that provides a brief summary of a page's content. Search engines often display this text as the snippet beneath the title in search results. While not a direct ranking factor, the meta description is your primary tool for convincing searchers to click your result over competitors.
H1 Structure
The H1 tag is the primary heading on a page and serves as the main content title visible to users. It signals to search engines the most important topic of the page. Best practice is to have exactly one H1 per page that closely aligns with the title tag but is written for the on-page reading experience rather than search result display.
Image Optimization
Image optimization is the process of delivering images at the correct size, format, and compression level while providing proper HTML attributes (alt text, width, height, lazy loading). Unoptimized images are the number one cause of slow page loads on the web, directly impacting Core Web Vitals scores that Google uses as a ranking signal.
Canonical Tags
A canonical tag (rel="canonical") is an HTML element placed in the <head> of a page that tells search engines which URL is the preferred or "canonical" version of that page. When the same content is accessible at multiple URLs (due to query parameters, www/non-www variations, HTTP/HTTPS, or syndication), the canonical tag consolidates all ranking signals to a single URL.
Open Graph
Open Graph (OG) is a protocol originally created by Facebook that uses meta tags in the <head> of a page to control how content appears when shared on social media platforms. The four required OG properties are og:title, og:type, og:image, and og:url. Without them, platforms attempt to auto-generate a preview that is often inaccurate or visually unappealing.
Robots Meta
The robots meta tag is an HTML <meta name="robots"> element that instructs search engine crawlers whether to index a page and whether to follow its links. Common directives include index/noindex (whether to include the page in search results) and follow/nofollow (whether to pass link equity through the page's outbound links). An X-Robots-Tag HTTP header can also deliver these directives.
Link Text SEO
Anchor text (link text) is the visible, clickable text of a hyperlink. Search engines use anchor text as a strong signal to understand what the linked page is about. Optimized anchor text uses descriptive, keyword-relevant phrases instead of generic text like "click here" or "read more", helping search engines connect your internal and external links to the right topics.
Canonical Tags vs. Redirects
Canonical tags and 301 redirects both address duplicate content, but they work differently. A canonical tag (rel="canonical") is an HTML hint telling search engines which URL is the preferred version while keeping both pages accessible to users. A 301 redirect physically sends users and crawlers from one URL to another, making the old URL inaccessible.
URL Structure
URL structure refers to the format and organization of web page addresses. SEO-friendly URLs use hyphens to separate words, maintain a flat hierarchy, use lowercase characters exclusively, and avoid unnecessary parameters or special characters. A well-structured URL communicates page content to both users and search engines before the page is even loaded.
Internal Linking
Internal linking is the practice of creating hyperlinks between pages on the same website. A strategic internal linking structure distributes link equity (ranking power) throughout your site, helps search engines discover and index all pages, and guides users to related content. Key concepts include anchor text strategy, hub pages, and avoiding orphan pages that have no inbound internal links.
Site Architecture
Site architecture is the hierarchical organization of pages, navigation systems, and URL structures that define how content is grouped and accessed on a website. Good architecture follows the 3-click rule (any page reachable within 3 clicks from the homepage), uses clear breadcrumb navigation, and organizes content into logical categories that match user intent and search engine crawl patterns.
Competitor SEO Analysis
Competitor SEO analysis is the process of scanning and comparing your website's technical SEO health against competitor sites using the same set of checks. Instead of auditing your site in isolation, you measure identical metrics — error counts, warnings, health scores, and errors per page — across your sites and your competitors, then identify where competitors outperform you and where you have an advantage. The comparison spans multiple scan engines (accessibility, SEO, AI SEO) to give a complete picture.
11 guides
Structured Data
Structured data is machine-readable markup (typically JSON-LD using the Schema.org vocabulary) embedded in your page's HTML that explicitly describes the content's type, properties, and relationships. It tells search engines and AI systems exactly what your content is — an article, a product, a recipe, an FAQ — rather than requiring them to infer it from unstructured text.
FAQ Blocks
FAQ blocks are structured question-and-answer sections on a page, marked up with FAQPage Schema.org structured data. They present information in the exact format that both traditional search engines (for featured snippets) and AI search engines (for synthesized answers) prefer to extract: a clear question followed by a concise, authoritative answer.
Author Attribution
Author attribution is the practice of clearly identifying who created a piece of content, both visually on the page and in structured data. This includes a visible author byline, a link to the author's bio page, and Person schema markup that connects the content to a real, identifiable author with credentials and expertise.
Freshness Signals
Freshness signals are indicators that tell search engines and AI systems when content was last published or updated. These include visible publication and modification dates on the page, datePublished and dateModified properties in structured data, and the actual content updates reflected in the page. AI search engines weigh freshness heavily when selecting sources for their answers.
AI Bot Directives
AI bot directives are rules you set in robots.txt and meta robots tags to control how AI company crawlers (GPTBot, Google-Extended, ClaudeBot, Bytespider, and others) access and use your content. These directives let you decide whether your pages can be crawled for AI training data, used in AI search results, or blocked entirely from specific AI systems.
Content Extractability
Content extractability measures how easily AI systems and web crawlers can parse, understand, and pull meaningful information from your page. Pages with clean semantic HTML, clear heading structure, well-organized sections, and content that is not locked behind JavaScript rendering or interactive widgets are highly extractable. Pages that rely on complex JavaScript frameworks, embed content in images or PDFs, or lack semantic structure are difficult for AI systems to process.
Statistics & Citations
Statistics and citations are verifiable data points, research findings, and attributed quotes included in your content with clear source references. This includes specific numbers, percentages, study results, and expert quotes that are formatted in a way AI systems can extract and attribute. Well-cited content signals authority and trustworthiness to both AI and traditional search engines.
Topic Clusters
Topic clusters are an SEO content strategy where a central pillar page covers a broad topic comprehensively, linked to and from multiple cluster pages that address specific subtopics in depth. This creates a web of semantically related content that signals topical authority to both traditional and AI search engines. AI systems particularly value topic clusters because they provide the comprehensive, interconnected knowledge needed to generate authoritative answers.
Content Freshness
Content freshness refers to how recently a page was published or last updated. AI search engines weigh freshness heavily when selecting sources for citations — ChatGPT prioritizes content updated within the last 30 days (giving it 3.2x more citations), while other platforms have their own recency thresholds. Freshness is signaled through dateModified schema markup, last-updated timestamps, and the actual recency of the information on the page.
AI Content Patterns
AI content patterns are specific content structures that make pages easier for AI search engines to extract, understand, and cite. Key patterns include definition blocks (a concise 40-60 word summary paragraph following each H2), evidence sandwiches (claim → supporting data → source citation), and self-contained answers (paragraphs that fully answer a question without requiring surrounding context). These patterns align with how AI systems parse and select content for generated answers.
AI SEO Competitive Analysis
AI SEO competitive analysis is the process of comparing your site's AI search readiness against competitor sites using the same AI SEO scan checks — structured data presence, FAQ and definition blocks, author attribution, freshness signals, AI bot access, content extractability, and citation patterns. By scanning competitors with identical criteria, you can identify where their content is better structured for AI extraction and where you already lead, then prioritize the specific gaps that are most likely costing you AI search citations.
2 guides
Robots.txt for AI Search Engines
Robots.txt is a text file at the root of your website that tells web crawlers which pages they can and cannot access. With the rise of AI search engines, robots.txt has become the primary way to control whether AI crawlers like GPTBot (OpenAI), ClaudeBot (Anthropic), Google-Extended (Google), and PerplexityBot (Perplexity) can crawl your content. Each AI company has its own crawler user-agent, and you need specific directives for each one to control access.
llms.txt
llms.txt is a proposed standard file (placed at /llms.txt on your domain) that provides a structured, plain-text summary of your website specifically for large language models. While robots.txt controls whether AI crawlers can access your site, llms.txt helps them understand what your site is about, what content is most important, and how it's organized. Think of it as a "readme" for AI — a concise document that gives AI models the context they need to accurately represent your content in search results.