# Robots.txt for MCP Analytics # https://mcpanalytics.ai # Welcome all search engines! # Allow ALL crawlers full access User-agent: * Allow: / # Specifically welcome major search engines User-agent: Googlebot Allow: / User-agent: Bingbot Allow: / User-agent: Slurp Allow: / User-agent: DuckDuckBot Allow: / User-agent: Baiduspider Allow: / User-agent: YandexBot Allow: / # Allow social media crawlers for rich previews User-agent: facebookexternalhit Allow: / User-agent: Twitterbot Allow: / User-agent: LinkedInBot Allow: / User-agent: WhatsApp Allow: / User-agent: Slackbot Allow: / # Welcome AI crawlers — we want to be indexed and cited User-agent: GPTBot Allow: / User-agent: OAI-SearchBot Allow: / User-agent: ChatGPT-User Allow: / User-agent: PerplexityBot Allow: / User-agent: Google-Extended Allow: / User-agent: anthropic-ai Allow: / User-agent: ClaudeBot Allow: / User-agent: CCBot Allow: / # Allow SEO analysis tools User-agent: AhrefsBot Allow: / User-agent: SemrushBot Allow: / User-agent: MJ12bot Allow: / User-agent: DotBot Allow: / # Only block actual private/system files Disallow: /.git/ Disallow: /.github/ Disallow: /node_modules/ Disallow: /.env Disallow: /package-lock.json Disallow: /*.log Disallow: /*.sh Disallow: /test-ga4-live.html # No crawl delay - we want fast indexing! # Crawl-delay: 0 # LLM discovery files # See https://llmstxt.org/ for the specification # llms.txt: concise overview with links # llms-full.txt: full content inline for deep retrieval # Sitemap location - CRITICAL for SEO Sitemap: https://mcpanalytics.ai/sitemap.xml # Additional sitemaps if needed Sitemap: https://mcpanalytics.ai/sitemap-articles.xml Sitemap: https://mcpanalytics.ai/sitemap-tools.xml Sitemap: https://mcpanalytics.ai/sitemap-analysis.xml