SEO: Sitemaps & Robots.txt

Manage your XML sitemaps and robots.txt file to help search engines discover and crawl your content efficiently.

Overview

Sitemaps and robots.txt are two fundamental tools for guiding search engine crawlers. Sitemaps tell crawlers which pages exist and how important they are. Robots.txt tells crawlers which parts of your site they may or may not access. RakuWP gives you full control over both from the panel.

XML Sitemaps

Accessing sitemaps

  1. Go to SEO in the sidebar and select your site.
  2. Click the Config tab, then select Sitemaps.

What sitemaps are generated

RakuWP generates XML sitemaps for each content type on your WordPress site: posts, pages, categories, tags, and author archives. The sitemaps page shows the available sitemaps with their entry counts.

News Sitemap

For news publishers, you can enable the News Sitemap toggle. This generates a separate sitemap following Google News specifications, including only articles published within the last 48 hours. It is essential for appearing in Google News results.

How sitemaps work

The plugin generates the sitemap XML dynamically on your WordPress site. Search engines discover the sitemap through the robots.txt file (which includes a Sitemap: directive) or when you submit it manually in Google Search Console or Bing Webmaster Tools.

Robots.txt

Accessing the robots.txt editor

  1. Go to SEO > Config > Robots.txt.

Editing rules

The visual editor lets you manage robots.txt rules without editing the file manually. Each rule consists of:

  • User-agent: The bot the rule applies to (* for all bots, or a specific name like Googlebot).
  • Disallow: Paths the bot should not crawl (e.g., /wp-admin/).
  • Allow: Paths the bot may crawl, even within a disallowed directory.
  • Crawl-delay: Seconds between requests (respected by some bots, not Google).

Syncing with WordPress

When you save robots.txt rules in the panel, they are synced to your WordPress site. The plugin intercepts WordPress's default robots.txt output and replaces it with your custom rules. This means you control robots.txt entirely from the RakuWP panel.

Best practices

  • Always ensure your sitemap URL is referenced in robots.txt.
  • Do not block CSS and JS files in robots.txt, as Google needs them to render pages correctly.
  • Use Disallow: /wp-admin/ but Allow: /wp-admin/admin-ajax.php for WordPress sites.
  • Submit your sitemap to both Google Search Console and Bing Webmaster Tools for faster discovery.