Blogger Sitemap & Robots.txt Generator
Generate sitemaps and robots.txt files optimized for Blogger / Blogspot blogs
Blog URL Required
Step-by-Step: Submit Sitemap to Google Search Console
- 1. Go to Google Search Console (search.google.com/search-console)
- 2. Select your Blogger blog property (or add it if not already added)
- 3. In the left sidebar, go to Sitemaps under "Indexing"
- 4. Click Add a new sitemap
- 5. Paste the sitemap path (e.g.,
sitemap.xmlorsitemap-pages.xml) - 6. Click Submit and wait for Google to process
- 7. Repeat for additional sitemaps (posts, pages, labels)
Sitemap Validation Check if your sitemap URL is valid
Blog URL Required
Crawler Rules
Disallowed Paths Block crawlers from these paths
Sitemap Include in robots.txt
Crawl Delay Optional — not supported by Googlebot
Note: Crawl delay is not supported by Googlebot. It is supported by Bingbot, Slurp, and DuckDuckBot. Recommended values: 5-30 seconds.
Custom Directives Advanced — add your own rules
Common Blogger Robots.txt Templates
Standard / Recommended
Allow all crawlers, block /search. Best for most blogs.
Strict SEO
Block search, archive, preview, mobile redirect. Maximum SEO focus.
Open Access
Allow everything. No restrictions on any crawler.
Private / Block All
Block all crawlers. Useful for draft/private blogs.
How to Use This Tool Quick guide
Step-by-Step Instructions
- Enter your full Blogger blog URL (e.g.,
https://yourblog.blogspot.comorhttps://www.customdomain.com) - Sitemap Tab: Click "Generate Sitemap URLs" to see all available sitemap URLs for your blog. Add labels for per-label sitemaps.
- Robots.txt Tab: Configure your crawler rules, disallowed paths, and options. Preview the generated robots.txt.
- Click the copy buttons to copy any URL or the full robots.txt content.
- Follow the instructions below to add these to your Blogger blog and Google Search Console.
How to Add in Blogger Implementation guide
Adding Sitemap to Google Search Console
- Open Google Search Console at search.google.com/search-console
- Verify ownership of your Blogger blog if you haven't already
- Go to Sitemaps in the left sidebar under "Indexing"
- Enter the sitemap path:
sitemap.xml - Click Submit. Repeat for
sitemap-pages.xmlif needed
Adding Custom Robots.txt in Blogger
- Go to Blogger Dashboard > Settings > Crawlers and indexing
- Find Custom robots.txt and click Enable
- Paste your generated robots.txt content into the text box
- Click Save changes
- Verify by visiting
yourblog.blogspot.com/robots.txt
Important: Do NOT add the "Sitemap:" directive inside Blogger's custom robots.txt editor. Blogger handles sitemap declarations automatically. Only add the Sitemap line if you are using a custom domain setup.
Adding Custom Robots Header Tags in Blogger
- Go to Blogger Dashboard > Settings > Crawlers and indexing
- Find Custom robots header tags and click Enable
- Recommended settings:
Homepage:all, noodpArchive and Search pages:noindex, followDefault for posts and pages:index, follow
Blogger Sitemap Guide Understanding Blogger's sitemap system
What is a Sitemap?
A sitemap is an XML file that lists all the URLs on your blog that you want search engines to crawl and index. It acts as a roadmap for search engine bots.
Blogger's Built-in Sitemaps
- sitemap.xml — Default sitemap showing up to 500 recent posts. This is the most commonly submitted sitemap.
- sitemap.xml?max-results=9999 — Full sitemap showing all posts (up to 9999). Useful for large blogs.
- sitemap-pages.xml — Sitemap for your static pages (About, Contact, Privacy Policy, etc.)
- feeds/posts/default — Atom feed of all posts. Can also serve as a secondary sitemap.
- feeds/posts/default/-/LABEL — Posts filtered by a specific label/category.
Sitemap Limits in Blogger
- Default sitemap shows only the latest 500 posts
- Use
?max-results=9999to include more posts - If your blog has over 500 posts, Google may not index all URLs from the default sitemap alone
- For very large blogs (>9999 posts), you may need to submit multiple index sitemaps
- Blogger automatically generates and updates sitemaps — no plugin needed
Robots.txt Guide Understanding robots directives
What is Robots.txt?
Robots.txt is a text file at the root of your website that tells web crawlers which pages or sections they should or should not crawl. It's one of the first files search engines look for.
Common Robots.txt Directives
- User-agent: Specifies which crawler the rule applies to (e.g., Googlebot, *, Bingbot)
- Disallow: Tells the crawler NOT to access the specified path
- Allow: Explicitly allows a crawler to access a path (overrides Disallow)
- Sitemap: Points to the location of your XML sitemap
- Crawl-delay: Specifies a delay between requests (in seconds). Not supported by all crawlers.
Blogger-Specific Paths
- /search — Search results and label pages. Disallow to prevent duplicate content issues.
- /search?updated-min= — Archive pages by date. Disallow to reduce crawl waste.
- /p/ — Static pages. Allow for important pages (About, Contact), disallow if you have many low-value static pages.
- ?m=1 — Mobile version redirect parameter. Disallow to prevent mobile duplicate indexing.
- /feeds/ — RSS/Atom feeds. Usually fine to allow, but can disallow if preferred.
Features List
Sitemap URL Generator
Auto-generate all Blogger sitemap URLs from your blog URL
Per-Label Sitemaps
Create sitemaps for individual blog labels/categories
GSC Submit URLs
Generate ready-to-submit URLs for Google Search Console
Robots.txt Builder
Configure crawler rules visually and generate robots.txt
Template Presets
One-click templates for common robots.txt configurations
Custom Directives
Add your own custom rules and directives
Sitemap Validator
Check if your sitemap URL follows the correct format
Copy to Clipboard
Copy individual URLs or full content with one click
Warnings System
Get notified about common robots.txt mistakes
Crawl Delay Config
Set crawl delay with recommended values
Mobile Responsive
Works on all devices — desktop, tablet, and mobile
No Dependencies
Runs entirely in-browser with zero external libraries
Frequently Asked Questions Click to expand
How often does Blogger update its sitemap?
Blogger automatically regenerates its sitemap whenever you publish, edit, or delete a post. Changes are typically reflected within a few minutes. You don't need to manually update or regenerate your sitemap.
Why is Google not indexing all my posts?
The default Blogger sitemap only includes up to 500 recent posts. If you have more posts, use
sitemap.xml?max-results=9999 to include more. Also ensure your robots.txt isn't blocking important pages, and submit your sitemap in Google Search Console.Should I disallow /search in robots.txt?
Yes, it's generally recommended for Blogger blogs. The /search path generates label pages and search results that can create duplicate content issues. Disallowing it helps focus crawl budget on your actual posts.
Can I use a custom domain with this tool?
Yes! This tool works with both blogspot.com URLs and custom domains. Simply enter your custom domain URL (e.g., https://www.myblog.com) and the tool will generate the appropriate sitemap and robots.txt URLs.
What's the difference between sitemap.xml and feeds/posts/default?
sitemap.xml is a proper XML sitemap optimized for search engines. feeds/posts/default is an Atom/RSS feed. While Google can consume both, the sitemap.xml format is preferred for search engine crawling. The feed is better for RSS readers.Does crawl delay work with Google?
No. Googlebot does not support the Crawl-delay directive in robots.txt. Google controls its own crawl rate through Google Search Console settings. The crawl delay directive works with Bingbot, Slurp (Yahoo), and DuckDuckBot.
Can I add multiple sitemaps?
Yes, you can submit multiple sitemaps to Google Search Console. Common ones for Blogger include: sitemap.xml (posts), sitemap-pages.xml (static pages), and label-specific sitemaps for important categories.
Will changing my robots.txt affect my search rankings?
It can, both positively and negatively. Properly configured robots.txt can improve SEO by directing crawlers to important content. However, accidentally blocking important pages can cause them to be de-indexed. Always test changes carefully and monitor in Google Search Console.
Tips for Better SEO Best practices for Blogger blogs
Sitemap Tips
- Submit your sitemap to both Google Search Console and Bing Webmaster Tools
- Check sitemap status regularly in Search Console for crawl errors
- Use
sitemap.xml?max-results=9999if you have more than 500 posts - Keep your blog's label structure organized for better per-label sitemaps
- Resubmit your sitemap after major changes to your blog structure
Robots.txt Tips
- Always test your robots.txt using Google's Robots Testing Tool in Search Console
- Never disallow your CSS or JS files — Google needs them to render your pages
- Use the "Disallow: /search" rule to prevent duplicate content from label pages
- Keep your robots.txt simple — complex rules can lead to unexpected blocking
- Verify your robots.txt by visiting
yourblog.com/robots.txtafter saving
General Blogger SEO Tips
- Write descriptive, keyword-rich post titles and meta descriptions
- Use proper heading hierarchy (H1, H2, H3) in your posts
- Optimize images with descriptive alt text and compressed file sizes
- Create internal links between related posts
- Enable custom robots header tags in Blogger Settings
- Use a responsive Blogger template for mobile-friendliness
- Submit your blog to blog directories and search engines
- Post consistently and focus on quality content
