Robots.txt Generator & Validator
Generate and validate robots.txt files with real-time syntax checking. Free tool with pre-built templates, SEO best practices, and instant validation.
Choose a Template
Editor
Valid Syntax
No critical errors detected
Best Practices
- ✓Always include User-agent: * directive
- ✓Add Sitemap directive with full URL
- ✓Don't block CSS/JS files (Google needs them)
- ✓Use /* for wildcards (blocks all subpaths)
- ✓Test in Google Search Console after deployment
- !robots.txt does NOT prevent indexing (use noindex meta tag)
Deployment Instructions
Next.js
Place in /public/robots.txt or create /app/robots.ts to generate dynamically
WordPress
Use Yoast SEO plugin or upload via FTP to site root (public_html). Must be accessible at yoursite.com/robots.txt
Static Sites
Place robots.txt in root folder before build/deployment. For Vercel/Netlify, place in public or static folder
How to Use This Tool
Choose a Template or Start Blank
Select a pre-built template (WordPress, Next.js, eCommerce) or start with a blank canvas to build custom rules.
Configure User-Agents & Rules
Add user-agent directives (Googlebot, Bingbot, or * for all), then specify Disallow/Allow paths. Add sitemap URL and optional crawl-delay.
Validate & Deploy
Real-time syntax validation highlights errors. Copy to clipboard or download as robots.txt. Upload to your site's root directory.