Fixing the robots.txt
file in Blogspot (Blogger) is essential for proper indexing and crawling of your site. Here's how you can do it:
Access Blogger Settings:
Log in to your Blogger account.
Go to the Settings section and scroll down to Crawlers and Indexing.
Enable Custom :
Turn on the Custom option.
Click on the Edit button to modify the file.
Customize Your File:
Add the following code as a starting point:
User-agent: * Disallow: /search Allow: / Sitemap: https://yourblog.blogspot.com/sitemap.xml
Replace
yourblog.blogspot.com
with your actual blog URL.
Save Changes:
After editing, save the changes to apply the new
robots.txt
file.
Verify in Google Search Console:
Use the URL Inspection Tool in Google Search Console to check if the changes are effective.
No comments:
Post a Comment