Latest Articles

How to Add Robots.txt in Blogger

Custom Robots.txt in Blogger

Robots.txt in Blogger
Robots.txt in Blogger
Today I would like to share one of the most major factors to have successful blog. Blogger mainly depends on creating content. Also you should have custom Robots.txt in blogger to improve your SEO. Search engine can crawl your blog successfully through your Robots.txt file.

What is Custom Robots.txt in Blogger?

In Blogger, the Custom Robots.txt feature allows you to customize the instructions for search engine robots that crawl and index your blog.

By default, Blogger provides a basic robots.txt file that is optimized for Blogger blogs. However, you may need to customize the robots.txt file if you have specific crawling or indexing requirements.

How to Add Robots.txt in Blogger?

  1. How to create a custom robots.txt file in Blogger.
  2. Go to your Blogger dashboard and click on the "Settings" tab.
  3. Scroll down to the "Crawlers and indexing" section and click on the "Custom robots.txt" link.
  4. In the text box that appears, paste the following code:
User-agent: Mediapartners-Google
Disallow:


User-agent: *
Disallow: /search
Allow: /


Sitemap: https://www.money2fast.com/sitemap.xml
Sitemap: https://www.money2fast.com/sitemap-pages.xml

......................................................................................................................................................................

5. Remember to replace your URL instead of www.money2fast.com.
6. After submitting the code, click on enable custom robots header tags.
  • Home page tags: choose ( all - noodp ).
  • Archive and search page tags: choose ( noindex - nofollow - noarchive - noodp ).
  • Post and page tags: choose ( all - noodp ).
Important.
  • Be careful when editing your robots.txt file. If you make a mistake, it could prevent search engines from crawling your entire blog.
  • If you are not comfortable editing your robots.txt file, you can ask a Blogger expert to do it for you.
  • You may need to wait a few days for the changes you make to your robots.txt file to take effect.

How to Fix Robots.txt Error in Blogger?

The "Blocked by Robots.txt" message in Blogger indicates that search engine crawlers, also known as robots or spiders, are unable to access specific pages or files on your blog due to restrictions set in the robots.txt file. This file serves as a set of instructions for search engines, informing them which pages to crawl and which to avoid.

To fix this issue and allow search engines to properly index your blog, follow these steps.

1- Disable Custom Robots.txt.
  • Navigate to your Blogger dashboard and click on the "Settings" tab.
  • Under the "Search Preferences" section, locate the "Crawlers and indexing" option.
  • Toggle the "Enable custom robots.txt" switch to the "OFF" position. This will restore the default robots.txt file, which allows search engines to crawl your blog pages.
2- Submit Sitemap to Google Search Console.
  • Access the Google Search Console and sign in using your Google account.
  • Select your blog property from the list of sites you manage.
  • In the left-hand menu, click on "Index" and then choose "Sitemaps."
  • Click on the "Add a sitemap" button.
  • Enter the URL of your blog's sitemap, which is typically "sitemap.xml" for Blogger blogs.
  • Click on "Submit" to notify Google Search Console about your sitemap.
3- Request Indexing of Blocked URLs.
  • Within the Google Search Console, navigate to "Coverage" and then select "Blocked by robots.txt."
  • Identify the specific URLs that are blocked by robots.txt.
  • For each blocked URL, click on the "Inspect" button.
  • In the inspection window, click on the "Request indexing" button. This will notify Google Search Console to recrawl and index the blocked URL.
4- Monitor Indexing Status.
  • Regularly check the "Coverage" section in Google Search Console to monitor the indexing status of your blog pages.
  • If any URLs remain blocked, repeat steps 2 and 3 to request indexing for those URLs.

Benefits of Robots.txt in Blogger

Here are some specific benefits of using robots.txt in Blogger.
  1. Improved website performance. By preventing search engines from crawling pages that are not important to your website, you can help to improve your website's performance.
  2. Improve your website's SEO: By preventing robots from crawling and indexing unimportant or duplicate pages, you can help to ensure that your website's most important pages are the ones that are being ranked and displayed in search results.
  3. Improved search engine ranking. By preventing search engines from crawling duplicate content, you can help to improve your website's search engine ranking.
  4. Reduced crawl budget. Search engines have a limited amount of crawl budget, which is the number of pages they can crawl per website. By preventing search engines from crawling pages that are not important to your website, you can help to ensure that they have enough crawl budget to crawl your most important pages.
  5. Preventing search engines from crawling and indexing pages that are not important to your website. For example, you might not want search engines to crawl your Blogger images or CSS files.
  6. Helping to prevent duplicate content issues. If you have multiple versions of the same content on your website, you can use robots.txt to tell search engines which version they should crawl and index.
  7. Protect your privacy: You can use a robots.txt file to prevent robots from crawling and indexing sensitive or private pages on your website.
Overall, robots.txt is a valuable tool that can help you improve your Blogger website's performance, search engine ranking, and crawl budget.
money2fast
By : money2fast
Comments



Font Size
+
16
-
lines height
+
2
-