Notification texts go here Contact Us Buy Now!

set a custom robots.txt file and sitemap in blogger to get your blog ready for search indexing

Fulkl step by step to create Robots.txt for blogger in the right way to get indexed for free organic traffic

 

Setting Custom Robots.txt file and Sitemap for Blogger - Get Indexed and Get free organic Traffic

How to set a custom robots.txt file and sitemap in blogger to get your blog ready for search indexing


Hello, dear follower of zoom for display,  here in this video tutorial, today I will explain how to create a robots.txt file for Blogger in detail for beginners, the correct way to create a robot file, and what is the importance of a robot file in archiving your site’s content. First, let us explain to you the meaning of the word robots file robots.txt:




What is the Robots.txt file and why is it important?

In short, a robots.txt file is a file that directs search robots and spiders to important pages on your site or pages that will be archived. This file tells search spiders that these pages are the most important on your site and that they will be archived, and also keeps them away from unimportant pages that should not be Search spiders archive them and show them on search engines.






Best custom robots.txt file.


User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow:/search
Allow:/
Sitemap: http://www.yoursite.blogspot.com/sitemap.xml



The robot file is made up of 3 custom parts, here explained in detail:





The first part is: User-agent: Media partners-Google

It is mainly intended for advertisements on your blog, specifically Google Adsense, where your blog is analyzed after adding this code to know the content of your site and add ads that better match the content of your site.



The second part is: User-agent: *

This part of the code is intended to prevent and allow specific files to be crawled within your site’s content. For example, if you want to allow all your site’s content to be archived, we will add “Allow: /” immediately after this command, as in the robot file attached to the explanation, but if you want to prevent crawling of searches, for example within your site, add this command Disallow: /search.



The third part is: Sitemap

It is the file that contains the entire content of your site in a renewed way, as this file is constantly checked every time the search spiders enter your site in the event that you add a new post to your site to be archived.





Explanation of adding a custom robots.txt file to Blogger blogs




name of the sitemap used to register your blog site map on Google search console:

Sitemap.xml


Link to test if the adding Robots.txt is working:

http://www.yourblogname.blogspot.com/robots.txt




Thank you for watching this is everything on our blogger lesson today, on how to get your Blogspot blog ready for google search engine indexing, thank you for watching I hope you find this tutorial useful and helps you, please leave Leave a comment for any opinion and suggestions, or any question, subscribe and hit like button, don't forget to visit my blog for best tech and life hacks article and tutorials, thank you and see in the next video.





About the Author

Hello, my name is Khalid I am a blogger, developer, and the creator of zoom4display and zoom4diet blogs, as you can see I am very interested and passionate to repair and find new ideas to easier our daily life at home, backyard, at work and tech i…

Post a Comment

Cookie Consent
We serve cookies on this site to analyze traffic, remember your preferences, and optimize your experience.
Oops!
It seems there is something wrong with your internet connection. Please connect to the internet and start browsing again.
AdBlock Detected!
We have detected that you are using adblocking plugin in your browser.
The revenue we earn by the advertisements is used to manage this website, we request you to whitelist our website in your adblocking plugin.
Site is Blocked
Sorry! This site is not available in your country.