[shortcode1]
How To Create Robots.txt File & Sitemap.xml File For Seo
Wanna learn how you can rank keywords almost instantly? Visit: for dominating Google Rankings Easily & Forever. Guaranteed Better Results.

How to create robots.txt file & sitemap.xml file for seo purposes. Watch this video to learn how to take advantage of these methods for better Google crawling.

Brought to you by seo services, in the SEO circles, this method of controlling which pages should be indexed by Google search engine is called PageRank sculpting or link sculpting.

Please NOTE: do NOT include the forward slash / by accident, because by doing so, you are directing user agents to NOT index your entire website (I’ve seen it happen), so whatever you do, do NOT use it like this:

User-Agent: Googlebot
Disallow: /
I repeat do NOT place the forward slash / (see above) because that tells Googlebot NOT to index your entire website (meaning it will de-index your entire website which no webmaster would want).

So as far as internet is concerned, you have an option to control which of your web pages you want indexed by Google which surely is a good thing to consider for your online business.

You can read more about blocking certain webpages using robots.txt file through Google guidelines for webmasters here:

XML sitemaps are a method to tell Google about web pages on your site that it might otherwise not discover, simply head out to:

The key points that I want to draw your attention to are: Web crawlers (also known as web spiders) often discover (thus, follow) web pages from one hyperlink to another found on internet. Sitemaps supplement your URI data to allow web crawlers that support Sitemaps to become aware of all the URLs in the Sitemap you create, and to learn about those URLs using the associated metadata. Read more about sitemap standards here:

For in depth reading (most webmasters can skip this) you can visit:

You can even use use other methods instead of robots.txt file for Googlebot, visit this page to learn more about Robots meta tag and X-Robots-Tag HTTP header specifications:


[shortcode2]

18 thoughts on “How To Create Robots.txt File & Sitemap.xml File For Seo

  1. Hi, I am still learning but Google removes underscores so the urls are ineffective as part of Google’s 200 seo criteria. I had to start redirecting to new urls with dashes? Pointless missing out an easy bonus point.

  2. G’day mate, I am not familiar with web.com and how their CMS work. I have briefly looked at their site to see any documentation but couldn’t find it. I encourage you to get in touch with their support team to see if they allow you to add one. If you have access to hosting back-end, then I can assist you to create one and also to upload it. Let me know how you go, thanks for stopping by.

  3. Yes, you save the file as plain text, so what is important is the file extension you save it as. Meaning, simply create your sitemap using plain text editor of your choice, and once you place all the details of your URL’s within the file, simply save it as:

    sitemap.xml

    making sure that the extension is .xml
    then simply upload this file inside public_html folder of your hosting account using any FTP program.
    Don’t forget to add your new sitemap to Google Webmaster Tools account, I have many videos which explain how to add and also verify sitemaps. Here’s one video I think will be of interest for you.
    http://youtu.be/DZHYQ0otfBk
    Hope this helps, happy rankings

  4. FEEDBACK..
    Please check your sound next time you make a video.  Your voice is understandable, but the volume of your recording is very poor. Perhaps check your mic volume? Speak closer to the mic?  

  5. How is this even done? You need access to the back office of the website? You save the robot file and then transfer it to the website? Do you have any videos where you do this?

  6. thanks for your great videos. Im having a problem submitting my sitemap which I have not had before and dont know how to resolve . on google webmaster toolsI get the message :

    Sitemap contains urls which are blocked by robots.txt.

    can you help ?

  7. awesome tutorials! I learned tons of great info. Btw, I notice that you’re using SEOyoast to generate your sitemap, is there a way to set priorities using yoast plugin? Thanks

  8. when i try to create they show message like this
    User-agent: *
    Disallow: /wp-admin/
    Allow: /wp-admin/admin-ajax.php
    why xml.sitemap generator create erorrs when I do submit my website url.

  9. Very nice video Rankya!

    Is there any disadvantage of having different pages in the xml-sitemap with the same priority? and what kind of advantage brings a higher priority to my page exactly?

  10. hey hey. Just a quick question. I used your robots.txt guide. When i test my sitemap on google webmaster. i get 7 warning messages with the following : Sitemap contains urls which are blocked by robots.txt. I have looked everywhere on the net for someone with the same problem but cant seem to find a solution. would you have any ideas?

Leave a Reply