As Yahoo, Ask, and MSN have all reported this morning, they (and Google) have joined together to make sitemaps autodiscoverable via the robots.txt file. As long as you tell the search engines where your sitemap lives (in the robots.txt file), they will be able to find it and use it. Simply add the following line to your robots.txt file:
The <sitemap_location> should be the complete URL to the Sitemap, such as: http://www.example.com/sitemap.xml
This directive is independent of the user-agent line, so it doesn't matter where you place it in your file. If you have a Sitemap index file, you can include the location of just that file. You don't need to list each individual Sitemap listed in the index file.
You can get all the facts about this and everything you need to know about sitemaps at sitemaps.org.
7 thoughts on “Sitemap autodiscovery for the Big 4 via robots.txt”
Are you at the conference today?
No, I couldn’t make it to the conference.
Are you going to SMX in June? I’ll be there.
Probably not, but if I do, I’ll let ya know.
Here’s the direct URL to the FAQ which describes the implementation: http://www.sitemaps.org/protocol.html#informing
Dazzlin, never in Europe? I want to have the opportunity to meet the best SEO girl in the world….
🙂 Thanks for this wonderfull blog.
Simleon, I’d never rule Europe out completely, but it’s not very likely. Well, at least not JUST for business reasons. If I manage to get there just for fun, I could throw business in as well. I’ll be sure to let you know if it ever happens. 🙂
Comments are closed.