Topics Map > Application Hosting > Developer Information

Web Hosting - robots.txt

The Web Hosting service by default implements a robots.txt file with a crawl-delay function within the /httpdocs folder for each Shared Hosting domain.

User-agent: *
Crawl-delay: 10

This robots.txt is in place to help stagger the load on individual web sites who are indexed by Google, and the like.  It can be especially helpful when multiple bots maybe indexing a web site at the same time, which can make sites less responsive.  Additionally it will help increase the overall capacity on the server(s) as most sites are indexed frequently and staggering this traffic helps to that end.  

We recommend that you keep this file in place containing the crawl-delay but but feel free to edit it to suite your needs. 

If you have further questions, email

Keywordssearch, google, bots, rate, crawl, engines, exclusion, security, allow, disallow, User-agent, sitemap, noindex   Doc ID62214
OwnerJake S.GroupDoIT Web Hosting
Created2016-03-25 10:21:02Updated2024-03-18 07:14:40
SitesDoIT Web Hosting
Feedback  0   0