Robots (XIII)
For every web, we create a simple robots.txt
file. This file allows the web crawlers to freely move around our web site.
/var/www/<web>/robots.txt:
--------------------------
User-agent: *
Disallow:
For every web, we create a simple robots.txt
file. This file allows the web crawlers to freely move around our web site.
/var/www/<web>/robots.txt:
--------------------------
User-agent: *
Disallow: