SEO RobotsGenerator- BestWebSites

Automatically generate your robots.txt data.
Robots.txt allows you to exclude pages and directories from search engines. Use robots.txt to exclude any object that should not be searched, such as admin files, bin files, and the DotNetNuke login, register, terms and privacy controls. Excluding these files will firstly prevent files that should not be publically exposed from being searched, and secondly prevents duplicate content issues in DotNetNuke.

Tick all folders that you want to be excluded using your robots.txt:


For each page in the website, exclude these controls:


This website uses:
Select the type of friendly URLs used on this website. Note that selecting wrong friendly URL type will result in an incorrectly generated robots.txt.