This topic contains 5 replies, has 5 voices, and was last updated by  social hub media 2 weeks, 3 days ago.

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
    Posts
  • #9917 Reply

    John
    Participant

    what is the use of robots.txt?

    #9931 Reply

    Virsen
    Participant

    Robot.txt consist of terminology such as user agent, disallow. The Website Owners inform crawlers that which pages of website they would not visit. The actual path of crawler traversing the website is initiated through robots.txt

    #10733 Reply

    rogerdavid
    Participant

    Robot.txt is generally used to stop web crawlers or spiders crawl websites. Website owners use it where they specifically don’t want their page to be crawled by crawlers. This is also known as Robots Exclusion Protocol.

    #15395 Reply

    WebXeros Solutions
    Participant

    The most common usage of Robots.txt is to ban crawlers from visiting private folders or content that gives them no additional information. This is done primarily in order to save the crawler’s time: bots crawl on a budget – if you ensure that it doesn’t waste time on unnecessary content, it will crawl your site deeper and quicker.

    Some people choose to save bandwidth and allow access to only those crawlers they care about (e.g. Google, Yahoo, and MSN).

    #15784 Reply

    social hub media

    Robots.txt is a text file that tells web robots which pages on your site to crawl.It also tells web robots which pages not to crawl

    #15785 Reply

    social hub media

    Robots.txt is a text file that tells web robots which pages on your site to crawl.It also tells web robots which pages not to crawl

Viewing 6 posts - 1 through 6 (of 6 total)
Reply To: use of robots.txt
Your information: