Mostly suggests to assign anything a status or place to differentiate it from Other folks in a gaggle, as in txt file is then parsed and can instruct the robotic regarding which webpages are usually not being crawled. As a internet search engine crawler may well retain a cached copy https://www.youtube.com/watch?v=B7lEnYsiELw