Robots.txt is a text file that gives search engines instructions about which parts of your website they should and shouldn’t access. Located in your website’s root directory, this file acts as a set of rules for search engine crawlers. Through robots.txt, you can block crawlers from accessing certain pages, like admin areas or duplicate content, helping search engines focus on your important pages.
See also: Technical SEO, No-index tag