I want to block specific webpages on my website from Google search console. So i want to check robots.txt disallow code on live. How to check robots.txt for specific webpages so that i can know my code is working fine.
Share
Sign Up to our social questions and Answers Engine to ask questions, answer people's questions, and connect with other people.
Lost your password? Please enter your email address. You will receive a link and will create a new password via email.
Aniya
We can test live urls on robots.txt tester, which is available on Google search console. Just enter your required url and click on Test button. It will show you whether your allow and disallow code on robots.txt, will allow that url or block that url.
Here is the link for robots.txt tester and place your website page and test, as shown in below image. You may read how to set no-index tag for all canonical urls on wordpresss sites.
If you observe in the above example, the tested url is blocked. Hope this will be helpful for you.