Main Profile

At A Glance

Should I block duplicate pages using robots.txt?

Halfdeck from Davis, CA asks: "If Google crawls 1,000 pages/day, Googlebot crawling many dupe content pages may slow down indexing of a large site. In that scenario, do you recommend blocking dupes using robots.txt or is using META ROBOTS NOINDEX,NOFOLLOW a better alternative?" Short answer: No, don't block them using robots.txt. Learn more about duplicate content here: http://www.google.com/support/webmasters/bin/answer.py?answer=66359
Length: 02:13

Contact

Questions about Should I block duplicate pages using robots.txt?

Want more info about Should I block duplicate pages using robots.txt?? Get free advice from education experts and Noodle community members.

  • Answer

Ask a New Question