In case anyone runs into this issue in the future, this seems to have fixed it:
Keep in mind it can take hours for the bots to catch up to your updated robots.txt
While the "Disallow: /ucp" seems to have done the trick, I also added the "Disallow: /*sid=" as precaution to prevent the bot from attempting to crawl/index ANY session IDs in any links.
Google is now crawling the board as intended and is no longer eating up bandwidth.
Code:
User-agent: *crawl-delay: 5Disallow: /ucpDisallow: /*sid=
While the "Disallow: /ucp" seems to have done the trick, I also added the "Disallow: /*sid=" as precaution to prevent the bot from attempting to crawl/index ANY session IDs in any links.
Google is now crawling the board as intended and is no longer eating up bandwidth.
Statistics: Posted by Jx80 — Tue Jun 04, 2024 11:53 pm