Soo, I've been running low on bandwidth the past couple months and realized it is allllll coming from Googlebot. Almost 1 GIG of bandwidth drained this month from Googlebot alone!
It comes to my site *everyday* and indexes all these little nooks and crannies of my site, like the edit profile, post new topic, rate this link, etc. Things which have no content basically. Ironically, the only page that shows up in Google search when I type in my site name is still the lonely lonely index page
(BUT, my pagerank is waay up!)
There is a way to slow the Googlebot down if you're signedup for their WebMaster tools but that link doesn't work in my WebMaster tools page, and if it DID work, would take about 90 days to take effect anyways.
I've googled this (ironically) and it seems others on all other CMS's have the same problem with the greedy greedy Googlebot.
Any ideas short of banning it to slow it down or keep it from crawling irrelevant pages from htaccess? Maybe there could be an addition to NukeSEO to make Google download the site map only once a week or so?