If a forum has thousands of topics, wouldn't allowing Googlebot to index the pages cause a huge increase in bandwidth? I'm assuming the googlebot would go through every single topic and every single page of that topic (which could multiply the number of pages spidered quite dramatically).
Googles doesn't do that. They take care to not overload servers with their spiders.
I've never done a search which turned up pages and pages of forum links from one site.
It can happen, but Google will hide some of the redundant pages from a site. Incidentally Google will not index every forum page since most will look very similar in content to other pages.[/quote]