robots.txt allows you tell spiders like googlebot and others what pages not to index. It will prevent the bot from loading unnecessary "garbage" pages which will preserve your server resources and keep the good content on your site from getting diluted. for example for phpbb3 you'll want to deny search.php, memberlist.php, faq.php....etc. These links and the content in them are hidden from the bots by default by phpbb3 but it will serve a page which will indexed so you're better off denying them through robots.txt
As mentioned you can use Webmastertools to test your robots.txt file or even create one. you can create your own in wordpad and just upload it. There's also a section that tells you what pages googlebot was denied from accessing. robots.txt only works with bots that respect it, rogue bots such as email harvesters will ignore it.
You can also use wildcards to match part of string in a url which works with some search engines, this is where webmastertools really becomes a good tool because you can check what pages are being denied to the bot to make sure you haven't inadvertently denied it from pages you want indexed.
“Results! Why, man, I have gotten a lot of results! I have found several thousand things that won’t work.”