Page 4 of 15

Re: Out of the box phpBB 3 is awful for SEO

Posted: Mon Jun 25, 2007 11:21 am
by dcz
Yes, what is hidden for guest is hidden for bots too, even though bots can browse private forums if you decide to (which again would be cloaking).
The $user->data['is_registered'] value is enough for both guest and bots, in the templates it's S_REGISTERED_USER.

++

Re: Out of the box phpBB 3 is awful for SEO

Posted: Mon Jun 25, 2007 11:23 am
by Xabi
So, all the stuff between...

<!-- IF not S_USER_LOGGED_IN -->
<!-- ENDIF -->

would be hidden for bots and guests?

Re: Out of the box phpBB 3 is awful for SEO

Posted: Mon Jun 25, 2007 11:26 am
by dcz
dcz wrote: in the templates it's S_REGISTERED_USER.

Code: Select all

<!-- IF not S_REGISTERED_USER -->
<!-- ENDIF -->
will do it. The S_USER_LOGGED_IN would not be enough for both bots and guests.

Re: Out of the box phpBB 3 is awful for SEO

Posted: Mon Jun 25, 2007 11:28 am
by Xabi
Ok! I will use it, thank you :)

Re: Out of the box phpBB 3 is awful for SEO

Posted: Mon Jun 25, 2007 12:04 pm
by thecoalman
I'd have to agree with dcz and had the same thoughts when I first saw this feature myself. No matter what label you put on it whether you call it deceptive or harmless a search engine is not going to know the difference, it's only going to know your serving different pages to it that users.

Just to add hiding the link from the bot is not going to work anyway in a lot of cases, assuming the memberlist here has the link hidden from bots its already been indexed by Google. It just takes one link somewhere such as in a post or another site and it's going to follow it. You have to deny it with robots.txt .

Re: Out of the box phpBB 3 is awful for SEO

Posted: Mon Jun 25, 2007 12:51 pm
by Eelke
I think the bot recognition is very useful to stop the enormous list of guests from appearing when a bot happens to be spidering your site, thereby distorting the number of visitors, etc. But I am "receptive" to the arguments against using it to serve actual different content to bots then to other users.

Re: Out of the box phpBB 3 is awful for SEO

Posted: Mon Jun 25, 2007 3:40 pm
by pilch
Frank Rizzo wrote:Sure. Give me a day or two to check sites are indexing properly.

But for now I seriously recommend that if you haven't already done so you limit the permission of the bot group.....
Frank.

Did you experience issues with running two versions of your forum? One on v2 and one on RC1? As in did your RC1 start getting indexed in SERPS when you perhaps, weren't quite ready (or wanted) to start having your site indexed?

I'm curious - if not a little panicy about actually having two versions of my forum running on my server. spose I could just go in and edit the .htaccess.

It's been interesting to see dcz's comments - he helped me out loads with his phpbbSEO (hopefully that won't be considered SPAM to the powers that be). I too have (and dcz will vouch for this) spent many hours into SEO into v2 forum. When I first starting getting excited about the phpbb3 movement, my 1st thoughts were awesome!! and then I quickly started realising (as I'm sure we all did) - OMG - but all of the work :(

My final point...

Thankyou very, very much to ALL the team on OLYMPUS - we're all very fortunate to have such dedicated and committed guys and girls.

Where can I donate some PayPal spondoolie$ ??

Regards..

p.s Should one be DEACTIVATING all the BOTS from ACP for now as I'm purely in test mode (having just updated to RC2) I have no idea of how soon I will be officially porting over to phpBB3 from v2. Thus, I don't want any RC* pages being indexed..

Re: Out of the box phpBB 3 is awful for SEO

Posted: Mon Jun 25, 2007 4:19 pm
by thecoalman
pilch wrote:
I'm curious - if not a little panicy about actually having two versions of my forum running on my server. spose I could just go in and edit the .htaccess.
By all means, password protect it. They can't index what they can't see. At the very least you should be denying the directory in robots.txt .

Re: Out of the box phpBB 3 is awful for SEO

Posted: Mon Jun 25, 2007 5:46 pm
by Frank Rizzo
pilch,

After I installed and upgraded I put the message board in the same folder so search engines are only seeing one version.

V2 was in mysite/forum
installed V3 in mysite/forum_new
ran the upgrade
(backed up) deleted mysite/forum
renamed mysite/forum_new mysite/forum

Google is there feeding for most of the day (albeit at a slow rate). Yahoo seems to have eaten a lot within the past week.

For google specifically I am now seeing the new pages being indexed which is a good start but saturation is not (yet) increasing.

If I search site:mysite.com I'm not seeing anything near the peak of what it used to show (35,000+ pages) but just around 3,000. I think this will change if G decides a deep crawl or whatever is required.

What really grinded my gears was the fact that G was returning descriptions based on the menu structure but after tweaking the templates this is now reading fine for some of the recently indexed pages. I still have supplementals but these are now mostly for pages indexed back in May.

---

On the point about cloaking I don't think boards should be so paranoid. If someone does fill in a complaint it would be a pretty sad thing for them to do.

As I say, if a site, is cloaking deceptive information then that is totally out of order and worthy of a ban. But if you dress up pages so that search engines can a) read the site faster and clearer, b) store less information but at the same time ensure that the words the bot reads are the same as a user reads then that's good for everyone.

What about news such which return stories in serps but when you click on them you are required to register before reading? Isn't that deceitful? Would google start banning the national newspapers?

I think there is no need to be over cautious about this. As long as you are not deceiving the public then serving a bot friendly page is fine.

Re: Out of the box phpBB 3 is awful for SEO

Posted: Mon Jun 25, 2007 9:45 pm
by dhn
Frank Rizzo wrote:Don't believe me? Take a look at Google's cache of this messageboard:

http://www.google.co.uk/search?q=site:p ... rt=90&sa=N
Google's results will appear differently depending on the search keywords used in a query. This is not 1994 where only the meta tag was shown. Of course it will show the first text that appears and is the most prominent to the indexer if NO keywords are given. But what are you even trying to prove?

Now let us try the search again with SEO as the keyword. Oh behold! We are actually seeing content within this topic appearing on the front page.

I am amazed at how some so called SEO experts are still trying to validate their existance by using cheap tricks like that. I do consider SEO important, but stuff like that annoys me.

Re: Out of the box phpBB 3 is awful for SEO

Posted: Mon Jun 25, 2007 9:54 pm
by dhn
Frank Rizzo wrote: If I search site:mysite.com I'm not seeing anything near the peak of what it used to show (35,000+ pages) but just around 3,000. I think this will change if G decides a deep crawl or whatever is required.

What really grinded my gears was the fact that G was returning descriptions based on the menu structure but after tweaking the templates this is now reading fine for some of the recently indexed pages. I still have supplementals but these are now mostly for pages indexed back in May.
No no NO! Do NOT consider descriptions that are returned based solely on a search without keywords for anything but your personal Google cache statistics. Users will NEVER see them when they search with keywords (unless the keyword or keywords are inside the description). Ever wondered how Google was able to always show exactly the parts of the page that contain the words in the description?

Re: Out of the box phpBB 3 is awful for SEO

Posted: Tue Jun 26, 2007 8:09 am
by Frank Rizzo
I get that now.

Google will adaptively show descriptions based on keywords. If you are just checking a site saturation you will get a nonsense description; use keywords in the search and it grabs text from around the keyword to use as the description.

Re: Out of the box phpBB 3 is awful for SEO

Posted: Tue Jun 26, 2007 8:18 am
by dcz
Frank Rizzo wrote: What about news such which return stories in serps but when you click on them you are required to register before reading? Isn't that deceitful? Would google start banning the national newspapers?

I think there is no need to be over cautious about this. As long as you are not deceiving the public then serving a bot friendly page is fine.
All right, webmasterworld is not blacklisted and uses similar feature, I know about it. What I do not know is why and how long will it last, as I never start working on something that could not last.

Now, the real matter is, IMHO, that we should not play too much with limits here, eg filter content for bots only when doing it for guest is SEO wise the same and fully safe (not 90% or even 99%). Playing with such limits is just paying quite some attention to SEO on the one hand, and at the same time, not following fully it's basic principles, it's not really consistent as a strategy. That's where too much is not useful, if not risky IMHO. Because, you cannot through the spam report possibility with just :
If someone does fill in a complaint it would be a pretty sad thing for them to do.
I don't think the one to successfully spam report a website would be the one to be really sad.
Again, and to makes things clear with what dhn wisely said, I don't want to threaten anyone here, actually, I started to provide SEO solutions after I had been fed up reading stupid things about the supposevly secret techniques provided by some, often stating that only them can understand. SEO is simple in principle, what is more complex is to define the appropriate strategy for a particular case, being adapted to both content type and the means and effort one is able to put in it.
About all this, my only message in the end is : why playing with limits if you're unsure about what are you doing, especially when it won't make any difference SEO wise ? (comparing bot and guest content filtering).
To me, if one technique is 100% safe, and if it's the same SEO wise, there is absolutely no reason to avoid it.

Then, about testing two forum at the same time (phpbb2 & 3) with the same content, you should prevent phpBB3 test board indexing (the best is to fully lock the folder) and work on HTTP 301 redirecting phpBB2 to phpBB3 to be able to safely switch when you're ready.
Just my two cents, but I'm pretty sure you don't want your site to become two times bigger with (pseudo) duplicate content and then, back to half it size after you got rid of phpBB2.

++

Re: Out of the box phpBB 3 is awful for SEO

Posted: Tue Jun 26, 2007 9:32 am
by Frank Rizzo
why playing with limits if you're unsure about what are you doing, especially when it won't make any difference SEO wise
Do you mean why bother feeding SE's more efficient content than guests? I think it will make a big difference.

If there are two forums one as-is and one with all the guff taken away then which site is more likley to make the googler want to click?

search: Growing Widgets

How to grow widgets
How to grow widgets. Posted by Frank Rizzo on Fri Jan 02, 2004 4:02 pm. Caring for
your widgets is important if you want your widget community to grow.
w w w . widgetsworld . com

How to grow widgets
How to grow widgets. Caring for your widgets is important if you want your widget
community to grow. Adding fertalizers such as Nitro-Seo-Sulphate will nurture
w w w . widgetsworldcompetitor . com

The first listing has superfluous text (Posted by Frank Rizzo on Fri Jan 02, 2004 4:02 pm) That is totally useless information to the person searching. It is wasting real estate space where you need to attract the users attention quickly so that they will click you.

The date 2004 would be off-putting too. Users may skip your listing thinking it is old and out of date. Not only that but what if the person posting it was called Booger? :D

Re: Out of the box phpBB 3 is awful for SEO

Posted: Tue Jun 26, 2007 9:38 am
by thecoalman
dcz wrote:
All right, webmasterworld is not blacklisted and uses similar feature, I know about it. What I do not know is why and how long will it last, as I never start working on something that could not last.
I think one thing to bear in mind is that webmasterworld has a lot of "trust". Apparently a lot of established sites use many techniques that can be labled "blackhat". Being they are trusted sites the same rules may not apply to them that would to smaller sites. If cloaking is something that Google and other search engines look for is an automated process such as they have "stealth" bot that simultaneously requests pages that the regular bot does I'd suggest this could possibly be a problem for smaller less established sites.