dcz wrote: in the templates it's S_REGISTERED_USER.
Code: Select all
<!-- IF not S_REGISTERED_USER --> <!-- ENDIF -->
Frank.Frank Rizzo wrote:Sure. Give me a day or two to check sites are indexing properly.
But for now I seriously recommend that if you haven't already done so you limit the permission of the bot group.....
By all means, password protect it. They can't index what they can't see. At the very least you should be denying the directory in robots.txt .pilch wrote:
I'm curious - if not a little panicy about actually having two versions of my forum running on my server. spose I could just go in and edit the .htaccess.
Google's results will appear differently depending on the search keywords used in a query. This is not 1994 where only the meta tag was shown. Of course it will show the first text that appears and is the most prominent to the indexer if NO keywords are given. But what are you even trying to prove?Frank Rizzo wrote:Don't believe me? Take a look at Google's cache of this messageboard:
http://www.google.co.uk/search?q=site:p ... rt=90&sa=N
No no NO! Do NOT consider descriptions that are returned based solely on a search without keywords for anything but your personal Google cache statistics. Users will NEVER see them when they search with keywords (unless the keyword or keywords are inside the description). Ever wondered how Google was able to always show exactly the parts of the page that contain the words in the description?Frank Rizzo wrote: If I search site:mysite.com I'm not seeing anything near the peak of what it used to show (35,000+ pages) but just around 3,000. I think this will change if G decides a deep crawl or whatever is required.
What really grinded my gears was the fact that G was returning descriptions based on the menu structure but after tweaking the templates this is now reading fine for some of the recently indexed pages. I still have supplementals but these are now mostly for pages indexed back in May.
All right, webmasterworld is not blacklisted and uses similar feature, I know about it. What I do not know is why and how long will it last, as I never start working on something that could not last.Frank Rizzo wrote: What about news such which return stories in serps but when you click on them you are required to register before reading? Isn't that deceitful? Would google start banning the national newspapers?
I think there is no need to be over cautious about this. As long as you are not deceiving the public then serving a bot friendly page is fine.
I don't think the one to successfully spam report a website would be the one to be really sad.If someone does fill in a complaint it would be a pretty sad thing for them to do.
Do you mean why bother feeding SE's more efficient content than guests? I think it will make a big difference.why playing with limits if you're unsure about what are you doing, especially when it won't make any difference SEO wise
I think one thing to bear in mind is that webmasterworld has a lot of "trust". Apparently a lot of established sites use many techniques that can be labled "blackhat". Being they are trusted sites the same rules may not apply to them that would to smaller sites. If cloaking is something that Google and other search engines look for is an automated process such as they have "stealth" bot that simultaneously requests pages that the regular bot does I'd suggest this could possibly be a problem for smaller less established sites.dcz wrote:
All right, webmasterworld is not blacklisted and uses similar feature, I know about it. What I do not know is why and how long will it last, as I never start working on something that could not last.