/viewtopic.php?t=123
/viewtopic.php?p=789
Code: Select all
/viewtopic.php?f=6&p=27994
/viewtopic.php?f=6&t=3424&view=previous
/viewtopic.php?t=7552&p=13110
/viewtopic.php?f=21&t=1659
/viewtopic.php?p=30907
/viewtopic.php?f=7&t=339
/viewtopic.php?f=26&t=16703&view=next
/viewtopic.php?p=36344
/viewtopic.php?p=3611
/viewtopic.php?f=18&t=21464
Add User 1.0.4
Auto Groups 2.0.2
Board Announcements 1.1.0
Change Post Time 1.0.1
CloudFlare IP 1.0.0
Contact Admin 1.3.7
Default Avatar Extended 1.2.2
Google Analytics 1.0.5
Group Template Variables 1.1.0
Hide Birthdays 1.0.1
Hide Newest User And Statistics Permissions 1.1.1-RC2
Joined date format 1.0.0
Last Post Avatar 1.0.3
LMDI Delete Re: 1.0.7
MailboxValidator Email Validator 1.0.0
phpBB3 SEO Sitemap 1.1.1
phpBB Media Embed PlugIn 1.1.1
PM Welcome 1.0.1
Post new topic 1.0.2
Previous / Next topic 1.0.3
Read other's topics Permission 1.1.0
Recent Topics 2.2.12
Round avatars 1.0.0
SEO Metadata 1.3.0
Show Guests in viewonline 0.2.0
Sortables Captcha 2.0.1
Who Is Online Extra Details by Informed Webmaster 0.9.3-BETA
Google is indicating that it found the canonical tag and it's using that as "thee" URL. So Google isn't ignoring these URLs, it's indicating they are duplicates of other URLs.
Code: Select all
<!-- IF U_CANONICAL -->
<link rel="canonical" href="{U_CANONICAL}">
<!-- ENDIF -->
It makes sense that Google has slightly more pages indexed than the number of topics because a topic may have more than one page (e.g., viewtopic.php?t=XXX&start=20) and that's considered a "new" canonical URL.
Thank you. It seems Google has indexed only slightly more pages than the total number of topics we have.
Don't be surprised when it comes back with exactly the same message; failed. I've been doing those validation fixes and haven't had much success until recently.Pfizz wrote: ↑Sun Feb 12, 2023 4:35 pmUPDATE: Actually, I was able to resubmit all those pages for review again to Google in the Google Search Console with just one click to validate them. So the re-validation has started and I am now awaiting to see the outcome, hopefully it will be positive. It seems all of those pages were flagged back in August of last year for some reason.
You can point Google at the news feeds to use as sitemap.
It's not necessarily a "huge" issue. The bot follows the link and finds canonical tag indicating the source page, this avoids the duplicate content issue and what URL the bot will use in index. The only issue is the bot is wasting your resources and it's resources when it's following links to duplicate content when it could be indexing new content or re-indexing old content.
Code: Select all
<a {% if postrow.S_FIRST_UNREAD %}class="first-unread" {% endif %}href="{{ postrow.U_MINI_POST }}">{{ postrow.POST_SUBJECT }}</a>
Code: Select all
<!-- IF not S_IS_BOT -->
<a {% if postrow.S_FIRST_UNREAD %}class="first-unread" {% endif %}href="{{ postrow.U_MINI_POST }}">{{ postrow.POST_SUBJECT }}</a>
<!-- ELSE -->
{{ postrow.POST_SUBJECT }}
<!-- ENDIF -->
Are there any other templates which need this fix?thecoalman wrote: ↑Fri Oct 04, 2024 11:42 pm You can fix these problems by editing tepmplate and hiding the links from Google, many of them are already hidden. For example in viewtopic_body.html around line 231 find: