Yes. There's no black magic the crawlers are using. Google, Yahoo!, 百度, Bing, Яндекс and others will follow every link and capture what they "see". If they see a "this topic does not exist" it is captured aswell (captured, not understood). In fact, some of them already do so since they can't be detected as bots and so guest permissions apply to them.
As a rule of thumb: handle guests and bots the same way - if you don't want bots to see some of your forums, deny them for guests aswell. And vice versa. If you want guests to be able to read forums, don't forget to allow those for bots aswell.
The worst thing about censorship is ███████████
Affin wrote: ↑
Tue Nov 20, 2018 9:51 am
The problem is probably not my English but you do not want to understand correctly.
We will not come anybody anyway, nevertheless, it's best to shit this.