What if we make websites that are difficult for Google robots to understand or easy to misunderstand as something else?
Can we create a web server that generates a shitty document out of a nice semantic one?
Has this been done?
just ban everything that has an odd user-agent, that comes from google/bing/.. IPs, etc
of course it's been done, some websites used to show content ONLY to google IPs/bots, so that they would show lots of stuff in the search results... and then, once the user enters the site, it would ask the user to register
>>61893617
I thought so.
The idea is to send bad or vague data to google to devalue their results. That's why I thought an easy for babby server side tool might be useful. Maybe something for analytics too.