How can I stop a web page/document being searchable COMPLETELY?
I know about nofollow noindex and robots.txt but some people don't play by the rules:
how can I give people access to data ONLY through a page on my website, without using passwords
it's a legal doccument that I want visible but only findable through my page
turn it into an image to make it harder. then embed that image and use some parameter, like image.jpg?foo=bar
have a script intercept the request for the image and only serve it if the right value is supplied. change every minute or so.
depends where it goes in the end, and the type of document i presume. if your users are just going to print it off and sign then yea sure.
aint no lawya.
no, use a php script or something similar. one embeds the image, like:<img src="image.jpg?supersecretcode=gentoo">
and one generates the image output, like:if ($_GET['supersecretcode'] == 'gentoo')
obviously instead of using a static value as the code, you'd dynamically generate it based on some formula in both scripts. something like a hash of the current time and a fixed string.
I can set up a suitable, secure solution for you for only $50. By "secure", I mean that it will be vanishingly unlikely that the page would be findable by anyone except a human starting from another page.
No, you make a submit button that calls a php script, the php script then reads and sends a file that is otherwise inaccessible to a direct GET request. (set up some htaccess rules for this) That way noone can link directly to your document.
You think parsing bots dont read the whole link in the src tag including params?? Then the scraper doesnt check the file type when it gets it before its passed to the indexer?
Tards need to lrn to search engine fundementals...
This whole thread is tard city.