07-21-2007, 07:01 PM
Hi. I created a website for a friend with a link to her resume. When she did a Google search for her name naturally the main page and the resume came up. She asked me if there's a way to make her resume accessible only when clicked on from her site; she doesn't want it openly available to anyone who searches her name.
Is there a way to do that?
I would do something with server side coding here. You should ask the PHP forum on that. Creating a session cookie, which is set in the index page maybe. On the resume page, the script checks whether or not the cookie exists, if it does then they can stay or if they don't then it redirects them to the main page.
Also, I don't know of Google has a feature where you can blacklist pages :S
07-21-2007, 08:10 PM
A simple way would be to use robots.txt (http://www.google.com/search?q=robots.txt) to prevent search bots from indexing the resume page. It would work on most search engines, and wouldn't require any crazy scripting.
07-22-2007, 04:55 AM
or block the page to only the Hosting server via .htaccess might work, too.
The robots.txt method would be more elegant, though.
07-23-2007, 05:08 AM
Thank you for your suggestions. My problem, though, is that the site is at a free hosting service (Tripod) and I have no access to the root directory. I was reading about Robots Meta Tags. Do you think that would work?
07-23-2007, 09:16 AM
The robots.txt file is a very basic text file where you simply 'tell' search robots which pages you don't want it to index on search engines. All the decent search engines' robots will adhere to these rules you want to make, so disallowing indexing access on your friends resume page would probably be the easiest option:
This might help (http://www.gnc-web-creations.com/creating_robotstxt_file.htm)
07-23-2007, 01:31 PM
My problem, though, is that the site is at a free hosting service (Tripod) and I have no access to the root directory. I was reading about Robots Meta Tags. Do you think that would work?