I have a question for you which I’ve been trying to figure out for a few months.
If I disallow a page in robots.txt that I don’t want (or need) in the index (terms and conditions, privacy statements, logins etc) those pages are eventually removed from the Google index and their PageRank toolbar turns gray. This would indicate that those pages are not crawled or indexed and therefore do not build a PageRank.
But, is this really the case? If I am removing pages from the index using robots.txt am I inadvertently wasting PageRank by linking to those pages? Is the only way to effectively remove pages from the index and stop them building PageRank by adding nofollows to all the disallowed pages incoming links as well? I know that pages are given a gray tool bar when disallowed in robots.txt but is this a lie?!
This leads me on to the next question! Do nofollow attributes accurately cause the pages PageRank do be redistributed to the remaining followed links on that page?
Any thoughts would be appreciated!”
The gray in your toolbar is not a lie. Google will not serve those pages in search results if you properly noindex them in your robots.txt. I’ve personally found Google to be quite obedient of the robots.txt files. But other bots might not be as polite :-)
Cover both bases to be safe, but don’t bother adding nofollow to the meta tags of those pages you are looking to keep out of the index. If for some crazy reason they get an IBL (inbound link) from an external site, you still want to pass that pop on to the rest of your site.
Don’t forget guys and gals, if you need some FREE SEO ADVICE then drop me a line. It’s FREEEEEEEEE!