reputation in Google. How to avoid this SEO failure Always research the link profile and history of any domain name you register under. A qualified SEO consultant can do this. There are also tools you can use to see what skeletons may be in the site's closet. Whenever I pick up a new domain, I like to lay it dormant for six months to at least a year before trying to do anything with it. I want search engines to clearly differentiate the new incarnation of my site from its past life. It's an extra precaution to protect your investment. SEO Failure #5: Pages
That Won't Disappear Sometimes sites may have a different problem: too many pages in the search index. Search engines sometimes retain pages that are no longer valid. If people land on error pages when coming from search results, that's a bad user company employee list experience. Some site owners, out of frustration, list individual URLs in the robots.txt file. They hope Google will take the hint and stop indexing them. But this approach fails! If Google respects the robots.txt file, it will not crawl these pages.
So Google will never see the 404 status and find out that the pages are invalid. How to Avoid This SEO Mistake The first part of the fix is to not disallow these URLs in robots.txt. You WANT bots to crawl and know which URLs should be removed from the search index. After that, set up a 301 redirect on the old URL. Send the visitor (and search engines) to the nearest replacement page on the site. This takes care of your visitors whether they come from search or a direct link. SEO Failure #6: Missed Link Equity I followed a link from a college website and was greeted with a 404 (not found) error. That's not uncommon, except the link was to