So after Google’s last big indexing round-up, a number of different sites that I manage in one of my small businesses lost its ranking. I understand that google has reset their algorhythms to try to clean out duplicate content and people who use scraper software, but a lot of legitimate sites seemed to have gotten slapped as well.
There’s a great article about it here for any one that might have suffered a similar fate.
If any one has any ideas of how to avoid showing duplicate content for site terms and policies, please let me know, I’m all ears. Online businesses, entrepreneurs or even basic bloggers will benefit from cleaning up how the bots read their pages.