So after Google’s last big indexing round-up, a number of different sites that I manage in one of my small businesses lost its ranking. I understand that google has reset their algorhythms to try to clean out duplicate content and people who use scraper software, but a lot of legitimate sites seemed to have gotten slapped as well.
There’s a great article about it here for any one that might have suffered a similar fate.
If any one has any ideas of how to avoid showing duplicate content for site terms and policies, please let me know, I’m all ears. Online businesses, entrepreneurs or even basic bloggers will benefit from cleaning up how the bots read their pages.
Are you referring to the algorithmic change (Jagger) which included a PR update that began two months ago and ended about one month ago, or the more recent PR update which seems to have reverted back to normal?
I am referring to the algorithmic change about two months ago, and although some pages reverted back to normal, most of my small publishing sites did not. I think it is primarily because of all of the reciprocal links they have. (Which weren’t so much a problem before, but now they are just hatin’
I tried to click on the link referencing the article but it didn’t come up.
Thanks jared. I accidently posted a little glitch. All clean now. Thanks for the feed back.