Back in the SEO heyday, people used to say quantity was king. Even if they didn’t say it, they strongly implied it, and implied that buying 1,000 links from article directories was the best investment you could make.

This led to some major abuses within the industry. Website marketers began publishing and republishing as many articles as possible, regardless as to whether they were original or intelligent. Many would also keyword stuff the articles so that they would rank higher, with little thought to an actual human audience. Then came link building. Many people would mass link wherever they could, no matter if the content was related or not, just to boost their rank position in search engines. Google Panda and Google Penguin have basically stopped all of these “gray” practices dead in their tracks.

Google Panda was the first update to be released by Google. It was created by Navneet Panda, and was closer than ever to smart artificial intelligence. This new algorithm bypassed over keyword stuffed articles with no new content and looked for unique quality content by studying what human quality testers actually liked and found helpful.

In April 2012, the newest update, Penguin was released. This algorithm accomplished even more by focusing on duplicate content and linking schemes, penalizing these sites by dropping lower in the rankings. Many companies that had been relying on these “tricks” soon found themselves down-ranked.

The main reason for these updates is to help reduce web spam, and to help Google’s search results gain a better reputation among its users. Common sense would tell those looking to increase their rankings to focus on producing higher quality writing. While building links is still important, building them at trusted sites is now the goal. Keyword stuffing is a thing of the past, and content that is written for human eyes is the future.

So just what did Google Panda and Penguin Do? They simply forced all of these Internet companies to work a little bit harder!