Google Panda


One of the most significant aspects of internet marketing in today’s world is the ability to be highly ranked in Google’s search engine. The most recent update to the algorithms used by this engine – Google Panda 4.0 – has thrown many website and business owners off as they try to remain at a good ranking for their keywords. As with any Google update, some websites have been benefited by the change, while others are now struggling because of it.


Some Panda Tips Directly from Google


Spam, Junk, and Outdated Content


Google search results primarily look for high quality, fresh, and useful content. A website that can provide these values generally ranks highly on results from Google searches. The Panda update is designed to strengthen these values, and has updated its algorithms to target websites that contain spam or copied content, and rank them less favorably. This means that a site containing copied content, or one that is not updated often, is going to be the most heavily hurt by the Panda 4.0 update. Websites designed to host press releases are among some of the hardest hit by the update. eBay is another website that lost traffic after the update because of its nature of primarily containing low quality content.


Here are some industry recognized sources discussing Google Panda a bit further.





In light of the nature of the update, the obvious solution is to keep your site up-to-date with fresh, relevant content. If your website has been negatively affected by the update, it means that as far as Google is concerned, it does not contain useful, fresh content. Blogging is a good way to combat this. By keeping a relevant blog and updating it on a regular basis, websites maintain search engine optimization while keeping a good ranking. It is also advisable for blog posts to be written by someone who is already a part of the company that owns the website. This way, content is kept authentic by both the company’s standards and by Google’s.


Some Panda Tips Directly from Rand Fishkin


Planned Content


Although blogging is an excellent example of something that someone can do to begin bolstering their rankings for the new Google algorithm, it is only one action that will not on its own save a struggling site. If Google Panda is suddenly hurting your company, then it is very likely that you do not have an active plan in place that allows for a website that is regularly updated with fresh, high quality content. It is vital to have a content marketing strategy that accommodates who a website is trying to reach, why they are trying to reach them, how it will reach them, and why that audience should want to be reached. Having a plan and process is essential to managing to maintain rankings in Google Panda 4.0.


Websites now require more regular updates to remain in Google’s good graces. In most cases, websites that are commonly updated with useful content will do well after this update. On the other hand, websites with recycled content that are allowed to remain stale will more likely than not find themselves buried amidst thousands of other similar webpages far away from page 1.

The word SEO has gained so much importance in the past few years that it has become an essential element in online marketing. SEO or search engine optimization is a strategy that helps get your webpage high rankings by various search engines. As a result your webpage is displayed as one of the top results on any search engine. Of course every search engine has different criteria to measure the quality and authenticity of a web page. However, most SEO firms and marketing companies only focus on Google as this search engine controls 80% of the search engine market and generates nearly 60% of web traffic for top ranked pages.


With the huge amount of market in its hand, Google feels responsible for making sure that all the content that is highly ranked should be of top quality, genuine and beneficial to its users. Until lately, the problem was that Google’s SEO strategy could easily be manipulated. Google would rank pages depending on usage of keywords, link building, anchor text, etc., all which could easily be achieved by SEO firms by building multiple websites with little content and appropriate keywords recommending one particular webpage that would get the highest ranks through Google. It was considered unfair and frankly made it difficult for Google users to get good results.


The New Updates By Google

Since Google was sick of all the unauthorized manipulation of its web results, it released two new updates that have completely turned the tables on SEO firms. Google has launched the Google Penguin and Google Panda Algorithms whose sole purpose are to re-define how web page results are to be judged and ranked on the Google page results.


The Google Penguin


The Google Penguin was released in April 2012 and the purpose of this new algorithm is that it determines how web pages are indexed and whether they are linked to the appropriate keywords that users use in the search engine bar or not.


For example, if you search for a directory online (ABC directory, for example) through Google, you should be able to find the direct link to the directory when you type in the keywords “ABC Directory”. However, before, there were many pages that linked the directory to a directory page but a direct link never appeared on the Google results until the keyword “” was searched. This meant that the domain was indexed but not the directory itself and this was what the Google Penguin pin rectified: sites that weren’t properly indexed and were backed up by lousy link building.


As a result, several web pages were penalized by receiving poor webpage ratings by Google that dropped their link from the first page by Google to the 10th or on a farther page. Some of the web pages were permanently banned from the Google results. The Google Penguin targeted websites that were stuffed with excessive keywords and links and analyzed them for further irregularities. These irregularities were analyzed by the Google Panda update.


To recover your web page from Google Penguin penalty, you would have to fill up a consideration request. However most of the sites could not receive this step successfully as it required you to first bring your site up to code and then required Google Spiders to crawl over the new webpage which could take months. Some of the businesses are waiting for the next Penguin update so that it may automatically revoke the penalty from the webpage.


The Google Panda


The Google Panda was released before Penguin and it did more damage to web page rankings than Penguin. In fact, once both web updates were released, web page owners were baffled by how their web page rankings fell so drastically. They were confused whether it was the Panda or the Penguin that did the damage. Eventually, this became the best time to be in the SEO business. Every web developer and web coder was now busy bringing the web pages up to code so that their clients could once again profit from being on top of the search results.


The Google Panda update was created because Google now demanded proper content. They wanted professional content that could serve the needs of the user. Hence how does the algorithm really determine whether the page has good or poor content and whether it deserved a good rank or not?


There were a number of factors analyzed.


  • First, Google analyzed how much time users were spending on the web page, the exit rate and what specific page had more exit rates than the others.
  • Then it analyzed the number of times people logged on to the same web page with the same IP address i.e. how many times they revisited the web page.
  • Then they analyzed the web content.


The content that Google Panda demanded had to be professional and authentic. This meant no more plagiarism. A popular strategy used prior to Panda update was firms using the same content and building multiple websites with shuffled content, all linking to one specific web page.


Now the Google Panda update sought out these web pages and lowered the ratings. It judges how much in-depth the info available on the page is, how much info is repeated or copied from other sites, how many times were keywords used and how naturally they were linked in the articles. Articles jammed with several keywords were rejected as well. Too much link building was no longer acceptable. Pages with several links for example more than 600, 000 links were carefully observed. Articles with less than 400 words were penalized with lower ratings and hence the recommended word limit went up to 500-800 words.


Such requirements made the webpages fall into lower results and hence caused them to become desperate to abide by the new Google rules. What the SEO firms are now challenged with is building authentic and genuine content for web pages as well as linking pages. There is no more manipulation and even if you try to manipulate, you would do so by posting more authentic information online which once again is beneficial to Google.


Author Bio

If you are looking for web solutions then contact Scott Heron, he has been promoting websites through SEO, PPC and content marketing for a few years now and is always happy to help.