Understanding Google Panda algorithm updates

In the last blog post, we discussed about Google Penguin updates. Today, let’s try to understand Google Panda update. The first release of Google Panda was in February 2011. While penguin updates went after back links, the main purpose of panda updates are to watch the quality of website content.I believe by far the biggest affect in search results were made by Panda updates. Panda updates fights scraper sites and content farms. A scraper essentially copies content from various websites using web scraping. Web scraping can be done by simple manual copy and paste or to the extend of using sophisticated web scraping software. On the other hand, content farms are websites with large volume of low quality content often copied from other web sites or being copied in multiple pages of the same webs site. Content Farms usually spend efforts to understand popular search keywords and spin out content based on those phrases.Thus Panda update was also called Farmer Update :)

Unlike Penguin updates, Google Panda updates are more frequent. It is expected that the algorithm has seen close to thirty refreshes. As mentioned about quality of content is the main focus of these updates. However quality doesn’t  confine to just the written quality of  content. Some of the major factors that could determine the quality of a website are

Quality of written content
Are you an authority about the topic. Is your content original? Will a reader benefit out of the written content you have put in your website? For example, this will be of utmost importance in case of e-commerce sites where there is a tendency to put minimal, duplicate or manufacture wording as product descriptions. These are the factors that determine the quality of written content.

Duplicate content
Are you using the same content in different web pages of your site (even if with minimal wording changes), then you are risking of getting affected by Panda. One example where this is affected unexpectedly is blogs. For example if you are using both tags and categories in your blog, this could be considered as a duplicate. But don’t worry, techniques like using noindex tags will help overcome this issue. Another method is to use canonical tags to tell Google which page to index. Duplicate content is not just limited to your website, it could be outside duplicate content too. For example – micro-sites or your own blogs where you either tend copy paste the same content or make minimal changes. Here is a good Moz article that describes various duplicate possibilities in the eye of Google.

Let’s take an example to understand better the above to points. Consider the book – Half Girl Friend in Amazon or Flipkart. As you can see the product description is kept different in both. Both are different from the description Chetan Bhagat gave in his website. Also we could reach the product page in Amazon or Flipkart in multiple navigational paths or URLs (like category, search direct etc.); but google indexes like the main product page.

Ad to content ratio
One of the elements used by the algorithm to test the quality is number of ads shown. While there are no optimal number, its best to be judicious while selecting the number of ads shown. While there are no optimal ratio for this, keeping the number of ads to minimal always keep you safe. It’s expected that algorithm has really matured to understand the ideal duplicate and ad to content figures.

Technical aspects
Technical aspects that differentiate the quality of a website including performance metrics such as page loading time, UX aspects like navigational ease & design and factors like proper re-directs, 404 pages and so on.

Finally, here is a Google Webmaster central blog post on what Google thinks a high quality website mean?

While the overall check of panda updates was on quality of the websites, third version of panda updates (specifically 3.3 onward) focused also on few other aspects. One improvement was around local search results. This update concentrated on improving on showing relevant search results from a proximity of user/content point of view. Another improvement was related to link evaluation(usage of link characteristics like wording, # of similar links etc to understand the content of the linked page). This was taken to an altogether different level with penguin updates. Freshness of the website was also considered in this update.

Google panda updates starting 3.0 were very regular that it became difficult to specifically focus on each update. Industry experts even opined that the dead Google Dance is back with these regular refreshes. Now the panda refreshes are tracked based on consecutive numbers rather than following a versioning system. From late 2013, Google started to include panda updates with index algorithms; thus becoming part of normal algorithm changes Google makes through-out the year. Panda 4.0 was released in May this year. This update specifically improved  on how Google views authority and originality of articles in the web. Here is a good case study on this from Search Engine Land. Another affect out of the Panda 4.0 was on how PR websites are ranked. Many PR websites were affected and had to re-define their publishing guidelines. The latest one was 4.1 or 27th Panda update in September 2014 which is expected to help small websites with original content. Also those websites doing recovering actions are expected to benefit from this update.

One thing we can understand by studying about these updates and their history is that all SEO practices revolve around these updates (like link building, content marketing and so on…)

Understanding Google Penguin updates

Let's talks about various Google search algorithms in the next few posts.

The main Google algorithm updates that we often hear about are Hummingbird, Panda and Penguin updates. Other components include Pigeon and Knowledge Graph. It’s interesting to note that most of these names are not coined by Google; but by various industry sources in the SEO world. Also, the above mentioned names are the ones which are relevant today. If you are a professional in this space for long, you would have heard updates starting from Google Dance to Fritz to Caffeine and more. Moz provides a good history timeline of Google algorithm changes along with associated key news articles (Google Algorithm Change History by Moz). In the next set of blog posts, I will try to explain the different components of Google Search algorithm updates and how they may be relevant to your website.
Google Penguin Updates

What are Google Penguin updates?
Let’s start with Penguin update since that had a recent update. When Penguin update was first launched, it started focusing on technical aspects of SEO like keyword stuffing and link building. Linking building strategies totally got changed because of Penguin updates. Penguin has gone through six updates and throughout it main focus has been on back links and link building practices.
First Penguin update was in 2012 and it targeted at reducing web-spam, specifically penalizing those websites violating Google’s webmaster  or quality guidelines.These malpractices include keyword stuffing, link farms, usage of doorway pages and duplicate contents. There were two more refreshes in 2012 for Penguin 1.0.

The next major update (Penguin 2.0) was in 2013. Some of the aspects that this update were how Google views advertorials and continued refining of back linking practices. Advertorial was not completely penalized; instead by posting visible disclaimers, websites had to make sure readers will be able to figure out the advertorials. Fortunately, the historic linking practices were not completely penalized; but it penalized any malpractices beyond Penguin 2.0 launch date. Some other interesting changes made in this update were how an authority get ranked better in a particular domain. For example, if your website is an authority in pediatric queries/articles, you may get a better ranking for those search keywords.Thus SEO effort was getting more aligned towards the content marketing strategies.This also meant a shift of SEO focus from outside to in.
Fifth update of Penguin (Penguin 2.1) was released by Google during October 2013. While there were no visible and significant changes in this update, industry experts consider this update improving the level of link-bait analysis;and identify spams at a deeper level in websites. Inherently, recovering from penguin updates meant looking at your link building practices. Also there were many cases of recoveries during this update. A study on Google Penguin 2.1 update by MathSight later this year found that there is linkage between quality of the content, anchor texts and links. Some of the factors included the quality of words and syllables in the body text; thus giving importance to readability. Having an impact due to anchor texts means – it could have an impact due to internal links too. Over-usage(optimized) of anchor texts and associated internal links (like link grids) may have adverse affect due to penguin update if it was meant for SEO and not from a UX perspective. Other impacts of Penguin 2.0 included the increased importance to contextual links (links that are contextual and relevant to an article), penalizing article spinning and unethical usage of micro sites. Some industry experts like Glenn Gabe argues Penguin affects SEO tactics like forum spam.

The latest update for Penguin was this year – Penguin 3.0 which was released on October 17, 2014. It is expected that in this refresh, some sites with bad linking practices will be demoted further and at the same time may help the sites hit earlier and taking recovery actions will be benefited.

How to test whether your site is affected by Penguin updates?
We can understand whether there was an impact by looking at the traffic.
  1. To understand the impact quickly, you could do a site wide search using Google (using ‘site:’). If you are not able to see any page results from your site, it means you are affected.
  2. Use any free tools like Fruiton or Panguin. These require giving access to your site analytics. Essentially these tools provide a time chart overlapped with key dates when the algorithm updates happened. So you can conclude your site is affected if you see a reduction in site traffic around these dates.
  3. If you think there is a possibility of manual penalty from Google, you will get a message in the webmasters tool.

How to recover from any Penguin penalties?
This is a comprehensive topic beyond the scope of a single post. The most important step is to do a back-link analysis. We have to be very careful while doing backlink analysis to differentiate between various quality links.The best place to start is your Google Webmaster tool. It provides a sample of links to your website in ‘Links to Your Site’ section in Search Traffic. You can also use other third party tools like Majestic SEO or Open Site Explorer. Group the links into which you think are malicious, which are of high quality and so on. You could ask the website owners to remove any links that you think are spams/not worthy.Or else you could use the Google disavow tool to remove the links from dodgy sites.But one needs to be very careful since it can result in un-wanted issues.Make sure to read the Webmaster Tool help on this topic.

If you are not affected by penguin updates or is a new website in making, then the best tactics to keep away from any penalties are
  • Keep the content relevant for the reader and not the search engine
  • Use social media to gain relevance and links
  • Avoid any paid link building schemes
  • Avoid misusing on-page SEO techniques

Understanding Local SEO

As I mentioned in the last post, I will continue to write about digital marketing topics for some more time :) In this blog post, let's try to understand about Local SEO, what it is and how to do it?

Local SEO deals with search results within a defined locality or geographic location. This can be considered a subset of overall SEO efforts focusing on showcasing your web details in SERPs relevant to location based search keywords.  Google search algorithm has evolved to streamline and showcase search results based on locality, intend of search and so on. Let's take an example - 'Dominos Cochin'

A search for Dominos Pizzas in Cochin provides search results as show above. Two components are of interest to us. Along with the regular 10 search results, it shows a map with pins pointing to the location of Dominos stores in Cochin, Kerala, India in the right top end. On the left we can see Google Plus pages pointing to Dominos stores pages. This is populated from Google + Local pages. Local SEO in essence is trying to get the such listing of your business at the top in such results. A similar experience is offered by other major search engines like Bing. For example, let's consider the same search query in Bing.

In the case of Bing search results, instead of Google + Local pages; we see Foursquare pages of Dominos. Thus as you might have guessed, the first step in local SEO is nothing but creating the appropriate business listing in Google + Local, Foursquare or Yahoo Listing. Creating this listing accurately helps in populating the results in both map and the local search result carrousel. Having said that, it's not just enough; another important aspect is citations. Citations are nothing but listing of your business in other online directories - like Yelp or Tripadvisor or Justdial and so on. Any information provided that can identify your business can be considered as a citation (like Name of the company or Phone number). The most importance caveat here is to keep the information given across these platforms consistent. In a way, this is going back to the old days of SEO - adding to directories. Having said that, citations doesn't mean - limiting yourself to these directories; the platforms could be any thing from Google + pages to niche platforms like Tripadvisor or simply your own blog.

Adding the listing in these directories is just the first step. The next obvious step is keep the listing vibrant. One possibility is with managing reviews in these sites. Tripadvisor is a good example. Consider you are a resort in Chennai; Reviews, testimonials, photographs uploaded by both you and visitors etc. add credibility to both the search engine and prospective customers. In essence this is an overlap happening between SEO, community management and social media. This is equally applicable to Google + Local or Foursquare.

The next pillar for your local SEO is your website itself. Adding the consistent citation in your website - whether it is in your contact us page or in footer, is very important. This is in fact one of the signals used by search engines to decide the local relevancy of a website or business. Utilizing a schema markup for this largely helps the website. Schema mark-ups convey search engines aspects like reviews and business hours from the website content and helps in populating the search engine pages.

For example, using   tells search engines that your business is open Monday to Saturday from morning nine o'clock to night nine o'clock. If your site comes up in SERPs for a keyword, it will be shown something similar to below -

Utilization of schema markup is not just limited to NAP (Name, Address, Phone) or working hours; for the complete list of possibilities, visit schema.org web page on local business. Another important utility of schemas is in managing testimonials. Having a page or section in your website for testimonials and adding a proper schema format to it helps not only gaining credibility to visitors; but also search engines may show them in search results.

Finally, we haven't talked about the most important player in local SEO - The Mobile. Day by day, mobile device is becoming the backbone to local searches. Enhanced GPS capabilities, better screen capabilities and connectivity helps in improving the search quality in mobile devices. Thus the final pillar of an effective local SEO strategy is giving its due respect to mobile websites. Making your website readable in mobile, having an app strategy and so on comes into picture.

If you are looking for an exhaustive list of factors affecting local SEO, consider reading Moz's Local Search Ranking Factors
Let's take a deeper look into Google + Local search results coming in SERPs. Populating the local listing carousel depends on couple of aspects -

  1. The proximity between the business and the searcher's location
  2. How dynamic is your listed page in these directories. This not only includes how accurate the information is; but also about how creative were you in using the listing. For example, have utilized the Photos & Videos sections? , have you utilized the Additional Details section?
  3. Signals from usual search - signals such as how well are you reviewed, how good is your general website, and so on.

Blog posts on Digital Marketing

Lately,  most of my latest posts in this blog has been related to Digital Marketing.  The main reason for it being that I am passionate about the domain and is working closely with some of the great marketers  (specifically cross channel attribution; more on this topic in a later post). So bear with me for few more posts, as I will continue to write about these topics for sometime. Considering digital is taking a front seat for most of the marketing plans these days, I believe you will continue to gain something out of reading these articles and find them interesting. Please share your valuable comments.

Signing off for day, sharing with you these  reports I came across related to display advertising budget allocation by advertisers. 

Google Trends and Keyword Research

Keyword research for SEO and SEM is a hot topic. A simple google search for the term provides more that 18,10,000 results. There are many paid and free tools available to do keyword research.In this blog post, I will concentrate on Google Trends as a tool for keyword research. Google Trends shows the search volume in a relative scale. It shows how a search term is searched in Google relative to the total search volume. The beauty of this tool is that it allows comparison between multiple terms. Added capabilities of Google Trends include selection of date for search volume, inclusion of news related to the search term and so on.

So how to include Google Trends in your market research?

The important aspect of Google Trends is that it provides the historical information and how the search volume has changed across years. This gives us an idea about not only the trend and volume; but also to learn about any seasonal effects. For example, consider the below chart. It shows the search volume trend for ‘buy AC’. As we know the sales of air conditioners are most during summer days. And the trend validates that. A look at the geography map, most of the searches come from northern part of India where the affect of summer is more.

Let’s consider another example. If your business concentrates on publishing audiobooks, one of the keyword to focus on would be ‘audiobook’. However there may be three other possibilities – ‘audio book’, ‘audio books’, ‘audiobooks’. Google Trends help you to have simultaneous analysis of how these keywords do in terms of search volume.

Another use case is the location of words in a search phrase. Let’s consider for example ‘quality management consulting’ and ‘consulting quality management’. As we can see there are variations in search volumes for each and we may choose one over the other based on trend.

Finally the important benefit of keeping a track on Google Trends is for Content Marketing. It helps in two fronts – to keep an appropriate calendar following the trend and developing content marketing themes based on rising/top keyword volumes. If you are in search marketing domain, I am sure you would have come across Google Trends being used as a keyword research tool. Of course one of the utilities of Google Trends is keyword research, but it has a wider utility of being a market research tool. For example, if you look at the chart for audiobooks from above example, we can see searches for ‘tamil audio books’ is rising at a staggering 450% . If you are a business focused on publishing audiobooks, you may want to increase number of Tamil titles. Let me close this post by providing another interesting use of Google Trends in keeping a track of key brand keywords. If you are the digital marketer in one of the leading online travel sites, this is a chart that you will ponder about. It shows ‘yatra.com’ as a search term decrease over time drastically and ‘makemytrip’ gained the leadership position over time.

A quick check of 'MBA' term shows a decreasing trend ;)

Image courtesy of Stuart Miles at FreeDigitalPhotos.net

Naming new products and versions

New product naming is always a challenge a fun experience. Name plays an important role, be it just a product or a brand. Naming products when improvements are introduced rapidly is handled to an extend by including a version - ing system. Thus a V2 product is an enhanced version of its  first entry and is always a better one. While naming a product, the following are the deciding factors -

Product name should align with your brand be it the parent company or the segment. For example ipad mini means a miniature version of orginal ipad and ipad air is the lighter version. A good example from the online advertising space if Google DoubleClick suite of products.

Product name should ring a bell without making a buyer think on what it means. For example, when the next version of ipad was introduced,  it being called ipad air 2. Having said that here the interesting  part - how about when Apple introduced all these products for the first time? At that time iPod is new, iPad is new. But even then, the usage of pod indicates related to music, and pad being something related to holding in hand to write or do something. Another fact is it used the 'i' append to relate and align with Apple products.

One example which completely screwed up this part is the kitchen appliance maker TTK-Prestige. If you consider their induction cooktop, they use a weird naming convention to differentiate between models. For example, PIC 3.0, PIC 6.0 V2, PIC 14.0. It is a natural tendency for the end user to think a 14.0 model is better and the latest. However that may not be the case at all!

Let's take another aspect. Let's assume we are going to market with an improved version  of an already prominent product in the market. We see many marketers naming such offering Next Generation. This name is such a cliche that every Tom, Dick and Harry uses it. Things get complicated if the next generation offering misses some favorite use cases the old version was doing (even if they are  silly!); then it puts a doubt on the existing customer base about the maturity of next generation. 

Another challenge in this type of naming to consider is what if we are envisioning to come up with another version in future; what will we call it then ? It may look awkward to even call it Next Generation V2 ...isn't it?

What do you think about naming products?

Shift of On Page SEO to Off Page SEO

As we discussed in the refresher note on SEO, Search Engine Optimization (SEO) techniques are generally classified in to on-page SEO and off-page SEO. Explosion of social media usage and improvements in search engine algorithms have literally shifted the focus areas of a SEO professional. Gone are the days when on-page optimization techniques covered most of the SEO portfolio. In my opinion, the importance of on-page optimization has been drastically reduced. Today off-page optimization requirements are getting more prominence. Also SEO activities are becoming more of a day-to-day routine. Gone are the days when we completed all on-page optimization requirements like setting the tags properly, using keywords and wait for search engines to pick-up out web-pages. Today, search engine algorithms are becoming sophisticated enough to customize the search results based on the user’s interest, browsing history, specific requirements and so on.
Consider for reference the updated report from search engine land (The Periodic Table Of SEO Success Factors); as we can see what matters more is how relevant is your website to the audience? Is it credible? Is the content share-able? and so on. These are the factors which affects the traffic to a website. Thus off-page SEO is getting more prominence these days.Thus, Off-page optimization techniques decide on improving the search rankings; on-page SEO techniques help in sustaining those rankings.

On-page optimization techniques are the foundation to building your SEO strategies. Not giving due importance to on-page SEO is like not worrying about your building’s foundation. Irrespective how many floors you build, how marvellous its architecture is or how well you do its interior decoration; the building won’t stand long unless the foundation is done properly; similar is the case when we consider on-page v/s off-page SEO.

Let’s take an example to understand this. Assume you wrote a stellar article, but didn’t give due importance to set its titles and meta descriptions properly. While the search engine may give it importance based on keyword density and also readers may share it across social media channels; but you stand to lose out on sustaining the ranking because of bad meta tag usage. Similarly you stand to lose out on sharing perspective too since the title, automatic description etc that get pulled by social media platforms won’t be proper. Thus on-page optimization requirements the hygiene factors and off-page optimization requirements are the motivational factors to keep visiting and sharing your website.

SEO should be considered as a strategic component of your marketing mix. Often the On-Page optimization v/s Off-Page optimization discussions pop up when a firm consider SEO to be an overhead effort. The shift of importance to off-page techniques from on-page optimization epitomizes the fact that SEO is becoming more strategic in nature. Consider your SEO efforts as an investment to build your brand. SEO is not about a web developer doing her part; it’s a marketer’s thought process in action.Hence it is important to keep a process in place to ensure on-page  requirements being met and continued investment in off-page  requirements to keep your website relevant and drive traffic!

If you are more interested in this topic, consider reading this Forbes interview with Sam McRoberts, a well know SEO Expert in the industry.

What do you think about on-page SEO v/s off-page SEO? Is the relevance of SEO tactics increasing or diminishing?

Dofollow, Nofollow - What to follow!

Dofollow, Nofollow, Doindex and No-index
These are the four terms you often encounter while discussing about Search Engine Optimisation (SEO). Understanding these terms are also important from two other perspectives – Link Building and Social Bookmarking. Let’s briefly discuss about these today.


This concept was introduced to overcome some of the earlier black-hat SEO techniques in link building. How many links are you getting from outside is (was) an important factor in Google PageRank and other search engine rankings. As you might have guessed, this resulted in lot’s of spam with folks trying to link to a website from all places possible (comments, forum threads, paid links and so on).

Simply put, nofollow is just an attribute of the ‘rel’ HTML tag. If we use the ‘nofollow’ attribute, it tells the search engines that the linking can be discarded for search ranking. Introduction of this concept helped to reduce a lot of link spams. Read this wikipedia article for a good coverage on spamdexing and link spams (Spamdexing-Wikipedia). A good example to understand the importance of follow/Nofollow concept is the comments section in your blog. I am sure, if you have a blog, you will be getting lot’s of spam comments and many ‘look-good-yet-spam’ type comments linking back to the commenter’s website. There are plugins available for wordpress using which you can set nofollow attribute for the comments in your blog. Thus search engines won’t give importance to the linking from your site. Even in blogger, when you add a link, there is an option to put it as no-follow.

Nofollow can be at two level – link level and content level. A link can be set noFollow by providing a rel attribute in href tag. Content level nofollow is set using the content attribute (meta name="robots" content="nofollow")

The above snippet makes the entire web page or a blog post nofollow for Google thus not considering any links in the page for link juice. Now let’s understand follow attribute. If the link or content doesn’t have the rel tag attribute set as nofollow; it is generally considered to be followed.However the right practice to tell the search engine that you want to make a particular link or page follow-able is using the attribute Dofollow. For example - using attribute rel="dofollow"

Having discussed these, I would also like to say that over time, the importance given to these attributes by online marketers have gone down. Many of the popular sites like Facebook and Twitter have ‘Nofollow’ policy; yet the links coming from those are considered important for search engine rankings.The most important thing to keep in mind is – links are always good (at least for driving traffic if not SERP); and we need to use them cautiosly and not as a spamming tool. Here is Google’s view on nofollow.


Noindex attribute tells the search engines whether to crawl the webpage or not. This attribute is used in conjunction with meta tag. These are mainly used in cases of large web pages or utility pages (like printing, mobile etc.) or those involving large amount of back-end communications or intranet webpages. Usage and interpretation of Noindex could vary between search engines and we can set the attribute at search engine level as below (Read more in this Noindex-Wikipedia article)

These basic terms discussed in this post are foundational for building a good link building and social media strategies (we will look into these in later posts).

What’s your view about Nofollow/Dofollow; do you think search engines consider these important in their search algorithms?
Image courtesy of Stuart Miles at FreeDigitalPhotos.net