B School | Business | Marketing | Life ...

Subscribe to my Newsletter

Friday, November 27, 2015

Book Review : Why I failed by Shweta Punj

Why I failed - Lessons from Leaders by Shweta Punj attracted my attention since most of the books on business/self-help talks about success or how to succeed. This book is a collection of failure stories by well know personalities. I started readin it with an expectation to get a different perspective; but to be honest it disappointed me.

Most of the failure stories are told at a very high level and fails to describe the details. I am sure most the readers wanted that extra-piece of narrative on what happened, how it happened, its consecunces and how leaders overcame it. Sometimes I felt the book is just another one talking about success and how to succeed :) The only piece of thought I could take out of the book was - everyone including the leaders that we adore goes to phases of failure.

The book talks about 16 stories of leaders from various domains. The ones I felt inspiring were Sabyasachi Mukherjee and Sminu Jindal. The least digestable ones were like the ones of Abhinav Bindra (was the failure really due to pressure of expectations? or just overconfidence?) and Narayana Murthy (because I have read marvellous narratives in other articles)

Overall, a good read; but don't over-expect. I felt it to be more how they succeeded stories than failure stories!

Monday, November 23, 2015

Understanding Google Panda algorithm updates

In the last blog post, we discussed about Google Penguin updates. Today, let’s try to understand Google Panda update. The first release of Google Panda was in February 2011. While penguin updates went after back links, the main purpose of panda updates are to watch the quality of website content.I believe by far the biggest affect in search results were made by Panda updates. Panda updates fights scraper sites and content farms. A scraper essentially copies content from various websites using web scraping. Web scraping can be done by simple manual copy and paste or to the extend of using sophisticated web scraping software. On the other hand, content farms are websites with large volume of low quality content often copied from other web sites or being copied in multiple pages of the same webs site. Content Farms usually spend efforts to understand popular search keywords and spin out content based on those phrases.Thus Panda update was also called Farmer Update :)

Unlike Penguin updates, Google Panda updates are more frequent. It is expected that the algorithm has seen close to thirty refreshes. As mentioned about quality of content is the main focus of these updates. However quality doesn’t  confine to just the written quality of  content. Some of the major factors that could determine the quality of a website are

Quality of written content
Are you an authority about the topic. Is your content original? Will a reader benefit out of the written content you have put in your website? For example, this will be of utmost importance in case of e-commerce sites where there is a tendency to put minimal, duplicate or manufacture wording as product descriptions. These are the factors that determine the quality of written content.

Duplicate content
Are you using the same content in different web pages of your site (even if with minimal wording changes), then you are risking of getting affected by Panda. One example where this is affected unexpectedly is blogs. For example if you are using both tags and categories in your blog, this could be considered as a duplicate. But don’t worry, techniques like using noindex tags will help overcome this issue. Another method is to use canonical tags to tell Google which page to index. Duplicate content is not just limited to your website, it could be outside duplicate content too. For example – micro-sites or your own blogs where you either tend copy paste the same content or make minimal changes. Here is a good Moz article that describes various duplicate possibilities in the eye of Google.

Let’s take an example to understand better the above to points. Consider the book – Half Girl Friend in Amazon or Flipkart. As you can see the product description is kept different in both. Both are different from the description Chetan Bhagat gave in his website. Also we could reach the product page in Amazon or Flipkart in multiple navigational paths or URLs (like category, search direct etc.); but google indexes like the main product page.

Ad to content ratio
One of the elements used by the algorithm to test the quality is number of ads shown. While there are no optimal number, its best to be judicious while selecting the number of ads shown. While there are no optimal ratio for this, keeping the number of ads to minimal always keep you safe. It’s expected that algorithm has really matured to understand the ideal duplicate and ad to content figures.

Technical aspects
Technical aspects that differentiate the quality of a website including performance metrics such as page loading time, UX aspects like navigational ease & design and factors like proper re-directs, 404 pages and so on.

Finally, here is a Google Webmaster central blog post on what Google thinks a high quality website mean?

While the overall check of panda updates was on quality of the websites, third version of panda updates (specifically 3.3 onward) focused also on few other aspects. One improvement was around local search results. This update concentrated on improving on showing relevant search results from a proximity of user/content point of view. Another improvement was related to link evaluation(usage of link characteristics like wording, # of similar links etc to understand the content of the linked page). This was taken to an altogether different level with penguin updates. Freshness of the website was also considered in this update.

Google panda updates starting 3.0 were very regular that it became difficult to specifically focus on each update. Industry experts even opined that the dead Google Dance is back with these regular refreshes. Now the panda refreshes are tracked based on consecutive numbers rather than following a versioning system. From late 2013, Google started to include panda updates with index algorithms; thus becoming part of normal algorithm changes Google makes through-out the year. Panda 4.0 was released in May this year. This update specifically improved  on how Google views authority and originality of articles in the web. Here is a good case study on this from Search Engine Land. Another affect out of the Panda 4.0 was on how PR websites are ranked. Many PR websites were affected and had to re-define their publishing guidelines. The latest one was 4.1 or 27th Panda update in September 2014 which is expected to help small websites with original content. Also those websites doing recovering actions are expected to benefit from this update.

One thing we can understand by studying about these updates and their history is that all SEO practices revolve around these updates (like link building, content marketing and so on…)

Monday, November 16, 2015

Understanding Google Penguin updates

Let's talks about various Google search algorithms in the next few posts.

The main Google algorithm updates that we often hear about are Hummingbird, Panda and Penguin updates. Other components include Pigeon and Knowledge Graph. It’s interesting to note that most of these names are not coined by Google; but by various industry sources in the SEO world. Also, the above mentioned names are the ones which are relevant today. If you are a professional in this space for long, you would have heard updates starting from Google Dance to Fritz to Caffeine and more. Moz provides a good history timeline of Google algorithm changes along with associated key news articles (Google Algorithm Change History by Moz). In the next set of blog posts, I will try to explain the different components of Google Search algorithm updates and how they may be relevant to your website.
Google Penguin Updates

What are Google Penguin updates?
Let’s start with Penguin update since that had a recent update. When Penguin update was first launched, it started focusing on technical aspects of SEO like keyword stuffing and link building. Linking building strategies totally got changed because of Penguin updates. Penguin has gone through six updates and throughout it main focus has been on back links and link building practices.
First Penguin update was in 2012 and it targeted at reducing web-spam, specifically penalizing those websites violating Google’s webmaster  or quality guidelines.These malpractices include keyword stuffing, link farms, usage of doorway pages and duplicate contents. There were two more refreshes in 2012 for Penguin 1.0.

The next major update (Penguin 2.0) was in 2013. Some of the aspects that this update were how Google views advertorials and continued refining of back linking practices. Advertorial was not completely penalized; instead by posting visible disclaimers, websites had to make sure readers will be able to figure out the advertorials. Fortunately, the historic linking practices were not completely penalized; but it penalized any malpractices beyond Penguin 2.0 launch date. Some other interesting changes made in this update were how an authority get ranked better in a particular domain. For example, if your website is an authority in pediatric queries/articles, you may get a better ranking for those search keywords.Thus SEO effort was getting more aligned towards the content marketing strategies.This also meant a shift of SEO focus from outside to in.
Fifth update of Penguin (Penguin 2.1) was released by Google during October 2013. While there were no visible and significant changes in this update, industry experts consider this update improving the level of link-bait analysis;and identify spams at a deeper level in websites. Inherently, recovering from penguin updates meant looking at your link building practices. Also there were many cases of recoveries during this update. A study on Google Penguin 2.1 update by MathSight later this year found that there is linkage between quality of the content, anchor texts and links. Some of the factors included the quality of words and syllables in the body text; thus giving importance to readability. Having an impact due to anchor texts means – it could have an impact due to internal links too. Over-usage(optimized) of anchor texts and associated internal links (like link grids) may have adverse affect due to penguin update if it was meant for SEO and not from a UX perspective. Other impacts of Penguin 2.0 included the increased importance to contextual links (links that are contextual and relevant to an article), penalizing article spinning and unethical usage of micro sites. Some industry experts like Glenn Gabe argues Penguin affects SEO tactics like forum spam.

The latest update for Penguin was this year – Penguin 3.0 which was released on October 17, 2014. It is expected that in this refresh, some sites with bad linking practices will be demoted further and at the same time may help the sites hit earlier and taking recovery actions will be benefited.

How to test whether your site is affected by Penguin updates?
We can understand whether there was an impact by looking at the traffic.
  1. To understand the impact quickly, you could do a site wide search using Google (using ‘site:’). If you are not able to see any page results from your site, it means you are affected.
  2. Use any free tools like Fruiton or Panguin. These require giving access to your site analytics. Essentially these tools provide a time chart overlapped with key dates when the algorithm updates happened. So you can conclude your site is affected if you see a reduction in site traffic around these dates.
  3. If you think there is a possibility of manual penalty from Google, you will get a message in the webmasters tool.

How to recover from any Penguin penalties?
This is a comprehensive topic beyond the scope of a single post. The most important step is to do a back-link analysis. We have to be very careful while doing backlink analysis to differentiate between various quality links.The best place to start is your Google Webmaster tool. It provides a sample of links to your website in ‘Links to Your Site’ section in Search Traffic. You can also use other third party tools like Majestic SEO or Open Site Explorer. Group the links into which you think are malicious, which are of high quality and so on. You could ask the website owners to remove any links that you think are spams/not worthy.Or else you could use the Google disavow tool to remove the links from dodgy sites.But one needs to be very careful since it can result in un-wanted issues.Make sure to read the Webmaster Tool help on this topic.

If you are not affected by penguin updates or is a new website in making, then the best tactics to keep away from any penalties are
  • Keep the content relevant for the reader and not the search engine
  • Use social media to gain relevance and links
  • Avoid any paid link building schemes
  • Avoid misusing on-page SEO techniques

Saturday, November 7, 2015

Understanding Local SEO

As I mentioned in the last post, I will continue to write about digital marketing topics for some more time :) In this blog post, let's try to understand about Local SEO, what it is and how to do it?

Local SEO deals with search results within a defined locality or geographic location. This can be considered a subset of overall SEO efforts focusing on showcasing your web details in SERPs relevant to location based search keywords.  Google search algorithm has evolved to streamline and showcase search results based on locality, intend of search and so on. Let's take an example - 'Dominos Cochin'

A search for Dominos Pizzas in Cochin provides search results as show above. Two components are of interest to us. Along with the regular 10 search results, it shows a map with pins pointing to the location of Dominos stores in Cochin, Kerala, India in the right top end. On the left we can see Google Plus pages pointing to Dominos stores pages. This is populated from Google + Local pages. Local SEO in essence is trying to get the such listing of your business at the top in such results. A similar experience is offered by other major search engines like Bing. For example, let's consider the same search query in Bing.

In the case of Bing search results, instead of Google + Local pages; we see Foursquare pages of Dominos. Thus as you might have guessed, the first step in local SEO is nothing but creating the appropriate business listing in Google + Local, Foursquare or Yahoo Listing. Creating this listing accurately helps in populating the results in both map and the local search result carrousel. Having said that, it's not just enough; another important aspect is citations. Citations are nothing but listing of your business in other online directories - like Yelp or Tripadvisor or Justdial and so on. Any information provided that can identify your business can be considered as a citation (like Name of the company or Phone number). The most importance caveat here is to keep the information given across these platforms consistent. In a way, this is going back to the old days of SEO - adding to directories. Having said that, citations doesn't mean - limiting yourself to these directories; the platforms could be any thing from Google + pages to niche platforms like Tripadvisor or simply your own blog.

Adding the listing in these directories is just the first step. The next obvious step is keep the listing vibrant. One possibility is with managing reviews in these sites. Tripadvisor is a good example. Consider you are a resort in Chennai; Reviews, testimonials, photographs uploaded by both you and visitors etc. add credibility to both the search engine and prospective customers. In essence this is an overlap happening between SEO, community management and social media. This is equally applicable to Google + Local or Foursquare.

The next pillar for your local SEO is your website itself. Adding the consistent citation in your website - whether it is in your contact us page or in footer, is very important. This is in fact one of the signals used by search engines to decide the local relevancy of a website or business. Utilizing a schema markup for this largely helps the website. Schema mark-ups convey search engines aspects like reviews and business hours from the website content and helps in populating the search engine pages.

For example, using   tells search engines that your business is open Monday to Saturday from morning nine o'clock to night nine o'clock. If your site comes up in SERPs for a keyword, it will be shown something similar to below -

Utilization of schema markup is not just limited to NAP (Name, Address, Phone) or working hours; for the complete list of possibilities, visit schema.org web page on local business. Another important utility of schemas is in managing testimonials. Having a page or section in your website for testimonials and adding a proper schema format to it helps not only gaining credibility to visitors; but also search engines may show them in search results.

Finally, we haven't talked about the most important player in local SEO - The Mobile. Day by day, mobile device is becoming the backbone to local searches. Enhanced GPS capabilities, better screen capabilities and connectivity helps in improving the search quality in mobile devices. Thus the final pillar of an effective local SEO strategy is giving its due respect to mobile websites. Making your website readable in mobile, having an app strategy and so on comes into picture.

If you are looking for an exhaustive list of factors affecting local SEO, consider reading Moz's Local Search Ranking Factors
Let's take a deeper look into Google + Local search results coming in SERPs. Populating the local listing carousel depends on couple of aspects -

  1. The proximity between the business and the searcher's location
  2. How dynamic is your listed page in these directories. This not only includes how accurate the information is; but also about how creative were you in using the listing. For example, have utilized the Photos & Videos sections? , have you utilized the Additional Details section?
  3. Signals from usual search - signals such as how well are you reviewed, how good is your general website, and so on.