Useful Commands

Reading Time: 1

GIT Cheatsheet

To initialise (create) a git repository
git init
Add files to the branch
git add –all
Commit files
git commit -m “initial commit”
Add remote repo
git remote add origin<username>/<repo>.git
Push commit to the remote repo
git push -u origin master
Check status
git status
Check current branch
git branch
Change branch
git checkout <branch_name>
Pull changes from origin (remote branch) to master (local branch)git pull origin master
For different repo histories use –allow-unrelated-histories flag
Create a tag (annotated)
git tag -a <tag> -m “<tag comment>”
View tags on a branch
git tag
Search for a specific tag on a branch
git tag -l “<tag>”
Wildcard (*) search can be used within the quotes (“)

Skunk Works

Reading Time: 3 minutes

skunkworksCompanies often run into situation where employees are too focused to meet the quarterly goals and annual targets that they often fail to notice and take advantage of key market opportunities, which if pursued would result in a significant competitive advantage for the company in the longer term.

Every major technology company pours millions of premium dollars in Research and Development (R&D). Apart from a handful of companies most R&D initiatives do not result in any significant business benefit. The reason is not the lack of ambition or effort  on the R&D team’s part, but the issue lies with the company not taking advantage of the discoveries and breakthroughs uncovered by the R&D department.

Kodak is somewhat of a clichéd example but never the less it’s eventual demise accurately reflects the points highlighted about. Kodak was the company that invented digital cameras yet failed to take advantage of the opportunity. When Steve Sasson, an engineer at Kodak Eastman, presented the prototype digital camera back in 1975, the company’s marketing and business departments did not appreciate it. Part of the reason was that the digital camera was viewed as a threat to their existing business lines. Eventually companies like Sony, Nikon, Canon and Samsung took advantage of this opportunity at the expense of Kodak.

This scenario is common across almost all established businesses where a new product or service is viewed as a thread to the existing business lines and is not further explored.

There are two major contributing factors to the scenario.

1. R&D Initiatives Are Incremental in Nature

Most of the R&D initiatives are evolutionary rather than revolutionary in nature. Part of the problem is the focus on near term gains rather than having a long term vision.R&D teams that produce results which can be pushed out to the market in the next iteration of the existing products are more likely to success within their peer group. This takes the focus away from what can be done to truly disrupt the existing line of products and services a company offers.For most companies the R&D department in essence is just an extension of their market reactive selves and is not truly autonomous.

2. Not Taking Advantage of the Discoveries

Discoveries which just incrementally enhance the existing products and services of a company are most likely to be pushed to the market. Breakthroughs and discoveries which fundamentally change the nature of a company’s products or services are considered too risky and never see the light of day.

Once a company achieves success, its desire to develop and disrupt its existing line of products and services is diminished. A much stronger desire to maximize its current profits presides over the company. And this is where the problem lies.

The conjunctive impact of the above two factors contribute towards incumbency. The companies are under the false impression of being at the cutting edge of research and development in their fields when they are merely spending dollars on incremental linear changes.

This is where Skunks Works thinking can be extremely useful.

What is Skunk Works?

Clarence Leonard “Kelly” Johnson pioneered this methodology at Lockheed Martin in June 1943. Kelly Johnson and his Skunk Works team designed and built the XP-80 (Prototype version of P-80) in only 143 days, 7 days before required.

As Peter Diamandis and Steven Kotler elaborate on this concept in their book Bold: How to Go Big, Create Wealth and Impact the World, there are three main ingredients to facilitate Skunk Works:

1. Isolation

To truly innovate it is necessary for the innovation team to be detached from the rest of the organisation. This way the team can autonomously function and focus on initiatives that will truly disrupt and push forward a companies core business.Isolation is one of the key requirements for Skunk Works, as isolation stimulates risk taking, encourages original ideas and acts as a counter-force to organisational inertia.

2. Rapid Iteration

Rapid iteration is the practice of quickly trying out ideas or “experiments” and increasing the knowledge gained from results. This way the team can get an almost instant feedback on their minimum viable product or service without having to spend large sums of money and time on developing a well polished product that nobody wants. It also allows the team to incorporate the feedback from customers into the next iteration of the product.

3. Big Goals

Skunk Works ventures are created to tackle hard problems. The aim is to achieve difficult goals. Business as usual activities and operations are not a good for for going skunk unless the idea is to fundamentally disrupt the existing processes.


The combination of big goals, isolation and rapid iteration results in a fostering environment which allows the team to be autonomous and be able to master the skills required to deliver purpose driven big goals.


Further Reading:

  1. Link to the 14 Rules of Skunk – Kelly’s 14 Rules & Practices
  2. Kelly Johnson’s biography – Clarence “Kelly” Johnson: Architect of the Air
  3. Peter H. Diamandis and Steven Kotler – Bold: How to Go Big, Create Wealth and Impact the World
  4. Lockheed Martin’s Skunk Works

Lexical Normalisation of Twitter Data Presentation at SAI Conference 2015, London, UK

Reading Time: 1

Highlights from my presentation at Science and Information Conference (SAI) 2015 in London, UK.

Watch the entire presentation here.

For presentation slides and further reading click here.

Migrating Amazon Linux AMI PV (Paravirtual) EC2 Instance to HVM (Hardware assisted Virtualization) Instance

Reading Time: 2 minutes

PV to HVM Image

As of now there is no easy way to directly migrate an existing PV EC2 (T1) instance to HVM (T2) instance. Using the following steps we will migrate data from an existing PV (Paravirtual) instance to a newly launched HVM (Hardware assisted Virtualisation) instance.


STEP 1 – Prepare SOURCE Volume

PV Instance


  1. Stop the PV instance
  2. Using the AWS Console go to the PV instance root volume and create a snapshot
  3. From the snapshot create a new volume, we will call this as the source volume


STEP 2 – Prepare TARGET Volume

HVM Instance


  1. Create and Launch a new Hardware Assisted Virtual (HVM) machine of similar configuration as the PV instance. Ensure that the new HVM instance has the same base AMI as that of the PV instance and is in the same availability region.
  2. Stop the HVM
  3. Using the AWS Console go to the newly created HVM’s root volume and create a snapshot
  4. From the snapshot create a new volume, we will call this as the target volume
  5. Using the AWS Console:
    Attach SOURCE volume to the new instance as /dev/xvdf
    Attach TARGET volume to the new instance as /dev/xvdg
  6. Start the HVM


STEP 3 – Migrate Data from Source Volume (PV) to Target Volume (HVM)



  1. SSH into the new HVM instance and get root access by using:
    sudo su
  2. Mount the source and target drives using the following commands:
    mkdir -p /mnt/source && mount /dev/xvdf /mnt/source
    mkdir -p /mnt/target && mount /dev/xvdg1 /mnt/target

Please use ls -al /dev/xvd*  to see the drives if the above drive labels are different on your machine.

  1. Delete everything but /boot on the target volume using the following command
    cd /mnt/target && ls | grep -v boot | xargs rm -Rf
  1. Delete /boot on the source volume:
    rm -Rf /mnt/source/boot
  1. Copy the source volume’s data to the target volume preserving all the attributes:
    rsync -aAXHPv /mnt/source/ /mnt/target
  2. After the data has been copied across from the source to the target volume navigate to the following directory on the target volume:
    cd /mnt/target/boot/grub We will update the grub.conf file to tell the kernel to disable SELinux on boot.Append the following to the kernel line in your /boot/grub/grub.conf for your current kernel.enforcing=0
  3. After appending the above setting your grub.conf file should look like this:grub.conf
  4. Stop the system and detach all volumes using the AWS console.



STEP 4 – Attach the Target Volume to HVM instance as root device


  1. Attach the TARGET volume on the new instance as /dev/xvda.
    Please check where the original root device was mounted on the HVM instance. In most cases, it should be /dev/xvda.
  2. Start your HVM instance.It should be an exact replica of the PV instance.
    The PV instance and the source volume can be deleted.

Design Thinking

Reading Time: 2 minutes

Designer Desk

We are at the verge of a monumental shift in the way consumers choose and utilize products and services. We have seen the transition from the Industrial age to the Information age, which has fundamentally changed the way we produce and consume i.e. the basic building blocks of our economy.  The transition from Industrial age to Information age was quite stark. We saw a shift from products based economy to services oriented economy.

With the maturation of the Information age we are now moving away from an economy and a society built on the logical linear computer like capabilities of the information age to an economy and society built on inventive, empathic, big-picture capabilities of what’s rising in its place, the Conceptual age.

The Conceptual Age belongs to a very different kind of person with a very different kind of mind – creators and empathizers, pattern recognizers, and meaning makers. Inventors, designers, artists, storytellers, big-picture thinkers – are the kind of people primed to excel in the new era and reap societies richest rewards.

For existing businesses in order to stay competitive it is important to differentiate themselves from their competitors not only in terms of the quality of the service but as an overall experience of what their consumers experience when they interact with the brand. Businesses need to start thinking in terms of selling experience rather than just a product or a  service.

Consumers are now looking for a sense of personal satisfaction from any product or service that they utilize. And this personal satisfaction depends on attributes like design, story, symphony, empathy and meaning. These are the fundamental human abilities that triggers a response that resonates with consumers at a very personal level. These attributes that are at the core of the design thinking approach.

So what is Design Thinking?

Design thinking refers to design-specific cognitive activities that designers apply during the process of designing.
— Willemien Visser, The Cognitive Artifacts of Designing 

“Design thinking is a human-centered approach to innovation that draws from the designer’s toolkit to integrate the needs of people, the possibilities of technology, and the requirements for business success.”
— Tim Brown, President and CEO, IDEO






IBM Design


Helvetica [2007]


A documentary about typography, graphic design, and global visual culture. Click here for movie homepage.


Design & Thinking [2012] 


An excellent documentary exploring the idea of design thinking. Click here for movie homepage.

Predictive Capacity of Meteorological Data

Reading Time: 1


View Presentation: here

With the availability of high precision digital sensors and cheap storage medium, it is not uncommon to find large amounts of data collected on almost all measurable attributes, both in nature and man-made habitats. Weather in particular has been an area of keen interest for researchers to develop more accurate and reliable prediction models.

This paper presents a set of experiments which involve the use of prevalent machine learning techniques to build models to predict the day of the week given the weather data for that particular day i.e. temperature, wind, rain etc., and test their reliability across four cities in Australia {Brisbane, Adelaide, Perth, Hobart}. The results provide a comparison of accuracy of these machine learning techniques and their reliability to predict the day of the week by analysing the weather data. We then apply the models to predict weather conditions based on the available data.

Further Reading: The complete paper is available here.

Lexical Normalisation of Twitter Data

Reading Time: 1
  • social-media-in-1-minute

View Presentation: here

Watch the Highlights: here

Twitter with over 500 million users globally, generates over 100,000 tweets per minute . The 140 character limit per tweet, perhaps unintentionally, encourages users to use shorthand notations and to strip spellings to their bare minimum “syllables” or elisions e.g. “srsly”.

The analysis of Twitter messages which typically contain misspellings, elisions, and grammatical errors, poses a challenge to established Natural Language Processing (NLP) tools which are generally designed with the assumption that the data conforms to the basic grammatical structure commonly used in English language.

In order to make sense of Twitter messages it is necessary to first transform them into a canonical form, consistent with the dictionary or grammar. This process, performed at the level of individual tokens (“words”), is called lexical normalisation. This paper investigates various techniques for lexical normalisation of Twitter data and presents the findings as the techniques are applied to process raw data from Twitter.

Further reading: The complete paper is available here.