Iterative development has been a widely accepted method of product development and even web development for a long time time. You work with a team to determine what are the minimum features that have to be included to ship or launch a product, then as you receive feedback from the users you make adjustments and refinements. This process enables you to get to market the fastest and make adjustments as you gather data and feedback, while still providing value to the end user of the product.
However when it comes to working with companies on implementing an analytics solution on their web site or application there seems to be an overwhelming resistance to applying this iterative process to the implementation process. Instead of getting a base or standard implementation in place and collecting data as soon as possible some are content to wait days, weeks, months even to create a “perfect” implementation. It would seem that the sooner you are able to begin data collection, the faster you will be able to make adjustments to the types of data you are collecting and have the opportunity to discover potential flaws in what may have seemed like a rock solid plan.
When you are planning for an implementation you are not creating the blueprints for a skyscraper that will have to stand unchanged for decades or longer. Analytics implementation in its very nature is an iterative and a continual process, unless of course you never plan on adding any new content or features to your website. As you continually improve your site and its features, your implementation must evolve as well in order to provide you with the data you need.
The digital world is ruled by deadlines. Retail sites spend months preparing for cyber monday, media sites are in a constant state of flux with the ever changing news cycles or latest sporting event, consumer product and services sites are locked in a never ending battle for market share. Each of these sites would benefit greatly from having a baseline of data collection as fast as possible, instead of collecting nothing at all until the perfect state is reached. No one can predict with any great certainty when the next surge of traffic will occur, so it becomes vital to all sites to collect what they can when they can and then adjust and optimize the implementation.
Deploy, monitor, analyze, adjust. Repeat process…. forever. Optimization is a revered and accepted practice in analytics with regards to content, lets start the process of Iterative Implementations so that we can optimize that as well.
3 thoughts on “Iterative Analytics Implementations”
I live by this mantra, at EE we got the minimum requirements and some wow pieces, cut the rest, then tweaked and improved every week. The joys of TMS meant we applied over 160 tweaks in 3 month with just two people, that rate of change is highly unusal in a multi billion retail driven organisation that are used too month release cycles. The joins of giving people end to end responsibility.
Thanks or sharing Rudi:
Thsi Blog post is a :
– a response to a “why” and “how” client question.
– a reference to remember the way it works and the Digital Analytics implementation philosophy.