Slow is fast and fast is slow

In the summer of 2018, we undertook an endeavor to vet three options for a software system to manage our key data. The small sample provided us with a plethora of knowledge on capabilities, limitations, user experience, and showcased how each system would meet our requirements. It also afforded us an opportunity to see where our assumptions on quality, cost, and time were right and wrong. This pilot run of three systems gave us so much more than it cost and teed up the selected course of action for success. The end result was a software suite that deployed on time, under budget, and delivered on all requirements.

So, with all of these benefits, why are pilots so often skipped. I offer two main reasons:

  1. Action Bias – even through active work happens during a pilot, it is not actively directed at a deliverable is deemed inaction. Therefore, any action that is not end-state focused is less valuable.
  2. Cost – With tight budgets and lean organizations the norm, efforts spent vetting solutions that do not work out are a difficult sell. In the aforementioned software project, two thirds of the vetted solutions were “wasted” in that they led to no deliverable, despite both cash and human resource expenditures. This is compounded by the time that is spent vetting, delaying the delivery of the software, in this case, a planned three-month/actual five-month period that could have spent working one of the solutions toward delivery.

I would challenge these two lines of thinking with the myriad benefits that were attained through piloting.

  • Knew what would not work – the three selected systems all promised to meet our needs. However, through testing, the means by which each would achieve our requirements – or where they could not – were quickly discovered. This made the decision process so much easier when it came time to select the system of choice because we knew what we were signing up for. Therefore, we were able to determine what would not work for a fraction of the cost of a full implementation.
  • Pull from the best of all systems – Even though two were not selected, some of their functionality did inform requirements of the chosen system. By refining our requirements set, the end choice was better than it would have been on its own.
  • Better understanding of the likely duration/cost of each task – By tinkering in each system, we were better able to home in on a more accurate projected duration and cost for each task. This made the sell to our leadership much easier since contingencies were much smaller, and budgets could be set earlier. Better yet, we were more sure of our delivery date which allowed for ancillary system work to be scheduled and completed at a manageable pace.
  • Aware stakeholders – Lastly, we did not have many surprised stakeholders throughout the process as they had a good idea of final deliverable capabilities, cost, duration, and quality entering the project. We had a better understanding of where slow points would be, when personnel resources would be required, and where variables would likely manifest. Converting some of our “unknown, unknowns” into “known, unknowns” derisked the project and made it an easier decision to tackle the big change.

Ultimately, the knowledge gained during a relatively inexpensive pilot pays for itself. So, I implore you to set aside the action bias and invest in the success of the project through a pilot.