Our admiration for the assembly line is so deep that we are suckers for the promise of “proven systems” regardless of their feasibility. We so treasure predictability and control that the promise seduces no matter how many times it is broken.
I started my career at what ultimately morphed into Accenture. That tenure coincided with the development of one of the early systems development methodologies—Method/1. The methodology was an attempt to make the process of understanding, designing, and implementing a technology solution to an organization’s business problem something that was manageable and repeatable. Method/1 made Accenture a lot of money and can be seen as a distant ancestor of a host of systems for building systems; agile, scrum, rational unified, waterfall, extreme programming—the ingenuity of marketers in packaging and labeling ideas is endless.
All are variations on a theme. They embody a desire for control in a turbulent world. If Ford could manufacture identical cars and McDonald’s could guarantee the consistency of fries and shakes from coast to coast, then we ought to be able to turn out information systems with similar confidence in quality and predictability.
There seems to be an uptick in promises of proven systems in multiple settings; not simply in the arena of design and development. Given their appeal, it’s critical to recognize the limits of these promises.
The first limit is the dangerously thin data from which these proven systems are built. One of the first ideas pounded into your head when you are compelled to study statistics and research methods is that “data” is not the plural form of “anecdote.” Yet most proven systems are based on a handful of prior examples that happened to work.
Look at the history of Method/1. The late accounting firm Arthur Andersen & Co.—which birthed Accenture—automated the payroll department of a GE manufacturing plant in the 1960s. That first project failed and Andersen rebuilt the system at a loss to make good. Andersen’s sales pitch to their second client boiled down to we’ve learned what mistakes to avoid while our competitors have yet to make them. Several projects later, Andersen consultants documented what they had done and packaged their ad hoc approach into a standard project plan for internal use called the “client binders.” As an accounting firm, Andersen was accustomed to protecting client identities and to thinking of audit processes as something common and repeatable for all clients. Consequently, the client binders were devoid of client specifics and mirrored the look and feel of standardized, repeatable audit processes.
In spite of their limitations and the thin data they were built on, the client binders gave Andersen’s consultants a better starting point for new projects and helped Andersen extend and consolidate their lead in the market. They worked best in the hands of experienced consultants who could use these materials to organize and support productive conversations with clients and prospects about how to structure and manage new efforts.
Then the methodology zealots took over. The experts took the crib notes that were valuable to experts and rewrote them into recipes for reasonably smart people with limited experience to follow. Repeatable, industrial, processes promise good economics to those who invent them and acceptable quality to those who seek the outputs.
The fundamental problem is that today’s knowledge work doesn’t consist of repeatable, industrial, processes no matter how much we wish they did or how often we claim they do. I’ve written before about the problems this strategy presents (Repeatable Processes and Magic Boxes).
Where does that leave us?
Be suspicious of claims about proven systems. Look for the demands for creative leaps and flashes of insight hiding within seemingly innocuous steps. Look for potentially endless cycles of analysis with no stopping rule. Find the magic box. Be wary of maps that don’t show where the roads end and the dragons hide.