Section 1.2. The InnovationChaos Paradox


1.2. The Innovation/Chaos Paradox

Today's IT is big business for sure.

According to the Gartner Group's Worldwide IT spending report of 2004, the amount of money being funneled into this field will grow 5.5 percent a year for 2006, 2007, and 2008 until it reaches nearly $3 trillion. Gartner Dataquest's Software Support Portfolio Review looked at current revenues for the worldwide software support services market. That market alone is worth $52.3 billion.

Those numbers tend to separate IT from the business operations they support. That's simply a convenient separation. Today's IT is the business. IT has become woven into the fabric of every business transaction. Remove cell phones from the picture today, and what adjustments would businesses have to make? Take away email. Take away word processors. What would become of the deal?

People in technologythe people who plan, design, develop, install, maintain, monitorare as key as any other component to the success of business enterprises. And at the macro level, the success story stands out. But the focus of the discussions in this book is not on the macro level but on the micro level. To begin focusing toward that point, I'll take a step back.

Data processing has been around in American business since the late 1950s. The computers then weren't what we would recognize today as powerful. But by the early 1970s, these machines had become entrenched as core business tools. They were especially big in transaction-intensive industries. Banks, insurance companies, utilities, and reservation systems all found computing to be an effective way to process and manage mountains of data. Data-processing centers back then were somewhat removed from the daily bustle of the business. They were set off in climate-controlled rooms, set on raised floors. Access to these resources was strictly controlled. And those people who did have their fingers on terminal keyboards or worked the centers were either highly trained engineers or mathematicians, or they were working along rigidly codified lines. And the tool set was quite small: MVS, tape storage, COBOL and JCL, dumb terminals. These were the ubiquitous elements of just about any data center. And even though things did not always run as they should have, the centralization helped in the management of any problems that did crop up.

That was 30-plus years ago. Things began to change in the late 70s and early 80s, with the introduction of the personal computer and the runaway success of microtechnology. In fact, by 1990, data processing was gone. Information processing had replaced it. Walk into any American company today, and PCs are everywhere, bounded by mini and midrange systems. More than that, software is everywhere. And nearly everyone is using all this technology, using it everyday, automatically, without giving it so much as a thought. We now operate in the field of Information Technology Management. What are we managing?

We're managing servers and server farms, networks, routers, gateways, databases, spreadsheets, security profiles, email, word processors. And the developmental tool sets now available have grown so diverse as to be almost impossible to keep up with. What was en vogue seven years ago is passé today. The pace of change in IT is relentless. Innovation has become a fixed characteristic of this discipline, and that has pushed it more and more away from the bounds of discipline.

Back in the 70s, maybe a handful of people in an organization were devoted to data management. Today, everyone has a voice in it. The capabilities have become so flexible that applications have the potential to be cranked out in incredibly short spans of time. The process-centric approach that was required to manage a data shop in 1975 has been displaced through innovation. I am not a programmer, but using a tool like Microsoft Access, I can build a database system that tracks project risks, and I can have a working prototype of it ready to look at before lunchtime. That innovation has shaped a technology field that many describe as weakly linked, reactive rather than proactive, and poorly focused. DataQuest reports that 70 percent of today's IT workers are between 22 and 37 years old. The average length of experience is 11 years. All those peopletalented as they probably arewere raised in environments of click, fast, flash, and cash.

So that's how it is. What's the answer? Do we return to glass-walled rooms? I don't know anyone who would advocate that. But the paradox lies not with the click or the fast or the flash. It lies with the cash. We've already mentioned the size of the IT industry: trillions of dollars as a single enterprise. And most of that money is public money, channeled into companies through stockholders, through investors. Most of them know nothing about how the dollars allocated to IT are being spent. On top of that, I have seen that even top management has little clue as to how well those dollars are being spent. The core issue is that the dollars are in all likelihood being spent less than efficiently, with little control, with little insight into best practice, with little knowledge of crossover or hidden failures.

Philip Crosby, one of the leading process improvement pioneers, says in his famous book Quality Is Free (McGraw-Hill) that it's cheaper to do things right the first time, that it always costs more to have to go back and fix something. And process improvement is all about getting it right the first time.

The theme I want to put forward in this book is that processthe process that was born with this industrycan make a difference in today's technology organizations. The thoughtful implementation of a considered process improvement program can help a technology shop operate more effectively, with increased focus, with reduced waste, with a better bead on its mission and the overall goals of the company.

More and more IT managers are adopting this mode of thought. I am not a John crying in the wilderness. Today a renewed interest in management through process has arisen. In the last decade, process programs have become more and more prevalent. And out of all the available options, three have moved to the top of the chain. These three are as follows:

  • The 9001:2000 Quality Management Standard from the International Organization for Standardization

  • The Capability Maturity Model Integration from the Software Engineering Institute

  • Six Sigma, a methodology for improvement shaped by companies such as Motorola, Honeywell, and General Electric

These programs are not esoteric philosophies; they are not whiteboard theories. They offer practical, tangible guidelines. And they aren't fads either. Over time, each has proven itself in measurable, quantitative ways. They have been shown to visibility open the production process, to heighten management effectivenessnot through increased control but through commonly designed and measured controls. And, ultimately, they have been shown to increase the quality of the products you produce, not only external products (those you ship) but internal products as well.




Process Improvement Essentials
Process Improvement Essentials: CMMI, Six SIGMA, and ISO 9001
ISBN: 0596102178
EAN: 2147483647
Year: 2006
Pages: 116

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net