2.5 The Golden Age of Software


2.5 The Golden Age of Software

If technology is at the heart of the new business agenda, then computer software must be the soul. Computer software is a collection of machine instructions programmed to execute actions of a preconceived design. The design and subsequent actions are primarily a direct utilization of a specific capability made possible by a subset of associated computer hardware. As the underlying capability of the hardware changes, software must be modified, altered or reprogrammed to take advantage of the new capability.

This continuous improvement process is often a function of the technology’s maturity cycle, in which new capabilities are incorporated into the design as the application of technology to perform ever-increasing tasks becomes apparent. However, this seemingly modern phenomenon is found in the evolution of many technologies, presenting the same set of challenges in use and development. Let us follow the evolution of the castle as a weapon of warfare; as the technology to wage war on castles grew, the design of the castles to gain a military advantage increased. For example, a medieval castle was primarily a technology of war, a machine used to defend and protect its occupants. Examining the life cycle of English castle-building, the same process improvement development cycle can be observed. As castle design evolved, it incorporated both new military knowledge and empirical masonic knowledge to improve the tactical advantage of the defenders. In the later Middle Ages, castles reached a high level of efficiency and technological superiority, thus offering tremendous advantages incorporated into every facet of design. Inventions such as arrow-loops, murder holes and crenulations were continuously enhanced to keep pace with the ever-improving design of bow and arrow technology. Because these structures were made of stone and are still standing, they provide a record of the improvements over time. Visitors to a fourteenth-century castle can readily spot design improvements when compared to a twelfth-century castle. Similarly, a programmer reviewing source codes from a third-generation language like COBOL (Co(mmon)b(usiness)o(riented)l(anguage)) can read the intention of a subroutine and incorporate new functions into the design of a new program in a language like XML (extensible markup language). The development of the disruptive gunpowder cannon technology shifted the tactical advantage to the attacker, and warfare became increasing mobile, therefore reducing the importance of the castle as a military machine. Here again one can see that a disruptive technology radically alters the use and method of traditional technology. This could also be observed when computers suddenly embraced the shift from terminals to GUIs (graphical user interfaces), as in MacIntosh and Microsoft Windows.

Technology advancements that led the Industrial Revolution of the nineteenth century gave us the ability to produce enormous quantities of new technologies. Computer software, like manufacturing technologies, has for the most part been adapted to repetitious work. Manufacturing work, like software development, has evolved into one person designing, all others tending the machines or maintaining the code, with a slow and predictable erosion of quality.

Regardless of their scale, software development projects are intricate systems with dynamic interactions which interconnect them to both a supporting infrastructure and a process or initiation sequence, often represented by a human being. Traditional software development methodologies have inherited design and control mechanisms that were the product of legacy project management systems like GANNT, PERT and CPM, all with roots stemming from post-World War II military construction projects. These methodologies of project management focus on the control and time division of labour. In addition, they often break a project down into discrete tasks, matching them to resources and applying the result to a timeline.

There have been countless discussions on how technology is the enabler of business reengineering; yet, the processes of implementing technology, creating software and the integration with legacy systems are still burdened with ideas and methods left over from the information engineering concepts of the 1970s and 80s. During the 1990s, almost every technology project was marked as a reengineering project. In fact, very few were a radical definition of a process, most actually being process improvements or simple modifications of old systems. Reengineering became the management trend de jour because it was easy for most organizations to get funding by merely taking last year’s project, giving it a new name and calling it a ‘reengineering project’. Over three-quarters of the reengineering projects were process improvements at best, and numerous others simply repackaged old projects to get new funding. Many of the projects touted as reengineering were never completed, resulting in very little value added, or else they were a total failure. This was, in most cases, attributed not to the individuals involved or the technologies engaged to reshape the business, but to the lack of understanding of what it means to ‘reengineer’. Michael Hammer and James Champy’s definition of reengineering was clear and precise: it meant a fundamental rethinking and radical redesign. Rethinking a process is simple; unlearning all the years of process legacy derived from countless business changes is hard. Using technology as an enabler of an existing business process or a newly reengineered one is relatively straightforward. Nevertheless, can we say that it adds value? Many of the technology failures experienced in the 1990s might have been avoided if a simple test had been used when applying technology to reengineering projects: Is the proposed project a radical departure from what already exists, or have we re-examined the fundamental way the process works? This question seems simple, and yet it reveals a complex set of issues. To understand these issues we must first examine our information technology roots.

Most commercial technology implementation methodologies employed by today’s technology organizations are overflowing with work product-based deliverables coupled with a complex method and process of creating, documenting and displaying the activities of a project. Looking briefly at where these methodologies came from and how they evolved provides a foundation for the reasons why and the ways in which these dated methods need to migrate to more dynamic processes.

The Art of Programming

In the 1960s, commercial information technology was just data processing, not much more than a fast adding machine with some extended printing capabilities. In addition, it also needed an entire support organization to keep it going. In the 1970s, corporations realized that information systems played an integral part in the day-to-day business of running the company. Great demands rose from the organizations using the technologies; consequently, the data processing department (sometimes called the ‘IBM department’) was inundated with requests for systems. Within a very short time, data processing organizations began to realize that the process of creating software was complex, tricky and ultimately similar to an art form, due to the variability in the quality of programming from one programmer to another. Complicating matters even further, programming was almost a method of self-expression for data processing personnel. Thus began the programmer’s quest for efficiency, whose credo was ‘if it took me this long to figure out, it should be twice as hard for the next person to work out’. Armed with good intentions, programmers worked to document their thinking while writing computer programs. However, they often jettisoned this additional task when deadlines loomed and the resulting program was desperately needed by the business, which was under the impression that program documentation could be written after the deadline was met. In the majority of cases, this rarely happened and orphan programs devoid of remarked instructions, that is, notes which programmers use to document their thinking, acted to hamper the data assessing department’s performance over time. Most organizations tried to develop a consistent approach to software development; standards came along.

As the demand on information resources grew, information user groups put increasing pressure on management to squeeze data processing organizations to deliver functionality to business units and other end-user groups quickly. Typically, when these organizations received computer software solutions, they were significantly reduced versions of the original designs, falling short of the expectations envisaged in the design process. Users became discontented with the time taken to develop systems and the ‘clash of the technology titans’ was born. Data processing (DP) groups rose to the occasion by creating quality control mechanisms that resulted in tangible documentation to show corporate management that business users really did not understand how to specify what they wanted in a computer system. Armed with reams of technical specifications which only technologists understood, many data processing organizations used this as a shield with which to deflect the wrath of management and reset the expectations of the user community.

In the early 1980s, data processing organizations gained top management’s attention (primarily due to the rising cost of capital equipment and personnel) and began to change their image from simple processors of data to management information systems (MIS) groups. It was about this time that technology groups began to be labelled ‘professionals’. The newly labelled ‘MIS organizations’ took on the same type of appearance as the early federalists during the founding of the United States, emphasizing a strong centralized approach to hardware, data, applications and personnel. This centralized buying was justified by economies of scale and proved to be an effective short-term cost saver. One side effect of this process was that many IT organizations became locked into one particular technology manufacturer, regardless of whether the technology solved the business problem or not. Many users found themselves with a brand new terminal on their desk and a software application or two that automated only a small part of their job, while performing many functions and features that were alien to their business, but it was ‘cheaper to do this way’.

In the late 1970s and early 1980s, in an effort to reduce costs organizations developed ‘package mania’, and purchased pre-built applications to drop in as a new solution. After experiencing a cultural backlash from user organizations, all claiming to have unique business processes, the next step was to try to modify these packages in order to fit the business. Most vendors disliked the idea of an organization wanting to modify their packages and discouraged purchasers by stating that warranties would be violated and the package would no longer be supported. Organizations which changed the packages they had purchased soon found that trying to unravel the logic of the vendor’s programmers was a project in itself. Adding to these pressures, modifications to these packages became a maintenance nightmare, involving entire organizations to support one or two central packages.

Modifying software codes and developing new programs became of paramount importance to organizations, and then along came ‘computer automated software engineering’ (CASE) tools and information engineering. CASE tools allowed programmers to develop systems using generators, compilers, transform engines, that is, application programs that convert data, information or programming codes from a raw form to a refined one, and table-based functioning shells. These tools permitted the development of software applications in half the time of the labour-intensive task of writing COBOL code. Unfortunately, since then CASE has had mixed reviews by critics and practitioners. Many CASE tools were ahead of the technology and people that supported them. The early versions generated character-based screens, being well on the way to central repository functionality. Suddenly Microsoft Windows came along and few vendors were prepared to drop their current development plans. Several vendors tried to make the switch but the market was not ready for GUI-based tools. Vendors had to spend double the money maintaining the character-based version while diverting research and development money to GUI tools. To compound the issue, users had unreasonable expectations for the initial CASE tools. Data bigots wanted these tools to be data focused (if not centred) while the world actually was process oriented. Vendors, mistakenly interpreting the data bigot’s power in the market, focused on the data tools instead of process tools. This was not sufficient to provide the cost savings that users thought they would get.

Disillusionment set in and many vendors paid a steep price – ultimately buried in the CASE software graveyard. Many vendors fell by the wayside while a few rose to the challenge and Upper CASE tools were the result. The use of CASE tools is still very much alive and vendors are rapidly evolving to an object-based repository. However, that would be the subject of another book.

Another phenomenon that many MIS organizations failed to realize – and thus did little to harness the potential – was the introduction of PCs. PCs took the user community by storm. They allowed users to acquire a certain level of freedom from the centralized, slow-moving machinery of the IT infrastructure, and permitted the execution of functions no longer bound to the central computer, with applications such as spreadsheets and word processing. Users could purchase a software package to perform basic office functions for several hundred US dollars and, since it resided on their computers, could bypass the slow-moving IT organization. The IT organization tried to get organizations to standardize on one package or one vendor. Similarly, every IT department felt the need to support these packages internally, duplicating the support effort of the vendor. Many organizations took over this responsibility and increased their workload, in many cases sidetracking their organizations’ primary mission.

From Art to Science

In the mid to late 1980s, concepts such as information engineering (IE), joint application development (JAD) and rapid application development (RAD) sprouted from the ashes of major internal corporate battles as a new set of programming disciplines. In many cases, these techniques were presented as a way of involving the users, when in reality they were a means to perpetuate the centralized IT function. IE and its attendant books were, in most cases, fancy window-dressing on a dead group of principles which no one wanted to bury. After all, technology gurus, such as James Martin, (see Martin, 1986) charged people thousands of US dollars just to listen to them speak about it.

In the late 1980s and early 1990s, individuals began to realize the power of the PC. PC users started developing their own spreadsheets, notes files and databases to catalogue and retrieve data used in everyday processing. In the process, the islands of information were born. During this time, many IT organizations struggled for centralized control of resources, while the corporate user community opposed this trend by insisting on decentralization. Many IT organization executives were embroiled in a delicate balancing act between these two forces, thus failing to understand the users’ primary motivation. Even today, in IT organizations, there is still a quest for ‘writing new systems’ versus creatively applying packages and minimalist programming. Most IT organizations had (and still have) a backlog of system requests; as a consequence, users flocked to spreadsheets and other PC programs in order to get a sense of progress. Business computer users often create their own information system using spreadsheets just to feel some sense of control over their own data destinies. IT groups have been distributing architectures that allow users to implement their own solution in a plug and play network. In turn, this network should allow users to connect into a composite navigational structure, one that allows the bulk of the computing to occur on the desktop, using the network as a data conduit supplying these applications with requested information.

During the latter part of the 1980s, users were told that they were empowered; however, they were just being patronized. Current applications were redefined as ‘assets’ by the organization in order to attach some value to a dying set of technologies and principles. In fact, during this period, centralized groups went through the process of decentralization and decentralized groups became centralized in an effort to retain control. They mistakenly thought that the IT organization and its products could be part of the ‘solution’. In reality, they were a major ‘roadblock’ to progress. Without the huge problems they created, however, radical change might never have begun.

Organizations of the 1990s moved to ‘empowered teams’, cross-functional groups focused on solving business problems. IT organizations used this opportunity to establish close links with user communities providing them with the tools to define specifications. Teams developed aggressive technology solutions, only to face mounting obstacles in the bureaucracy of transition organizations. IT organizations started changing in the closing years of the twentieth century, and are now aspiring to a business process-focused set of activities. These organizations are rapidly learning that business processes are as important as ‘data’ in the corporate asset treasure chest. Companies combining business process and the data supporting the processes will soon realize that an object-oriented approach to systems design will help them to satisfy organizational requests for systems. This will be a major shift for them, both culturally and technologically.

The effects of downsizing the organization led to the piecemeal approach to systems integration and a rush to client/server-based systems which promised cost savings. A great number of technology organizations quickly moved to implement client/server technology, only to realize that integration between the new technology and the legacy systems became increasingly difficult and consumed more resources than estimated. Unfortunately, a number of chief information officers (CIOs) and IT managers failed to produce short-term benefits during the migration towards interconnective technologies which could only yield long-term benefits. A number of CIOs learned a new definition of CIO (‘career is over’) as they became some of the first casualties of the information age. Individuals in technology organizations discovered that when management decides to measure technology projects on very short-term deliveries of value, the term ‘pioneer’ was also redefined as ‘people leading the charge of business change with a large number of arrows in their backs’. Many technology leaders realized that their technical infrastructure and organizations were not up to the challenge, being ill-prepared to meet the tremendous learning curve required to developing business solutions in the new environment. This situation was exacerbated by senior technical managers’ failure to identify the users as the problem because of their inability to develop specifications that could be accomplished within budgetary limits.

Blame for early failures in client/server experiments can be placed at the feet of technology organizations which behaved like slow-moving dinosaurs. Fortunately, this organizational latency, which could be compared to the decline and demise of dinosaurs and their eventual transformation into crude oil, provided an environment of frustration which propelled management to initiate changes and provided a seedbed for the business process of ‘reengineering’. Consequently, if things had run smoothly within technology organizations, and business units had not been frustrated by the efforts of the technology group, there would not have been such a burning desire to ‘reengineer’.

In many organizations, teaching people the fundamentals of new systems design is a major flaw in organizational planning and career development. Many technology organizations find large gaps in the fundamental knowledge of how to use new technology to transform business, not simply imitate it. A classic example can be made by comparing the information contained within an old text-based computer screen and its contemporary counterpart crafted in the latest Windows display technology. In the majority of cases, the only substantial difference is in the greater amount of information now able to fit onto the screen. What is needed is a fundamental rethinking of the information required, not simply more information. Here again it appears that we are using technology to ‘pour old wine into new bottles’ as Hammer and Champy noted:

The fundamental error that most companies commit when they look at technology is to view it through the lens of their existing processes. They ask, ‘How can we use these new technological capabilities to enhance or streamline or improve what we are already doing?’ Instead, they should be asking, ‘How can we use technology to allow us to do things that we are not already doing?’[87]

IT groups should be focused on how to integrate these technologies to provide information services to these groups and create an Internet-like surfing capability for corporate users to obtain business process-based information. These technology organizations need to make a fundamental shift from ‘pushing’ information to allowing the user to ‘pull’ only what he or she needs to support the business. The change needed to provoke this shift is based on altering the mindset within bureaucracies and the behaviour of the workforce associated with the delivery of technological solutions, as we shall see below.

[87]M. Hammer and J. Champy, Reengineering the Corporation. A Manifesto for Business Revolution (London: Nicholas Brealey, 2001) p. 89.




Thinking Beyond Technology. Creating New Value in Business
Thinking Beyond Technology: Creating New Value in Business
ISBN: 1403902550
EAN: 2147483647
Year: 2002
Pages: 77

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net