Software s Difficult Past

Software's Difficult Past

In the 1970s, IT departments running large mainframes controlled most of the corporate software development projects. The mainframe was the infrastructure for the enterprise computing environment. COBOL was the language of choice and any department with adequate budget willing to wait for the average IT department programming backlog of eighteen months could have the application they wanted developed or modified. Software was difficult to develop if for no other reason than because development was so tightly controlled by a small group of people with the necessary skills and access to expensive computers. In reality, much of the perceived unresponsiveness of centralized IT organizations was not due to any lack of software development skills or organizational structure, it was simply a result of the software architectures imposed by COBOL and mainframes.

Mainframe-based enterprise software applications, such as payroll processing, were typically large monolithic programs in which even simple changes were difficult to implement. The complicated structure of such programs usually limited the number of people who could modify them to their original developers. It was cost-prohibitive to have a new developer learn enough about a large mainframe program to modify it. This is painfully obvious today as many organizations return to 1970s era code and try to update it to be year 2000 compliant. For this reason, development managers would instead simply wait for the original developers to finish their current task and then assign them to go back and modify their earlier work. COBOL technology was well understood by those developers who programmed in it. Even in the rather simplified model of centralized mainframe development organizations, however, people and process issues already played equal weight to technology issues in their impact on the success of software development.

In the 1980s, inexpensive PCs and the popularity of simpler computer programming languages such as BASIC led to the start of IT decentralization. Even small departments with no formal IT staff could purchase a PC, figure out the details of DOS configuration files, and get a department member with a technical background to learn BASIC. There was no longer always a requirement to wait eighteen months or more for a centralized IT organization to develop your software program. All of a sudden large companies had dozens or perhaps even hundreds of " unofficial " IT departments springing up, with no backlog to complete, who could immediately start developing stand-alone applications. The only infrastructure they needed was a PC and plenty of floppy disks for backing up their programs. Software seemed easy for a moment, at least until a program grew larger than 64K or needed more than a single floppy drive's worth of storage. Even the year 2000 was only a far-off concern that crossed few developers minds. Most PC applications couldn't access mainframe data, but most developers were too concerned about installing the latest OS upgrade to worry. Software development was still difficult, we were just too busy learning about PCs to worry about it.

One result of the 1980s PC boom on software development was the creation of "islands of automation." While the software program on a stand-alone PC might have been very useful to its user , such programs often led to duplicated work and lower productivity for the organization as a whole. One of the biggest productivity losses suffered by organizations was probably duplicate data entry because a stand-alone system could not communicate with a centralized system and the same data was required by both systems. Many organizations still suffer from the "multiple data entry" problem today and it continues to be a challenge to software developers who must reconcile different formats or different input errors when trying to collect and merge data. This process, referred to as "data cleansing", is especially true in one of the hottest new fields of software, data warehousing. Data cleansing is a well- known problem to anyone trying to build a large data warehouse from multiple sources. Electronically connecting "islands of automation," rather than resolving the problem, simply increases the volumes of data that must be combined from various systems. As with many software development- related problems, the answer lies not in simply interconnecting diverse systems, but in doing so with a common software architecture that prevents such problems in the first place.

In the 1990s, corporations started to worry about centralized software development again. Microsoft Windows replaced DOS and brought a graphical user interface to stand-alone applications, along with a whole new level of programming complexity. Business managers realized that stand-alone PC applications might solve the need of one department, but did little to solve enterprise-wide business and information flow problems. At the same time, Unix finally matured to the point that it brought mainframe level reliability to client-server systems. This helped connect some of those PC islands of automation, but at a cost. MIS directors often found themselves supporting three separate development staffs: for mainframes, Unix, and PCs.

In the second half of the 1990s, our kids suddenly started teaching us about the World Wide Web and the Internet. Almost overnight, network infrastructure went from connecting to the laser printer down the hall to downloading multi-megabyte files from the web server halfway across the world. All it takes is a few clicks and anyone who can figure out how to use a mouse can get stock quotes and Java-enabled stock graphs on a web browser. A few more clicks to register on the site and you can be completing e-commerce transactions to sell or purchase that same stock. With the explosion of the Internet and its inherent ease of use, the same expectations for accessing corporate data, upwards of 80 percent of which is still stored on mainframes, were instantly set. Fewer computer users than ever before understand or even care about software development and its accompanying infrastructure. Software development, however, continues to be very difficult, and mostly for the same reasons.



Software Development. Building Reliable Systems
Software Development: Building Reliable Systems
ISBN: 0130812463
EAN: 2147483647
Year: 1998
Pages: 193
Authors: Marc Hamilton

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net