REALISATION OF RESPONSIBILITY IN INFORMATION SYSTEMS


REALISATION OF RESPONSIBILITY IN INFORMATION SYSTEMS

Having seen that reflective responsibility allows the identification of the three basic dimensions of responsibility, the next sections will deal with the specifics of the relationship between reflective responsibility and the business use of information technology. We will now take a look at the realisation of responsibility in IS. For analytical purposes it makes sense to distinguish between different viewpoints here. We will therefore differentiate between reflective responsibility because of IS, for IS, and through IS. The idea behind this distinction is to demonstrate the difference between prospective and retrospective responsibility, and to show how the use of IT can affect the basis of responsibility ascription by influencing communication.

Reflective Responsibility Because of Information Systems

Of the different possible relationships between responsibility and IS, the one that comes to mind first is responsibility because of IS. This stands for all those situations that are caused by the use of IS in which responsibility could or should be ascribed. The emphasis here is on ex post ascriptions for objects that have already happened . Computers and information technology and especially their business use can affect rules and behaviour in many different ways. A prominent example is the one already discussed in some detail earlier on, the question of privacy and surveillance. Here, the use of IT is recognised as a moral problem that will lead to the ascription of responsibility ex post . The corporation, the CIO, the technician, they all can be held responsible for their action and their use of IT. Possible consequences of this can include legal sanctions, moral sanctions, but also economic or social rewards.

Similar problems of responsibility because of IS can be found in other areas of what is usually referred to as computer ethics. One of the problems most frequently discussed is that of power. We all know Bacon s dictum that knowledge is power. This in itself can mean different things. It can mean the actualisation of potentialities of the human mind as well as control over information commodities. It can be seen as an intrinsic good or a means to something (Stichler, 1998, p. 175). The reason why power is often cited as the number one ethical problem in computers and information is usually that it is quite obvious that these systems can serve as means of power.

The giving, orchestrating, and taking of information is a basic use of power. Power is capability, the ability to get things done that one wants done. (Mason et al., 1995, p. 40)

information systems are one of the crucial media on which organizational power rests: power is based on organizational positions that provide access to the IS, and on the special skills for using and interpreting IS outputs. (Lyytinen & Hirschheim, 1988, p. 23)

Laudon and Laudon (1999, p. 453) point out that information systems change the distribution of power, money, rights, and obligations.

While it is clear to the observer that information systems change tasks , jobs, and obligations, and consequently also power relationships, it is less clear that this must be a bad thing. In fact some authors point out that it is not power itself that is the ethical problem, but its misuse (Langford, 1999b, p. 9). Furthermore power in a formal sense not only does not have to be negative, it even has the status of a basic value. It is a condition for the actualisation of different life plans (H ffe, 1995, p. 146). Power should therefore be understood to be morally ambivalent (Gethmann & Gethmann-Siefert, 1996, p. X). It can be used for the good or the bad.

In a business setting it is important for managers to recognise this ambivalent moral status of power produced by information systems. Managers should also be aware that the use of information technology affects power distributions on all different levels of society. It has a lot to do with the business environment that managers find today. Globalisation and multi-national corporations that dominate international trade would not be possible in their current form without the capabilities of transmitting information (cf. Castells, 2000, p. 136f). Furthermore, the use of IT changes the political landscape in other ways. There is a drive towards international harmonisation of the global social, economic, and political world (Johnson, 2000, p. 22). Also, IT is gaining a more dominant role in the democratic process of power distribution. [20] Managers need to be aware of these developments because they play a part in them. The increasing use of IT in business sets the tone and prepares employees as users for the further spread of technology. A digital democracy, if we ever achieve it, will in large part build on competences that the citizens pick up at work.

Of more immediate importance to managers is how power distributions are changed on the meso-level of the firm. IT plays a role in and facilitates many changes in the way business is done. Looking at the management fads of the last few decades, one finds that most of them are based on an increased use of information and knowledge provided by technology. Many of these management fashions , if realised, result in severe organisational changes. Examples would be management theories such as just-in-time production and transport, or, more recently, business process reengineering (cf. Currie, 2000). Both lead to a restructuring of work processes and also of decision processes. They change how much power some members of an organisation hold over others. In IS literature, one therefore frequently finds the assumption that the introduction of IS is a destructive and creative process   la Schumpeter. That means that by introducing new technologies, one destroys old structures and processes, and replaces them with newer and presumably better ones. Especially business process reengineering (BPR) is often described as the reinvention of the business from scratch, and it is usually facilitated by the use of IT.

However, there is also the opposite view of the use of IT in organisations. While the introduction of new technologies may change the way things are done, it also has the capacity of preserving old structures that would be doomed without it. It can be argued that the old hierarchical production structure that we inherited from Ford and Taylor would be obsolete nowadays because it would not be able to process the required information if it were not for information technology. It can be argued that computer and information technology has entrenched the pre-existing employer-employee relationship (Johnson, 2001, For a more detailed discussion of the ethical impacts of IT on democracy see Stahl (2001b 203). What is true for the company can also be extended to the state. A modern welfare state requires and processes a huge amount of information, and it would probably cease to function without technology. This is why Weizenbaum (1976, For a more detailed discussion of the ethical impacts of IT on democracy see Stahl (2001b 250) could say that the computer is an instrument pressed into the service of rationalizing, supporting, and sustaining the most conservative, indeed, reactionary, ideological components of the current Zeitgeist ”a comment that still rings true today.

Finally, the use of information technology and the developments just described also affect the use and distribution of power on the micro level. The individual may become empowered by IT or she may also be alienated by it. New processes can require greater autonomy and thus give employees the chance to make relevant decisions without supervision. On the other hand employees can be degraded to mere means of feeding data into computers.

There are certainly more aspects of the relationship between the business use of computers and power. However, for our purposes it suffices that several points have become clear. Firstly, the use of computers and the subsequent change in the distribution of power can be morally relevant, but it is morally ambiguous. Therefore, secondly, power issues related to computer use in business can be a reason for the ascription of responsibility. Management should at least be aware of that possibility. In fact it is plain to see that questions of power play a role in most other normative problems of IS. Thirdly, there are some points which are specifically relevant to reflective responsibility.

Questions of power are a good example of the close relationship of facts and norms. If a manager wants to act responsibly in the face of power changes, then one of the challenges is the clarification of the issues. Questions like: who used to have what power, what changed, why did it change, who or what caused that change, etc., must be asked. The clarification of the problem will require the cooperation of the affected parties who may not wish to cooperate. An added difficultyisthatthesepowerchangeswillinmanycasesbeimplicitandinformal. Therefore there is also the challenge of finding means or institutions which will clarify the situation and support accountability. At the same time this sort of unclear situation will also require some space for prudent decisions and therefore some openness.

There is one last item with regard to power that managers must take into account when they want to exercise reflective responsibility. This is the basic problem that power endangers the legitimacy of discourses. Our argument so far was that reflective responsibility requires the participation of everybody involved, and that in this sense it can be seen as something like the realisation of a Habermasian discourse. There is a great amount of philosophical problems with the realisation of discourses, because real discourses will always miss some of the relevant ideas that provide the idea of discourses with its legitimacy. So far we have argued that reflective responsibility can accept sub-optimal solutions such as a necessarily imperfect real discourse because it aims at the improvement of circumstances and not at ethically perfect solutions. This argument will remain the basis for discursive responsibility, but one danger is that real-life power differences are so great or so detrimental to real discourses that they destroy their legitimacy . This is an area where management should become active. Usually management will hold an important position of power in responsibility discourses, and they should use this position not to press their own positions, but to facilitate a maximum of equality within discourses. This is of course easier said than done, and it will often be counter-intuitive because managers acquire their power in the first place to promote their causes. However, if they are serious about acting responsibly, then this use of power for the decrease of power differences will facilitate a successful approach. And there is one last issue in this context. In order to facilitate a discourse, to make it an instrument of legitimate ascriptions, thus to move it close to the conditions of an ideal discourse , managers will need a capacity of judgment and they will need to use it prudently. This prudence will then be used to provide structures of accountability which in turn will often be institutions that allow ascription. One can thus see that the problem of power raised by IS touches on all of the characteristics of reflective responsibility as developed above.

Next to privacy and power as an object of responsibility because of IS, another problematic area is that of intellectual property. The western market economy is based on the assumption of property. The idea usually is that individuals as well as collectives can own property and that they can do with it as they like. They especially have the right to exchange it against other property ”they can buy and sell it. On the other hand much of the criticism of this system such as Marxism is also based on property and its negative effects. There is a great amount of literature on property and its justification and its role in society. Most of this, however, has been written with the example of physical goods in mind. In the computer and information age, the emphasis is shifting toward intellectual property. Unfortunately the entire field of intellectual property is ill defined and hard to navigate. There are definitions of intellectual property such as the World Intellectual Property Organisation (WIPO), which sees it as the right to, among other things, the results of intellectual activity in the industrial, scientific, literary , or artistic field (cf. Forester & Morrison, 1994, p. 57). On the other hand it has to be objected that the term does not have a clear definition. But not only that, it can be said to be very misleading because it suggests that one can own ideas or ways of thinking about things, which in fact is not protected by any current approach to intellectual property (Snapper, 1995, p. 181). The use of the term also suggests that there is a coherent set of laws and rules governing the topic, when in fact there is a complex mix of legal and moral traditions, which are not only not coherent but sometimes even contradictory.

It is clear, however, that the area of intellectual property is steadily gaining in importance; this is mostly due to technological advances that are based on knowledge, information, and ideas rather than on physical goods. These technological advances can frequently be translated into services, goods, or business models which in turn generate huge profits. The prototypical example for this is software. Computer software can be sold as a good, and in fact some of the world s largest companies such as Microsoft, Oracle, and SAP do just that. Microsoft s CEO has become the richest person on the world just by doing so.

While intellectual property gains in importance, it also produces problems because the established rules governing property often do not apply to it. Property is a traditional concept originally based on physical manifestation (Barlow, 1995), and it can therefore not apply equally in the world of computers and cyberspace . In order to understand the specifics, it helps to contrast intellectual and traditional property. One difference is that one can trade manifestations of ideas such as images, books, or software CDs whose content nevertheless still belongs to the author (Weckert & Adeney, 1997, p. 65). This means that the acquisition of intellectual property follows different rules from those that apply to the acquisition of traditional property. The most striking examples are again software programs where one usually has to agree to a licensing agreement before being allowed to install them on a computer. This leads to differences in the characteristics of property that many users are not aware of or do not understand. If a user has property in a car, then she can do with it as she likes. She can for example lend it to a friend, change the interior, or sell it to someone else. If she bought software instead, then the same activities would in most cases be considered illegal.

The rules that lead to these results are usually justified by quoting the necessities of the production of intellectual property. It often is extremely costly to produce one specific piece of information, to produce one piece of intellectual property (Mason, 1986). A large software program can contain thousands of man years of work and be worth millions of dollars. Once this first copy has been produced, however, further copies can be made with very little expense. This reproducibility is a characteristic of property items having to do with computers (Johnson, 2001, p. 93). It has been facilitated and spread even further by the Internet, where files can be stored and downloaded by any interested party. The copies will then be useable without any decrease in usability or quality. At the same time the original is not affected by the copying at all. This means that there is another big difference between traditional and intellectual property, which is the issue of theft. While theft deprives the owner of traditional property of her possession and thus decreases her utility, the same does not apply to intellectual property. Kuflik lists three senses in which a person who gets an idea from me need not be taking it away from me: (1) I can still think it; (2) I can still enjoy whatever praise or admiration others might be disposed to give to me as the person who thought of it first; and (3) I can still use it, to all the same personal advantage, in my own personal life (Kuflik, 1995, p. 173).

As we can see, intellectual goods are fundamentally different from goods for which the institution of property was originally developed. Also, computer data and computer programs do not fit the established rules for intellectual property (Moor, 1985). At the same time one can see numerous attempts to strengthen intellectual property rules and to extend the reach of those rules, especially to include computer readable data and programmes. In order to evaluate these efforts, it is helpful to take a brief look at the arguments for and against the protection of intellectual property.

There are several different arguments for intellectual property in general and for its application to IT in particular. First of all, there is the theory of property in the tradition of natural rights whose most influential protagonist was John Locke (cf. Johnson, 2001). His idea is that we originally own ourselves , our bodies, and therefore everything that we produce. One acquires ownership in something by mixing one s labour with it. This theory can run into a lot of problems in the physical world, but for intellectual property it is deeply plausible. If you write a programme or collect data, then you create something that would simply not be there without you. Locke further discusses a proviso, which can be interpreted to mean that acquisition of property must not lead to anyone being worse off for it (cf. Nozick, 1974; Gauthier, 1986). This, too, seems to be the case for intellectual property, unlike for many examples of traditional property. Another natural rights approach would be that a program could be understood as an extension of the programmer to which she could again claim ownership (Nissenbaum, 1995, p. 206).

The second group of arguments in favour of intellectual property are teleological ones. According to these arguments, intellectual property should be protected because of the positive effects it has (cf. Weckert & Adeney, 1997, p. 61). The institution of private property can be interpreted as a device encouraging parsimonious yet efficient use of resources (Donaldson & Dunfee, 1999, p. 128). This can of course be transferred to intellectual property. More frequently one can hear the argument that the protection of intellectual property is a necessary prerequisite for the production of software. Innovation and development of new software are seen as positive goods, which can only be guaranteed if the developers have the right to profit from their work. If copying software were legitimate and companies could not profit from their investments, then they would simply cease to invest. This argument is applied to corporations as well as to the individuals who produce intellectual property. Stallman (1995, p. 191) pointedly paraphrases the individual argument as follows:

I want to get rich (usually described inaccurately as ˜making a living ) and if you don t allow me to get rich by programming, then I won t program. Everyone else is like me, so nobody will ever program. And then you ll be stuck with no programs at all!

The two most important legal means for the protection of property rights in the area of computers and IT are copyright and patents. There is no space to discuss the intricacies of these here, but it should be noted that they are in fact temporal monopolies. Both ensure the exclusive rights of use to their holders for a limited period of time, and both effectively exclude competition. Given that the usual defence of competition is that it is to serve the common good,itissurprisingthatthelimitingofcompetitionisalsojustifiedwiththesame argument. The question whether intellectual property rights should be granted or not thus boils down to the empirical problem of which course of action will actually promote the common good in a greater degree (cf. Kuflik, 1995).

This last point opens a first counter-argument against the protection of intellectual property. Questions of illegal copying of software, but also of other intellectual property, are at the forefront of international trade agreements, and they are heavily emphasised in international trade relations. The World Intellectual Property Organisation or the agreement as part of the WTO treaties are just two examples. In fact these treaties aim at the international recognition of national standards. Apart from problems of implementation, enforcement, and international legal issues, this area has become a target for the opponents of globalisation. Some opponents argue that the monopolies that are erected in the name of intellectual property threaten the diversity of culture and only benefit some of the big corporations (Smiers, 2001). While this sort of argument refers mainly to other forms of intellectual property such as music, films , and literature, it can easily be extended to the realm of computers as well. The quasimonopoly that the Microsoft Corporation holds in the area of PC system software is a good example.

Another argument against strong protection of intellectual property in the computing area involves the specific characteristics of software. The most important aspect is that the theft of software or data by copying does not necessarily affect the original owner. When someone steals my bicycle, then the fact that makes this hardest to bear is that I am deprived of its use. It can be argued that this is in fact the reason why we find theft immoral, and why moral and legal institutions were built to avoid it. If this is so, then there are no good arguments why the same moral and legal standards should apply if the basis is not applicable (cf. Johnson, 2001, p. 156; Weckert & Adeney, 1997, p. 70). This can be interpreted as the attempt to invalidate the Lockean natural rights argument. Even if one follows Locke by agreeing that putting work into something will lead to the acquisition of property, it does not necessarily follow that intellectual property rights as we see them today are justified by this premise . The probable purpose of Locke s argument that people should not be deprived of the enjoyment of the fruits of their labour is not necessarily touched by allowing copies of software and data.

There is also an argument against the teleological defence of intellectual property. The assumption that the producers of software are individual profit maximisers who would cease to work if they did not make a large amount of money from their work is quite weak. Artists, scientists, or academics produce their work usually without the hope of financial rewards (Weckert & Adeney, 1997, p. 62). Some authors (cf. Stallman, 1995) have argued that programming, for example, is a highly satisfying activity that provides people with enough intrinsic motivation to be independent of financial rewards. If this is true, the argument that no more software would be written if we were to drop the protection of intellectual property is wrong. It stands to reason, however, that fewer people would do it and spend less time on it. This disadvantage would have to be weighed against potential advantages. There are some conceivable advantages, which have been listed by Stallman (1995). Chief among them is the increased productivity. Even though less software might be produced, more people could use it because it would be free. Furthermore, existing programs could more easily be adapted because source code would be known; programmers could get a better education for the same reason. Finally, duplicate efforts might be avoided.

In fact, the origins of today s success of computer technology took place in an environment of free access to intellectual property. However, business interests were opposed to this, and they carried the day with legislators on an international scale (De George, 1998, p. 53). This is ironic because the current economic success is based on the set of intellectual property rules which, had they been in place earlier, would most probably have stunted the development of the Internet (Schiller, 1999, p. 10) and presumably most of the other developments in IT as well. Furthermore, it can be argued that the economic success of high technology is only possible because companies could draw on publicly funded and owned research. The capitalist success of IT thus depends on the communist approach to knowledge that is generally adopted by academia (Himanen, 2001, p. 60).

This contradictory theoretical background of intellectual property does not make it any easier for management to deal with it. Managers who want to act responsibly in questions of intellectual property in relation to computers and IT will often find one general rule: act legally. It is a common assumption that following the law will in most cases lead to ethical behaviour and that it is thus the sensible thing to do (cf. De George, 1998, p. 53f). A big advantage of this stance is that it avoids the tricky questions of morality and property. As we have seen, there are some quite strong moral arguments that reject the institution of intellectual property. Solutions like Johnson s (2001, p. 41) ” I argue that it is wrong to make a copy because it is illegal, but not because there is some prelegal immorality involved in the act ”simply circumvent having to find a solution to moral questions. In this sense they follow Montaigne s moralism by simply endorsing common practices independent of its justification.

However, this solution is often not really helpful because the legal positions are not clear either. This is the case in situations where more than one country is involved and therefore national law does not apply, or where it is unclear which law applies. Even though there is an increasing number of international agreements with the purpose of unifying the legal framework of intellectual property, many questions are still open . Furthermore, intellectual property rights are subject to intensive modifications, and it can be argued that some of the basic assumptions are being changed because of the specific questions arising in IT. One pertinent example is the change of interpretation of patent law and whether it is applicable to software. Up until the 1980s, American courts held that computer programs were mathematical algorithms and therefore not subject to patent law. This interpretation has changed in the last few years, and now patents are granted for software on a regular basis. Without being able to discuss the background of such changes and their potential results, it is clear that managers must find orientations in their dealing with intellectual property that go beyond a simple adherence to the law. This is where the question of responsibility enters the arena and where the concept of reflective responsibility can prove helpful.

If a manager faces a potential normative problem with intellectual property, the solution in the spirit of reflective responsibility will be similar to what was suggested in the case of privacy. First of all the pertinent reality must be established. That means that subject, object, and norms should be discovered . This in turn implies that the stakeholders must be identified and their claims should be discussed. Potential stakeholders here, apart from the usual suspects such as managers, employees, and stockholders , will probably include authors and programmers, users and customers, but also the political system and the general public. Again management will run into the problem that this process, which is modelled on the idea of a discourse, will not be possible to the point where it could create universal legitimacy. And again the answer to this objection would be that the mere attempt to have a stakeholder discourse promises advantages that can outweigh the disadvantages. But there is no guarantee that this will be so.

But even if a real stakeholder discourse is impossible , the theoretical background of reflective responsibility offers some hints as to the interpretation of intellectual property. First, property in general can be seen as an institution that guarantees accountability. It is generally assumed that owners must take responsibility for what is theirs, be it their thoughts, their houses , or the product of their work. Thus understood, property is an institution that facilitates the ascription of responsibility and therefore promises to improve its realisability. In this sense a strong protection of intellectual property is to be welcomed. However, while this argument is a strong support for the institution of property in general, it seems to continually lose weight in the area of IT-related intellectual property. That is because it relies on the assumption that rights imply duties , specifically that the right to property implies the duty to take responsibility for it. Anybody who has ever installed a commercially distributed programme on his or her computer knows that at the start of the installation, they must agree to a licensing agreement. These licensing agreements tend to explicitly contradict what was just said about rights and duties. The producers usually reserve all rights to the program, but they categorically rule out any responsibility (cf. De George, 1998, p. 54f; Nissenbaum, 1995, p. 534). In any other area apart from software, this sort of agreement would run counter to our beliefs. It stands to reason that the identification of the underlying norms of a responsibility ascription would identify this as a contradiction, and that most potential stakeholders would not accept it (even though factually we accept it because we see no alternative). The conclusion is that a responsible manager who wants to claim the rights to intellectual property should also accept the duties that come with it.

Furthermore, there are genuinely social questions caused by the business use of IT. Computers change not only the business organisation of work, but also the social ways of doing things. On the one hand it allows new industries to develop and it has produced unknown riches in some parts of the world. At the same time it leads to rationalisation and loss of employment in other parts . Business and IT combined are the driving force behind the modern form of globalisation, which is one of the clearest examples for a generally perceived need for responsibility (even though it is completely unclear who could be responsible for what and how). [21] IT and business interests also increasingly colonise other areas of society where they played a much smaller part up until now. A good example here is that of education. Under the heading of e-teaching or e-learning, we see a large number of initiatives of moving IT into the educational process at all different levels. This development is right now becoming a fact that most educational institutions have to accept, and at the same time it is also becoming a multi-billion-dollar market.

Questions of privacy, power, and property are of immediate interest to managers because they make decisions based on these or decisions that change the status quo. What most of the problems from this area have in common is that they are within what is frequently called the sphere of responsibility of the manager anyway. Mangers would consider themselves or be considered by employees and superiors to be responsible for solving these problems. In these cases it is therefore plausible that the idea of reflective responsibility would be attractive for managers trying to understand how they can discharge their responsibilities. We have tried to show how reflective responsibility offers an approach to these problems that allows managers to discharge their responsibility.

There are on the other hand also moral problems caused by the business useofITthatarenolongerwithinthesphereofinfluenceofindividualmanagers. These problems can still have a severe impact on the way the economy in general and a company in particular is run. Also, they can lead to individual responsibilities of individual managers. It is therefore necessary for managers to remember this higher level in order to be aware of potential conflicts or requirements. The actual list of changes in society and the way it is organised that are due to the use of IT is again potentially infinite and impossible to discuss comprehensively. Also, there is the problem of ambiguity of most changes. Whatever new technology is introduced, some people will welcome it and some people will object to it (Schwartz & Gibb, 1999).

The first and most general observation is that IT is increasingly becoming an agent of change. The way we act and interact seems to change because of the use of technology. Some of the buzzwords of business theory of the last few years are closely linked to IT. One example would be business process reengineering/redesign, which proposes that businesses should reinvent themselves and redesign their processes according to technical possibilities (cf. Turner, 1998). Another example is the idea of the virtual organisation, where thebordersbetweenandwithinclassicalorganisationsdissolve,andorganisational units are created for certain tasks and only exist for the duration of the task (cf. Pennings, 1998).

Critics of these ideas might point out that they are just intellectual fashions and often do not live up to their promises. This is not the place to weigh these claims. However, it is clear that these ideas, whether well-grounded or just fads, do have a real impact on the way the economy is organised. Spectacular examples such as the French Vivendi, a gas and water supplier turned media conglomerate (and now going bankrupt), the German Mannesmann, a steel producer turned biggest German mobile phone operator (before it was sold to Vodafone), or the American Enron, an energy supplier turned virtual marketplace (before it went bankrupt), prove that the changes are real enough. Technology and organisation are closely intertwined, not only on the meso level of the organisation, but also on the macro level of the economy. Even though the dot.com bubble has burst and most of the hype has gone, e-commerce and e-business are here to stay. The advantages of reduced transaction and agency cost, combined with a high degree of customer convenience and service potential, make sure that many markets will continue to develop in the direction of e-commerce. Since every organisational change has the potential to affect someone s moral rights and obligations, it can also be seen as an object of responsibility. Among the specific problems one can count the change in the nature of work, unemployment caused by rationalisation, the power shifts discussed above, and many more. The social changes that ICT engenders on all levels of the economy must therefore be counted among the reasons for responsibility because of IS.

The changes caused by IS are not confined to the area of economic activity in a narrow sense. Another area that has the potential to change many of our institutions and the way society is organised is the use of IT in state, government, and administration. This field, which is usually discussed under the heading of e-government , deals with the way societies use new technologies in their processes of government and administration. Many governments from national to municipal now offer some of their services on the Internet or are at least considering doing so. In many Western states, for example, citizens can file their tax returns or receive information electronically . This can obviously lead to new problems:

  • How can the state guarantee the privacy of the data it receives?

  • For which purposes can the state use the data?

  • Can information given for tax purposes be used to find parents who do not pay alimony or who at the same time get social welfare payments?

  • How does the state authenticate the users of the new services?

Apart from these rather technical questions, there are also some issues involving the very legitimacy of the state. Here, the use of IT can be described as having positive as well as negative effects. On the one hand the widespread use of IT and especially the Internet promises new possibilities of participation in government that so far were impossible. Groups of interested citizens can be included in decision processes, they can create national and international special interest groups, and generally a new dimension in the flow of information can be realised. On the other hand there is the danger that those parts of society that are excluded from the new technologies will be further marginalised or that a technical bias in the perception of reality could influence what topics are considered relevant. [22] Again, businesses and managers are not the only stakeholders in the process, but they need to be aware of them because they are among the stakeholders. Furthermore, commercial interests are increasingly shaping the development of technology, which in turn can have repercussions on the legitimacy of e-government.

This section was meant to demonstrate that there is a multitude of ways in which IS can become the reason for the ascription of responsibility. A manager confronted with this list of issues and with the normative problems that build its background might feel overburdened. How is any one human being expected to take the multitude of problems into account and react responsibly? This is exactly the problem of the traditional approach to responsibility, and it is where the reflective approach can offer some directions. One of the strengths of reflective responsibility as it was introduced here is that the manager trying to act responsibly does not have to be aware of all of the aspects that are relevant to the decision or action in question. In order to act responsibly, the manager must answer to those who are affected. This implies that the manager tries to identify those parties and is willing to communicate with them and consider their viewpoints. All of the details that were discussed so far will then emerge insofar as they are relevant to the specific situation. This also facilitates dealing with the problem that the issues we have discussed tend to be related among each other.

As an example of the relationship between different normative questions in IS, let us return to privacy/surveillance. While privacy is a multi- faceted problem in its own right, it is also closely related to other problems, not least of all to those problems that were discussed in this section. Privacy, for example, can be framed in terms of power, intellectual property, or social issues. The invasion of employee privacy by the employer is very clearly a use of power, which can be interpreted quite well in Foucauldian terms.

Surveillance of employee Internet access or email use, especially when the employee is not certain whether she is being monitored , is a good example for the interpretation of Foucault in IS. Feeling watched permanently with the possibility of being punished for infraction of rules is the definition of the Panopticon. Of course other questions of power also play a role: Who has the power to start surveillance? What power do employees have to stop it or to monitor the monitors ? Similarly, privacy is related to intellectual property. If we take a strong approach to intellectual property, then the result can be that everybody owns all information concerning themselves. This would translate into people s right to control who can use what information about them for what purpose. For an employer wanting to monitor his employees, this would mean that it would be an infringement of property rights to use information about them without their consent . Therefore, a company using technical means of surveillance on its employees could be said to act in a self-contradictory fashion if it insists on a strong protection of its intellectual property rights in other respects. And finally, privacy can of course also be seen in the light of the social repercussions it produces. The way a society deals with privacy is an expression of its values, but it can also change its values.

Coming back to our example of the CIO who has to make the decision whether or not to introduce surveillance software, we see that this is not a decision where the responsibilities that may result from it can easily be estimated in advance. This is why a traditional approach is bound to fail. Trying to realise a reflective approach, however, does not have to deal with this problem because it is by definition open and able to accommodate differing views. The discourse between the stakeholders allows the determination of the relevant subjects, the corresponding objects and instances, as well as the other determinants . Initiating a responsibility discourse therefore requires, first of all, the willingness to go ahead with it and then to accept the results. It is important to note that such a responsibility discourse may produce more than just one ascription. In our example the immediately affected parties would be the CIO herself, the employees, and higher management. Additionally one could see as affected the shareholders, customers, regulatory bodies, and many more. Starting a discourse with the immediately affected persons might lead to several ascriptions. The CIO, for example, might be held responsible for the technical proficiency of the surveillance measures and for making sure that it is done according to rules. Management could be held responsible for the supervision, while a new body or institution could be created charged with the task of creating rules for acceptable surveillance. The CIO might now be held responsible for assembling this new body and chairing it. It is easy to see that we could continue this imaginary case study further. What is important to understand is that these ascriptions, while they may be plausible, might happen this way, but they might also look completely different. It is impossible to foresee the material form of the resulting ascription.

What also becomes clear is that despite the effort involved in initiating the process of ascription and despite the possibility of failure, this reflective approach to responsibility is not only viable , it is in many situations the only recourse someone who wants to act responsibly has. Discursive participation of stakeholders offers the biggest possible resource of knowledge and expertise, thereby overcoming as far as possible the limitations of knowledge, power, and causality of traditional ascription. It thereby relieves the individual to the point that it regains its ability to act. In this sense responsibility is in fact the answer to the normative problems of modernity because it allows addressing the normative problems caused by risk and complexity without requiring the strong standards of most other ethical theories. The formal idea of reflective responsibility can therefore offer a promising approach with regard to normative problems caused by IS, where other approaches fail. Having shown this, we will shift our emphasis toward the future aspect of responsibility under the heading of responsibility for IS.

Reflective Responsibility for Information Systems

Responsibility for IS differs from responsibility because of the fact that it relates to intentional activity with regard to IS ”it is prospective rather than retrospective. That means that responsibility for IS looks at what managers would think of first when asked about their responsibility. How is a system designed, how is it maintained , how do business processes link in with technology, and who are the persons to whom all of these things are ascribed? The distinction between responsibility because of and responsibility for IS is slightly artificial because the limit is unclear. If responsibility for a system is not properly discharged, that can lead to responsibility because of the results of this negligence. On the other hand the experience of being ascribed responsibility because of IS will in many cases lead to decisions that aim at institutionalising responsibility for it.

The difference between the two aspects runs roughly analogous to the distinction between prospective and retrospective responsibility, which was discussed in general terms in Section 5.2.6. Accordingly , there is also the difference between transitive and reflexive ascription. While responsibility because of IS as discussed in the last chapter tends to focus on things that have happened and tries to establish a responsibility relationship to third parties, responsibility for IS focuses on the future and will therefore have to rely in a higher degree on the voluntary assumption of the responsibility in question. Again, it is important to stress that these two aspects tend to go hand in hand and that they are mutually dependent. Nevertheless, the analytical distinction makes sense because it allows the identification of implications that become clearer when one concentrates only on one aspect.

The field of responsibility for IS is related to or contains many aspects that cannot be discussed here. There is, for example, the entire question of political responsibility for IS. States and governments make decisions that determine how technological development will progress or what sort of technology is likely to succeed. These decisions are made under a great deal of uncertainty because technological development is hard to estimate for more than a short time span. There is a whole field of science that deals with this sort of decision influencing technological development, not only IT development, which goes under the name of technology assessment.

For the purposes of the text, however, we will concentrate on the intermediate range of technology use in organisations and thus on the mangers view. Managers responsibility for IT in business can be divided according to several criteria. One could distinguish internal from external responsibility or look at the temporal horizon. Responsibility for IS in the short-term sense is just one example of responsibility for certain fields or areas of business. In this sense every line manager is responsible for the correct working and adequate servicing of the systems in his or her area. More interesting is the long-term responsibility for IS, because this is where the reflective approach can help matters most.

The prospective aspect of responsibility for IS can be most usefully applied at those stages where the actual manifestation of information systems is still unclear and where, therefore, considerations of responsibility can affect the greatest changes. This is in the early stages of the systems analysis, design, or development. If a decision has to be made about the use of IT, about the acquisition of new systems, the upgrading, or any other change, then it is a direct consequence of the theory of reflective responsibility that the ethical and social consequences must be considered as early as possible. Rogerson (1998, p. 16) names the deskilling of jobs, redundancy, and the break-up of social groupings as examples of what project management should think of early on. There are again the same two arguments for this prospective action that we have encountered before. On the one hand it is in the business interest of any organisation to make sure that its investments have a high probability of success. On the other hand it is an imperative, resulting from the idea of responsibility that the process of answering to those who are affected by a decision to be taken as early as possible. Since the first argument comes natural to businesses, we will put more stress on the second one. The early stages of systems analysis and design should not be misunderstood as pure technicalities where ethics enters only in the form of externally determined specifications, if at all. General managers and IT managers who hold positions of responsibility for system design must be aware that this responsibility expressly also covers an ethical angle. That means that they have to be aware of the fact that a good and appropriate systems analysis and recommendations for design makes it far less likely that there will be either practical or ethical problems with the resultant system (Langford, 1999b, p. 68). At least as important is that they recognise the fact that the process of analysis and design is already value-laden before it begins. This refers partly to the last section, to the relationship of IT and fundamental philosophical questions. The questions that are asked before and during the design process, which naturally determine which answers can be given, depend on the worldview of the persons involved. They therefore necessarily imprint this worldview or life-world on the answer that can be given to them. The myth of amoral computing must therefore be destroyed before the planning process can start properly. Managers and technicians must be aware that design often involves ethical decisions based on social context and social values, and the technology transmits or embodies the value decisions (and assumptions) made during the design process (Huff & Martin, 1995, p. 82).

If this is realised, then it also becomes clear that solitary design by individuals or small groups of experts will run the risk of idiosyncrasy. The ideas that were presented earlier as models for reflective responsibility, namely Habermasian discourse or the stakeholder approach, will then automatically enter the picture as viable alternatives. In order to overcome one-sided views and conceptions, the affected parties should participate and they should do so as early as possible.

ThediscussionofreflectiveresponsibilityforISmustnecessarilyremainon a rather abstract level because by its very nature, the fact that it concerns future developments, it cannot go into details. However, there are two areas about which one can make observations from this general point of view. These concern the reliability of computers and the resulting necessity to design systems for fault tolerance on the one hand, and questions of the framework in which responsibility can be realised on the other hand. These will now be discussed under the headings of reliability and fault tolerance.

Many of the moral problems produced by computers and information systems are caused by their not functioning as intended. Examples range from the computer crash that wipes out a day s work or more, to incorrect data in FBI files, to failing security features in nuclear power plants. Business examples could include proprietary databases whose integrity is breached by hackers or lack of order fulfilment caused by faulty customer files. This sort of problem is an object of reflective responsibility for IS because it can at least be partly addressed by forward-looking use of technology. Some solutions can be found in the design of information systems, but in order to account for them, the subject of responsibility first has to be aware of them.

A first question to ask is therefore how reliable information systems are. It turns out that this relatively simple question is not simple to answer. For many other potentially safety-critical technologies, we have clear rules for the evaluationandjudgmentconcerningreliability.Theengineerresponsibleforthe development of a bridge, for example, would be able to clearly state how much weight that bridge can carry, for how long it is designed to last, and other factors of reliability. The same sort of exactness is rarely if ever achieved for computers. The reasons for this are manifold . One of them is a lack of a clear and agreed-upon language for the communication about the subject. Risks of software or hardware can often best be expressed in statistical terms, and these are frequently translated in misleading or false statements of a non-statistical nature (cf. Corbato, 1995).

Another problem leading to fundamental problems with the reliability of computers and information systems is their systemic nature and the resulting complexity (Littlewood & Stringy, 1995). One reason for the complexity of computers is the interaction of hardware and software. Reasons for failure can be found in either one and also in a lack of compatibility of the two. An added difficulty can also be human error such as incorrect equipment, operation, or maintenance. System failure can be caused by unusual combinations of problems from several of these categories (Borning, 1995). Accidents are usually caused by complex interaction between various components and activities, and it is mostly a mistake to try to attribute them to a single cause (Leveson & Turner, 1995).

There are many other reasons for a lack of reliability of software caused by complexity. One human-centred problem is the cooperation of people from different functional areas with different backgrounds. This sort of interdisciplinary collaboration can lead to misunderstandings that influence systems design and can only be detected in later stages (Rogerson, 1998). A more technical cause of problems is that traditional laws of engineering do not necessarily apply to software. Software is not linear, and a reliability test cannot rely on the interpolation of testing results. Therefore, if one wanted to be certain to exclude faults completely, one would have to test every single combination of input and output and check whether the results are correct. This sort of check is impossible even for relatively small programs, because the permutation of inputs and outputs are astronomical and a complete check for most programs would be impossible even if one used the combined power of all computers. What makes matters worse is that even if one could check a program completely, the identification of errors and their rectification would most likely introduce new errors so that after every cycle of checking and correcting, one would have to start it over again. For programs with between 100,000 and 2 million lines of code, the chances of introducing a severe error during the correction of original errors is so large that only a small fraction of the original errors should be corrected (Forester & Morrison, 1994, p. 121).

An added problem is that even if one could solve all of these problems, the guaranteed correctness of the program would only refer to the relationship between the underlying model and program. Whether the underlying model is correct and adequate is a question that cannot even be addressed by these measures. Some of the most spectacular malfunctions of computer systems were caused by faulty models. In 1960, for example, a newly installed early warning system used by NATO to detect Warsaw Pact attacks indicated that the United States was under nuclear attack by Soviet missiles with a probability of 99.9%. The reason for this wrong report turned out to be the rising moon, which the Ballistic Missile Early Warning System had misinterpreted (cf. Borning, 1995). This sort of mistake is impossible to completely rule out by prior testing because the malfunction was not caused by faulty programming, but by an inadequate model of reality. Users and managers of IS must therefore be aware of the fact that even by using new technologies such as program verification, programs can never be proven correct ; in other words, one cannot be sure that they will do what is intended (Smith, 1995).

Reflective responsibility for information systems will have to accept the fact that absolute reliability of computers is impossible to guarantee. The reflexivity requires that expected results of actions be considered and that the realisability of a responsibility ascription be part of the ascription process. A simplistic statement requiring managers to take responsibility for the full functionality of information systems could. in the light of the difficulties with ensuring functionality. not count as reflectively responsible. Reflective and prospective responsibility for IS will therefore have to accept the reality of systems faults and try to find a way to deal with them. One way to do this is to designfaulttoleranceintosystems.Therearedifferentdefinitionsandrealisations of fault tolerance. On the one hand there are technical definitions. These tend to stress that the functionality of systems must be retained even in cases where parts fail (Kornwachs, 1996). For information systems this is a relevant viewpoint because it takes into account the nature of the systems. A technical way of achieving this is to build in redundancy. One can have different design teams develop several versions of a program in the hope that the teams will not make the same errors (Littlewood & Stringy, 1995). Another way to achieve fault tolerance is to limit the criticality of any one part. That means that critical parts of a system should be constructed with as little complexity as possible, that backup systems are still less complex, and that main and backup systems operate completely independently.

While such methods are certainly necessary and it appears to be highly desirable that they be implemented in safety-critical systems such as control systems of nuclear power plants, airplanes, or military hardware, users and managers should be aware of their limitations. No formal proof can guarantee the reliability of systems, and even the best laid-out improvements of safety features can fail. In fact, the continuous attempts to improve safety can themselves become liabilities (Borning, 1995).

An important point for managers and technicians to realise is that the reliance on technology and the belief in its infallibility can have dire consequences. Human beings are unable to supervise and control dynamic systems without mistakes. Therefore, the hope that mistakes in complex systems can be ruled out by human supervision is doomed to failure (Bergmann, 1996). It is doubtful whether the distinction between technical and human errors makes any sense at all. One can argue that most if not all faults are caused by human action (De George, 1998; Leveson & Turner, 1995). The attempts of exculpation of technical and socio-technical systems on the ground of singular human errors, which one can usually hear after technological catastrophes, make little sense inthislight.

What is necessary is the acceptance of the old wisdom of errare humanum est . If decision makers go so far and accept their fallibility, which as we have seen is another characteristic of reflective responsibility, then the question is: How do we deal with the risks and lack of reliability of all technical systems and especially of information systems? There is clearly no easy answer to this, but the idea of reflective responsibility can suggest an approach. The first step is to identify the moral question of the use of risky technology as such. References to technology, to fault tolerance, and to specific problems can be used to conceal the moral nature of the question. If technology is used and if it has the potential of affecting people s lives and rights, then this is a moral issue. It has to be recognised and treated as such. Reflective responsibility is helpful in this regard because the first step in ascribing it is the identification of the relevant reality. That means that the fact of the matter must be addressed as well as the normative foundations. The next point where the theory of reflective responsibility is helpful in dealing with technical systems is its emphasis on the participation of the affected parties. Solipsistic decisions concerning complex systems are likely to fail because they tend to neglect relevant aspects. Using a participatory approach as suggested by reflective responsibility can over come the problem. Finally, reflective responsibility is aware of its own fallibility, which is another important aspect. It indicates that even under ideal circumstances, the entire process of ascription may fail, which in the case of prospective responsibility for technological systems can mean that the system fails and that unforeseen results appear. This realisation leads on to another level of responsibility ascription, namely to the question of who is responsible for the distribution of responsibilities. What that means is that in view of the fallibility of responsibility, it must be ruled out that responsibility can be assumed merely on the grounds of personal interest. There must be institutions that ensure that responsibility can only be assumed or ascribed by parties that are competent to do so. Of course we are surrounded by such institutions. A middle manager of an electricity supplier cannot take responsibility for building a nuclear power plant. In the case of business information technology, however, such meta-responsibilities are not yet established. Who is responsible for the way technical development will go? Who makes steering and controlling decisions and on what grounds? Right now there seems to be an implicit consensus in the industrialised world that market interests should shape the future development of information technology and its use. It is unclear up to what point these forces have the legitimate right to control where social, political, or legal limits are or should be. This responsibility of a higher order is something managers of IS must be aware of, and in order to act responsibly, they have to assume their role in it as experts and decision makers.

Responsibility as a social construction needs to take more into account than just the nature of its object, which was just discussed as the fallibility of information systems. It also needs to take into account its own social nature. Managers who have responsibilities for the use of computers and information technology must ensure that the ascription process can be realised, that it has consequences, and that it follows the agreed-upon rules. We are not going to dwell on this point for too long because it is mostly the specific application to IS of the general directions that resulted from the analysis of the concept of reflective responsibility in Chapter 6. It does seem useful, however, to recall what these consequences are and how they apply to IS.

Before one can discuss the framework of responsibility for IS, one must recognise its nature. That means its characteristics as an ascription must be established and, as a result, its social nature must be accepted. If this is the case and the manager or group in question agrees to accept responsibility for IS, then there is a multitude of aspects that need to be covered. These start with external factors for which managers have no direct responsibility, but which can be partly ascribed to them. This refers to the social framework as it is provided by society and the state. Many of the questions regarding responsibility matters in IS are regulated or can at least be subject to official regulation. Questions of health, security, ergonomic design, etc., are regulated by law in most societies. For reflective responsibility this means that there is a normative basis for ascriptions. The first step of clarifying the issues can in many cases be greatly facilitated by such official social rules. While these rules will in most cases be part of the responsibility environment that the actors have to adapt to, they are at the same time an object of responsibility from the reflective point of view. If official regulations can clarify the ascription, then reflective responsibility demands that these regulations be realised. For the individual manager, this can mean that from her primary responsibility for IS, a secondary responsibility for the collaboration in the development of social rules can arise. An example might be a manager responsible for the implementation of a new system who finds that safety standards of a certain type of device are insufficient or do not exist. In this case a new sort of moral responsibility for the development and implementation of adequate safety standards can flourish. That can mean that the manager provides means to professional bodies that define such standards, that she collaborates as an expert on the development, or that she uses her role as a citizen in a democracy to effect similar change. The framework of responsibility, when viewed from the reflective viewpoint, therefore also becomes an object of responsibility.

This responsibility for the framework, for the realisability of responsibility, of course goes beyond the social and political frameworks. There are a great number of issues that could be counted under the heading of framework which are placed within the classical areas of responsibility of management. That means that managers are aware of them or at least could be aware of them, and that they have the power to influence them. Included Among are those areas where morality and economic reasoning coincide such as the question of efficiency. Efficiency is an economic good, which on a business level usually stands for the adherence to the economic principle of minimisation or maximisation. Either way, the amount of input is to be optimised with regard to the amount of output. This is not only an economic aim, but it is also a moral good (cf. Donaldson & Dunfee, 1999). Efficiency is morally relevant because it is the basis of wealth. The ethical defence of capitalism usually emphasises this aspect, the fact that markets produce the maximum welfare which can in turn be seen as the basis of a moral life.

While efficiency as management s responsibility is probably hardly contentious because it clearly benefits everybody involved (even though it can of course come into conflict with other values such as employment), there are other framework issues where management needs to become active. We have seen before that one of the conclusions to be drawn from the idea of reflective responsibility is that it has to consider the viability of ascriptions and that this in many cases will require the introduction or maintenance of institutions. These institutions should ensure that the ascription of responsibility stands a chance of being realisable and acceptable. In other words, they should help to increase the viability of responsibility. This can be done by improving the clarity of all of the dimensions involved. An important aspect of this will be the improvement of accountability. Accountability is of high importance in information systems because their systemic nature ”the problem of the many hands that design and use them, and the combination of hardware, software, and people ”often make it hard to determine which person or group caused which result. Institutions of accountability are meant to overcome this problem by establishing clear guidelines for the determination of causalities. It is hard to say what these institutions would look like in any particular circumstances, but examples are easy to imagine. They can range from organisational measures such as clear job descriptions, and understandable outlines of tasks over technical measures such as well-defined levels of access to broader questions of organisational culture, which determines whether accountability is taken serious at all. Also, accountability should be recognised as something that is not an end in itself and not an absolute entity, but something that itself can only survive in a social environment and under norms that are conducive to its development (Johnson & Mulvey, 1995). Accountability and responsibility in the sense the words are used here are therefore mutually dependent.

A last important aspect with regards to the framework of responsibility for IS and accountability are side effects. These pose a serious problem for responsibility ascriptions because they complicate the process due to the lack of intention that is inherent to them. Traditionally, in law as well as in ethics, the subject of responsibility had to show an intention , it had to fulfil the mens rea requirement in order to be held responsible. Side effects are defined by their not being intended, and therefore responsibility ascriptions for them tend to be difficult. There are basically two ways of dealing with this problem. On the one hand one can define responsibility without intention. This is what the legal doctrine of strict liability does. In strict liability, producers of goods are held liable for damage caused by these goods, independent of their knowledge of defects or their intention of selling them defectively. On the other hand one can attempt to realise responsibility by removing side effects from their obscurity and by explicitly making them objects of responsibility. The two approaches are clearly related. The doctrine of strict liability aims at improving the responsibility of producers for side effects and thus implies that it is possible to move these effects to the centre of attention. Managers trying to anticipate responsibility for side effects can do so by using different organisational and institutional measures. The most important step is the raising of consciousness that side effects exist and that they are considered to be objects of responsibility. The realisation of responsibility by discursive methods, by stakeholder analysis, or by any other means that includes a considerable number of affected parties is a good way of doing this. The more people who have a chance of participating in discourses about decisions, the higher the probability of something unexpected being detected before it becomes serious. The downside is of course that decision processes become unwieldy and, in a worst-case scenario, do not happen at all any more. It is therefore part of managers duty to weigh these considerations and decide to which point such discourses can be led. This then brings us to another aspect of reflective responsibility, to the faculty of judgment which is necessary for the successful ascription of responsibility. It becomes clear that managers who want to act responsibly and who are in a position of doing so need this capacity in order be successful.

The aspect of prospective responsibility for computers and information systems in a business environment is by definition highly complex because it deals with developments that are still to come. However, the two points discussed here ”the recognition of the fallibility of information systems and the emphasis on framework and institutions ”can go some way in helping managers identify areas where they should become active. One important aspect of the framework is that it has to build a bridge to the retrospective view, to responsibility because of IS. Institutions should not only clarify who should look out for what, but they must also make it clear what the consequences for neglecting these institutions are. Institutional and organisational rules put in place should make both of the temporal aspects clear to the affected parties.

Apart from the two temporal aspects of the relationship between responsibility and IS, there is a third aspect which aims less at the establishment of responsibility ascriptions but rather at the impact that IS has on the processes of responsibility itself. This is what will be discussed as responsibility through IS in the next section.

Reflective Responsibility Through Information Systems

While the last two sections were used to analyse what it can mean to ascribe responsibility because of and for information technology in a business setting, this one will look at how the use of computers and IT change the nature of these ascriptions. It is obvious that the use of different media of communication can change the character of communication, its moral and informational content, and consequently also the character of responsibility. However, it is less clear what the relevant mechanisms of change are and how they are to be evaluated. The answer to these questions depends on the choice of theory used for the description of communication. In order to demonstrate the effects of IT on communication and responsibility, we will use the Habermasian theory of communication because it is already well introduced in this text and because it has already been demonstrated that it is a useful theory with regards to responsibility due to the structural similarity between the two.

According to Habermas, human beings communicate in order to survive and prosper in the world. Communicative action serves these aims, but it goes beyond that. Communicative action is the highest form of human action (compared to pragmatic and strategic action), because it recognises the others not only as objects of manipulation but as subjects or, to use a Kantian expression, as ends in themselves. We all live in our respective life-worlds which form the horizon of our realities. These life-worlds, however, are not purely idiosyncratic, but they are constructed and modified using communication. That means that our life-worlds are always there as a sort of background conviction which only becomes relevant in that moment where they lead to an impediment to collective action. If two or more life-worlds (more exactly: aspects of life- worlds ) collide and thereby change their status from an unconscious background resource to an issue of contention , then we come to the point where discourses become necessary. Discourses in the Habermasian sense are discussions that take place in an ideal speech situation or at least under the conscious effort to produce circumstances that resemble the ideal speech situation as far as possible. In order to achieve an ideal speech situation, the participants should:

endeavour to ensure that (a) all voices in any way relevant can get a hearing, and that (b) the best arguments we have in our present state of knowledge are brought to bear, and that (c) disagreement or agreement on the part of the participants follows only from the force of the better argument and no other force. (Habermas, quoted in Ess. 1996b, p. 216)

Every statement made in a discourse is implicitly accompanied by three validity claims: truth, rightness, and authenticity. [23] That means that whenever something is said in a discourse (or outside of discourse as well, but in everyday circumstances it may be less obvious), then the speaker implies that the proposition is true, that it is normatively right and acceptable, and that she means what she says, that she is authentic or truthful . It is these three validity claims that necessitated this brief repetition of Habermas theory because they allow a good insight into how computers and information technology impact on the process of responsibility ascription.

Validity claims in the Habermasian sense are important for responsibility ascriptions because they constitute the first step in conforming with the idea of reflective responsibility, which is clarity of the notion. As was shown in Section 6.3, the first and most important step in ensuring an open and effective ascription of responsibility is to clarify what exactly is involved. That means that the dimensions must be identified as well as the relevant norms and factual circumstances. This is where validity claims enter the picture. The clarification of all of these points is only necessary in those cases where they are controversial. Controversial claims, however, need to be argued for or against. These arguments are based on validity claims. The outcome of the arguments determine how responsibility is ascribed, to whom, with which consequences, etc. It is therefore clear that if IT affects validity claims, it also affects responsibility ascriptions.

The use of computers and IT can affect validity claims in different ways. First of all they change the starting position with regards to claims by affecting our life-world. The shared life-worlds in western industrialised societies nowadays contain entities which 50 years ago would have appeared ridiculous, such as cellular phones or virtual reality. That means that our shared reality, changes, and accordingly our claims to truth, which are based on our reality, change as well. Another way in which computers change claims to truth is by influencing our epistemological criteria. According to Postman (1992, p. 115), the recourse to computers has become the modern-day equivalent to it is God s will and has roughly the same consequences, namely immunising statements against criticism.

Another aspect of the changes of validity claims incurred by the use of IT are the anthropological effects discussed earlier. Humans are seen as computers or computers are viewed as humans. Both views affect the basis of our assumptions about humanity. Our view of humanity, on the other hand, is the basis of responsibility ascriptions. This leads back to the conditions that subjects have to fulfill, and it is clear that humans , viewed as information processing systems, will not be considered as free and thus will generally fail to be responsible.

Apart from these fundamental ways in which IT can change responsibility ascriptions by affecting fundamental notions, there are also some effects it can have just by acting as a medium of communication. This thought is easily understood by looking at the two most important innovations in computermediated communication ”hypertext and email. Email is not equivalent with other means of communication because it crosses the line between several of them. On the one hand it is similar to written communication as we find it in letters or books. On the other hand the rapidity of exchange and the ubiquity of its use lets it appear closer to face-to-face or telephone communication (O Leary & Brasher, 1996). In some respects the characteristics of emails make it an ideal medium for the exchange of validity claims:

In terms of argument patterns, the rhythm of email and mailing list exchange encourages opposing manifestos and summaries but also quick movement from what you just said to the arguments and presuppositions behind it. Positions get examined from a variety of angles, and there will be demand for backing on specific points. This makes email a good medium for the kind of dialogue that Habermas speaks of, which demands justification for each speech act and inquires into the validity and sincerity of claims. (Kolb, 1996, p. 16f)

At the same time email threatens classical structures that were erected with the purpose of guaranteeing the truth of claims. In the paper-based world, there are structures such as peer review or editors which aim at ensuring that published material adheres to certain standards. Mullins (1996, p. 276) refers to these structures as gatekeeping functions. While it has always been possible to circumvent such structures, it is nevertheless true that this was only possible by investing a considerable effort. Therefore we are used to a certain standard of written texts and these have a highly plausible prima facie claim to truth. For email this additional filter does not exist, and it therefore again transcends the division of written and spoken communication with a direct impact on validity claims.

The second example for the effect of IT on validity claims is hypertext. The introduction of hypertext is probably one of the main reasons for the success of the World Wide Web. Clicking on a link and thereby moving to another document, another thought, another thread of discussion is easy even for those who know little of computers. At the same time it seems to appeal to us, maybe because it is closer to our way of thinking than traditional long and linear texts. It seems that hypertext in this respect reflects other social developments which go away from long and strenuous activities toward shorter and more entertaining ones. A good example of this trend is to be found in e-commerce and its approach to customers. Consumers have to deal with a flood of information that they cannot possibly process and most of which is therefore ignored. The emphasis in e-commerce is therefore increasingly on catching the customer s attention, and one can frequently hear talk of attention as the central resource in the information age (cf. Zerdick et al., 2001; Liebl, 1999). While it is not easy to see what exactly the effects of such developments on social relations are, it is clear that they can impact on real discourses and on our evaluation of validity claims.

On the other hand, the use of computers and information technology offers new chances for responsibility ascriptions. New interest groups can form and new objects of responsibility can be identified. While this argument is often used to demonstrate the democratic nature of new technologies, especially the Internet, it is also applicable to an organisational setting. Group collaboration systems, for example, can help increase communication between members of one organisation and thereby stimulate new areas of responsibility. This is often described as the purpose of collaboration systems even if the term responsibility is not used.

While information technology can help identify dimensions of responsibilityand the formation of interests, it can also be helpful in conducting responsibility ascriptions. If we continue to employ Habermas theory of communication to responsibility claims, then one of the conclusions is that real discourses, which can be responsibility discourses, need to approximate ideal discourses in order to produce valid results. From the point of view of reflective responsibility, the validity and viability of ascriptions is important. Therefore real responsibility discourses should be as close to the ideal speech situation as possible. It is possible to employ information technology for these purposes. According to Lyytinen and Hirschheim (1988, p. 24f), information systems can have emancipatory effects. They can make social relationships more symmetrical, thereby reducing organisational power, and they can allow new interpretations of data by redistributing access. Empirical research in systems which have the express purpose of facilitating communication between hierarchies, for example group decision support systems, suggests that some features of computer-mediated communication can live up to these emancipatory promises (cf. Laudon & Laudon, 1999). An important factor here seems to be anonymity. By using anonymous communication in computer-mediated discussions, real-life power differences can be overcome and discussions can be held in a more detached and objective atmosphere. This allows participants to voice their opinions who would otherwise be kept out of the discussion. Since the equality of the speakers and their mutual recognition is a key part of Habermas ideal speech situation, one can conclude that the use of IT can facilitate discourses and help approximate them to ideal discourses.

There are of course also downsides to responsibility ascriptions by discourses mediated by computers and IT. A first counterargument could use the same starting point as the last one and argue that IT has the opposite effect, that it moves discussions away from ideal discourses. This can happen in several ways. Firstly, IT-mediated discussions require technical access and competencies in the use of technologies. The majority of humanity has neither and is therefore excluded from them. But also in an organisational setting where management could ensure equal access, the ability and willingness to use technologies varies and thereby changes the power that individuals can wield in their social setting. Secondly, technology-mediated discourses also offer the possibility of hidden manipulations of power. Group decisions support systems (GDSSs), for example, tend to have a central mediator who controls the entire communication. This allows her to set the agenda and to rule which contributions are admitted and which ones are not. Similarly, many other technologies allow manipulations of communication of which the participants are not aware.

Another argument against the use of technology in responsibility ascriptions is the formal structure of information systems. It is true, particularly of information systems used in businesses, that they are rather fixed and that they thereby violate some of the ground rules for discourses, the chance to express opinionsthroughargumentation.Also,commercialinformationsystemsusually do not aim at testing people s opinions but aim at facilitating action. In this way they can produce particular social relations (cf. Lyytinen & Hirschheim, 1988, p. 24). This happens because information gets objectified in information systems, and therefore discourses and responsibility ascriptions get frozen and objectified too (cf. Ulrich, 2001).

Another problem with the use of information technology in ascribing responsibility is information overload. Even under the best of circumstances, where IT helps identify the dimensions, clarify the rules, and facilitate discourse, it comes up against the limitation of human reason. Humans can only process a limited amount of information, and the information available through technological means manifolds this amount. The old axiom of economics that more of a good is always better therefore is not applicable to information (Kolb, 1996).

Most of these points refer to the effect of the use of computers and information technology on responsibility and discourses in general. They are nevertheless easily applied to business or organisational responsibility ascriptions. Going back to the central idea of this chapter, to the question of what managers can learn from the theory of reflective responsibility with regard to IS, it is easy to see that the aspect of responsibility through IS is quite ambivalent. Managers who want to be responsible, who want to facilitate or realise ascriptions of responsibility, will in many cases have to rely on the medium of ICT for this purpose. This last section has shown that the use of this medium is not neutral, but can improve or degrade the quality and thus the acceptability of responsibility ascriptions. This knowledge is important because it affects the viability of responsibility and therefore also its ethical acceptability.

[20] For a more detailed discussion of the ethical impacts of IT on democracy, see Stahl (2001b).

[21] For a more detailed discussion, see Stahl (2001c).

[22] For a more complete overview of the impact IT has on the ethical aspect of democracy, see Stahl (2001b).

[23] One sometimes finds references to a fourth claim, to understandability. For our purposes the three above claims are sufficient; since the fourth claim is not reflected in most of the literature, we will just leave it aside.




Responsible Management of Information Systems
Responsible Management of Information Systems
ISBN: 1591401720
EAN: 2147483647
Year: 2004
Pages: 52
Authors: Bernd Stahl

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net