Preliminary Considerations About Human Inquiry

In this section, we start off by describing some implications of cultural development on the understanding of human inquiry, which have been largely discussed during the last 30 years within the modernism-postmodernism debate. Especially in response to the "anything goes" debate, we use Kuhn's (1996) concept of a paradigm to develop a framework for the conceptualization of different paradigms of inquiry, which will be applied in the following sections.

Winds of Change: From Modernism to Postmodernism

Societal and cultural influences play a major role when we try to understand human inquiry and science in general. Organizations are cultural constructs. Nature, wherever it occurs in the range of human consciousness, is already culturally 'coated.' Technology, being a result of cultural achievements, is permanently — while it evolves — becoming a part of culture.

Culture can neither be regarded as a temporal nor as a spatial constant. However, historically culture's influence upon human cognition was ignored most of the time. With his famous dictum "homo mensura," Protagoras gave us a first hint on the cultural conditionality of human cognition. But it took until the beginning of the 20th century for cultural anthropology and philosophy of culture to emerge.

With modernism and postmodernism, two cultural epochs are differentiated. This difference is of interest especially with respect to their corresponding conceptions of human inquiry.

In general, modernism may be regarded as a framework for the epoch of the industrialized society, or in philosophical terms, the "age of the enlightenment." Even if there is no identifiable 'starting point' of modernism, there are five developments which are regarded to be constituent for the modern age: the emergence of the nation state, the advent of science, economic and technological progress, the rise of the West, and the secularization and democratization of society (Allen, Braham & Lewis, 1992; Hall & Gieben, 1992). Scientific development led to an intellectual revolution with subsequent rejection of superstition, tradition and also religious authority. Technological advances together with the economic transformation of the 18th and 19th centuries introduced a period of self-sustaining growth. Growth was also supported by the rise of the West as the main economic and political power through worldwide exploration, colonization and exploitation of resources.

The fundamental assumption of modernism is, thus, basically positivist in connection with a naïve ontological realism and epistemological objectivism. Until the end of the 19th century, the mechanization of the world was, at least to most researchers, the foundation for the explanation of virtually any problem. Technology improved production and living conditions; belief in progress dominated questions on development of society or science.

Along with all progress made during modernism, most industrialized societies went through a structural change the economists call the "three-sector-hypothesis" (Fourastié, 1954). The service sector is becoming more important today. However, due to this change, the former modern workforce is confronted with entirely different problems, to which their modern world view (with its assumptions and attributes listed above) does not provide the appropriate answers to anymore. Many of the basic assumptions of modernism no longer hold in our world today. This led philosophers to question modern issues using a different framework: postmodernism.

Postmodernism, which has its origins in a fundamental criticism of modernism, is first found in arts and architecture at the beginning of the 20th century (Bell, 1962; Pannwitz, 1917). From there, three concepts of the postmodern evolved, which, according to Gmür (1991), can be summarized as (1) the postmodern as the opposite of the modern, (2) postmodern as a standstill in progress, and (3) postmodern as a new, radical modern. The differences between many postmodern views do not always become clear. Science itself is evolving. Postmodern concepts of science break with the modern ideas like "truth" or "unity of science" and encompass "multimethodological approaches" which are highly dependent on social and cultural context (Bell, 1926; Hassard, 1993; Rosenau, 1991). Cultural and scientific developments, influenced by the postmodern turn, also have adapted to a different view on human inquiry and knowledge. Classic (i.e., modern) concepts of "knowledge," which regard it as an objective entity, are superseded by postmodern concepts, which view knowledge as culturally determined, subjective or social. As a consequence, epistemology, the classic field of philosophy dealing with human inquiry and concepts of "knowledge," now also discusses these new concepts (Lyotard, 1984; Rorty, 1991; Sorri & Gill, 1989).

In summary, postmodern critique signifies a general process of de-legitimization. In the scientific sphere, a loss of confidence in rational theory, the safeguards of rigorous research methods, the capacity for objective knowledge, and the promise of steady progress in the growth of knowledge can be asserted (Gergen, 1996) — be it under the 'postmodern label' or not. However, developing a sound philosophical foundation of human inquiry and knowledge has not become easier, since there is a multitude of positions to chose from, as we will see in the following sections.

The Concept of "Paradigm of Inquiry"

Being in favor of intellectual diversity supported by postmodern conceptions of science, but afraid of "anything goes" in its literal meaning, we argue that researchers must specify the (pre-)suppositions their work is based on and can be judged on. This demand does not necessarily preclude the simultaneous use of different methods and methodologies. But a mixture of methodologies based on incompatible ontological, epistemological or anthropological presuppositions has to be denied. Otherwise, due to their incommensurable (pre-)suppositions, multimethodological approaches will allow no unequivocal interpretation of the results obtained, rendering their declaratives arbitrary.

Grounding research programs within a specific discipline is of limited use because such 'local' concepts hinder transdisciplinary research. We argue the foundation has to be laid on a meta-scientific level. With reference to philosophy of science and drawing on the concept of "paradigm" developed by Kuhn (1996), we propose the use of a framework for the conceptualization of so-called "paradigms of inquiry." While philosophers mostly desist from classifying positions, we want to use the framework for a classification — even if reductionistic. This framework does not only allow the positioning of paradigms relative to each other, but also supports the conceptualization of new ones. Thus, it serves descriptive as well as prescriptive purposes. We are well aware of the fact that all frameworks have one problem in common: the dimensions used for their composition are arbitrary and the 'borders' imposed by the categories chosen may implicate distinctions which are not always appropriate to make, as we will see when describing sociopragmatic constructivism.

An unspecified reference to Kuhn can be considered problematic, since Kuhn himself used the term "paradigm" in at least 21 different definitions (Masterman, 1970, p. 61). This makes it necessary to take a closer look at the concept of "paradigm" used.

Kuhn's concept of "paradigm" is the result of an analysis of both history and sociology of science. It concludes in a rejection of induction and falsification as driving forces in the development of science. Thereby, the thesis of continuity in science, in the sense of cumulative scientific progress or a constant growth of knowledge available to mankind, is negated. Rather, scientific development (and development of knowledge along with it) is seen as characterized by revolutions that are followed by phases of relative stability, and so forth.

According to Kuhn, the development of a science starts with a pre-scientific, pre-paradigmatic phase, which is characterized by a non-systematic search and analysis of facts (observation data) and, for the most part, an absence of theory. What may look like endless freedom at a first glance, for example, choosing the object or methods of research, appears at a closer look as being limited and framed by existing social and cultural experience (Fleck, 1935). Possible ways out of this phase are strongly influenced by the concept of the "scientific community" which, through interaction between participating scientists, leads to the development of different "schools" (de Solla Price, 1963). In each of these schools over time a consensus on different aspects of their science forms, which is then communicated in more or less formal ways. As Kuhn (1996) writes:

"A scientific community consists [] of the practitioners of a scientific specialty. To an extent unparalleled in most other fields, they have undergone similar education and professional initiations; in the process they have absorbed the same technical literature and drawn many of the same lessons from it. Usually the boundaries of that standard literature mark the limits of a scientific subject matter, and each community ordinarily has a subject matter of its own. There are schools in the sciences, communities, that is, which approach the same subject from incompatible viewpoints" (p. 177).

The increasing structuring of research topics and research methods opens the paradigmatic phase in a science. Pre-science becomes normal science, "firmly based upon one or more past scientific achievements, achievements that some particular scientific community acknowledges for a time as supplying the foundation for its further practice" (Kuhn, 1996, p. 10). Kuhn (1996) has not given an unequivocal description of the function of a paradigm in normal science, which leads to intense debates on this issue (Banville & Landry, 1989; Lakatos & Musgrave, 1970). Masterman's (1970) attempt to classify the different uses of these terms by Kuhn differentiates three concepts of "paradigm": metaphysical, sociological, and construct paradigm (pp. 65–68).

The metaphysical paradigm represents the most extensive consensus possible within a science: a worldview or Weltanschauung. Worldview, as understood by Kuhn, thereby implies that perception is influenced by experience. In this sense it is like a 'lens' through which we perceive the world. Changing one's worldview from one to another is no continuous process, but rather a radical shift. It is impossible to view the world through one or the other 'lens.' The world, as seen with the old worldview, has a different 'Gestalt' than the one seen with the new one. The two cannot be compared; they are incommensurable (Kuhn, 1996, pp. 111–117).

The sociological paradigm encompasses "the entire constellation of beliefs, values, techniques, and so on shared by the members of a given community" (Kuhn, 1996, p. 175) and is a concretion of the metaphysical paradigm. Taking the social dimension into consideration in describing science emphasizes the subjectivity of their self-conception. At the same time, the concept of "objectivity" in science has to be dismissed. Instead, they are to be considered socio-cultural phenomena, which can only be interpreted with respect to their social, historical and cultural background. Theories, based on the idea of a cumulative development of science and knowledge, are unable to explain fashions and recurrent ideas in science, as Kuhn (1996) illustrated with several examples from history of science.

The construct paradigm is the most concrete form of a paradigm. It refers to the methodical layer of science, to specific tools, instruments, and procedures for producing and collecting data. Many authors have criticized Kuhn's (1996) thesis of a single dominating paradigm within normal science (p. 19), criticizing it as 'monistic' (Banville & Landry, 1989, p. 49). Questioning Kuhn's argument, they tried to prove the existence of multiple paradigms in science, just as Culnan (1986) does for the field of management information systems. Masterman (1970) differentiates between single paradigm and multi paradigm sciences, the former applying to the natural sciences, the latter to the social sciences (pp. 73–76). Kuhn (1996) revises his paradigm concept, describing it as "constellations of group commitment" (p. 181). In order to avoid further terminological problems, he proposes the term "disciplinary matrix." Main elements of this matrix are "symbolic generalizations," "common beliefs," "shared values," and "shared exemplars" (pp. 182–187).

Postulating, as we do, a paradigmatic foundation of research, is also being criticized. Banville and Landry (1989) assert, "one should not distract researchers from their daily activities and ask them to try to set up a set of rules to be called a paradigm; rather, one could observe how these researchers proceed, elaborate a model and propose it as a paradigm" (p. 50). But this view is questionable itself, if we take Kuhn's (1977) characterization of normal science into consideration: "Normal scientists never examine their paradigm critically, in particular the paradigmatic theory. They simply use the theory uncritically as an instrument for puzzle solving" (p. 141).

This description of missing reflection in science can equally be taken as an explanation for the missing philosophical foundation of ISR. Normal research within a paradigm never questions its own basic assumptions. But even if someone wants to make a conscious choice between paradigms, e.g., in a crisis situation, things do not get easier: the existence of several adoptable paradigms in a research field poses the question to the researcher which paradigm or which 'school' she/he should follow. The decision in favor of one certain paradigm should be well considered, since its consequences have a fundamental impact on all following work. As Lincoln (1990) states:

"The adoption of a paradigm literally permeates every act even tangentially associated with inquiry, such that any consideration even remotely attached to inquiry processes demands rethinking to bring decisions in alignment with the world view embodied in the paradigm itself" (p. 81).

The consequences of adopting a paradigm result in several advantages and disadvantages:

"For example, research paradigms facilitate the practice of a community of scholars by providing shared assumptions about the nature of phenomena, a vocabulary for representing such phenomena, and criteria for evaluating scholarly work. On the other hand, frames are constraining when they reinforce unreflective reliance on established assumptions and knowledge, distort information to make it fit existing cognitive structures, and inhibit creative problem solving" (Orlikowski & Gash, 1994, pp. 176–177).

For this reason, Kuhn (1996, p. 178) emphasizes the importance of concurrent or sequential affiliation to different scientific communities, resulting from an "enlargement of the horizon" aimed at by inter- or transdisciplinary research.

Postmodern concepts of science challenged the modern monoparadigmatic concepts, accompanied by an increasing propagation of antipositivist epistemological positions, which reject ontological realism. Giving up the idea of an objective reality, the problem of "missing ontological unity" (Kanitschneider, 1991) can be interpreted in a new way. Research results no longer have to claim absolute truth or absolute insight, but instead have to be evaluated on the basis of their pragmatic value. Because goals of research result from individual or social demands, research becomes increasingly subjective. Therefore, postmodern concepts of science support multi-paradigmatic approaches. These do not completely eliminate the above-mentioned 'framing'-problem, but ease the problem by allowing multiple perspectives through different paradigms.

Our position is not a general rejection of multi-paradigmatic approaches. We postulate, however, that research paradigms (and their corresponding methodologies) have to be consistent in terms of themselves. The analysis and evaluation of the results obtained have to take place within a dedicated methodology (endogenous criticism). This proposition is absolutely reconcilable with Mingers' (2001) critical pluralism. He proposes a critical, yet multimethodological approach to achieve a base of methods as needed in the particular research situation. Thereby one is not restricted by a single paradigm. It will, in any case, be necessary to assign methods to a certain paradigm of inquiry, because without this assignation the results will relapse into the already mentioned "anything-goes"-pluralism.

Our concept comprises the categories ontology, epistemology, anthropology and methodology. Frameworks similar to this concept have been used for example in sociology (Guba, 1990), organization theory (Burrell & Morgan, 1979; Morgan & Smircich, 1980), and information systems research (Iivari, 1991; Orlikowski & Baroudi, 1991).

This concept of "paradigm of inquiry" can be interpreted as a morphological box (Zwicky, 1969), whose possible instances are restricted by several constraints. For example, a naïve ontological realism is not compatible with a radical constructivist epistemology, which itself is not compatible with a concept of the "human" as being deterministic, and a nomothetic methodology. From the perspective of cognitive science, paradigms of inquiry can also be viewed as frames (Minsky, 1975), or schemata (Rumelhart, 1980), hence as prototypical forms of conceptualization, guiding the process of inquiry. The concept of "paradigm of inquiry" is bound to a corresponding theory of culture (e.g., modernism, postmodernism), since it provides the holistic background of our conception of the world (Weltanschauung). With the advent of postmodern organization theories, the impact of theories of culture on paradigms of inquiry has become obvious in a discipline closely related to IS research.

In the next paragraphs we provide a non-comprehensive description of issues of the categories chosen to constitute the framework depicted in Figure 1. Since there is a host of definitions available in the literature, our description serves also as an explication of our understanding of these categories.

Category

Relevant Questions

Debate(s)

Ontology

What is the nature of the "knowable"? What is the nature of "reality"? Is reality external to the individual and imposing itself on the individual consciousness or a product of individual cognition?

Realism vs. Nominalism

Realism vs. Idealism

Realism vs. Anti-Realism

Epistemology

What is the nature of the relationship between the knower (the inquirer) and the known (the knowable)?

What are the grounds of knowledge? What is truth?

Positivism vs. Anti-Positivism

Objectivism vs. Subjectivism

Empirism vs. Rationalism

Anthropology

What is "human nature"? What is the relationship between human beings and their environment?

Determinism vs. Voluntarism

Primacy of the Individual vs.

Primacy of the Society

Methodology

How should the inquirer go about finding out knowledge?

Nomothetic vs. Ideographic


Figure 1: Framework for the Conceptualization of Paradigms of Inquiry

Ontology

In the 17th century, the concept of "ontology" was developed to describe a branch of philosophy dealing with the nature of being, even if early Greek philosophers were already concerned with ontological problems.

In the Middle Ages, the ontological debate was concerned with the nature of universals (as opposed to particulars) and is characterized by the two distinct positions of realism and nominalism. Realists believe that universals have an existence independent of mind. For nominalists, universals are just signs or names referring to single objects or sets of objects. Thus, they are dependent on the mind and have no existence themselves. Generalizing the two positions, one can say, realists assume the existence of a structured world, with the structure being independent from mind. Nominalists, on the other hand, deny the existence of a structured world. Structure, and names referring to it, are the results of cognitive processes and, thus depending on the mind.

Ontology changed dramatically with Kant. According to his criticism, ontology "presumptuously claims to supply, in systematic doctrinal form, synthetic a priori knowledge of things in general" (Kant, 1787, B303). In his argument, he opposes realism to idealism: realism means that we perceive objects whose existence and nature are independent of our perceptions, whereas idealism means that they are dependent on our perception. Unsatisfied with both positions he argues:

"Thoughts without content are empty, intuitions without concepts are blind. It is, therefore, just as necessary to make our concepts sensible, that is, to add the object to them in intuition, as to make our intuitions intelligible, that is, to bring them under concepts. These two powers or capacities cannot exchange their functions" (Kant, 1787, B75).

Kant reversed the classical view of cognition, known as "Copernican turn" in philosophy. Instead of understanding knowledge as conforming to objects, we have to understand the objects as conforming to the conditions of the possibility of our knowing. Thus, human knowledge is limited to appearances; we are not able to know of the "things-in-themselves."

For Kant (1787), the "scandal of philosophy" is that no proof has yet been given of the "existence of things outside of us" (Bxl). Whereas for Heidegger (1962, p. 249) the scandal is "not that this proof has yet to be given, but that such proofs are expected and attempted again and again," thus making the question for the nature of reality of the external world a pseudo-problem.

Heidegger (1962) conceptualizes "being" as "Dasein" which means essentially "in the world." He understands "world" similar to Husserl's (1986) concept of "life-world" as "everyday world," which is disclosed to us by pre-scientific experiences. These experiences are gained from using entities in the world as "tools" that are ready to hand. Only when tools fail we engage in theoretical cognition, thereby entering the scientific world. When we acquire knowledge about entities in the world we are always seeing entities in a "context of significance," in which these entities are related to each other and gain their meaning. Thus, entities do not have an existence themselves, but are always dependent on the existence of other entities.

Epistemology

Early Greek philosophers were already concerned with the nature of knowledge, but the concept of "epistemology," as we understand it today, was developed only in the second half of the 19th century to describe not only a theory of knowledge but also a theory of science (Köhnke, 1986, p. 58).

The central question of epistemology is concerned with the relation between the knower and the knowable. This question is based upon an assumption, which has been canonized by Descartes (1998) in his distinction between "res extensa" (i.e., body) and "res cogitans" (i.e., mind).

Another question of epistemology is concerned with the status of knowledge. Going back to Plato's distinction between knowledge and belief, knowledge has been characterized as "justified true belief." This characterization leads to the question, how can we justify our beliefs in order to gain knowledge? The central criterion to this justification is truth, which again raises questions about truth.

There are two distinct positions in epistemology: empiricism and rationalism. Empiricism is based on the assumptions that all knowledge is derived from our experiences. While observing the world around us, information about the world is being perceived as sense-data by our senses and imprinted on the mind. Therefore, the mind is understood as "tabula rasa" or "white paper" (Locke, 1959, II.i.2).

According to empiricists, the criteria for truth and thus for knowledge generally is being described by the "correspondence theory of truth," which holds that we have gained true knowledge about the world if our perceptions are in correspondence with it. In other words, propositions about the world are only true, if they are in correspondence with the facts. 'Correct' perceptions can therefore be understood as perfect mappings of some domain of the world into our minds. Thus, a problematic issue of empiricism is, it cannot explain the knowledge we have of things that cannot be found in experience (e.g., quarks).

Rationalists question the idea of sense-data being the source of knowledge. For them knowledge can only be derived from reason. In its most radical form, rationalism is based on Plato's concept of "innate ideas." In his famous cave analogy, Plato argues, that while observing the world, we actually perceive only the shadows of the real world. Only with reason we are able to remember the innate ideas, which provide us with true knowledge about the real world.

Dismissing the concept of "innate ideas" rationalists are confronted with the problem of finding other criteria of true knowledge. With his famous dictum "cogito ergo sum" (I think, therefore I am), Descartes (1998) provides us with such a criteria: the reason itself. He argues, that even the most radical skeptic has to admit, that to be skeptical about something requires the existence of reason. However, denying sense-data as some source of knowledge has to be regarded as a questionable proposition, since it cannot explain the existence of knowledge that cannot be gained without experiences mediated through our senses (e.g., language).

The argument provided by Kant, in his criticism of ontology, stating that "thoughts without content are empty, intuitions without concepts are blind" (op. cit.), is thus equally true for epistemology. In order to gain knowledge about the world, we have to have some access to the world and we have to have some faculty allowing us to structure our experiences. With Kant's (1787) words: "The understanding can intuit nothing, the senses can think nothing. Only through their union can knowledge arise" (B75).

With advances in neurophysiology and cognitive psychology, a new epistemology called "radical constructivism" (Glasersfeld, 1996; Watzlawick, 1984) has gained popularity, also in information systems research (Floyd, Züllinghoven, Budde & Keil-Slawik, 1992). Radical constructivists argue we cannot acquire knowledge about our environment, since the sensory input does not inform us about the qualitative nature of the stimuli. Our nervous system and the mind have only access to information about perturberations in our sensory organs. Consequently, all our knowledge about the 'world' is fictitious, is a construction of our mind. The concept of "truth" has therefore been dismissed and substituted by the concept of "viability" (Glasersfeld, 1996). Mental constructions are considered knowledge, if they help us to deal with our environment. It does not matter, if knowledge is in correspondence with the environment or not. Even if it is, we would not be able to realize it.

Whereas radical constructivism is an individualistic approach, social constructivism (Berger & Luckmann, 1966), developed in sociology of knowledge and thus not necessarily to be considered an epistemology, explains knowledge as a result of social interaction. There are no objective criteria for knowledge. Rather the concept of "truth" is based on the consensus theory of truth, which holds that within a certain community a proposition is considered as being true, if a consensus about its truth has been attained.

Anthropology

The concept of "anthropology" in the sense of a science of human nature was developed in the 16th century. Casmann (1594) describes human nature as dualistic: "Anthropologia est doctrina humanae naturae. Humana natura est geminae naturae mundanae, spiritualis et corporeae, in unum hyphistamenon unitae, particeps essentia."

The early understanding of anthropology was that as of a natural science, having the goal to explain human nature on the basis of biological or physiological concepts. But soon philosophers discovered reason as a feature that distinguishes human nature from the nature of all other beings, could not be explained in terms of natural science. The resulting distinction between humans rational and empirical nature has led to the classical body-mind dualism, which has been dominating anthropology ever since.

Kant, emphasizing the complementarity of the empirical and rational nature of humans, introduced anthropology as a philosophical problem, but only in the early 20th century, with writings by Scheler (1928), Plessner (1928) and Gehlen (1940), philosophical anthropology has been inaugurated as an independent discipline.

Already in Plato's "Republic" and Aristotle's "Politics," we find an understanding of human nature which goes way beyond the classical body-mind dualism, and stands contrary to the still dominating individualistic anthropology — the understanding of human nature as a result of man's cultural embeddedness. Social and cultural anthropology, thus, try to explain human nature on the basis of man's always already given social and cultural context. This effort is based on the assumption, that a human being without social and cultural context is not imaginable, a sole human being would not be a human being.

From a methodological point of view, individualistic anthropology seeks to explain human nature on the basis of the sole individual, whereas social and cultural anthropology seek to explain human nature by considering culture and society as primary sources of human nature.

In the early 20th century, a new approach towards the understanding of human nature was developed. Cassirer (1967, p. 23) refers to concept of "functional circle" (Funktionskreis) (Uexküll & Kriszat, 1934), which describes the anatomical basis of organisms' interaction with their environment as two coupled systems — "Merknetz" (receptor system) and "Wirknetz" (effector system). He argues humans are different in a certain respect: receptor and effector system are extended by a symbolic system, which adds a new dimension of reality. In addition, Vygotski (1978) writes in the early 1930s: "The use of auxiliary signs breaks up the fusion of the sensory field and the motor system and thus makes new kinds of behavior possible" (p. 35). Man does not just live in a merely physical universe, but lives in a symbolic universe. He cannot "confront reality immediately; he cannot see it, as it were, face to face. Physical reality seems to recede in proportion as man's symbolic activity advances" (Cassirer, 1967, p. 25). Thus, the resulting understanding of human nature is based on the understanding of man as "animal symbolicum" (Cassirer, 1967, p. 26).

Methodology

The fact we have to follow a certain path to reach a destination is the idea behind the concept of "method." In other words, a (scientific) method is a certain procedure we have to follow in order to pursue a certain goal (usually to gain knowledge), and methodology is the science concerned with methods. Its interest lies in the development of descriptive or prescriptive theories about methods, and in the determination of the requirements that have to be fulfilled in order to make a certain method applicable in certain situations. The last task is especially important for the sciences because the concept of the methodic foundation of knowledge developed by Plato can still be regarded as the most crucial criteria for scientificity. Prescriptive methodology, therefore, provides frameworks for scientific work, on which the quality of research can be judged. Such a framework does not only comprise propositions about the methods themselves, but also about the underlying ontological, epistemological, and anthropological presuppositions, as well as their implications for using the methods. Methodology can thus be regarded as encompassing our concept of "paradigm of inquiry."

Within methodology we can distinguish two different strands: the nomothetic and the ideographic. Nomothetic methodology is the foundation of quantitative research and natural science. It is explanatory, focused on 'hard' methods, deductive, ahistoric, and seeks to discover universal (natural) laws. The latter implies the assumption of their objective existence and that the researcher has no impact on the research findings. Ideographic methodology, on the other hand, is the foundation of qualitative research and, thus, mostly applied in the social sciences and humanities. It is explorative, focused on 'soft' methods, inductive, historic, and seeks to understand and to make sense of the phenomena under investigation. The latter implies, for example, experiences, knowledge, and values held by the researcher have an impact on the research findings.

From the beginnings of science in the history of mankind, science was characterized by monomethodological approaches: there was only one single way to gain knowledge. In his now famous book "Against Method," Feyerabend (1972) questions this monism and argues in favor of methodological pluralism. With the rise of postmodern concepts of science, drawing heavily on the ideas of Feyerabend, methodic and methodological pluralism has gained increasing popularity, especially in the social sciences (Guba, 1990; Hassard, 1993; Rosenau, 1991). As we already have stated above, multimethodological approaches are due to their foundation in incommensurable ontological, epistemological, and anthropological (pre)suppositions prone to the fallacies of an unreflective "anything goes" attitude. This leads to the necessity to develop criteria for the scientificity of multimethodological approaches. An overview on issues of multi-methodology has been given by Mingers and Gill (1997).



Computing Information Technology. The Human Side
Computing Information Technology: The Human Side
ISBN: 1931777527
EAN: 2147483647
Year: 2003
Pages: 186

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net