Method and Quantitative Research Methodologies to Advise Business

Introduction
Research is the scaffolding of any industry. The capability to examine vital areas within a given sector enables the development of better informed decisions, which is paramount to successful competition. Researchers and consultants are realizing the criticality of selecting a suitable methodology to garner the most valuable data pertaining to an organization’s needs.

This report examines various research methods and explores the use quantitative and mixed methods research methodologies, as applied to the field of technology consultation and advising business executives. In addition, the dissimilarities existing among qualitative and quantitative research is also discussed. This report will also openly explore other classifications of research paradigms regarding technology consultation themes, as well as new applications of quantitative and mixed methods research within this topic area.

The focal point of investigating technology consultation and advising senior business executives may initially appear to fall within the realm of those research methodologies that are considered quantitative. Today, many technology and consultation professional publications increasingly note the importance and application of qualitative approaches, most notably when the research is aligned with understanding attitudes, human interactions and perceptions, product preferences and other areas of business that are considered subjective (Green & Preston, 2005). In addition to quantitative methodologies this report will investigate the manner in which mixed methods research, which contain elements of both quantitative and qualitative approaches, are employed within the same broad area of technology consultation. This report will also examine issues surrounding the use of technology consultants in a business setting and how the technology consultant employs various research methodologies to generate and support strategic business decisions.

The field of technology consultation has no formal anchor or starting point within the extant literature. Due to the field being generally considered a unique combination of management consulting and practitioner-based technology expertise, it is often referred to simply as technology consultation (Canback, 1998; Carlson, 2004; Schein, 1999). While many researchers have applied the term consulting to indicate a wide range of business activities, such as, specialized outsourcing and requesting external assistance, this report will consider all activities where advice regarding technology is sought externally as “technology consultation”. Furthermore the terms “researcher” and “consultant” are used synonymously when discussing research topics applied to a business environment. The nonspecific nature of technology consultation and its unique combination of technology and business acumen has led to the presentation of many models within the consulting paradigm (Canback, 1998; Shays, 2003; Stipe & Panko, 1996).

Providing expert consultation that incorporates technology-based data and operational business requirements is the impetus of the technology consulting field (Clevenson, Guimaraes, & Yoon, 1998). As this field within consulting continues to mature, it is probable, based on current trends and the increased organizational dependency on the technology consultant, the subject matter will be incorporated into both qualitative and quantitative studies in the future.

Logical positivism presents a cause-and-effect model, also generally associated with quantitative research methodologies, that has demonstrated a notable degree of importance within the literature regarding technology consultation and how the field is applied to understanding the elements behind IT related initiative failures and successes (Kappelman, McKeeman & Zhang, 2006). In contrast, while important and contributory to the field of technology consulting overall, qualitative research methods have yet to gain a significant foothold in this area due to the strong influence of the technology practitioner (ITGI, 2003).
This report will examine where both research paradigms [quantitative and mixed method] are applied within the area of technology consultation and illustrate the factors that lead and contribute to the researcher and technology consultant proving objective and supportable consultation to business executives.

Evolution of Research Methods
With the beginning of scientific research came the understanding that a single type of method was employed for all subjects; to not use the accepted method by incorporating other approaches was to call into question the findings of the study and cast doubt on the researcher’s abilities. Today, times have boldly changed so as to suggest that within the twenty-first century not only is a mixed method approach acceptable beyond the sole application of quantitative methods, but it is increasingly becoming the modern researcher’s methodology of choice (Watts, 2000).
As the fields of scientific research evolved to become more rigorous and methodical, researchers began to discuss critical areas within many studies that appeared to go unaddressed due to weaknesses associated with the applied methodology of that era. These areas of research first became distinguished as humanities represented by history and the social sciences in the form of ethnography, and natural sciences as realistic and descriptive forms. Through continued evolution scientific research took two major paths: (a) empirical or factual sciences, and (b) non-empirical or more formal and theoretical bodies of knowledge. Empirical research draws conclusions from the gathered data and employs deduction as the main form of reasoning. Deduction is a process by which, given the properties of a population, the researcher is able to infer the properties of a sample of the same population. In contrast, non-empirical research draws conclusions using induction as the main form of reasoning. Induction, within the context of scientific research, is a process by which the researcher, given the properties of a sample of a population infers the properties of the population as a whole.

This distinction between the two forms of reasoning [deduction and induction] facilitated the development of two main approaches to scientific research: quantitative or empirical research and investigation, and qualitative research using exploration and phenomenological design. Robson (2002) also suggests that quantitative research is generally associated with deductive reasoning. This results from the perspective that hypotheses within quantitative studies are based on previously known facts, which are then tested and accepted or rejected based on the results of the analysis. Neuman (2003) suggests that qualitative measurement is inductive with the goal being to allow the researcher to note specific observations that lead to conceptual definitions and concepts. Neuman further adds that using a qualitative approach enables the researcher to iterate the research process permitting the study to unfold, while at the same time allowing the researcher to refine and guide its direction based on the most recent data received.
Campbell and Fiske first introduced mixed methods research, being comprised of both qualitative and quantitative approaches, in the late 1950s (Rocco, Bliss, Gallagher, & Perez-Prado, 2003). The researchers suggested combining qualitative and quantitative research methods so as to accurately measure a psychological trait and wanted to ensure that any variance was attributed to the actual trait and not due to the method itself (Creswell, 1994; Rocco et al., 2003). Greenhalgh (1997) supports this approach and further suggests that the legitimacy of qualitative methods is enhanced through the incorporation of quantitative research methods, otherwise known as triangulation.

Triangulation evolved to include using multiple data collection and analysis methods, multiple data sources, multiple analysts, and multiple theories or perspectives. Other researchers clarified the notion that the purpose of triangulation is to test for consistency rather than to achieve the same result using different data sources or inquiry approaches. Inconsistencies are seen as an opportunity for developing further insight into relationships between the methods chosen and the phenomenon studied, thus allowing researchers and the readers of their reports, alike, to improve their understanding of that phenomenon. (Rocco et al., 2003, p. 20)

Straub, Gefen & Boudreau (2004) support the notion of triangulating research efforts and further suggest that combining the positivists and non-positivists viewpoints can lead to a more viable research approach.

The formal research process can vary depending on the environment and the phenomenon under observation; however, many researchers agree that conducting formal research generally includes nine steps: (a) formulation of the research objectives, (b) development of an in-depth understanding of the applicable theories, (c) development of an analytical framework, (d) development of hypothesis and grid analysis, (e) identification of the participants and aims, (f) collection of experimental material, implementation and operationalization, (g) data sampling, (h) analysis, and (i) drawing supportable conclusions based on the findings of the study (Sogunro, 2002). The characteristics and distinctions existing among the aforementioned research steps and paradigms will be illustrated in the following sections.

Quantitative and Qualitative Analysis
Quantitative research starts with a notion that is conveyed by the researcher via a hypothesis. The researcher, conducting specific measurements, develops data, which is then followed by a conclusion drawn by deduction. Examining the strengths of this approach Greenhalgh (1997) suggests repeatability and reliability are the primary benefits afforded to the researcher: The measurements taken at one point, under closely approximated conditions, yield very similar results during follow-up or future studies. However, due to the characteristic of quantitative research to restrict certain types of analysis, qualitative research is recognized as the research methodology that is more appropriate for studying human relations (Trochim, 2002).

The objective of quantitative research is to develop and employ mathematical models, theories and hypotheses pertaining to natural phenomena. The process of measurement is central to quantitative research because it provides the fundamental connection between empirical observation and mathematical expression of quantitative relationships. (http://en.wikipedia.org/wiki/Quantitative_research)
As human relations is defined (interaction among humans), it becomes obvious as to why quantitative research is unsuitable for many of the research topics and variables associated with developing the necessary depth of understanding within this and other applicable areas [advising executives and technology consulting] (Trochim, 2002).

Qualitative and quantitative methodologies are vastly different approaches to conducting scientific research. Robson (2002) supports this notion and suggests, “quantitative research is associated with the testing of theories, whilst qualitative research is associated with the generation of theories” (p. 46). According to Winter (2000), in quantitative research the researcher is isolated from the participants of the research and plays no role requiring direct contact. Scientific rigor and objectivity, being the mainstays of quantitative research, have led the researcher to employ methods that minimize the injection of bias and the researcher’s interactions with participants of the study.

Quantitative research methodology stems from the theory that science can provide objective declarations regarding the acceptance or rejection of hypotheses, which are based on and derived from deductive reasoning (Bolree, 2003; Creswell, 2000; Creswell, 2003). The theoretical underpinnings of quantitative research can be traced to the early twentieth century, where there was a narrow view regarding quantitative methods as being the only approach of value to providing truth or any real-world substantive results. Today, quantitative research is well embedded in the area of technology consulting, in terms of developing financial models, but does not enjoy the same popularity in other areas of technology consulting that include user acceptance, user interface, and other human factors associated with technology implementations (Brynjolfsson, Malone, Gurbaxani, & Kambil, 1991).
According to Bolree (2003), the fundamental idea of logical positivism (frequently referred to as quantitative science) is that all knowledge has a basis in empirical observation, which is then supported through the application of mathematical and statistical processes. Bolree (2003) also offers the idea of the verification principle, which asserts that meaning is derived from theoretical declarations only if they can be tested empirically.

Straub et al. (2004) reiterate that deduction is the main concept [mindset] used by the positivist researcher and provide four steps employed during the deductive process:
1. Testing internal consistency, i.e., verifying that there are no internal contradictions.
2. Distinguishing between the logical basics of theory and its empirical, testable, predictions.
3. Comparison with existing theory, showing that the new theory advances knowledge.
4. Empirical testing aimed at falsifying the theory with data.
(http://dstraub.cis.gsu.edu:88/quant/)
In contrast to the quantitative researcher, Straub et al. (2004) assert that the qualitative researcher views the world as a social construction that varies depending on the observer’s and interpreter’s frame of mind and the specific phenomenon under observation. In this regard reality is viewed as highly subjective. Coates (2004) supports this perspective of qualitative inquiry and adds that the qualitative approach, while not having a universally accepted definition, is a process uniqely established to investigate social issues. Which in the case of business and technology, there are often complex research problems associated with these combined areas.

One of the most widely used quantitative research approaches, next to descriptive statistics, is hypothesis testing (Creswell, 2003). Testing hypotheses involves the researcher drawing from the existing body of knowledge a proposition of causal and resultant relationship within some phenomenon. For top management decision cases, such a hypothesis can include hypothesized dependence of company profitability on the demand for different company products, fixed costs and changes in the company variable costs, or expansion of company markets (Cooper & Schindler, 2006). According to Creswell (2003) hypothesis formulation is one of the more complex tasks required of the researcher, and is not purely a matter of utilizing a statistics and/or quantitative research expert. This being the case, it becomes obvious as to why business mangers and executives must employ expert researchers [consultants] when seeking answers to critical organizational problems.

The perspective that usefulness and truth are derived from quantitative methods is readily observed within the area of technology consultation; however, there are those that debate the complete dependence on this paradigm for the creation of new knowledge and urge a more human-oriented approach (Johnson & Onwuegbuzie, 2004). Other researchers and scholars in the field propose that consulting theory and practice is ever changing, and coupled with the added frequency of change due to technology, one approach to conducting research is not sufficient to cover the dynamics of the two areas combined (Canback, 1998; Lewis, 2005). Only under the most exceptional circumstances do the areas of consulting and technology adoption or implementation apply persistent rules governing research and theoretical application. According to Lewis (2005) within the area of technology consulting, new practices, theories, and technological innovations continue to rapidly replace the older methods of conducting and applying the results of scholarly research.

Mixed Methods Analysis
Johnson and Onwuegbuzie (2004) eloquently define mixed methods research as “the class of research where the researcher mixes or combines quantitative and qualitative research techniques, methods, approaches, concepts or language into a single study” (p. 12). When combined correctly together by expert researchers, quantitative methods are given a greater ability to reveal relevant truth and enhance the accuracy of the usefulness of the findings.

The goal of mixed methods research is not to replace either of these approaches but rather to draw from the strengths and minimize the weaknesses of both in a single research study and across related studies. If you visualize a continuum with qualitative research anchored at one pole and quantitative research anchored at the other, mixed methods research covers the large set of points in the middle area. If one prefers to think categorically, mixed methods research sits in a new third chair, with qualitative research sitting on the left side and quantitative research sitting on the right side. (Johnson & Onwuegbuzie, 2004, p. 14-15).

According to Minger (2001) many traditional researchers consider the results from qualitative research to be less robust and therefore less reliable than those provided by more traditional quantitative methods. Adcock & Collier (2001) posit that data validity and accuracy of measurement are two aspects paramount when conducting scholarly research. Much of the debate stems from the manner in which the researcher is required to interact with participants of the study. Many traditional researchers accept as true that as a consequence of this interaction, the results of the study are not sufficiently objective and cannot withstand objective scientific scrutiny (Kuhn, 1996). Peebles (2006) advocates that many studies within the field of management consulting make an effort to camouflage this potential flaw with the research perspective of logical positivism, which can provide a veil of objectivity. In reference to technology consulting, a phenomenological perspective (equated here with qualitative inquiry) could examine how business executives view the growing organizational dependency on the consultant. Or a logical positivist perspective (equated here with quantitative measurement) could propose to test a hypothesis that seeks to prove if consultants are employed more in large organizations than in small organizations. It is likely that a combination of these methods could provide results that reflect an entirely new way of exploring technology consulting, taking into account the consulting environment existing today (Johnson & Turner, 2003).

Regarding mixed methods Greenhalgh (1997) asserts that mixed methodologies, ” illustrate a commendable sensitivity to the richness and variability of the subject matter” (p. 740). It is therefore logical to assume that mixed method research is capable of providing richer perspectives regarding the technology and consulting combined paradigms. It should be noted that it is not uncommon to discover articles within the extant literature that are political or prejudicial in nature or favor one opposing viewpoint over another without the accompanying scholarly evidence. This, while it is not an absolute, has the potential to increase the real or perceived level of bias. It is this real or perceived increase in bias that limits a study’s scientific application and usability within the consulting field.

Quantitative and Mixed Methods Comparison
As previously mentioned, quantitative research has been the venerable convention regarding empirical knowledge in many sciences. Within organizations and business environments quantitative research methods have evolved from the traditional application of evaluating quality control processes and the like to utilizing survey instruments to measure subjective issues such as team performance, organizational change, and societal factors influencing the organization (Price, 1997). Mathematical and statistical processes used to explore quantitative data afford the researcher a structure from which a comprehensive analysis can be conducted. Within the realm of scientific research, the rationale for studying a particular phenomenon may include developing a richer perspective regarding causality or specific relationships among the data, general exploration, description, or prediction (Cooper & Schindler, 2003). Ultimately, even with the wide range a reasons to study phenomenon, the selection of the appropriate research methodology remains paramount. Additionally, Winter (2000) espouses that regarding all aspects of gathering data and conducting formal research there is a direct correlation regarding “how the results of the research are to be internally or externally generalizable” (p. 7).

Research comprised of mixed methods, unlike pure quantitative or qualitative approaches, offers unique benefits outside of merely providing a veil of objectivity. Regarding mixed method research Rocco et al. (2003) contend “. mixed methods research allows for the exploratory process beginning with empirical evidence of the particular process and leads to a level of abstracting/theorizing/generalizing and the confirmatory process of hypothesis testing of theories” (p. 21).

Proponents of mixed method research apply triangulation as a means to control the limitations and weaknesses associated with the more traditional quantitative and qualitative research approaches (Hall & Rist, 1999). Abusabha & Woelfel (2003) provide three conditions under which mixed methods research is appropriate, “when all data have both an objective and a subject component the researcher requires cross-validation of the results the researcher seeks to cancel out, somewhat, the corresponding weaknesses associated with pure qualitative and quantitative approaches” (p. 569).

Conclusion
Providing sound consultation to business executives is paramount within the technology-consulting arena. The ability to identify quality information and to recognize the solid reliable business research on which high-risk decisions can be based provides the foundation for making solid well-informed business decisions (Cooper & Schnidler, 2006).

Technology consultants [researchers] can have a positive influence at the strategic and tactical levels. Many consultants are involved in conducting extensive secondary research for their clients. As a result, consultants often have a significant influence in determining research design, of both customer research and the selection of proprietary models (Copper & Schindler, 2006). In cases where the consultant does not personally conduct the data collection, they are often involved in interpreting the results (Copper & Schindler, 2006; Robson, 2002). With a variety of factors to consider such as, organizational size, research subject, funding, timeframe, and overall goals of the research, some consultants elect to conduct both qualitative studies (focus groups and expert interviews) and quantitative studies (survey, observation, and descriptive) on knowledge, attitudes, opinions, and motivations as they seek new opportunities or solutions to their clients’ problems.

Regarding the use of quantitative and mixed methods research for business applications, the researcher must be able to determine what constitutes good research practices to ensure the study generates dependable data through professionally applied practices, which in turn can be used for decision making. As previously stated the quantitative research approach attempts to make precise measurements of reality. In terms of business research this can be consumer behaviour, employee knowledge, or market analysis; furthermore, quantitative methodologies provide supporting rationale and justification regarding business-oriented questions where statistical analysis and numerical manipulation procedures are required. While there are many methodologies of quantitative research, the survey is considered the most widely used for data gathering within a business setting (Cooper & Schindler, 2006). The results of survey research often assist organizations in making critical business venture and investment decisions, as well as support many quality assurance and organizational performance programs (Robson, 2002).

The purpose of mixed method research within the realm of enabling management to make better decisions is two fold: to provide an additional layer of validity to qualitative studies, and to allow the researcher and research sponsor to become actively involved in data collection and data interpretation. Mixed method research allows the researcher to become immersed in the phenomenon under study and therefore any knowledge gained can be used to adjust the data extracted from the next participant (Cooper & Schindler, 2006). Data derived from a quantitative study often consist of responses that are coded and categorized for the purpose of conducting statistical analyses and numerical manipulation. In slight contrast, mixed method research allows the researcher and consultant to, not only conduct statistical analyses, but also to convey subjective thoughts and feelings regarding events, situations, and interactions. This leads to more robust findings that could potentially be more wide-ranging and applicable to the business phenomenon under study.

Executives that utilize mixed methods of both qualitative and quantitative approaches usually seek a richer understanding of reality. Greenhalgh (1997) suggests that those researchers that employ mixed method research, as well as the business executive relying on the findings of the study “realize the necessity to study participants in their natural setting so as to make sense of phenomena in terms of understanding the unique meanings and complexities associated with human behaviour” (p. 740).

The importance of skilled professionals conducting and interpreting research results, either quantitative or mixed method, cannot be overstated. In most business research scenarios a formal methodology is based on the purpose of the research, its schedule, its budget, the topics being studied, and the researcher’s skills and preferences. Without skilled researchers to oversee and interpret formal research efforts and results, even the best applied methodologies and most appropriate studies are predestined for failure (Cooper & Schindler, 2006).

There are many tools available to the consultant regarding the unique application of both quantitative and mixed method research within the business setting. For example, consultants can employ data analysis packages that use quantitative historical data drawn from business integration and business process reengineering efforts. These data analysis tools along with expert interviews can be leveraged together in order to analyze specific product markets, projections of future product demand, outsourcing selection, partnerships, consumer trends, and other product life cycle management issues. Thus, these are examples of important business decisions, with varying degrees of strategic impact, in which a combination of research approaches might be helpful.

Mixed method research involving human relations, which aligns well with the consulting paradigm, is necessary in helping organizations develop a clearer understanding of matters involving human interaction and organizational transformation (Scott, 2003). Sustaining a continuous awareness of human relations research is crucial to providing the best opportunity for creating a team-oriented environment and increasing individual employee satisfaction and performance (Scott, 2003). According to Storey (1998) without hands-on knowledge of recent findings throughout the field, human resource departments would not be fully capable of effecting positive change in their workers’ environment regarding the application of relevant findings. Storey (1998) further advocates
From the theoretical perspective the contribution to the organization stemming from human resource management flows naturally from the kind of emphasis now given in mainstream business strategy courses to concepts such as core competences, intangible assets, intellectual capital, organizational capability and knowledge management. (p. 273)

Both quantitative and mixed method research methodologies possess unique strengths and weaknesses with each having the potential to be employed for top management advisory purposes. The specific disadvantages of all methodologies within each of these two major approaches [mixed method and quantitative] must be considered prior to executing any research study, so as to mitigate potential errors and to ensure the correct alignment of the approach with the intent of the research (Cooper & Schindler, 2006; Dale, 2005). With regard to mixed methods, these are best employed when senior management desires to understand attitudes of the staff regarding certain aspects of a study, such as organizational practices or processes, or management requires a descriptive response to their business questions. Other ways the technology consultant can use mixed method research can be to elaborate on quantitative research results, guide the selection of cases in qualitative studies, identify unobserved heterogeneity in quantitative data, develop a richer understanding of statistical findings, discover quality problems with quantitative measurement instruments, and examine the scope of results from a qualitative study (Dale, 2005).

In the context of researching a question posed from a business perspective, quantitative methods continue to prove important. Consultants and researchers often find quantitative methods valuable regarding the creation and analytical and discrete models that illustrate, solve, or predict problems relating to economics and finances, quality assurance, and operations (Clevenson, Guimaraes, & Yoon, 1998; Hagedorn, 1982). It is apparent, based on this research and the resounding conclusions and recommendations stemming from the literature review that under the appropriate conditions, both mixed and quantitative methods are suitable to advise the business executive.

References
Abusabha, R., & Woelfel, M.L. (2003). Qualitative versus quantitative methods: Two opposites that make a perfect match. Journal of the American Dietetic Association, 103 (5), 566-569.

Adcock, R., & Collier, D. (2001). Measurement validity: A shared standard for qualitative and quantitative research [Electronic version]. American Political Science Review, 95 (3), 529-546.

Barcus, S., &Wilkinson, J. (1986). Handbook of management consulting services. New York: McGraw Hill Book Company.

Biech, E. (1999). The business of consulting: The basics and beyond. San Francisco: Jossey-Bass/Pfeiffer.

Biswas, S., & Twitchell, D. (2002). Management consulting: A complete guide to the industry. New York: John Wiley.

Block, P. (1999). Flawless consulting: A guide to getting your expertise used (2nd ed.). San Francisco: Jossey Bass Publishers.

Bower, M. (1982). The forces that launched management consulting are still at work. Journal of Management Consulting, 1(1), 4-6.

Brynjolfsson, E., Malone, T.W., Gurbaxani, V., & Kambil, A. (1994). Does information technology lead to smaller firms? Management Science, 40(12), 1628-1644.

Canback, S. (1998). The logic of management consulting. Journal of Management Consulting, 10(2), 3-11.

Carlson, L. W. (September October 2004). Using technology foresight to create business value. Research Technology Management, 5160.

Carlson, S. (2004). A hard eye on what IT buys [Electronic version]. Chronicle of Higher Education, 50(36), A35-A36.

Carr, N.G. (2004). Does IT matter? : Information technology and the corrosion of competitive advantage. Boston: Harvard Business School Press.

Clevenson, A., Guimaraes, T., & Yoon, Y. (1998). Exploring expert system success factors for business process reengineering. Journal of Engineering and Technology Management, 15, 179-199.

Green, A., & Preston, J., (2005). Speaking in tongues – diversity in mixed methods research. International Journal of Social Research Methodology, 8(3), 167-171.

Greenhalgh, T. (1997). Papers that go beyond numbers. British Medical Journal, 315, 740.

Hall, A.L. & Rist, R.C. (1999). Integrating multiple qualitative research methods (or avoiding the precariousness of a one-legged stool) [Electronic version]. Psychology & Marketing, 16(4), 291.

Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researchers, 33(7), 14-26.

Johnson, R. B., & Onwuegbuzie, A. J. (2004, April). Validity issues in mixed methods research. Paper presented at the annual meeting of the American Educational Research Association, San Diego, CA.

Johnson, R. B., & Turner, L. A. (2003). Data collection strategies in mixed methods research. In A. Tashakkori, & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioural research (pp.297319). Thousand Oaks, CA: Sage.

Kuhn, T. S. (1996). The structure of scientific revolutions. Chicago: The University of Chicago Press.

McEvilley, M. (2002). The essence of information assurance and its implications for the Ada community. Special Interest Group of Ada. Retrieved November 4, 2006, from http://www.acm.org/sigada/.

Minger, J. (2001) Combining IS research methods: Towards a pluralist methodology. Information Systems Research, 12(3), 240259.

Neuman, W.L. (2003). Social research methods: qualitative and quantitative approaches. Boston: Allyn Bacon

Patton, M.Q. (2002). Qualitative research and evaluation methods. (3rd ed.) Thousand Oaks, CA: Sage.

Pfeffer, J, & Sutton, R. I. (January 2006). Evidence-based management. Harvard Business Review, 84(1), 63-74.

Quantitative research (2006). Retrieved October 20, 2006, from http://en.wikipedia.org/wiki/Quantitative_research.

Robson, C. (2002). Real world research (2nd ed.). Malden, MA: Blackwell Publishing.

Remenyi, D., & Money, A.H. (2000). The effective measurement and management of IT costs and benefits. Oxford, England: Boston, Butterworth-Heinemann.

Rocco, T.S., Bliss, L.A., Gallagher, S., & Perez-Prado, A. (2003). Taking the next step: Mixed methods research in organizational systems. Information Technology, Learning and Performance Journal, 21(1), 1929.

Scott, K. B. (2004, March). An analysis of factors that have influenced the evolution of information assurance from World War I through Vietnam to the present. Unpublished master’s thesis, Department of Systems and Engineering Management, Graduate School of Engineering and Management, Air Force Institute of Technology, Wright Patterson Air Force Base, OH.

Scott, W. (2003). Organizations: Rational, natural, and open systems (5th ed). Upper Saddle River, NJ: Prentice Hall.

Serena Software. (2005). The impact of Sarbanes-Oxley on IT and corporate governance. Retrieved October 29, 2006, from http://techfinder.theinquirer.net/shared/write/vnu/collateral/WTP/10000000248_01655_06186_49041_76942_85573_Sarbox_White_Paper.pdf?ksi=10000011056&ksc=21260749240.

Sogunro, O.A. (2002). Selecting a quantitative or qualitative research methodology: An experience [electronic version]. Educational research quarterly, 26(1), 3-10.

Storey, J. (1998). Strategic Human Resource Management. London: Sage/Open University Business School.

Straub, D., Gefen, D. and Boudreau, M. (2004). The IS World Quantitative, Positivist Research Methods Website. Retrieved November 16, 2006 from http://dstraub.cis.gsu.edu:88/quant/

Trochim, W.M. (2002). Qualitative measures. Retrieved November 22, 2006, from http://trochim.human.cornell.edu/kb/qual.htm.

Watts, B.L. (2000). Mixed methods make research better. Marketing News 34(5), 16.

Winter, G. (2000). A comparative discussion of the notion of ‘validity’ in qualitative and quantitative research [Electronic version]. The Qualitative Report, 4 (3-4). Retrieved November 10, 2006, from http://www.nova.edu/ssss/QR/QR4-3/winter.html.