Proposal
At the beginning of the 21st century, information technology and communication become an integral part of the corporation. Information technologies help organizations to manage and control all processes, develop their infrastructure and change. The most rapidly developing topic related to technology in recent years has been IT, with the developments coming so fast that everyone has difficulty keeping up with them and developing conclusive interpretations about their effects on organizations. The rapid advent of computer applications, the Internet, and other forms of information and communication technology have major implications for organizations and their management, but people have trouble saying exactly what effects they have and why. As for effects on public organizations, especially until recently, research has been scarce. The rational for the research paper is to investigate and analyze new trends in information systems application and their impact on communication, effects on corporate design, structure and productivity. It is assumed that advances in technology, especially computer, information, and communications technology, have presented organizations and managers with dramatic new challenges and opportunities, and researchers have been pressing to develop the theoretical and research grounding needed to understand and manage these developments.
Increasingly, organizational researchers and managers have to try to assess the influence of information technology on organizational design. The advent and dissemination of computers, the Internet, e-mail, and other forms of information and communication technology have transformed organizations and working life within them and continue to have dramatic effects. Managers’ strategic choices also determine structure. Managers may divide up an organization into divisions and departments designed to handle particular markets, products, or challenges that have been chosen for strategic emphasis. Organizations face varying degrees of uncertainty depending on how much more information they need than they actually have. As this uncertainty increases, the organizational structure must process more information. Organizations employ a mix of alternative modes for coordinating these activities. First they use the organizational hierarchy of authority, in which superiors direct subordinates, answering their questions and specifying rules and procedures for managing the information processing load. As uncertainty increases, it overwhelms these approaches. The next logical strategy, then, is to set plans and goals and allow subordinates to pursue them with less referral up and down the hierarchy and with fewer rules.
The objectives of the research are to (1) identify the main applications of information technology in a given organization; (2) analyze their impact on a corporate organization, (3) identify the main trends and factors which influence implementation of information technology and (4) discuss the impact on information technology on employees’ relations. Experts on IT tend to report that the more salient effects in industry include the extension of computing technology into design and production applications, such as computer-aided design, in which computer programs carry out design functions, and computer-aided manufacturing, in which computers actually control machinery that carries out the manufacturing process. Computer-integrated manufacturing links together the machinery and the design and engineering processes through computers. Ultimately, an integrated information network links all major components of the organization, including inventory control, purchasing and procurement, accounting, and other functions, in addition to manufacturing and production. These developments, according to expert observers, support an evolution from mass production to mass customization, where manufacturers and service organizations produce large quantities of goods and services that are more tailored to the preferences of individual customers than previously possible. In addition, observers suggest that computerized integration of production processes has effects on organizational structures and processes. Computer-integrated manufacturing reportedly moves organizations toward fewer hierarchical levels, tasks that are less routine and more craftlike, more teamwork, more training, and more emphasis on skills in cognitive problem solving than in manual expertise. These components of the framework combine to influence the way technological initiatives play out. The framework helps to explain why even very similar technological initiatives can have very different outcomes, because of different organizational and institutional influences on their implementation. Fountain also describes how such influences raise formidable challenges for successful utilization in government, given the strong, often entrenched organizational and institutional influences.
The research will pay a special attention to computer technology and its impact on a corporate organization. Computer technology and the Internet have also become more influential in organizational decision-making processes. For many years organizations have been using computers to store large data sets and retrieve information from them, but more recently the capacity for active utilization of that data has advanced, so that computer-based management information systems (MIS) have become very common. A MIS typically provides middle-level managers with ready access to data they can use in decision making, such as sales and inventory data for business managers, and client processing and status data for managers in public and nonprofit organizations. Decision support systems provide software that managers can use interactively. Many organizations currently utilize geographic information systems (GIS), which provide information about facilities or conditions in different geographic locations. A GIS might allow a planner to designate any particular geographic location in a city and pull up on the computer screen a diagram showing all the underground utility infrastructure, such as pipelines and electric cables, at that location. An executive information system provides MIS-type support, but at a more general, strategic level, for the sorts of decisions required at higher executive levels.
It is supposed that information technology allows a corporation greater decentralization of functions thus ensuring effective management and control. Computers, the Internet, electronic mail, and other forms of information and communication technology make possible more elaborate and interactive networking of people and organizational units, both within and between organizations. Some organizations have moved away from traditional hierarchical and departmental reporting relationships to forms of virtual organization and dynamic network organization, in which a central hub coordinates other units that formally belong to the same organization, as well as organizations formally outside the hub organization (such as contractors or agencies with overlapping responsibility for public agencies), via e-mail and the Internet. Advances in IT reportedly lead to smaller organizations, decentralized organizations, better coordination internally and with external entities.
Literature Review
The field of organization theory provides many valuable concepts and insights. It raises an important issue for those interested in corporate design and management, application of information systems to corporate settings and in modern business. The book Does IT Matter? by N. G. Carr (2004) discusses the problem of IT application in modern business, pros and cons of different information systems and their impact on a corporation. The book discusses the problems of technological transformations and universal strategy, IT investments and technological change. The questions asked about how much the work involves the same tasks and issues, how easy it is to know whether the work is being done correctly, and similar issues. The researchers found relationships between the structures and coordination processes in organizational units and the nature of their tasks. Some units, such as units that handled applications for unemployment compensation, had tasks low in uncertainty (low in variability and difficulty). The employees mainly filled out and submitted application forms for the persons who came in to seek unemployment compensation. These units had more plans and rules and fewer scheduled and unscheduled meetings than other units, and relatively little horizontal communication among individuals and units. Other units had tasks higher in task uncertainty, such as the unemployment counseling bureau, which helped unemployed people seek jobs. This task involved many variations in the characteristics of the clients—in their needs and skills, for example—and often there was no clearly established procedure for responding to some of these unique variations. In this bureau, employees relied little on plans and rules and had more scheduled and unscheduled meetings and more horizontal communication than other units. Units that were intermediate on the task dimensions fell in the middle ranges on the structural and coordination dimensions. So, in many government agencies, in spite of the external political controls, subunits tend toward more flexible structures when they have uncertain, nonroutine, variable tasks. Similarly, many organizations have purposely tried to transform routine work into more interesting, flexible work to better motivate and utilize the skills of the people doing it.
The book X-Engineering the Corporation by J. Champy (2002) pays a special attention to change management and its application to information technology. Also complicating the analysis of technology, various studies have found weak relationships between structure and technology, sometimes finding that size influences structure more than technology does. Research indicates that technology shows stronger effects on structure in smaller organizations than in larger ones. Similarly, the effects of task characteristics on structure are strongest within task subunits; that is, the task of a bureau within a larger organization has a stronger relationship to the structure of that bureau than to the structure of the larger organization. In sum, size, technology, structure, and other factors have complex interrelationships. The author underlines that many contemporary organizations operate under such great uncertainty that these basic modes become overloaded, so they must pursue additional alternatives. First, managers can try to reduce the need for information. They can engage in environmental management to create more certainty through more effective competition for scarce resources, through public relations, and through cooperation and contracting with other organizations. They can create slack resources (that is, create a situation in which they have extra resources) by reducing the level of performance they seek to attain, or they can create self-contained tasks, such as profit centers or groups working independently on individual components of the work. Alternatively, managers can increase information processing capacity by investing in vertical information systems, such as computerized information management systems, or by creating lateral relations, such as task forces or liaison personnel. Thus, managers have to adopt coordination modes in response to greater uncertainty and information processing demands.
In recent work, Information Systems Management G. Philip (2007) exemplifies the movement among organizations and organization design experts toward increasing emphasis on flexibility and rapid adaptation to complex and quickly changing challenges. The technostructure consists of analysts who work on standardizing work, outputs, and skills—the policy analysts and program evaluators, strategic planners, systems engineers, and personnel training staff. The support staff units support the organization outside the work flow of the operating core—for example, mail room, food service, and public relations personnel. The different positions must be coordinated through the design of the organization’s superstructure. All organizations do this in part through unit grouping, based on any of a number of criteria: knowledge and skill (lawyers, engineers, social workers), function (police, fire, and parks and recreation employees; military personnel), time (night shift, day shift), output (the products produced by the different divisions of a corporation), clients (inpatients or outpatients; beneficiaries of insurance policies), or place (the regional offices of business firms, the federal government, and many state agencies; precincts in a city police department). An action-planning system, by contrast, specifies not the general result or standard but the details about actions that people and groups are to take. In the modules just mentioned, the applications from clients are placed in file folders that move from point to point in the modules as different people do their part of the work on the case. The filing clerks are trained in a system for moving and keeping track of the files—there are many thousands of them for each module—so they will not be lost—and can be located at any given time. As the clerks move the files around the module, they log them in when they arrive at certain points, using a bar code scanner similar to those used in supermarkets. The careful specification of the actions of the file clerks in this file-tracking system is essential to coordinating the different specialists in the module and to assessing the coordination of the work among all the modules.
Simon (2007) examines the problem of a corporate organization and its dependence on technology.. The analysts discussed in the preceding historical review have either concentrated on industrial organizations or sought to develop generic concepts and theories that apply across all types of organizations. Such organizations might respond to differences in size in different ways than do other organizations, such as business firms. When the contingency theorists analyzed environments, they typically concentrated on environmental uncertainty, especially as a characteristic of business firms’ market environments, and showed very little interest in political or governmental dynamics in organizational environments.
Laudon & Laudon (2005) analyze technology in terms of the type of interdependence among workers and units the work requires. Organizations such as banks and insurance companies have mediating technologies. They deal with many individuals who need largely the same set of services, such as checking accounts or insurance policies. Their work involves pooled interdependence because it pools together such services and sets of clients. They establish branches that have little interdependence with one another and formulate standardized rules and procedures to govern them. One unit completes its work and passes the product along to the next unit, which completes another phase of the work, and so on. Plans and schedules become an important coordination tool for these units. Units with intensive technologies have a reciprocal pattern of interdependence. The special units in a hospital or a research and development (R&D) laboratory need to engage in a lot of back-and-forth communication and adjustment in the process of completing the work. These units must be close together and coordinated through mutual adjustments and informal meetings. Thompson contended that organizations may have all these forms of interdependence. They will first organize together those persons and units that have reciprocal interdependence and require high levels of mutual adjustment. Then they will organize together those units with sequential interdependence, and then group units with pooled interdependence. Analyzing many studies of structure, Philip (2007) found some support for Laudon & Laudon (2005) observations. Philip and others concluded that studies have tended to find that organizational units with high interdependence were much less likely to have a lot of standardized work procedures than organizations with low interdependence.
Another very influential perspective on information technology, developed by Baschab et al (2007), argues that work processes vary along two main dimensions: the frequency with which exceptions to normal procedures arise and the degree to which these exceptions are analyzable (that is, the degree to which they can be solved through a rational, systematic search). If a machine breaks down, often a clear set of steps can lead to fixing it. If a human being breaks down psychologically, usually few systematic procedures lead as directly to diagnosis and treatment. Organizational technologies can rank high or low on either of these two main dimensions. Routine technologies involve few exceptions and provide clear steps in response to any that occur (high analyzability). In such cases, the work is usually programmed through plans and rules, because there is little need for intensive communication and individual discretion in performing the work. For examples of routine technology, researchers usually point to the work of many manufacturing personnel, auditors, and clerical personnel. At the opposite extreme, nonroutine technologies involve many exceptions, which are less analyzable when they occur. Units and organizations doing this type of work tend toward flexible, “polycentralized” structures, with power and discretion widely dispersed and with much interdependence and mutual adjustment among units and people. Units engaged in strategic planning, R&D, and psychiatric treatment apply such nonroutine technologies.
Craft technology involves infrequent exceptions but offers no easily programmed solutions when they occur. Government budget analysts, for example, may work quite routinely but with few clear guidelines on how to deal with the unpredictable variations that may arise, such as unanticipated shortfalls. These organizations tend to be more decentralized than those with routine technologies. Engineering technology involves many exceptions but also offers analyzable responses to them. Engineers may encounter many variations, but often they can respond in systematic, programmed ways. Lawyers and auditors often deal with this type of work. When an Internal Revenue Service (IRS) auditor examines a person’s income tax return, many unanticipated questions come up about whether certain of the person’s tax deductions can be allowed. The auditor can resolve many of the questions, however, by referring to written rules and guidelines.
Dennings (2001) states that organizations with engineering technologies tend to be more centralized than those with nonroutine technologies, but more flexibly structured than those with routine technologies. The author reviewed numerous studies that showed that organizational units with routine technologies had more formal rules and procedures and fewer highly educated and professional employees. Concerning internal and external coordination, most large government agencies, like business firms and nonprofit organizations, now have an intranet, an Internet-based network within the organization with access restricted to designated organizational members. To maintain security of data about individual citizens and about such sensitive matters as national security, these intranet arrangements usually require elaborate provisions for controlled access. Some government employees now carry with them devices that periodically inform them of newly assigned access codes for their agency’s intranet because the codes are changed periodically as a security precaution. These examples indicate that IT has provided significant improvements and opportunities for government, its employees, and the clients of government agencies.
In the study Fluency with Information Technology: Skills, Snyder (2007) examines the main concepts of information technology and its application in modern corporation. As one might expect, IT raises many challenges for managers in government, some of which are daunting. Some of these issues are new, but some involve application of the topics covered in this book and challenges similar to those encountered in managing any significant operation or initiative. Executives and managers confront challenges in strategic planning for IT itself and in integrating IT into more general plans and strategies, as well as in procurement and purchasing, creating organizational structure and designs to incorporate IT and adapt to it, training, recruiting, and many other areas. The author finds that the framework treats IT developments as emerging from interactions among objective technologies such as computer hardware and software, organizational forms such as bureaucracies and networks, and institutional arrangements such as cultural and legal conditions.
Two research studies McAfee (2006) and Danziger and Andersen (2002) examine the problem of information technology in a particular organization. Danziger and Andersen (2002) examines the effects of IT in Public Administration and its pros and cons for this types of organization. The managers perceived that any structural changes caused by IT implementation in public agencies have little impact on organizational performance (measured as improved ease of communication and improved technical decision making). However, the managers tended to regard IT adoption as having a direct positive impact on improving technical decision making (as opposed to an impact on decision making by way of influences on structure). County government managers may have different responses to developments in IT than state and federal managers, the lack of perceived structural effects of IT is striking. McAfee (2006) pays attention to corporation in developing countries and possible difficulties caused by economic and social growth patterns.
In sum, organization theorists have generally addressed structure from a generic perspective, devoting little attention to the distinctive structural attributes of public organizations, even though some important studies have concentrated on public agencies. These general points apply to most organizations, however, and the discussion here gives examples specifically involving public organizations. A number of studies indicate that an organization’s structure also depends on the nature of its work processes, or technologies and tasks. Researchers use a wide variety of definitions of technology and tasks, such as the interdependence required by and the routineness of the work.
Methodology
The framework of the research will be based on qualitative data collection methods. The research study will be based on observations and case study methods. Observation is the most frequent data-collection method used in qualitative research. Quantitative research begins with theory. From theory, prior research is reviewed (; and from the theoretical frameworks, hypotheses are generated. These hypotheses lead to data collection and the strategy needed to test them. The data are analyzed according to the hypotheses, and conclusions are drawn. These conclusions confirm or conflict with the theory, thereby completing the cycle.
First, participant observation (in which the observer is obvious to and involved with the subjects) is less valid than a questionnaire would be for sensitive data. Second, the observer’s expectations affect what he or she sees and reports, reducing the validity of the data. Third, and even more complex, is the lack of expectations that results when no structure is a priori given to the observer. If only one observation is taken, it will reflect one small portion and will not capture the essence of the culture or the situation. An example of this would be a visitor from outer space coming to Ohio during the winter and reporting that there were trees without leaves. Another outer-space visitor arriving in summer would report trees with leaves. Both would be accurate in their observations, but neither would accurately reflect the actual situation. One other caution exists: the observer’s sense perceptions are not always accurate. Particularly in a nonstructured observation situation, the element of surprise can dominate the sensory input of the observer, rendering reported data invalid. All validity concerns described here affect both participant and nonparticipant observations. Compared to participant-observation strategies, the validity of “nonparticipant-observation strategies is greater because there is no reactivity among the subjects to the presence of the researcher. This reduction in bias, however, does not cancel out the other biasing (invalidating) effects (Denzin and Lincoln 1995).
The main criteria applied to the selection of organization are large modern organization with 1000 and more employees. The organization should introduce information technology no less than 1 year ago. These criteria will help the research study to ensure quality of data and information collected for this research. First, because the subjective bias of the observer affects his or her reporting, having several observers from several backgrounds (or points of view) report on the same phenomena can increase validity. Coalescing their data reduces sensory-deficiency and misinterpretation error. Second, structuring the observation increases validity by focusing the attention of the observers on certain characteristics and events. Third, placing the observation on a scientific foundation by stating a hypothesis up front increases validity by avoiding distortion. Fourth, nonparticipant observation, as opposed to participant observation, increases validity. And, fifth, using observation only for studying those phenomena that are appropriate to this method (e.g., nonverbal behaviors and social interactions) increases validity (Denzin and Lincoln 1995).
The case-study method is one more design strategy under the qualitative rubric. Case studies can be single-subject designs or based on a single program. At the beginning, the research issues will be turned into more specific and researchable problems, followed by techniques and examples of how to collect, organize, and report case-study data. In addition, she argues that case study is a helpful procedure when one is interested in such things as diagnosing learning problems, undertaking teaching evaluations, or evaluating policy. Consistent with assumptions of qualitative research philosophy, the critical emphasis in case studies is revealing the meaning of phenomena for the participants. Denzin and Lincoln (1995) acknowledge this assumption, claiming that case-study knowledge is concrete, contextual, and interpreted through the reader’s experience. He prefers case-study methods because of their epistemological similarity to a reader’s experience. He particularly notes the reasonableness of assuming the natural appeal of the case approach. Case-study data come from strategies of information collection that have been described in Figure 2: interviews, observations, documents, and historical records. The three steps in conducting a case study: (1) assemble raw case data; (2) construct case record and (3) write case-study narrative (Dicks et al 2005).
Validity is considered to be an advantage of case studies because of their compatibility with reader understanding; in other words, they seem natural. The validity limitations that have already been put forth in this book related to observational data apply to case studies as well. However, the counterbalancing of information from documents with data from observation and interviews strengthens the resulting validity. Invalidity of one set of data can be checked by conflicting or supporting results from the other sources, which is a type of triangulation. Case-study methodology has potential for increased validity for several reasons. First, because multiple data-collection techniques are used (e.g., interview, document study, observation, and quantitative statistical analysis), the weaknesses of each can be counterbalanced by the strengths of the others. Conclusions related to a certain aspect of a phenomenon under study need not be based solely on one data source. Second, validity may be increased by checking the interpretation of information with experts. Third, with case studies there are generally a variety of data sources. There should be a structural relationship among these sources (Dicks et al 2005). To the extent that these findings are consistent within the case, the validity is enhanced. Conceptually, this is similar to giving a battery of tests to obtain an estimate of consistency in the underlying constructs. Fourth, using a scientific method in which one hypothesizes something about the case and collects data to determine if the hypothesis should be rejected could add to validity and also help future researchers determine starting places for their research. All of these approaches would tend to improve understanding of the case and give in-depth descriptive information. To estimate the applicability of this study, one needs deep descriptors to clearly define the characteristics of the sample. Description is not of sufficient detail to have a clear sense of socioeconomic status, culture, and so on.
The object of persistent observation as a design feature is to achieve depth of meaning from the data (i.e., what seems salient in the setting). Like the other criteria, it was originally described for ethnographic research. To comply with this criterion, the researcher focuses in detail on the most relevant factors in an ethnographic study. The emerging domains of meaning, then, are based on a depth of understanding. To apply this characteristic to the Fuller study (not an ethnographic study) requires examining how the researcher determined what labels to apply to the emerging themes of the managers’ experiences (Dicks et al 2005).
It is assumed that any interpretation of data is only as good as the accuracy of those data. If the data on which researchers are basing judgments are faulty, one cannot expect the conclusions to be accurate. Obviously, one can have appropriate inferences and conclusions without collecting data. Einstein knew his theory of relativity was correct before he had the data to support it. He spent most of his life collecting data to support the accuracy and usefulness of his theory because he, too, probably accepted the assumption that one can only have confidence in a research theory to the degree that one has reliable and valid data that support the theory. Therefore, high confidence is achieved only if judgments are based upon accurate data. Finally, it is important to reemphasize that one method is not necessarily better than another. It is also important when evaluating research to determine how well the research adds to the body of knowledge and to determine what it suggests needs to be done next. We believe that the model presented, the qualitative-quantitative interactive continuum, can facilitate the evaluation, planning, and conduct of research by providing a framework within which to conceptualize it (Dicks et al 2005).
The sample of subjects is drawn to reflect the population. After the pretest measures are taken, the treatment conducted, and posttest measures taken, a statistical analysis reveals findings about the treatment’s effects. To support repeatability of the findings, one experiment usually is conducted and statistical techniques are used to determine the probability of the same differences occurring over and over again. These tests of statistical significance result in findings that confirm or counter the original hypothesis. Theory revision or enhancement follows.. Fountain concluded that this and other examples suggest the impediments to major IT initiatives linking and coordinating diverse agencies and programs, and the likelihood that developments in IT applications will involve more modest projects and changes. These procedures are deductive in nature, contributing to the scientific knowledge base by theory testing. This is the nature of quantitative methodology. Because true experimental designs require tightly controlled conditions, the richness and depth of meaning for participants may be sacrificed. As a validity concern, this may be a limitation of quantitative designs (Dicks et al 2005).
Project Management: Gantt Chart
Gantt Chart will be used as a core of project management. Some charts and diagrams, once they are developed and the planning process is completed and moved forward to the implementation stage, can become management tools. completed and moved forward to the implementation stage, can become management tools. conducting and managing the project, charts and diagrams can be used as a framework for monitoring what is being done and seeing that the sequence is followed and each segment is accomplished at the right time. Diagrams can be used as an aid to determine if resources need to be reallocated at crucial points in the program (Clark 1977). Leaders who perceive a plan to be important will advocate the implementation of the plan; to implement a plan, leaders must work the plan, stay the course, and see it to completion. In the implementation of a plan, administrators will find various types of diagrams helpful. With the advent of computer capabilities and accessibility of appropriate software, charting and diagramming have become much easier, and the end result of the planner’s efforts can look much more professional, while providing clarity and readability. At one time, freehand diagrams, done with the style of an architectural draftsman, were often considered some of the better efforts to present such materials. Persons who, by their own admission, could not draw a straight line used ruler and templates to help in the process of making flow charts. One problem inevitably occurred when templates were used; the size of the figure traced from the template was either too small or too large. Consequently, professional draftsmen were frequently enlisted to make clear and professional-looking charts and diagrams (Clark 1977).
To construct a Gantt Chart, the planner begins with the placement of vertical and horizontal axes on a page. The vertical axis is placed toward the left side and the horizontal on the top of the page. The various functions and tasks to be completed in a project are listed along the vertical axis. A timeline with measured increments is recorded along the horizontal axis. Typically, the scale or time calendar is shown at the top of the page. Each function or activity along the vertical axis, listed in sequence from first to last, is numbered. The time when the function or task is to begin and end is determined, and a line, usually a heavy or wide line, is extended horizontally from the estimated starting date to the time or date when the function or task is to be completed (Philip, 2007). The thickness or texture of the timeline can be varied to communicate different meanings to the reader. Symbols can be added to the Gantt chart that represent report documents or specified activities. Each Gantt chart is titled, and a key for symbols or the meaning of various lines, as depicted by different line widths or compositions, should also be included (Clark 1977). Gantt charts can be constructed in various ways; in fact, variations are encouraged if such modifications help clarify the chart. The final Gantt chart is designed to contain a title, the timeline, the functions to be accomplished, and the bars and symbols designed to show the activities or to depict when reports are to be prepared and due or when meetings will be scheduled. Finally, a key is included at the bottom of the figure to explain the “language” of the chart. Responsible for performing the various functions should be identified. In the evaluation of a building principal, some tasks or functions would be done jointly by the principal and the person responsible for conducting the evaluation. Other tasks would be the responsibility of the person conducting the evaluation, and some responsibilities would be in the hands of the principal. While it would be possible to use varied types of timelines to depict such information, especially with the availability of computer assistance, such information is generally placed in the accompanying documentation (Philip, 2007). The completed Gantt chart is primarily useful as a management device, since it is easy to see at a glance if appropriate progress has been made and all requirements met. Documentation to accompany a Gantt chart is an important dimension of the total effort. Typically, the documentation accompanying a Gantt chart will have the same heading or title as the activities listed on the vertical axis of the chart. An introduction statement should be included, and each numbered line or item on the chart should be incorporated in the documentation, with the corresponding number used for each item.
Bibliography
- Baschab, J., Plot, J., Carr, N. 2007, The Executive’s Guide to Information Technology. Wiley; 2 edition.
- Carr, N. G. 2004, Does IT Matter? Information Technology and the Corrosion of Competitive Advantage. Harvard Business School Press;.
- Clark, W. 1877, The Gantt chart: A working tool of management. I. Pitman; 2nd edition.
- Champy, J. 2002, X-Engineering the Corporation: Reinventing Your Business in the Digital Age. Warner Business Books; 1st edition.
- Danziger, J. N., Andersen, K. V. 2002, The Impacts of Information Technology on Public Administration: An Analysis of Empirical Research from the “Golden Age” of Transformation. International Journal of Public Administration, 25 (5), 591.
- Dennings, P.I. 2001, The Invisible Future: The Seamless Integration Of Technology Into Everyday Life. McGraw-Hill Companies; 1st edition.
- Denzin, N.K. and Lincoln, Y.S. Handbook of Qualitative Research, Thousand Oaks CA: Sage, 1995.
- Dicks, B. Mason, B., Coffey, A. and Atkinson, P Qualitative research and Hypermedia. London: Sage, 2005.
- Laudon, K. C. & Laudon, J. P. 2005, Management Information Systems: Managing the Digital Firm, 9th Edition.
- McAfee, A. 2006, Mastering theThreeWorlds of InformationTechnology. Harvard Business review. pp.141-147.
- Philip, G. 2007, Information Systems Management. Prentice Hall; 7 edition.
- Snyder, L. 2007, Fluency with Information Technology: Skills, Concepts, and Capabilities (3rd Edition). Addison Wesley; 3 edition.
- Simon, H.A. 2007, Administrative Behavior, 4th Edition. Free Press; 4 Sub edition.