Information Technology and the Research University
Managing the Digital Ecosystem
Administrators must lead their universities into a future in which every constituency has distinct needs and every decision has implications for all.
Benjamin Disraeli’s ironic comment, “I must follow the people. Am I not their leader?” aptly describes the feelings of many college and university administrators as they develop institutional plans for information technology (IT) that will support research, teaching, and learning in the coming decades. The context in which we are expected to lead our institutions in IT decisions has changed dramatically. We are experiencing unprecedented technological change emerging from a much greater diversity of sources than ever before. Students, faculty, and staff arrive at the beginning of each school year with new ideas (and associated hardware and software) for using information technologies that will enable them to accomplish their diverse goals— educational, professional, and personal. Technology firms along with increasingly influential open-source software development efforts present us with a staggering number of technologies that hold promise for enabling our fundamental missions of creating and transferring knowledge. What leadership strategies are appropriate in such a complex, dynamic, and unpredictable context?
We’ve gone through a qualitative rather than merely a quantitative transition in the nature of IT. We have recently become accustomed to thinking of IT as one of the many infrastructures we provide on campus and for the national academic community. But the complexity and dynamism of IT, especially in academe, now warrants thinking about it in different terms: as an IT ecosystem. Computing in higher education has evolved from islands of innovation, to activities that depend on campuswide and worldwide infrastructures, to an ecosystem with many niches of experimentation and resulting innovation. These niches are filled by faculty and students who do what they do with IT following whatever motivations they may have, undirected by a central authority. But, as in any ecosystem, they are connected to the whole and often depend on a set of core technical, physical, and social services to survive. Many innovations depend on services—networking, authentication and authorization mechanisms, directories, communication protocols, domain-naming services, global time services, software licensing, etc.—becoming widely available beyond the niche in which they develop. A simple example is the dependence of almost any IT innovation that uses the network to share information on the “domain-naming system” to find the devices with which it needs to communicate. Managing ecosystems calls for a very different set of strategies than those used to administer islands of innovation or the more static, top-down services that characterize many infrastructures.
Academic computing ecosystems, especially (but not exclusively) at research universities, are tremendously complex, and understanding this complexity is crucial for selecting effective management strategies. These ecosystems evolve and change, often very quickly, as innovations developed by faculty and students are absorbed by the ecosystem. Unlike other campus infrastructural systems, IT systems have important feedback loops. For example, successful implementations of “course management systems” (Web-based applications for posting syllabi, readings, assignments, etc.) require continuous communication between those providing the central service and the faculty and students using it. The result is evolutionary modification, often at a rapid pace, of both the technological tool itself and the ways in which the faculty choose to use it. If this mutual modification through feedback is ineffective, the tool will quickly become irrelevant and die. Many on-campus innovations rapidly become “necessities” and redefine the nature of the core services needed to sustain them. Unlike natural ecosystems, the IT ecosystem cannot evolve without some external management. Nevertheless, top-down planning, uninformed by the diverse niches of use and innovation, will not work.
Twenty-five years ago computers were relatively large, relatively rare, and used for computation. The communities at the university interested in computing were small, isolated, and largely self-sufficient. Institutional leadership and planning entailed helping these communities get resources to acquire the next fastest mainframe or the newest micro- or minicomputer for word processing or data acquisition and data processing.
By 1990, personal computers (PCs) were ubiquitous. Each had computing power that exceeded most of the mainframes of the previous decade, and some of them were networked together and to the NSFnet. IT planning had become more complex, but the shape of leadership needed from the central administrations and the associated strategies were fairly clear: Find resources to build campus networks; invest in wide-area networking for research; fund opportunities for faculty to experiment with using PCs for research, teaching, and learning; and hire staff to look after the support needs of those using computing for a wide variety of purposes. In short, build and sustain an IT infrastructure for our individual institutions and for higher education as a whole.
Over the past 15 years, computers have become useful for communication, new forms of knowledge representation, knowledge management, visualizing complex data, customer relations management, music storage and transfer, video editing, gaming of all kinds, grid computing, and something else new seemingly every day. The user community now includes everyone on campus. IT tools have become essential utilities without which we cannot function. At the same time, they are a defining force in shaping the future of our core missions of knowledge creation and transfer. Furthermore, the cheap microprocessor and the Internet created a tipping point sometime during this period, when the hub of innovation moved from a small core of experts to a vast number of users.
By the turn of the 21st century, the flow of IT innovations was no longer largely from the university to the students. Except perhaps for supplying the bandwidth for Internet connections and educational discounts on expensive software, many colleges and universities have little to offer to their students who arrive with computers (often more than one), cell phones, personal digital assistants, iPods, digital cameras, Sony PlayStations, Xboxes, LCD TVs, Bluetooth-enabled devices, e-mail accounts, personal Web pages, blogs, and high expectations for the role IT will play in their education.
There is a similar story regarding faculty, not just in science and engineering but in all disciplines. They depend on a wide variety of software, operating systems, computer configurations (including the emerging small supercomputers known as “compute clusters”), multiple servers under their control, and access to all of their resources at any time from any place in the world. They are increasingly involved in cross-institutional research groups that require everything from sharing large data sets to remote digital access to supercomputers and state-of-the-art research instruments.
For students and faculty, the number and kinds of information technologies they expect the university to support is more diverse than ever before and more rapidly changing than most would have imagined when universities began their commitment to IT as a fundamental infrastructure. Documents such as the National Science Foundation report Revolutionizing Science and Engineering through Cyberinfrastructure highlight some of the issues. For example, faculty routinely work with colleagues at distant institutions, depending on e-mail, instant messaging, wikis, videoconferencing, file-sharing protocols, cybertrust mechanisms, and many other IT tools. Grid computing (the use of many computers working together to solve large-scale problems) will require not only high-bandwidth network connections but data architectures and authentication and authorization protocols that ensure the coordination and validity of the calculations and data.
The management challenge
Although many have noted the need to adapt to the changing IT environment, few have offered clear suggestions about how university leadership must change to deal with the complexity of the situation. We believe that some unusual management strategies are needed, and some very difficult decisions will have to be made in response to a few important trends, including:
- The proliferation of niches of IT applications has made providing the core services for sustaining the system as a whole more complex to manage.
- The costs of providing the levels of IT services expected by both faculty and students working in these niches are increasing because of this complexity.
- Information technologies emerging from niche innovation are being rapidly identified as essential tools for research, teaching, and learning—tools without which success cannot be achieved—and the rate at which this is happening is accelerating.
A core business of higher education is innovation in research and teaching, which is one of the ways in which IT ecology at the university differs from that in the commercial sector. In the latter, some efficiencies in IT are produced by the standardization of hardware and software. Experimentation with new hardware and software is centrally planned and approved. In contrast, a guiding principle for IT support strategies at universities has been to encourage experimentation by both faculty and students. We keep the networks open and provide at least best-effort support for a wide range of software and hardware. Not surprisingly, the result has been a proliferation of technologies and high expectations that universities will create core services that support whatever niche users evolve. Diversity is both a value for our ecosystem and a drain on scarce resources.
That computing is a sine qua non of contemporary scientific research requires no additional arguments from us. What is less well known is how widely the dependence on IT has spread through all disciplines. It would have been hard to imagine 30 years ago that academic philosophers would be deeply involved in fields such as computational logic and computational epistemology or that the theater department would join with computer scientists to create an entirely new academic enterprise called “entertainment technology.” And what will be upsetting to some is that IT is becoming equally essential in teaching and learning. The history of technology-based or technology-enhanced learning is not one of substantial demonstrable improvement in learning outcomes. But things are changing. As those developing e-learning tools draw increasingly from the body of knowledge and techniques of cognitive science, effective technology-enhanced learning is becoming a reality.
Carnegie Mellon faculty have had measurable success in using cognitively informed computer-based instruction in several areas. The earliest results came from intelligent tutoring systems developed using John Anderson’s theory of cognition. Researchers used painstaking talk-aloud sessions to understand how novices and experts solve a problem. They then developed software that compares steps taken by the user with these cognitive models of how others solve problems to provide intelligent individualized feedback to students working their way through problems. The result was Cognitive Tutors for middle- and high-school students that have produced documented improvements in the learning of algebra, geometry, and other subjects. Expanding the application of cognitive science (and Cognitive Tutors) to online courses for postsecondary education, we have had considerable success with courses developed as part of Carnegie Mellon’s Open Learning Initiative (OLI), which is devoted to developing high-quality, openly available, online courses. The OLI’s goal is to embed in the course offering all of the instruction and instructional material that an individual novice learner needs to learn a topic at the introductory college level. These efforts are part of a worldwide movement to develop IT tools to provide effective transfer of knowledge.
An additional development at universities is the pace at which successful IT experiments are expected to become part of the IT utility. Consider the difference between e-mail and wireless networking. E-mail was invented as part of the ARPANET in 1971. Yet it was not really an expected part of universities’ IT infrastructure until the late 1980s and early 1990s. Very early wireless networking was first deployed as an experiment at Carnegie Mellon between 1994 and 1997. By 2001, not only was the entire campus covered with a commercial 802.11b network, but students and faculty expected that network to be ubiquitous and always available. The RIM Corporation introduced the first Blackberry wireless handheld device in 1998. By 2002, many faculty and staff on campuses started to use Blackberries. Today, many expect and depend on Blackberry (or Treo or other cellular technology) connectivity to keep up with their e-mail and calendars as they travel. Students, faculty, and staff will arrive this fall with a range of wireless devices that they expect to connect to their university e-mail, course management systems, central calendars, the university portal, and other core information services provided by the university.
We should anticipate even more challenges. The expectation of our constituents is that if a technology is available and it can help them accomplish their goals, the university should provide whatever core services are required to support it.
We propose the following as critical strategies for addressing new challenges posed by IT ecosystems in higher education.
Creating more robust feedback loops between the members of the academic community who are generating IT innovations and those responsible for supplying a sustainable environment. We have described how IT over the past 25 years has changed from innovation and use by the few to innovation and use by the many on campus. Without gathering data from the many, no central organization can effectively predict what the university must do to provide the core services that that will both sustain the everyday uses and enable the evolution of innovative uses. The diversity of sources of use and innovation requires new and more aggressive techniques for gathering data. Central IT organizations must engage in information-gathering outreach unheard of in earlier times. University leadership must create the conduits of communication and encourage active participation by faculty, students, and staff in that communication to develop services to best support their needs and expectations.
Collaborating with other universities to develop shared solutions for both intra- and interinstitutional support for academic IT. There are many examples of collaborative efforts among universities in IT. These range from regional education and research networks to consortia for software licensing to joint software development. Recent examples include the development of the National Lambda Rail, a project by a consortium of research universities, along with Internet2, to create an all-optical, extremely high-bandwidth network that will serve the bandwidth and network research needs of higher education, and “Sakai,” a project led by MIT, Indiana University, the University of Michigan, and Stanford University to develop an open-source course management system (technology to support Web posting of course materials and collaborative work) for higher education. Universities could do much more, but resources are limited. As IT organizations at our various universities are called on to provide more and more services without additional resource input, collaborative work does and will suffer. Leaders are faced with the difficult decisions entailed in choosing between keeping up with increasing daily pressures on basic IT services and supporting collaborative projects with other institutions to build sustainable and evolving IT core services for the future.
Selecting for adaptive and nonadaptive technologies for resource allocation based on their contributions to our fundamental missions. Under resource limitations, not all IT applications can be equally supported. Thus, difficult decisions are required to select which expectations to meet and which to disappoint. Creating the robust feedback loops mentioned above will help universities better adapt to changing needs and expectations. But more than understanding use and user expectations is required. We also need to make choices based on solid data about the contribution of the many IT applications to our central research and teaching missions. Although faculty are in the best position to identify what IT effectively supports their research, even here there are questions about whether the methods they use are globally effective and efficient. For example, several universities are now encouraging faculty to forego having their compute clusters near their offices in favor of locating them in central machine rooms. The theory is that the university will spend less overhead and be able to provide better professional IT support for central farms of clusters than for clusters distributed all over campus. If this is true, it is reasonable for university leaders to find ways to encourage shared-resource strategies at the cost of some individual convenience. The same principle applies to many core services that are currently replicated across our institutions for the sake of convenience at the potential cost of global inefficiencies and lack of interoperability.
Judgments about the relative contributions of educational technologies to fulfilling the core mission of knowledge transfer are also essential. Despite being burned repeatedly by claims that technology will transform learning outcomes, central leadership is nevertheless reluctant to deny requests for potentially promising new technologies for teaching and learning, Although finding effective technologies has often been done through trial-and-error experimentation, there is increasing information from cognitive and learning sciences about what is likely to help and what is likely to hurt that can give us guidance about where to place our bets. We have not yet really broken the pattern of deploying new technology with only the hope that we will find effective pedagogical applications for it.
Take the fairly old notion of a laptop requirement on campus. Many vendors have sold K-12 districts and universities on the notion that equipping all students with a laptop will “obviously” improve learning outcomes. Yet it is nearly impossible to find a study that reports anything more than anecdotes about use or user satisfaction at “laptop universities” or that controls for other educational innovations introduced at the same time as the laptops. In a recent study conducted by the Office of Technology for Education and the Eberly Center for Teaching Excellence at Carnegie Mellon, we learned that distributing laptops to students in a case study actually discouraged some collaborative learning behaviors, which learning sciences have shown improve learning outcomes. The study also indicated positive consequences of laptop ownership. The point is not that universal laptop ownership is not a good thing; rather, that before widely deploying a technology, we should understand both what problems we are trying to address and what we already know about how that technology might solve the problems.
In the absence of rigorous data, we cannot afford to invest in proliferating devices and software that merely appear to hold some promise to improve learning. Another example is the increasingly vociferous claim that educational software needs to exploit the fact that the current generation of learners are “natural multitaskers.” This seems an increasingly dubious or at least complicated claim in the light of developing evidence that multitasking is accompanied by reduced cognitive attention. Leadership is required to say “no” in the face of unsubstantiated claims that a technology will transform teaching and learning.
Revitalizing commitment to open standards to ensure the sustainable and evolvable development of IT in academe. Finally, education must do more to return some sanity to the IT standards movements. Our current IT ecosystem’s (somewhat fragile) stability depends now on far-sighted work in “open standards” that have allowed software and hardware to interoperate and have structured, shared services. Most of these standards originated at individual universities. Members of broader academic communities helped guide them through sanctioning bodies and lobbied for vendor acceptance. Enabling diverse IT ecosystem configurations depends on the existence of open standards that allow many different devices and pieces of software to coexist, communicate, and use common features of the environment. It has always been a challenge to convince vendors to adhere to open standards if there is any commercial advantage to be gained by creating features that step outside those protocols. The situation is not getting better. Indeed, there is a subtle deception about claims of adherence to standards. If a product mostly follows a standard, the vendor will say it “adheres.” But the hard reality faced by those implementing the products is that anything short of complete adherence often results in complete failure of the service to work properly or requires time-consuming customization. Very large vendors can often succeed by stepping outside the standards and encouraging the adoption of their proprietary systems as a “top-to-bottom” solution for an institution. But even smaller vendors will often opt to ignore open-standard options if they deem them too great an obstacle to getting their product to market in a timely fashion.
Indeed, more than the vendors must bear the blame for open standards not playing the role they should in sustaining academia’s diverse and evolving array of applications. Standards definitions have often become too cumbersome, and the ratification process too slow. The communities developing standards seem to lose touch with both vendors and users. open source movements that should be using their standards in the creation of products. The effort to develop the IMS and SCORM standards to allow interoperability of course management systems and repositories of related materials has become a multiyear marathon that is producing standards so detailed that vendors and university software developers are reluctant to use them. This could result in each university opting for its own closed standard, which would make it impossible for universities to take advantage of shared core services and much more difficult for successful innovations to spread.
University leaders must insist that standards be usable, that they be developed and documented in a timely manner, and that they can be easily adopted by commercial vendors and independent open-source developers. Collectively, colleges and universities constitute a large market and are thus in a position to play a powerful role in the standards game. We can use our intellectual strengths to lobby researchers in academia and the corporate sector who are engaged in creating standards, as well as organizations such as the Institute of Electrical and Electronics Engineers, who are responsible for ratifying them. Finally, we can use our buying power to reward vendors that build features that really adhere to open standards.
The implications of the transition from IT as infrastructure to IT as ecosystem are profound for leaders in higher education. We are already making tough choices about how to structure IT organizations and allocate resources to IT. Our arguments here suggest that IT organizations should start to look and act differently than they do today: They should be part of robust feedback loops with the dynamically changing niches of innovation throughout the university community; they should be looking beyond their own walls to collaboration with other institutions; and they should help revitalize the open-standards movements that are so critical to sustaining diversity in our ecosystems. But this means hard choices for university leadership outside of IT. University leaders must partner with faculty, students, and IT leadership to make some hard choices about how to sustain those niches in the ecosystem that are most valuable for our core missions. It means some paths of innovation and some core services will be starved. It means diverting resources from services we could provide now to fuel the collaboration and development of standards that will sustain us into the future. In a context in which IT gives us all instant gratification—we want to use it now!—these will not be popular decisions. Few leadership decisions to sustain an ecosystem for the future at the cost of current expectations and needs are.
Joel M. Smith ([email protected]) is vice provost and chief information officer and Jared L. Cohon is president of Carnegie Mellon University.