After completing this chapter, you should be able to:
A number of years ago Thomas H. Crowley, then director of Computer Science Research at Bell Laboratories in New Jersey, offered an after-hours course in computer literacy for adults. Discussing the impetus for choosing computers to solve problems, Crowley identified three main positive reasons why people turn to them for a variety of applications: (1) making a job more economical (savings of time or money) or more convenient; (2) making feasible a job that is impossible to do by any other method; and (3) using the computer to gain insight into a process by modeling or simulating it (Crowley). These three reasons for computing make the machine a powerful tool in all areas of modern life.
One often finds that more than one of Crowley's reasons applies to a particular computer application. The three often overlap in both common and exotic applications, and it is hard to find applications in which at least one does not fit. The computer has become a positive force in most areas of life because of its characteristics: its flexibility, generality, speed, and accuracy. Even though Crowley identified them some years ago, they are still relevant to current computer usage. Let us keep Crowley's perceptive remarks in mind as we focus on four major fields of application today: education, medicine and science, the media, and government. After we look at software commonly used with positive applicability in many fields, we shall consider typical applications in each of these areas and highlight some of the frontiers of modern computer usage.
Most users of personal computers today work with several kinds of computer programs that are called collectively applications software. Users ordinarily choose them to increase their productivity in doing day-to-day work; consequently, these applications programs are also called productivity software. Whereas formerly packages carried out single functions, nowadays they are often bundled together in a multifunctional package that can share data among functions. Such software, called an integrated package, usually includes word processing, spreadsheets, and database capacity. Most people adopt applications software for economy and convenience, but both feasibility and insight often have a place in its use.
The five most common types of application are word processing, spreadsheets, database packages, graphics, and communications. After a short description here, each kind of applications package will be treated in detail in a later chapter.
Word processing is using a computer like a typewriter, to prepare documents, but it offers far more flexibility than typing. Even basic word processors make revision and editing easier than on a typewriter. And they offer features which were never envisioned for typewriters, such as the ability to elect different character fonts, stylistic options, and spelling checking. Some modern word processors even have typesetting capabilities that allow the user to do desktop publishing. They are now certainly more than substitutes for typewriters, and offer ease of use and features for document preparation never before feasible with personal computers.
Spreadsheets are analogous to automated accounting sheets, numerical figures laid out in rows and columns. The spreadsheet's advantage over manual accounting techniques is that it can do calculations automatically and update the ledger immediately. Suppose you are making a budget for the month and do not want to spend more than a quarter of your assets for housing. Your spreadsheet would have a slot for income and another for housing expenses, to be calculated at no more than one fourth of income. If you have put the formulaic relationship between income and housing into the spreadsheet, adjustments to income trigger the software to recalculate new housing figures conveniently and quickly. As you enter different income figures, the computer automatically refigures the housing budget to reflect the quarter-of-income allowance for that purpose. Ease of planning and insight into the budget process contribute to the popularity of spreadsheets.
In principle, database and file management packages are analogous to manual filing systems, such as an index card file for a mailing list of 5000 contributors to a charity campaign. The card file is probably organized alphabetically by the last names of the people on the list. Also included on each name card is other associated information, such as the contributor's address, broken down into street, city, state, and ZIP code; the amount of the contribution; and the number of years the person has been giving. If the list of givers is very long, reordering the index card file to pull out all people in a certain ZIP code area who have contributed over $100 in the last three years would be tiresome and troublesome. However, when the information is put into a file management program, a one-file database of contributors, the process of sorting and searching becomes easy, for the application provides these functions as standard features. Most database packages can handle multiple files and interrelate them for powerful control over large amounts of information. Simple searches for ZIP codes, although not impossible with a set of index cards, are convenient and economical with a file management package. In addition, complex searches like the one proposed, a kind of computing for insight, become feasible.
Computer graphics packages allow the user to treat the computer screen as a drawing pallet and then print the results in black and white or color. They are commonly used either for creation of freeform art and design or for graphing numerical information. More complex graphics programs for computer-aided design (CAD) allow engineers to model building plans and machine tool designs directly on their screens. Today artists are using graphics packages to design logos for companies and lay out advertising copy, often in conjunction with desktop publishing packages with word processing features. Since it is often said that a picture is worth a thousand words, people regularly choose to graph their numerical or financial data to make them easier to understand -- an aspect of computing for insight. Creation of line and bar graphs and pie charts are often standard graphic options in spreadsheet programs.
The increasing popularity of the Internet in recent years has led some commentators to suggest that the latest "killer app," jargon for an innovative, widely adopted computer application that makes some person or business rich and famous, will be software for data communications, such as Web browsing tools. Yet data communications covers a wide variety of useful applications linking computers and the telephone system. Equipped with a modem, hardware that changes computer signals into telephone signal's and vice versa, a home computer user can access information from a variety of public and private sources or send messages throughout the world. Using the phone connection to check airline schedules and reserve tickets at home can save time and money. For a doctor seeking medical advice for an unusual condition, quick access to remote medical databases may make feasible the saving of a patient's life. Electronic mail, the sending of a message to another person's "mailbox" on another computer, and computer conferencing, multiple users commenting by electronic mail on an issue of common interest, offer extended two-way transfers of information and provide insight not readily feasible by other means.
Popular acceptance of these common kinds of applications software has been the major reason why the personal computer has been so successful. The machines that run the packages are reasonably priced; the ease of use and adaptability of their software have spurred microcomputer sales in all fields. Indeed, all areas of the professions, business, and government have incorporated these tools for a multitude of common and unusual tasks. The combination of productivity software, microcomputers, and general users represents a successful instance of the Computer Triangle. The result has been the burgeoning personal computer revolution throughout the world.
Applications software like word processing and the increasing presence of the Internet represent only a sample of common computer uses found at all levels of education from kindergarten to college. Innovative strategies in computer-assisted instruction (CAI) include student multimedia projects, simulations in history and science, and electronic mail between students geographically dispersed throughout the world. Yet questions and debates still persist about the methods and quality of incorporating computers in education.
No one questions that computing in education is a big business. According to the Software Publishers Association, roughly $2.4 billion was spent on educational technology in public schools (kindergarten through grade 12) in 1994, and IBM Academic Consulting estimates more than $6 billion for college and university purchases in the same year. New educational paradigms, illustrated in Table 2-1, are being developed for incorporating computer support in the instruction that is two-way, collaborative, and interdisciplinary.
The traditional lecture method of delivery, with students sitting passively in a classroom, is giving way to networked classes and the teacher's becoming more of a mentor to guide student exploration of a topic. Students can communicate with each other through class networks and do team projects in a variety of multimedia (Reinhardt). Proponents of these digital, computer technologies in education argue that they make instruction more vivid and fitted to individual student needs. In addition, administrators and public officials often believe that computers will lessen the spiraling costs of education. Models of distance learning and consolidating resources with computer resources distributed on the Internet, such as the University of Texas WWW Lecture Hall, offer such hopes.
Yet many faculty members are not convinced. A study at the University of Southern California reported in 1995 that fewer than 5 percent of college faculty members were using computers in instruction to enrich learning. Many teachers, trained in the lecture mode of instruction, report that they do not have the time to invest in learning new technology. They are skeptical of the benefits that the new modes of delivery will make in the education of their students. In many institutions personal commitments of faculty to educational computing do not lead to tenure or promotion. And faculty are even more suspicious of administrators' arguments for cost savings coming from technology (DeSieno).
Joseph Weizenbaum, long a leader in educational technology at MIT, has been a longtime critic of hastily adopted educational technology. He warns that people should not expect computers to replace the most valuable relationship in the educational process, the human contact between teacher and student. Weizenbaum thinks that going to the library and reading a novel about the Great Depression is better than gleaning loads of facts and statistics about it from a database. A multimedia database on the Depression contains an anonymous someone's selective gathering of information about a complex period, while a novel like James Farrell's Studs Lonigan provides a capsule reading of real life. Teachers should try to foster a student's natural talent for imagination and hold on to it as long as possible. Weizenbaum certainly agrees with much educational research that computers should be used carefully in conjunction with well-planned classroom teaching (Brady).
Few would argue that computers can replace a good teacher. But they can provide hard-to-teach supplemental dimensions or demonstrate with other methods. Let us examine how teachers are developing the new instructional paradigms suggested in Table 2-1 in multimedia, simulations, networking, and distance learning on the Internet. In the successful cases, the three elements of the Computer Triangle -- hardware, software, and people -- are working together for better education.
Much use of computer-assisted instruction in schools involves drill-and-practice software. These are programs designed to teach a particular kind of knowledge and to drill students for mastery. Some of this software is especially appealing to young learners, because it incorporates color graphics and sound feedback. Programs to teach the phonics approach to reading fit into this category, along with foreign language and mathematics drills and supplementary lessons for remedial students.
As students get older, drill-and-practice software tends to become boring and lose its appeal, unless it is cleverly designed to hold their interest. A multimedia program in American history called Point of View (Scholastic Software) uses a mixture of timelines on politics and popular culture, census statistics, and recorded sound to overcome the element of boredom. It takes students' interest in this important subject and their probable lack of knowledge about parts of it as a point of departure for their investigations. Students can follow their individual preferences in working through the program's materials.
For instance, students can choose to display several different historical timelines at the same time. As they view the dates of important rock music events in the pop culture of the 1960s, they can also display the political events of the Kennedy-Johnson era, including the civil rights struggle and the Vietnam War. They can play recorded excerpts of important speeches, such as Kennedy's inaugural ("Ask not what your country can do for you") or Martin Luther King's "I have a dream" speech, from the March on Washington. The possibilities for graphing census statistics are extremely varied. Students can choose to graph the numbers of white and black residents in several states. As they move through several decades on the timeline, the graphs are automatically updated and displayed on the screen. Students can save materials from the package to use in developing their own research papers with word processing programs. The package intentionally embodies a flexible, interactive learning environment to encourage students to develop their own research strategies and to learn more about American history than their textbooks contain.
Creative teachers at all levels of education have always found ways to incorporate innovative teaching aids and strategies in their classes. Computers add new capabilities not available by other means. One positive example is the "computer-infused classroom" in a middle school in Shoreham, New York. There a creative teacher, Robert Vlahakis, includes three computing components in an American history course: simulation, telecommunication, and standard software packages (Vlahakis).
Using data communications software, Vlahakis's students correspond with peers in other schools by electronic mail and extract stories from a public on-line news service about current events related to class materials. They are learning how to gather and use information, a skill that will be more important than memorization of facts in the next century, when massive databases on all subjects will be accessible from computer keyboards. With their word processors and desktop publishing software, students work in teams to prepare their own newspapers of important historical periods. They gather textual and pictorial materials, type or scan them into the computer, and compose newspapers such as the Jamestown Memo or the Plymouth Bay Sun for the Revolutionary period. For them the computer has made American history come alive.
Simulation software also provides these students with insight into the historical process which they experience vicariously. Having to make decisions based on the available evidence shows students that the answers to real world problems are usually open ended. It forces them to use higher-level thinking skills, like logic and deduction, to solve a problem. Shoreham students use a simulation package called The Other Side (Tom Snyder Productions) to negotiate a peace between two nations, and they rate the experience a highlight of the course.
To Vlahakis, "Computers have added another dimension to the teaching style that I have always employed -- an experiential, problem-solving approach that encourages students to question, discuss, and analyze history. The computer has made it more efficient for me to focus on the individual needs of my students in a more diversified learning environment." All three of the positive reasons for computing are involved in this class: efficiency, insight, and the feasibility of trying individual teaching strategies not available in other ways.
With the availability of easy to use authoring software, some students have begun making their research investigations as multimedia presentations. An eighth grader in Montreal used a program called HyperStudio (Roger Wagner Publishing) in the fall of 1995 to present the issues of the controversial and divisive referendum on secession of the province of Quebec. Included in this impressive presentation were the text arguments of both sides, the Oui and Non parties, as well as video clips of newsfilm presenting rallies on both sides as short videos. The student also used a camcorder to present his own views on film. Using the capability of the software to access the World Wide Web, the production included a number of links to national government opinion and the editorial stands of important Canadian newspapers in Montreal and Toronto. The viewer of the lesson was able to leave the program to see the viewpoints of major opinion organs on this important vote and return to the student's presentation with only a few clicks of the mouse. This middle school student was too young to vote in the election, but through his exploration and multimedia production he was probably as qualified as many adults to make an informed choice.
Similar multimedia term papers were created in a freshman literature course on the novels of Raymond Chandler, coordinated by Professor Matthew Bruccoli at the University of South Carolina. Under Bruccoli's guidance, the students chose a chapter of one of Chandler's California detective novels to annotate for the modern college reader. According to Bruccoli, "these were a bunch of students born . . . thirty years after these novels were published. The cars meant nothing to them, the brand names that were . . . assumed to be recognizable to the reader of the 1940s and 1950s. Packard -- the kids had never seen a Packard -- and Los Angeles, Hollywood of the early 40s and 50s , are about as remote from them as the planet Mars. If . . . the kind of literature known as social history . . . is going to survive, ways must be found to make the material accessible." To prepare their annotations, the students did library research on the reality of Chandler's California -- historical personages, events, maps, fauna and flora, etc., and used a program called MediaLink (InterEd) to add hypertext links to their chapters. One student annotated a footnote on the modern German composer Paul Hindemith with a biographical note and segment from his symphony in E flat. Another focused on the area of Los Angeles called Central Avenue and discovered that this was the neighborhood that eventually fed the Watts riots of 1965. Of her multimedia presentation, this student said, "I liked it better than doing a normal term paper because you learn so much more. . . . When people come to view your work, they're actually seeing examples of what's going on." (Van Gelder)
Such projects exhibit several of the new educational methods described previously, including individual exploration and diversity in presentation of materials in various media. The teacher had the students investigate unfamiliar topics under his guidance, and the computer allowed them to explain it to others, in color with sound and images attached to their traditional footnotes. The ease of use of the authoring software permitted the students to create multimedia term papers on their own. The computer gave them the ability to present various media resources that otherwise would not have been possible, and both the students and their audience gained insight into the richness of literature through the hypertext assignment.
Teachers report that simulations are especially useful in teaching mathematics and science. Mathematics is hard to teach partially because many complex functions are difficult for students to visualize. The highly respected mathematical package called Mathematica (Wolfram Research) overcomes this problem (Grady). It can carry out almost any kind of mathematical operation -- numerical, symbolic, or graphical. For instance, it will take an expression like x2 - 1, factor it into (x - 1)(x + 1), and graph it. It can do differentiation and integration in calculus or graph functions in two or three dimensions. Formerly such capabilities were available only on large systems and were clumsy to use; this program makes them easy to use on large personal systems used in many schools and colleges, like 486s and Pentiums and larger Macintoshes. Math educators are beginning to rethink their assignments to catch up with Mathematica's potential to enliven the teaching of mathematics, especially for college students.
In the experimental sciences, educational simulations often meet several needs. Perhaps the subject is hard to understand without experiment, or unsafe for students to experiment with in real life, such as infectious disease. The high cost of equipment required for an experiment may preclude having it available for student use. Sometimes a simulation cuts out a long delay in waiting for a reaction, or shows something impossible to observe, such as what happens at the atomic level (Trollip). Simulation packages in mechanics, electromagnetism, and modern physics, produced at Stanford by Blas Cabrera, present animated results on the computer screen so that students can visualize not only the processes in progress but the final result. Students can try 16 experiments on their own, several times if necessary, in order to understand the complexities of topics like planetary motion or rocket ballistics. They learn at their own pace and can grasp the interrelationships of the physical variables at work.
Accurate information and speedy access to it are probably more important in medicine than in most fields, given the life-and-death situations that doctors and nurses face every day. Medical databases of current research findings and treatments, well indexed by categories, were one of the first major uses of on-line databases. In Computers and the Information Society, James Radlow tells the story of a California doctor in the 1980s who was able to save the life of a patient with uncontrolled bleeding by tapping into MEDLINE, a well-known medical database, from his home in the middle of the night. Among summaries of 19 articles on this type of bleeding, he found an English translation of a German case which seemed to match his patient's symptoms and described a new blood replacement therapy. The doctor ordered the treatment to begin immediately and found the patient much better in the morning. Access to knowledge through data communications and up-to-date on-line medical databases saved the patient's life.
Today "telemedicine networks" allow specialists in large medical centers to do diagnosis at long distance through computerized two-way videoconferencing, using high-speed data communications. Small towns have access to the knowledge of big-city physicians on-line, whereas they could never afford to have such doctors in their communities. For instance, in May 1994, a woman in rural Georgia had a rotting skin lesion on her leg that was not responding to treatment by local doctors. Rather than sending her to Atlanta or Augusta, the hospital beamed up a specialist at the Medical College of Georgia in Augusta, whose examination of the lesion via videoconferencing identified the problem as a staph infection. He prescribed a long dose of antibiotics, and the woman did not have to lose her leg. Although the 1980s doctor in California found the answer to his problem with a computer database, in this case the medical school dermatologist was able to see the wound and make a quick diagnosis in real time. The computer brought the contact between patient and doctor at a distance up close and personal (Cowley et al.).
Patients themselves are getting in the act of understanding their symptoms and drugs with so-called "doc-in-a-box" software kits and on-line support groups and forums. A woman in Wisconsin having complications from breast cancer surgery joined the growing CHESS (the Comprehensive Health Enhancement Support System) computer network at the suggestion of her hospital and found lots of advice from fellow patients with similar problems regarding her illness and medicines. CompuServe, an on-line computer information service, has a forum where patients can log on and ask questions, which are often answered by doctors. "Doc-in-a-box" software CD-ROMs, expected to be a $500 million industry by 1996, package information in multimedia that can be accessed after a patient types in a set of symptoms. The program then responds with a set of possible diagnoses and an encyclopedia article on each condition. Also included is information about thousands of drugs and medical procedures. As a cost-saving measure, some HMO plans are training nurses to use software of this kind to make preliminary diagnoses over a toll-free phone line. With computer assistance patients get advice in an impersonal and anonymous way, and studies indicate that those who have logged on tend to like it. Yet software creators and forum organizers emphasize that these approaches are meant to supplement regular medical care, not replace it (Cowley).
Keeping accurate records on hospital patients is essential; but for nurses, the load of paperwork can be overwhelming. Many hospitals have installed a portable computer in patient rooms for encoding data like temperature and pulse rate as they are gathered. Using a barcode scanner, nurses can read drug and food requirements encoded on patients' armtags. Because the workstations are networked throughout the hospital, medical departments like radiology can file test results electronically as they are completed, and they are instantly available in any room. Treatment documentation and accounting records are also on-line. Errors due to illegible handwriting or verbal misunderstandings can be avoided, since the patient's master record is stored in typed form in the hospital's main computer. Not only does the networked hospital have more accurate records quickly accessible, but recordkeeping is more economical as well.
Hospitals routinely employ computer monitoring of critically ill patients when nurses are otherwise engaged. In intensive care units, for example, patients are wired with electronic sensors that monitor their vital signs, such as heartbeat and temperature. This information is collected continuously and automatically transmitted to the nurses' station, where it is displayed on a computer screen. If some vital sign falls outside the normal range, the machine beeps a warning. For example, if a heart patient's pulse falls suddenly, the machine alerts the nurse on duty, who can then respond immediately. In applications like this, the computer allows an instant diagnosis not available by other methods and can truly be a lifesaver.
Medical technology based in computers has added new diagnosis tools unimaginable only a few years ago. The most well-known one is probably computerized axial tomography, better known as the CAT or CT scanner. Today hospitals regularly use such devices to find brain tumors and other cancers and later to check for recurrence after surgery or chemotherapy. The equipment blends x-ray with computer graphic imagery techniques. The machinery takes a series of x-ray photo cuts through the bodily area under study, at slightly different angles. The computer then merges these separate images and displays them graphically on a color screen, showing different kinds of body tissue in different colors. Magnetic resonance imaging (MRI scanning) employs a similar technology using the interaction of radio waves with hydrogen nuclei in water and fatty tissue in the body to get safe and reliable computer images of the internal organs (Redington and Berninger). With both machines, doctors have a picture of the body through computer imaging that was not possible with earlier diagnostic tools.
Computers are being used creatively to assist handicapped people through electronic prosthetics, or the replacement of missing body parts with artificial limbs or organs. Programming a computer to control artificial limbs falls into the category of cybernetics. The early computer pioneer and mathematician Norbert Wiener, of MIT, coined the term from the Greek word kybernetes, meaning "pilot" or "helmsman," to refer to the study of control mechanisms, both in animals and machines. Suppose a person has been in a car accident and lost the lower part of her left leg; yet the nerve endings are still active. Medical researchers can produce an artificial leg with computer chips that sense nerve signals to move or bend the leg. When the woman wants to walk, her brain signals the sensor on the prosthetic leg; it receives the message and moves the appropriate parts of the leg and foot. Research in cybernetics is producing wonderful progress in prothestics, helping handicapped persons and accident victims to restore bodily functions like walking and talking.
Ordinary personal computers are already working wonders for handicapped people. Voice synthesizers, chin switches, head pointers, and braille printers have opened up communications for handicapped students not even imagined a few years ago. The mouse, which users roll around on the table to select items on the computer screen, has become an input device for people who cannot use a pencil or a keyboard. A Swedish computer science student paralyzed from the shoulders down uses a head pointer, which monitors his head movements, to choose options on Macintosh menus. He then activates the option he wants with a puff switch, into which he blows a puff of breath. On a special machine that emulates a keyboard, controllable from his head pointer, he picks out a character and types it with the puff switch. The student has already developed a typing speed of 100 keystrokes per minute.
Hypermedia tools that open windows and present text, graphics, and sound with the click of a button are now accessible to people who do the clicking with a chin switch and a nod of their head. Speech recognition packages allow blind students to control their machines with their own voices. Many of these hardware options can be added to standard machines without enormous cost. Liz Vantrease, an active composer of opera and other music who has slowly debilitating Lou Gehrig's disease, has called the computer "an empowering device" for the disabled: "It can give you back some of the things that were so unfairly stolen from you by illness, accident, or birth defect" (Meng). The computer extends a handicapped person's sensory and motor abilities and thus makes life more truly manageable.
In the traditional sciences we find a variety of applications that depend first on the computer as a very fast and accurate calculating engine and then as a powerful graphic imaging system. These fields apply principles of mathematics to real-world applications, often using differential equations to model natural processes. Today supercomputers and high-powered graphical workstations are the staple tools of the research scientist. Supercomputers, the most expensive computers in the world, come in different configurations, but they share one common trait: the ability to do repetitive mathematical calculations at incredible speeds. Their speed and power is usually measured in gigaflops (GFLOPs), billions of floating point (mathematical) instructions per second. Newer supercomputers have now reached peak speeds of a trillion floating point operations per second, called teraflop (TFLOP), a thousand-fold improvement over a gigaflop machine.
Weather forecasting is an everyday instance of supercomputing. Accurate weather prediction depends on the collection of temperature, wind velocity, and air pressure readings from many stations. For decades scientists have known that solving a series of complex mathematical equations containing data from thousands of weather stations will give an accurate forecast. The problem has been that the number of calculations required to solve the equations is so large. By the time they are solved by ordinary methods, the weather has changed. In the early days of computing, the first practical use of the formulas became possible. Accurate one-day forecasts were calculated with the computational power of mainframe computers of that era. Nowadays we expect accuracy about five days in advance, because supercomputers have increased calculating power so dramatically. In an earlier time natural disasters like Hurricane Hugo (1989) would probably have been more damaging, because forecasters would have been unable to make accurate predictions well in advance of the storm.
Current weather research at the National Center for Atmospheric Research in Colorado combines the resources of a Cray supercomputer, for computation, with full-color graphical workstations, to model the behavior of thunderstorms. Researchers at the Ohio Supercomputer Center are combining weather data with water level and flow data to model the hydrodynamic conditions in Lake Erie. A few years ago this polluted, almost dead body of water was little understood. Now supercomputer modeling allows forecasts of water conditions about every six hours -- roughly the same frequency as weather forecasts ("Visualization News").
What is evolving is a new field called computational science. It is based on scientific visualization, which Donna Cox of the National Center for Supercomputing Applications has called "the process of representing, as computer graphic images, the results of simulation computations initially expressed in numbers." First the numbers are generated through supercomputing, and detailed graphic images are created. Then graphic workstations present the results for close analysis. Much of the natural phenomena displayed cannot be well understood without a visual image. As Cox has asked, how can one understand the billions of numbers that come from a supercomputer? Visualization helps to solve the problem of "numerical overload" (Cox).
In 1987, the National Science Foundation issued a report on the importance of visualization in scientific computing. In 1989 the Federal High Performance Computing Program listed 21 "Grand Challenges for which solution is likely to be possible using systems developed under this initiative." It calls for massive research funding in such areas as human genetics, the molecular basis for disease, prediction of biochemical effects in the design of new drugs, simulation of ocean currents, aerodynamic transportation design, and the study of ozone depletion in the upper atmosphere. It envisions a close working partnership between computer scientists and their colleagues in all fields of the physical, biological, and earth sciences.
In computational science, then, the scientist can now gain insight into real-world processes not possible with earlier technologies. In this new endeavor, computer science is viewed as more than a service field that produces powerful pictures. It is central to the initial mathematical modeling of real-world phenomena and to the visual understanding of the numerical outcomes. Suppose a scientist wants to display a moving color image on a computer screen made up of more than 1 million spots of color, called pixels, that change shade at least 15 times a second. The computational power required to calculate 15 million changes and send them to the screen in one second is a job for a supercomputer or a high-end workstation. Without such capability the researcher cannot see what is happening in real time, as the event is happening. With the computer imaging, the response seems instantaneous to the user, because human speeds of perception are much slower than computer calculations.
This sort of scientific computing is very expensive. In fact, the National Science Foundation program of funding five National Centers for Supercomputer Applications (at Carnegie Mellon, Princeton, Cornell, University of California at San Diego, and University of Illinois at Champaign-Urbana) had a budget of about $100 million a year through 1995 (La Breque). Supercomputing centers tend to be joint projects of universities and research institutes, which share the computational resources and costs. Thus the third of our reasons for computing, economics, joins the other two, insight and feasibility, in the explosion of scientific research through supercomputers and visualization.
Television viewers of the mid-1980s remember the clever series of Coca-Cola ads featuring the talking head Max Headroom, a personality who existed only as a disembodied image on a TV screen. Max became so popular that an ABC series was built around his adventures "twenty minutes in the future" (Fisher). The show featured hip humor, quick cinematic cutting and music, and, for the first time on network television, integrated computer graphics. Computer-generated images moved behind Max, who appeared himself to be a computer image, but was in actuality an actor made up to look like a digital creation. The result was a series about a telegenic society of constant, multiple intrusions of media into daily life. Max Headroom was not the first use of computer video in movies, but it did break ground in network television. Today even local TV stations employ easy-to-use video graphics technology for special effects in production and advertising.
Computers are now an integral part of journalism and the print media. In fact, desktop publishing has changed forever the production of newspapers and magazines. "Hot type" newspaper pages, produced in the past with Linotype machines and cases of leaded type, have been replaced with cold type, the production of typefonts and pages on computer screens. Printed by laser and xerographic processes, cold type is not limited to black and white, but appears increasingly in color. The costs are lower, the range of printing possibilities is far greater, the time lag is shorter from layout to printing, and the whole enterprise can be controlled in an editorial office.
Newspaper reporters regularly carry portable laptop computers on assignment. Thousands of reporters write their stories in the field and send them back to the home office via the built-in modem and communications software in the computer. In the newsroom, staff reporters use networked terminals to check facts and produce their stories. The managing editor can call up their stories on his terminal and edit them for final printing.
In the layout department, workers use desktop publishing software to set up the pages of the paper, including the mix of stories, graphics, and digitized photographic images that make up a newspaper page. By 1990 the Dallas Times Herald included between 20 and 30 illustrations a day created on a Macintosh, along with advertising photographs that had been computer-enhanced with graphics packages. The director of computer imaging reported a doubling of computer-produced illustrations in one year, at no increase in labor or production costs (Matazzoni).
The San Francisco Examiner went high-tech when Soviet President Gorbachev visited the Bay Area in June 1990 to meet the very short deadlines of its midday and afternoon editions. Technicians videotaped Gorbachev's arrival at the Russian consulate from a television set; transferred a single frame of the tape to the newspaper's "electronic darkroom" software, run on a Macintosh; enhanced the image to print quality; and in a few minutes had the color photo on press. For later editions a reporter snapped pictures with a new Sony digital camera, which records photos on floppy diskettes rather than print film. These were easily processed in the electronic darkroom, and the final afternoon editions carried photos of Gorbachev taken that morning, before he left the area. Both technologies achieved record speed for color photography in the production process (San Francisco Examiner).
Surely one of the most well-known high-tech success stories for computers in the newspaper business is USA Today, the flashy national paper delivered daily across the country. Since it began in 1981, USA Today has been produced with computer graphics and four-color printing. Like the San Francisco Examiner, the paper began capturing photo images from television in 1990, using software called Adobe Photoshop, developed by the company that led the rush to laser printing with its PostScript software (Adobe Systems). Before printing, these digital TV photos can be edited and enhanced.
More amazing is the paper's nationwide printing process, based on satellite distribution of digitized pages. Copy for the paper is sent out every night from Washington headquarters to printing locations all over the nation. First the print pages are digitized with computer hardware; then the digitized images are sent out to the printing sites via satellite. A black-and-white page takes 3 minutes to transmit; color pages, richer in digitized information, take about 6 minutes. The printers receive the images from the satellite dishes and recreate and print the pages between midnight and 2 a.m., and the paper is available all over the country that morning. Here is an application not feasible without computer technology for digitization of images and data communication by satellites. The result is the first truly nationwide daily English language newspaper (Korzenlowski).
In a society as large as the United States, computers have been used since their earliest days to keep track of data of all kinds. Herman Hollerith, working for the Census Bureau, invented the punch card to carry out the 1890 census; UNIVAC I, the first commercial computer, helped with the 1950 count. Based on 1980 census figures, Kenneth C. Laudon described the massive numbers of people about whom records are now being kept by government and quasi-governmental agencies:
At last count there were 50 million Social Security beneficiaries, 95 million individual and 75 million business taxpayers, 21.2 million recipients of food stamps, 10.6 million recipients of Aid to Families with Dependent Children (AFDC), 24 million criminals and 60 million civilians with fingerprints at the FBI, 3.9 million elderly receiving Supplemental Security Income (SSI), 21.4 million recipients of Medicaid, 61.8 million people covered by private health plans, more than 500,000 doctors and dentists who generated 1.1 million office visits, 49.8 million public school students, 9.5 million arrests, 294,000 people in jail, 5.8 million defense industry workers, 2 million members of the armed forces, 36 million living veterans of all wars, 51 million credit card holders, 62 million credit records held in private credit data systems, 154 million registered motor vehicles, and 140 million licensed drivers.
In 1987 a government report listed 26,682 big mainframe computers with a total of 173,069 terminals, as well as a growing number of personal computers (99,087) in federal service (Wilk). Computer spending and usage has continued to spiral upward in the intervening years, as illustrated by the graph in Figure 2-21. Whereas a little more than $15 billion went into federal information technology in fiscal year 1987-88, the latest figures for 1994-95 show an investment of about $25 billion a year. Certainly the number of microcomputers the federal government owns continues to grow, but we have no current figures for state and local jurisdictions.
Governments at all levels use computers to function more efficiently, effectively, and even more democratically. Computers seek out waste and fraud in government programs. They help to streamline operations by merging information collected from different agencies, for better planning, allocation of resources, and distribution of services. They work to collect taxes fairly and to discover cheaters, as well as to distribute monies to the less privileged members of the society. They use information collected about the economy and the society to make plans for future growth and development. They monitor criminal activity and seek out lawbreakers.
In addition, the federal government uses computers for our collective defense. Federal agencies reporting the most computers in 1985 were the Defense Department (66 percent of the mainframes and 39 percent of the micros in the federal government), the Department of Energy, including nuclear regulation (10 percent of the mainframes), the Justice Department, including the FBI (10 percent of the micros), and the Treasury Department, including the Internal Revenue System (4 percent of the mainframes and 6 percent of the micros).
Studies of the effects of increasing computer usage in government have focused discussion among constitutional scholars on the shifting divisions of power among the three branches of the federal government, and on the power relations between the federal government and the states. One question concerns the computer's role in bringing about unequal parity in the historical checks and balances embodied in the Constitution among governmental bodies. The agencies that possess most of the federal government's computers are departments of the executive branch, controlled by Cabinet members. With this massive control over computer information resources, the president and the executive branch seem to have an unfair advantage in the distribution of power.
Congress, since the early 1970s, has added its own considerable computer resources -- for example, for modeling the federal budget process in the Congressional Budget Office. It regularly requires the executive branch to share its computer-produced materials for consideration in the legislative process. Nevertheless, after passage of the Gramm-Rudman-Hollings law requiring automatic deficit reduction, Congressman Michael Synar of Oklahoma brought suit questioning its constitutionality. He felt that Congress had unwisely given over its constitutional responsibility for budgeting not to the President but to a computer (Kraemer and King).
Since the election of 1994, the continuing battle between the Republican Congress and the Democratic White House has reached onto the Internet. Before the election, the White House went onto the World Wide Web with one of the most popular home pages: http://www.whitehouse.gov. Visitors can not only view photos of the mansion but also position papers from the Clinton administration. Similarly most executive department agencies have added Web sites with the strong encouragement of the Clinton-Gore administration. In early 1995, Speaker of the House Newt Gingrich unveiled his rival service for the Congress called Thomas (named for Thomas Jefferson): http://thomas.loc.gov. This site offers the full text of pending legislation and the Congressional Record. Both groups believe that getting the word (and in some cases, the picture) out about the government can only improve communication with the voting public. Taxpayers paid for the development of the Internet over the last 25 years, and they can now log on and actively watch the struggle between the two most visible branches of government (Weingarten).
The judiciary, the least computerized of the three branches, can also require information from the executive departments, and has the power to rule ineligible whatever evidence it chooses. In other words, the judiciary can decide in particular cases whether information is or is not relevant and thus may stymie the executive branch's arguments by ruling its massive data files inadmissible to the case at hand. Computers have only intensified the continual tug of war for power between the branches of the federal government.
In relations between the federal government and the states and cities, the evidence is more mixed. Especially in social welfare and law enforcement, the federal government has more presence on the local level now than ever before. With the provision of welfare funds from states and national agencies to cities and counties come mandates about how the programs must be administered and monitored. This degree of governmental control is made possible with computers; whether it is desirable is a controversial political issue.
Federal incentives for law enforcement funds can require standardization and sharing of information at all levels through such agencies as the National Crime Information Center. NCIC was formed in 1967 to collect and distribute criminal information: files on stolen property and firearms, lists of wanted criminals and missing persons, and criminal histories of people arrested for or convicted of serious crimes. Gradually state and local police departments have merged their records into NCIC and coordinated their computer hardware to allow almost instant access to these massive criminal files. David Nemecek, director of NCIC, reported in August 1990 that the agency got more than 1 million queries a day for criminal information from more than 64 thousand law enforcement agencies (Boyd). Without such centralization and standardization of criminal records and the ability to access them through communications links, there could be no effective nationwide coordination of law enforcement. On the other hand, critics allege that NCIC is vulnerable to abuse. Computer Professionals for Social Responsibility has, in fact, persuaded the FBI to spell out policies prohibiting the use of NCIC to track "suspicious" individuals.
In recent years, using digitized imaging, computer programs have been developed to compare fingerprints automatically, replacing the time-consuming process of checking fingerprint cards manually. In one case reported in the Christian Science Monitor (9 June 1988), a burglar who murdered a woman in San Francisco in 1978 and left behind his fingerprints was caught when the police department got the software eight years later. A police inspector was able to find the man's prints in less than 4 minutes -- a job he had been unable to accomplish in almost a decade of checking records visually. The murderer was picked up, admitted the crime, and is now in jail.
In 1988 these digitized fingerprint records had not yet been incorporated into NCIC, for two reasons. Several competing encoding standards from different software producers were in use, and the FBI did not have the funds to encode its more than 20 million criminal fingerprints -- that is, more than 200 million individual prints, 10 per person. While the FBI worked to complete this job by 1993, states had been buying the same software and sharing data among themselves. With their own computers and budgets, local jurisdictions have kept control over their own criminal data, and with microcomputers approaching the power of mainframe machines this trend will grow.
Certainly the most massive use and expenditure on computers in the U.S. government has been in defense of the nation. The famous and controversial Star Wars weapons program of the 1980s, properly called the Strategic Defense Initiative (SDI), was based on the use of computers to monitor attacking missiles and automatically deploy defensive missiles to destroy them. The program has been vilified by some computer professionals as too expensive and unworkable, and credited by supporters with helping to end the Cold War (because the Soviet Union had neither the technology nor the finances to mount a credible alternative). Some critics alleged that SDI was far more than a defensive system and could be used for a preemptive first-strike nuclear attack.
Star Wars aside, modern warfare was changed forever by the so-called smart weapons that aided soldiers, sailors, and pilots in the 1991 Persian Gulf War, which has been called "the first information war" by author Alan Campen. High-tech successes included Tomahawk stealth missiles, which sought out and destroyed their targets with pinpoint accuracy, and Patriot missiles, which used computer-generated trajectories to intercept incoming Iraqi Scuds. In a continuing research project called the supercockpit, pilots wear "virtual reality helmets." As in a video game, computers in the helmets show pilots the terrain ahead of them, the enemy weapons approaching, and the evasive actions they can take. Research includes voice-control systems that allow planes to be flown by speech commands. If the pilot is unable to fly the plane, the computer, dubbed the "pilot's associate," takes over (Thomas and Barry).
War is becoming depersonalized; the enemy is not seen by the soldier or the pilot, but found and destroyed electronically. Even in the ranks of defense analysts, critics contend that General George Patton was right when he said, "Wars may be fought with weapons, but they are won by men. It is the spirit of men who follow and of the man who leads that gains the victory." A lot of faith and federal research money have been invested in the Strategic Computing Program of the Defense Department and in the workability of automated weapons. Whether defense budgets will be able to keep pace with the 1980s in these times of federal deficits and the end of the Cold War, in order to sustain the research required by these new computing systems, remains an important question for national debate among Congress and the American people.
For individual computer users, as well as in education, the medical and physical sciences, the media, and at all levels of government, the computer is a positive technology that offers economy, insight, and applications that would be impossible to do otherwise. It enables its users to be more productive and to do enormous good. It gives reality to the old slogan Information Is Power for individuals and organizations alike. For some handicapped people, the computer is literally empowering, and for government it makes possible the exercise of power for social good in a complex, modern mass society. Yet like any technology, computers bring inevitable costs, not all of them monetary. We must turn now to the other side of the equation, the societal consequences of the computer age.
Like the American literature course at the University of Texas described in Chapter 1, Major Curtis Carver's computer science course at West Point is available at all hours of the day and night on the military academy's World Wide Web site. At West Point all cadets get a computer when they arrive, and the faculty can place assignments on the Web. Major Carver's "textbook of the future" contains more than a gigabyte of course-related resources on the Web: 200 audio files; 200 graphic files; 37 digital movies; multimedia slide shows for every lesson; student papers from earlier semesters; and definitions, search terms, and practice quizzes for each exam. Students may send queries to the teacher and take exams on-line, and the computer will do automatic scoring of objective questions. They type answers to essay questions, which are then graded later as electronic mail.
One pioneering feature of this WWW course is the learning styles quiz at the start of the course. Students answer a series of questions on the computer designed to determine what kind of learners they are -- verbal, visual, aural, etc. The questionnaire is scored automatically by the machine for each student, and lesson files are then presented to them individually in the most advantageous way for their understanding. For example, verbal students will get the text documents first, whereas visual learners will be offered graphics and videos first. Although all of the material is always available, the computer presents it for each cadet in a personalized way to make it easy to comprehend. Students have control from their dorm rooms about how and when they will study the lessons outside of class (Carver).
Courses like this one suggest possibilities for the growth of distance learning and collaborative efforts among colleges and universities. The computer is the middleman dispensing multimedia and accepting responses and queries through the medium of the World Wide Web. Although the course was put together for local students at West Point, anyone with the Web address and proper access codes can view the lessons. Compared with some other sites offering a variety of materials, which may be protected by copyright law from use by others, this West Point course was developed at a public institution; and its resources are accessible in the common domain.
Other common distance learning modes include distribution of course materials on videotape or via live satellite television. Combining a computer with these methods provides for easier remote feedback between students and their teacher. For instance, MediaLink for Macintosh includes several interactive network features suitable for use on the Internet for distance learning. Students equipped with an audiovisual Mac can watch a television course live or on videotape and still have all of the multimedia capacity in lesson materials. If they log onto the Internet, they can correspond with their teacher about assignments. With their addresses, the teacher can send transmit media resources, including sounds and graphic files, directly to students. Such items as text assignments, short quizzes to fill out, or graphics for further information can be sent to all the students through their Internet addresses. Two participants can collaborate on a document in an on-line conference in real time, such as working on a class report at long distance. The term "real time" refers to the ability to respond to computer activity immediately, as it is happening. As each person keys text into the computer, the other party can see the typing at the other end of the Internet line. With the computer as intermediary, a number of the new learning paradigms shown in Table 2-1 can be realized. Distance learning can move from passive viewing to active two-way participation among teachers and students, a true community of learners who may be widely separated by geographical distance.
computer-aided design (CAD)
computerized axial tomography (CAT or CT scanner)
magnetic resonance imaging (MRI scanning)
Strategic Defense Initiative (SDI)
1. All of the following characteristics make the computer a powerful force in many areas of application except
a. its flexibility
b. its speed
c. its accuracy
d. its compatibility among different models and manufacturers
2. Applications software that is called "integrated"
a. works on black-and-white screens only
b. has several programs that can interact with each other's data
c. is handled in character mode
d. can be run only on a mouse-driven computer
3. Which of the following computer strategies represent new methods of educational pedagogy?
a. distance learning
b. network classrooms
c. team projects in multimedia
d. all of the above
4. All of the following are reasons for the popularity of computer simulations in the natural sciences except one. Identify it.
a. Students gain experience in on-line networking in the laboratory.
b. Sometimes experiments are not safe for students to do in real life.
c. Equipment for some experiments is too costly for many laboratories.
d. Simulations cut down on delay in waiting for reactions.
5. Which of the following does not characterize modern scientific research in business and universities?
b. graphical workstations
c. dissection of laboratory animals with robots
d. scientific visualization
6. The era of computerized journalism is characterized by all of the following except
a. reporting with laptop computers equipped with modems
b. cold type publishing
c. voice-activated typesetting
d. on-line versions of newspapers and magazines
7. Which of the following is not a reason that governments at all levels use computers to carry out their business?
a. to function more efficiently
b. to discourage dissent
c. to manage governmental services more effectively
d. to spread their benefits more democratically
8. Choose from the following list a project that exemplifies computers used for national defense.
a. SDI, the Star Wars project
c. the Patriot missile
d. all of the above
9. T F Word processing is simply typewriting using a computer keyboard for input and a printer for output.
10. T F The chief advantage of a spreadsheet over manual accounting techniques is its capacity to do arithmetic calculations automatically.
11. T F A major reason why the personal computer has become so popular is that various kinds of applications software which run on it can do so many useful jobs.
12. T F In the future experts expect computers to become a primary replacement for the human contact between teacher and student.
13. T F One of the main reasons why USA Today could become America's first national paper, delivered throughout the country daily, was satellite distribution and printing of the downloaded text at local sites.
14. T F Studies show that by the year 2000 as many Americans will get their news from the World Wide Web as from printed newspapers.
15. T F Most of the computers in the federal government are in departments of the legislative branch controlled by Congress.
16. T F Military leaders unanimously agree that future armies will have few ordinary foot soldiers, whose role will be taken over by a technological, robotic force.
17. Computer programs for common applications like word processing or spreadsheets, which users choose to increase their day-to-day productivity, are often called _____________________.
18. _________ packages are analogous to manual filing systems, such as using index cards for a mailing list.
19. Electronic mail and computer conferencing represent two instances of the burgeoning applications software field called ___________________.
20. _______ is a well-known medical database of current research findings and treatments, well indexed and suitable for on-line searches.
21. Computers are being used frequently in the field of ___________, the replacement of missing body parts with artificial limbs or organs.
22. The new field of _____________________ combines two major computer capacities, the ability to do massive amounts of mathematical calculations quickly and to represent the results as graphic images.
23. When a computer calculates the results of an experiment and presents the results as soon as the data have been read, the process is called a _________ application.
24. The ________________________________________ was created to collect and distribute criminal information, such as the names of wanted criminals and missing persons and criminal records of people convicted of serious crimes.
1. Go to the administrative office of your academic department or college to see what kinds of applications software are used there. For what sorts of jobs are specific kinds of software used? What particular brands are favored and why?
2. Investigate the role of computer-assisted instruction in your college or university. Are writing courses taught with computers? Do the social sciences use on-line databases? What kinds of simulations are found in the social and physical sciences?
3. Do writing teachers recommend spelling and grammar checkers at your college? Are calculators encouraged in mathematics classes? If not, what reasons are given for discouraging such computer aids?
4. American health care remains the most expensive in the world. For a recent year, research the extent to which economists credit expensive high-tech computer devices like CAT and MRI scanners with increasing overall health costs.
5. Look up the 21 Grand Challenge problems identified by the Federal High Performance Computing Program in 1989. Find out whether natural scientists at your institution do research related to those problems, and write a report on the role of computing in the effort.
6. Investigate the role of computers in journalism on your campus or in your community, including both print and radio and television.
7. Take a topic like Bosnia, the Super Bowl, or something else current, and see what sorts of information you can find about it on World Wide Web versions of newspapers, magazines, and television networks. Is the depth of coverage suitable for writing a report, or would you be better off using traditional sources in the library? How easy is it to use the indexing systems available on the Web?
8. Given the widespread use of NCIC crime files, go to a local police station and see if they are queried, for what purposes, and how often. How reliable do officials think that NCIC information is?
9. The Tomahawk and Stealth missile technologies that were used in the 1991 war with Iraq were really developed in the 1970s. Investigate the computer component of some current weapon system under development.
1. With increasing use of CAI, how is the role of the classroom teacher changing? Are computers in education generally a positive trend?
2. What effects will health care reforms discussed in Washington have on the spread of high-tech medicine? Would Americans want cost controls to hold back the growth of medical technology?
3. The growth of "big science" involving supercomputers is thought by some to be too expensive for continued support by the federal government, at least at levels sustained in the Cold War. Discuss.
4. Because print and broadcast media are merging materials into common digital formats, the media industries of the future will probably coalesce into giant newspaper/TV/movie conglomerates. Discuss this statement.
5. Most of the computers in the federal government are used in the departments of the executive branch, such as Defense, Treasury, and Justice. As a result, the computer contributes to unequal parity in favor of the president vis-a-vis the Congress and the Judiciary in terms of the checks and balances embodied in the Constitution. Defend or refute this point.
6. Wars in the future will be automated with computerized weapons systems. The days of the foot soldier are rapidly ending. Discuss this statement.
Multiple-Choice: 1. d; 2. b; 3. d; 4. a; 5. c; 6. c; 7. b; 8. d
True/False: 9. F; 10. T; 11. T; 12. F; 13. T; 14. F; 15. F; 16. F
Fill-in: 17. productivity software; 18. database; 19. data communications; 20. MEDLINE; 21. prosthetics; 22. computational science; 23. real-time; 24. National Crime Information Center (NCIC)