Chapter 3

This situation will change, LYNCH said. He differentiated CD-ROM from the practices that have been followed up to now in distributing data on CD-ROM. For LYNCH the problem with CD-ROM is not its portability or its slowness but the two-edged sword of having the retrieval application and the user interface inextricably bound up with the data, which is the typical CD-ROM publication model. It is not a case of publishing data but of distributing a typically stand-alone, typically closed system, all—software, user interface, and data—on a little disk. Hence, all the between-disk navigational issues as well as the impossibility in most cases of integrating data on one disk with that on another. Most CD-ROM retrieval software does not network very gracefully at present. However, in the present world of immature standards and lack of understanding of what network information is or what the ground rules are for creating or using it, publishing information on a CD-ROM does add value in a very real sense.

LYNCH drew a contrast between CD-ROM and network pricing and in doing so highlighted something bizarre in information pricing. A large institution such as the University of California has vendors who will offer to sell information on CD-ROM for a price per year in four digits, but for the same data (e.g., an abstracting and indexing database) on magnetic tape, regardless of how many people may use it concurrently, will quote a price in six digits.

What is packaged with the CD-ROM in one sense adds value—a complete access system, not just raw, unrefined information—although it is not generally perceived that way. This is because the access software, although it adds value, is viewed by some people, particularly in the university environment where there is a very heavy commitment to networking, as being developed in the wrong direction.

Given that context, LYNCH described the examples demonstrated as a set of insular information gems—Perseus, for example, offers nicely linked information, but would be very difficult to integrate with other databases, that is, to link together seamlessly with other source files from other sources. It resembles an island, and in this respect is similar to numerous stand-alone projects that are based on videodiscs, that is, on the single-workstation concept.

As scholarship evolves in a network environment, the paramount need will be to link databases. We must link personal databases to public databases, to group databases, in fairly seamless ways—which is extremely difficult in the environments under discussion with copies of databases proliferating all over the place.

The notion of layering also struck LYNCH as lurking in several of the projects demonstrated. Several databases in a sense constitute information archives without a significant amount of navigation built in. Educators, critics, and others will want a layered structure—one that defines or links paths through the layers to allow users to reach specific points. In LYNCH's view, layering will become increasingly necessary, and not just within a single resource but across resources (e.g., tracing mythology and cultural themes across several classics databases as well as a database of Renaissance culture). This ability to organize resources, to build things out of multiple other things on the network or select pieces of it, represented for LYNCH one of the key aspects of network information.

Contending that information reuse constituted another significant issue, LYNCH commended to the audience's attention Project NEEDS (i.e., National Engineering Education Delivery System). This project's objective is to produce a database of engineering courseware as well as the components that can be used to develop new courseware. In a number of the existing applications, LYNCH said, the issue of reuse (how much one can take apart and reuse in other applications) was not being well considered. He also raised the issue of active versus passive use, one aspect of which is how much information will be manipulated locally by users. Most people, he argued, may do a little browsing and then will wish to print. LYNCH was uncertain how these resources would be used by the vast majority of users in the network environment.

LYNCH next said a few words about X-Windows as a way of differentiating between network access and networked information. A number of the applications demonstrated at the Workshop could be rewritten to use X across the network, so that one could run them from any X-capable device- -a workstation, an X terminal—and transact with a database across the network. Although this opens up access a little, assuming one has enough network to handle it, it does not provide an interface to develop a program that conveniently integrates information from multiple databases. X is a viewing technology that has limits. In a real sense, it is just a graphical version of remote log-in across the network. X-type applications represent only one step in the progression towards real access.

LYNCH next discussed barriers to the distribution of networked multimedia information. The heart of the problem is a lack of standards to provide the ability for computers to talk to each other, retrieve information, and shuffle it around fairly casually. At the moment, little progress is being made on standards for networked information; for example, present standards do not cover images, digital voice, and digital video. A useful tool kit of exchange formats for basic texts is only now being assembled. The synchronization of content streams (i.e., synchronizing a voice track to a video track, establishing temporal relations between different components in a multimedia object) constitutes another issue for networked multimedia that is just beginning to receive attention.

Underlying network protocols also need some work; good, real-time delivery protocols on the Internet do not yet exist. In LYNCH's view, highly important in this context is the notion of networked digital object IDs, the ability of one object on the network to point to another object (or component thereof) on the network. Serious bandwidth issues also exist. LYNCH was uncertain if billion-bit-per-second networks would prove sufficient if numerous people ran video in parallel.

LYNCH concluded by offering an issue for database creators to consider, as well as several comments about what might constitute good trial multimedia experiments. In a networked information world the database builder or service builder (publisher) does not exercise the same extensive control over the integrity of the presentation; strange programs "munge" with one's data before the user sees it. Serious thought must be given to what guarantees integrity of presentation. Part of that is related to where one draws the boundaries around a networked information service. This question of presentation integrity in client-server computing has not been stressed enough in the academic world, LYNCH argued, though commercial service providers deal with it regularly.

Concerning multimedia, LYNCH observed that good multimedia at the moment is hideously expensive to produce. He recommended producing multimedia with either very high sale value, or multimedia with a very long life span, or multimedia that will have a very broad usage base and whose costs therefore can be amortized among large numbers of users. In this connection, historical and humanistically oriented material may be a good place to start, because it tends to have a longer life span than much of the scientific material, as well as a wider user base. LYNCH noted, for example, that American Memory fits many of the criteria outlined. He remarked the extensive discussion about bringing the Internet or the National Research and Education Network (NREN) into the K-12 environment as a way of helping the American educational system.

LYNCH closed by noting that the kinds of applications demonstrated struck him as excellent justifications of broad-scale networking for K-12, but that at this time no "killer" application exists to mobilize the K-12 community to obtain connectivity.

******

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ DISCUSSION * Dearth of genuinely interesting applications on the network a slow-changing situation * The issue of the integrity of presentation in a networked environment * Several reasons why CD-ROM software does not network * +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

During the discussion period that followed LYNCH's presentation, several additional points were made.

LYNCH reiterated even more strongly his contention that, historically, once one goes outside high-end science and the group of those who need access to supercomputers, there is a great dearth of genuinely interesting applications on the network. He saw this situation changing slowly, with some of the scientific databases and scholarly discussion groups and electronic journals coming on as well as with the availability of Wide Area Information Servers (WAIS) and some of the databases that are being mounted there. However, many of those things do not seem to have piqued great popular interest. For instance, most high school students of LYNCH's acquaintance would not qualify as devotees of serious molecular biology.

Concerning the issue of the integrity of presentation, LYNCH believed that a couple of information providers have laid down the law at least on certain things. For example, his recollection was that the National Library of Medicine feels strongly that one needs to employ the identifier field if he or she is to mount a database commercially. The problem with a real networked environment is that one does not know who is reformatting and reprocessing one's data when one enters a client server mode. It becomes anybody's guess, for example, if the network uses a Z39.50 server, or what clients are doing with one's data. A data provider can say that his contract will only permit clients to have access to his data after he vets them and their presentation and makes certain it suits him. But LYNCH held out little expectation that the network marketplace would evolve in that way, because it required too much prior negotiation.

CD-ROM software does not network for a variety of reasons, LYNCH said. He speculated that CD-ROM publishers are not eager to have their products really hook into wide area networks, because they fear it will make their data suppliers nervous. Moreover, until relatively recently, one had to be rather adroit to run a full TCP/IP stack plus applications on a PC-size machine, whereas nowadays it is becoming easier as PCs grow bigger and faster. LYNCH also speculated that software providers had not heard from their customers until the last year or so, or had not heard from enough of their customers.

******

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ BESSER * Implications of disseminating images on the network; planning the distribution of multimedia documents poses two critical implementation problems * Layered approach represents the way to deal with users' capabilities * Problems in platform design; file size and its implications for networking * Transmission of megabyte size images impractical * Compression and decompression at the user's end * Promising trends for compression * A disadvantage of using X-Windows * A project at the Smithsonian that mounts images on several networks * +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Howard BESSER, School of Library and Information Science, University of Pittsburgh, spoke primarily about multimedia, focusing on images and the broad implications of disseminating them on the network. He argued that planning the distribution of multimedia documents posed two critical implementation problems, which he framed in the form of two questions: 1) What platform will one use and what hardware and software will users have for viewing of the material? and 2) How can one deliver a sufficiently robust set of information in an accessible format in a reasonable amount of time? Depending on whether network or CD-ROM is the medium used, this question raises different issues of storage, compression, and transmission.

Concerning the design of platforms (e.g., sound, gray scale, simple color, etc.) and the various capabilities users may have, BESSER maintained that a layered approach was the way to deal with users' capabilities. A result would be that users with less powerful workstations would simply have less functionality. He urged members of the audience to advocate standards and accompanying software that handle layered functionality across a wide variety of platforms.

BESSER also addressed problems in platform design, namely, deciding how large a machine to design for situations when the largest number of users have the lowest level of the machine, and one desires higher functionality. BESSER then proceeded to the question of file size and its implications for networking. He discussed still images in the main. For example, a digital color image that fills the screen of a standard mega-pel workstation (Sun or Next) will require one megabyte of storage for an eight-bit image or three megabytes of storage for a true color or twenty-four-bit image. Lossless compression algorithms (that is, computational procedures in which no data is lost in the process of compressing [and decompressing] an image—the exact bit-representation is maintained) might bring storage down to a third of a megabyte per image, but not much further than that. The question of size makes it difficult to fit an appropriately sized set of these images on a single disk or to transmit them quickly enough on a network.

With these full screen mega-pel images that constitute a third of a megabyte, one gets 1,000-3,000 full-screen images on a one-gigabyte disk; a standard CD-ROM represents approximately 60 percent of that. Storing images the size of a PC screen (just 8 bit color) increases storage capacity to 4,000-12,000 images per gigabyte; 60 percent of that gives one the size of a CD-ROM, which in turn creates a major problem. One cannot have full-screen, full-color images with lossless compression; one must compress them or use a lower resolution. For megabyte-size images, anything slower than a T-1 speed is impractical. For example, on a fifty-six-kilobaud line, it takes three minutes to transfer a one-megabyte file, if it is not compressed; and this speed assumes ideal circumstances (no other user contending for network bandwidth). Thus, questions of disk access, remote display, and current telephone connection speed make transmission of megabyte-size images impractical.

BESSER then discussed ways to deal with these large images, for example, compression and decompression at the user's end. In this connection, the issues of how much one is willing to lose in the compression process and what image quality one needs in the first place are unknown. But what is known is that compression entails some loss of data. BESSER urged that more studies be conducted on image quality in different situations, for example, what kind of images are needed for what kind of disciplines, and what kind of image quality is needed for a browsing tool, an intermediate viewing tool, and archiving.

BESSER remarked two promising trends for compression: from a technical perspective, algorithms that use what is called subjective redundancy employ principles from visual psycho-physics to identify and remove information from the image that the human eye cannot perceive; from an interchange and interoperability perspective, the JPEG (i.e., Joint Photographic Experts Group, an ISO standard) compression algorithms also offer promise. These issues of compression and decompression, BESSER argued, resembled those raised earlier concerning the design of different platforms. Gauging the capabilities of potential users constitutes a primary goal. BESSER advocated layering or separating the images from the applications that retrieve and display them, to avoid tying them to particular software.

BESSER detailed several lessons learned from his work at Berkeley with Imagequery, especially the advantages and disadvantages of using X-Windows. In the latter category, for example, retrieval is tied directly to one's data, an intolerable situation in the long run on a networked system. Finally, BESSER described a project of Jim Wallace at the Smithsonian Institution, who is mounting images in a extremely rudimentary way on the Compuserv and Genie networks and is preparing to mount them on America On Line. Although the average user takes over thirty minutes to download these images (assuming a fairly fast modem), nevertheless, images have been downloaded 25,000 times.

BESSER concluded his talk with several comments on the business arrangement between the Smithsonian and Compuserv. He contended that not enough is known concerning the value of images.

******

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ DISCUSSION * Creating digitized photographic collections nearly impossible except with large organizations like museums * Need for study to determine quality of images users will tolerate * +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

During the brief exchange between LESK and BESSER that followed, several clarifications emerged.

LESK argued that the photographers were far ahead of BESSER: It is almost impossible to create such digitized photographic collections except with large organizations like museums, because all the photographic agencies have been going crazy about this and will not sign licensing agreements on any sort of reasonable terms. LESK had heard that National Geographic, for example, had tried to buy the right to use some image in some kind of educational production for $100 per image, but the photographers will not touch it. They want accounting and payment for each use, which cannot be accomplished within the system. BESSER responded that a consortium of photographers, headed by a former National Geographic photographer, had started assembling its own collection of electronic reproductions of images, with the money going back to the cooperative.

LESK contended that BESSER was unnecessarily pessimistic about multimedia images, because people are accustomed to low-quality images, particularly from video. BESSER urged the launching of a study to determine what users would tolerate, what they would feel comfortable with, and what absolutely is the highest quality they would ever need. Conceding that he had adopted a dire tone in order to arouse people about the issue, BESSER closed on a sanguine note by saying that he would not be in this business if he did not think that things could be accomplished.

******

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ LARSEN * Issues of scalability and modularity * Geometric growth of the Internet and the role played by layering * Basic functions sustaining this growth * A library's roles and functions in a network environment * Effects of implementation of the Z39.50 protocol for information retrieval on the library system * The trade-off between volumes of data and its potential usage * A snapshot of current trends * +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Ronald LARSEN, associate director for information technology, University of Maryland at College Park, first addressed the issues of scalability and modularity. He noted the difficulty of anticipating the effects of orders-of-magnitude growth, reflecting on the twenty years of experience with the Arpanet and Internet. Recalling the day's demonstrations of CD-ROM and optical disk material, he went on to ask if the field has yet learned how to scale new systems to enable delivery and dissemination across large-scale networks.

LARSEN focused on the geometric growth of the Internet from its inception circa 1969 to the present, and the adjustments required to respond to that rapid growth. To illustrate the issue of scalability, LARSEN considered computer networks as including three generic components: computers, network communication nodes, and communication media. Each component scales (e.g., computers range from PCs to supercomputers; network nodes scale from interface cards in a PC through sophisticated routers and gateways; and communication media range from 2,400-baud dial-up facilities through 4.5-Mbps backbone links, and eventually to multigigabit-per-second communication lines), and architecturally, the components are organized to scale hierarchically from local area networks to international-scale networks. Such growth is made possible by building layers of communication protocols, as BESSER pointed out. By layering both physically and logically, a sense of scalability is maintained from local area networks in offices, across campuses, through bridges, routers, campus backbones, fiber-optic links, etc., up into regional networks and ultimately into national and international networks.

LARSEN then illustrated the geometric growth over a two-year period— through September 1991—of the number of networks that comprise the Internet. This growth has been sustained largely by the availability of three basic functions: electronic mail, file transfer (ftp), and remote log-on (telnet). LARSEN also reviewed the growth in the kind of traffic that occurs on the network. Network traffic reflects the joint contributions of a larger population of users and increasing use per user. Today one sees serious applications involving moving images across the network—a rarity ten years ago. LARSEN recalled and concurred with BESSER's main point that the interesting problems occur at the application level.

LARSEN then illustrated a model of a library's roles and functions in a network environment. He noted, in particular, the placement of on-line catalogues onto the network and patrons obtaining access to the library increasingly through local networks, campus networks, and the Internet. LARSEN supported LYNCH's earlier suggestion that we need to address fundamental questions of networked information in order to build environments that scale in the information sense as well as in the physical sense.

LARSEN supported the role of the library system as the access point into the nation's electronic collections. Implementation of the Z39.50 protocol for information retrieval would make such access practical and feasible. For example, this would enable patrons in Maryland to search California libraries, or other libraries around the world that are conformant with Z39.50 in a manner that is familiar to University of Maryland patrons. This client-server model also supports moving beyond secondary content into primary content. (The notion of how one links from secondary content to primary content, LARSEN said, represents a fundamental problem that requires rigorous thought.) After noting numerous network experiments in accessing full-text materials, including projects supporting the ordering of materials across the network, LARSEN revisited the issue of transmitting high-density, high-resolution color images across the network and the large amounts of bandwidth they require. He went on to address the bandwidth and synchronization problems inherent in sending full-motion video across the network.

LARSEN illustrated the trade-off between volumes of data in bytes or orders of magnitude and the potential usage of that data. He discussed transmission rates (particularly, the time it takes to move various forms of information), and what one could do with a network supporting multigigabit-per-second transmission. At the moment, the network environment includes a composite of data-transmission requirements, volumes and forms, going from steady to bursty (high-volume) and from very slow to very fast. This aggregate must be considered in the design, construction, and operation of multigigabyte networks.

LARSEN's objective is to use the networks and library systems now being constructed to increase access to resources wherever they exist, and thus, to evolve toward an on-line electronic virtual library.

LARSEN concluded by offering a snapshot of current trends: continuing geometric growth in network capacity and number of users; slower development of applications; and glacial development and adoption of standards. The challenge is to design and develop each new application system with network access and scalability in mind.

******

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ BROWNRIGG * Access to the Internet cannot be taken for granted * Packet radio and the development of MELVYL in 1980-81 in the Division of Library Automation at the University of California * Design criteria for packet radio * A demonstration project in San Diego and future plans * Spread spectrum * Frequencies at which the radios will run and plans to reimplement the WAIS server software in the public domain * Need for an infrastructure of radios that do not move around * +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Edwin BROWNRIGG, executive director, Memex Research Institute, first polled the audience in order to seek out regular users of the Internet as well as those planning to use it some time in the future. With nearly everybody in the room falling into one category or the other, BROWNRIGG made a point re access, namely that numerous individuals, especially those who use the Internet every day, take for granted their access to it, the speeds with which they are connected, and how well it all works. However, as BROWNRIGG discovered between 1987 and 1989 in Australia, if one wants access to the Internet but cannot afford it or has some physical boundary that prevents her or him from gaining access, it can be extremely frustrating. He suggested that because of economics and physical barriers we were beginning to create a world of haves and have-nots in the process of scholarly communication, even in the United States.

BROWNRIGG detailed the development of MELVYL in academic year 1980-81 in the Division of Library Automation at the University of California, in order to underscore the issue of access to the system, which at the outset was extremely limited. In short, the project needed to build a network, which at that time entailed use of satellite technology, that is, putting earth stations on campus and also acquiring some terrestrial links from the State of California's microwave system. The installation of satellite links, however, did not solve the problem (which actually formed part of a larger problem involving politics and financial resources). For while the project team could get a signal onto a campus, it had no means of distributing the signal throughout the campus. The solution involved adopting a recent development in wireless communication called packet radio, which combined the basic notion of packet-switching with radio. The project used this technology to get the signal from a point on campus where it came down, an earth station for example, into the libraries, because it found that wiring the libraries, especially the older marble buildings, would cost $2,000-$5,000 per terminal.

BROWNRIGG noted that, ten years ago, the project had neither the public policy nor the technology that would have allowed it to use packet radio in any meaningful way. Since then much had changed. He proceeded to detail research and development of the technology, how it is being deployed in California, and what direction he thought it would take. The design criteria are to produce a high-speed, one-time, low-cost, high-quality, secure, license-free device (packet radio) that one can plug in and play today, forget about it, and have access to the Internet. By high speed, BROWNRIGG meant 1 megabyte and 1.5 megabytes. Those units have been built, he continued, and are in the process of being type-certified by an independent underwriting laboratory so that they can be type-licensed by the Federal Communications Commission. As is the case with citizens band, one will be able to purchase a unit and not have to worry about applying for a license.

The basic idea, BROWNRIGG elaborated, is to take high-speed radio data transmission and create a backbone network that at certain strategic points in the network will "gateway" into a medium-speed packet radio (i.e., one that runs at 38.4 kilobytes), so that perhaps by 1994-1995 people, like those in the audience for the price of a VCR could purchase a medium-speed radio for the office or home, have full network connectivity to the Internet, and partake of all its services, with no need for an FCC license and no regular bill from the local common carrier. BROWNRIGG presented several details of a demonstration project currently taking place in San Diego and described plans, pending funding, to install a full-bore network in the San Francisco area. This network will have 600 nodes running at backbone speeds, and 100 of these nodes will be libraries, which in turn will be the gateway ports to the 38.4 kilobyte radios that will give coverage for the neighborhoods surrounding the libraries.

BROWNRIGG next explained Part 15.247, a new rule within Title 47 of the Code of Federal Regulations enacted by the FCC in 1985. This rule challenged the industry, which has only now risen to the occasion, to build a radio that would run at no more than one watt of output power and use a fairly exotic method of modulating the radio wave called spread spectrum. Spread spectrum in fact permits the building of networks so that numerous data communications can occur simultaneously, without interfering with each other, within the same wide radio channel.

BROWNRIGG explained that the frequencies at which the radios would run are very short wave signals. They are well above standard microwave and radar. With a radio wave that small, one watt becomes a tremendous punch per bit and thus makes transmission at reasonable speed possible. In order to minimize the potential for congestion, the project is undertaking to reimplement software which has been available in the networking business and is taken for granted now, for example, TCP/IP, routing algorithms, bridges, and gateways. In addition, the project plans to take the WAIS server software in the public domain and reimplement it so that one can have a WAIS server on a Mac instead of a Unix machine. The Memex Research Institute believes that libraries, in particular, will want to use the WAIS servers with packet radio. This project, which has a team of about twelve people, will run through 1993 and will include the 100 libraries already mentioned as well as other professionals such as those in the medical profession, engineering, and law. Thus, the need is to create an infrastructure of radios that do not move around, which, BROWNRIGG hopes, will solve a problem not only for libraries but for individuals who, by and large today, do not have access to the Internet from their homes and offices.

******

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ DISCUSSION * Project operating frequencies * +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

During a brief discussion period, which also concluded the day's proceedings, BROWNRIGG stated that the project was operating in four frequencies. The slow speed is operating at 435 megahertz, and it would later go up to 920 megahertz. With the high-speed frequency, the one-megabyte radios will run at 2.4 gigabits, and 1.5 will run at 5.7. At 5.7, rain can be a factor, but it would have to be tropical rain, unlike what falls in most parts of the United States.

******

William HOOTON, vice president of operations, I-NET, moderated this session.

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ KENNEY * Factors influencing development of CXP * Advantages of using digital technology versus photocopy and microfilm * A primary goal of CXP; publishing challenges * Characteristics of copies printed * Quality of samples achieved in image capture * Several factors to be considered in choosing scanning * Emphasis of CXP on timely and cost-effective production of black-and-white printed facsimiles * Results of producing microfilm from digital files * Advantages of creating microfilm * Details concerning production * Costs * Role of digital technology in library preservation * +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Anne KENNEY, associate director, Department of Preservation and Conservation, Cornell University, opened her talk by observing that the Cornell Xerox Project (CXP) has been guided by the assumption that the ability to produce printed facsimiles or to replace paper with paper would be important, at least for the present generation of users and equipment. She described three factors that influenced development of the project: 1) Because the project has emphasized the preservation of deteriorating brittle books, the quality of what was produced had to be sufficiently high to return a paper replacement to the shelf. CXP was only interested in using: 2) a system that was cost-effective, which meant that it had to be cost-competitive with the processes currently available, principally photocopy and microfilm, and 3) new or currently available product hardware and software.

KENNEY described the advantages that using digital technology offers over both photocopy and microfilm: 1) The potential exists to create a higher quality reproduction of a deteriorating original than conventional light-lens technology. 2) Because a digital image is an encoded representation, it can be reproduced again and again with no resulting loss of quality, as opposed to the situation with light-lens processes, in which there is discernible difference between a second and a subsequent generation of an image. 3) A digital image can be manipulated in a number of ways to improve image capture; for example, Xerox has developed a windowing application that enables one to capture a page containing both text and illustrations in a manner that optimizes the reproduction of both. (With light-lens technology, one must choose which to optimize, text or the illustration; in preservation microfilming, the current practice is to shoot an illustrated page twice, once to highlight the text and the second time to provide the best capture for the illustration.) 4) A digital image can also be edited, density levels adjusted to remove underlining and stains, and to increase legibility for faint documents. 5) On-screen inspection can take place at the time of initial setup and adjustments made prior to scanning, factors that substantially reduce the number of retakes required in quality control.

A primary goal of CXP has been to evaluate the paper output printed on the Xerox DocuTech, a high-speed printer that produces 600-dpi pages from scanned images at a rate of 135 pages a minute. KENNEY recounted several publishing challenges to represent faithful and legible reproductions of the originals that the 600-dpi copy for the most part successfully captured. For example, many of the deteriorating volumes in the project were heavily illustrated with fine line drawings or halftones or came in languages such as Japanese, in which the buildup of characters comprised of varying strokes is difficult to reproduce at lower resolutions; a surprising number of them came with annotations and mathematical formulas, which it was critical to be able to duplicate exactly.

KENNEY noted that 1) the copies are being printed on paper that meets the ANSI standards for performance, 2) the DocuTech printer meets the machine and toner requirements for proper adhesion of print to page, as described by the National Archives, and thus 3) paper product is considered to be the archival equivalent of preservation photocopy.

KENNEY then discussed several samples of the quality achieved in the project that had been distributed in a handout, for example, a copy of a print-on-demand version of the 1911 Reed lecture on the steam turbine, which contains halftones, line drawings, and illustrations embedded in text; the first four loose pages in the volume compared the capture capabilities of scanning to photocopy for a standard test target, the IEEE standard 167A 1987 test chart. In all instances scanning proved superior to photocopy, though only slightly more so in one.

Conceding the simplistic nature of her review of the quality of scanning to photocopy, KENNEY described it as one representation of the kinds of settings that could be used with scanning capabilities on the equipment CXP uses. KENNEY also pointed out that CXP investigated the quality achieved with binary scanning only, and noted the great promise in gray scale and color scanning, whose advantages and disadvantages need to be examined. She argued further that scanning resolutions and file formats can represent a complex trade-off between the time it takes to capture material, file size, fidelity to the original, and on-screen display; and printing and equipment availability. All these factors must be taken into consideration.

CXP placed primary emphasis on the production in a timely and cost-effective manner of printed facsimiles that consisted largely of black-and-white text. With binary scanning, large files may be compressed efficiently and in a lossless manner (i.e., no data is lost in the process of compressing [and decompressing] an image—the exact bit-representation is maintained) using Group 4 CCITT (i.e., the French acronym for International Consultative Committee for Telegraph and Telephone) compression. CXP was getting compression ratios of about forty to one. Gray-scale compression, which primarily uses JPEG, is much less economical and can represent a lossy compression (i.e., not lossless), so that as one compresses and decompresses, the illustration is subtly changed. While binary files produce a high-quality printed version, it appears 1) that other combinations of spatial resolution with gray and/or color hold great promise as well, and 2) that gray scale can represent a tremendous advantage for on-screen viewing. The quality associated with binary and gray scale also depends on the equipment used. For instance, binary scanning produces a much better copy on a binary printer.

Among CXP's findings concerning the production of microfilm from digital files, KENNEY reported that the digital files for the same Reed lecture were used to produce sample film using an electron beam recorder. The resulting film was faithful to the image capture of the digital files, and while CXP felt that the text and image pages represented in the Reed lecture were superior to that of the light-lens film, the resolution readings for the 600 dpi were not as high as standard microfilming. KENNEY argued that the standards defined for light-lens technology are not totally transferable to a digital environment. Moreover, they are based on definition of quality for a preservation copy. Although making this case will prove to be a long, uphill struggle, CXP plans to continue to investigate the issue over the course of the next year.

KENNEY concluded this portion of her talk with a discussion of the advantages of creating film: it can serve as a primary backup and as a preservation master to the digital file; it could then become the print or production master and service copies could be paper, film, optical disks, magnetic media, or on-screen display.

Finally, KENNEY presented details re production:

* Development and testing of a moderately-high resolution production scanning workstation represented a third goal of CXP; to date, 1,000 volumes have been scanned, or about 300,000 images.

* The resulting digital files are stored and used to produce hard-copy replacements for the originals and additional prints on demand; although the initial costs are high, scanning technology offers an affordable means for reformatting brittle material.

* A technician in production mode can scan 300 pages per hour when performing single-sheet scanning, which is a necessity when working with truly brittle paper; this figure is expected to increase significantly with subsequent iterations of the software from Xerox; a three-month time-and-cost study of scanning found that the average 300-page book would take about an hour and forty minutes to scan (this figure included the time for setup, which involves keying in primary bibliographic data, going into quality control mode to define page size, establishing front-to-back registration, and scanning sample pages to identify a default range of settings for the entire book—functions not dissimilar to those performed by filmers or those preparing a book for photocopy).

* The final step in the scanning process involved rescans, which happily were few and far between, representing well under 1 percent of the total pages scanned.

In addition to technician time, CXP costed out equipment, amortized over four years, the cost of storing and refreshing the digital files every four years, and the cost of printing and binding, book-cloth binding, a paper reproduction. The total amounted to a little under $65 per single 300-page volume, with 30 percent overhead included—a figure competitive with the prices currently charged by photocopy vendors.

Of course, with scanning, in addition to the paper facsimile, one is left with a digital file from which subsequent copies of the book can be produced for a fraction of the cost of photocopy, with readers afforded choices in the form of these copies.

KENNEY concluded that digital technology offers an electronic means for a library preservation effort to pay for itself. If a brittle-book program included the means of disseminating reprints of books that are in demand by libraries and researchers alike, the initial investment in capture could be recovered and used to preserve additional but less popular books. She disclosed that an economic model for a self-sustaining program could be developed for CXP's report to the Commission on Preservation and Access (CPA).

KENNEY stressed that the focus of CXP has been on obtaining high quality in a production environment. The use of digital technology is viewed as an affordable alternative to other reformatting options.

******

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ ANDRE * Overview and history of NATDP * Various agricultural CD-ROM products created inhouse and by service bureaus * Pilot project on Internet transmission * Additional products in progress * +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Pamela ANDRE, associate director for automation, National Agricultural Text Digitizing Program (NATDP), National Agricultural Library (NAL), presented an overview of NATDP, which has been underway at NAL the last four years, before Judith ZIDAR discussed the technical details. ANDRE defined agricultural information as a broad range of material going from basic and applied research in the hard sciences to the one-page pamphlets that are distributed by the cooperative state extension services on such things as how to grow blueberries.

NATDP began in late 1986 with a meeting of representatives from the land-grant library community to deal with the issue of electronic information. NAL and forty-five of these libraries banded together to establish this project—to evaluate the technology for converting what were then source documents in paper form into electronic form, to provide access to that digital information, and then to distribute it. Distributing that material to the community—the university community as well as the extension service community, potentially down to the county level—constituted the group's chief concern.

Since January 1988 (when the microcomputer-based scanning system was installed at NAL), NATDP has done a variety of things, concerning which ZIDAR would provide further details. For example, the first technology considered in the project's discussion phase was digital videodisc, which indicates how long ago it was conceived.

Over the four years of this project, four separate CD-ROM products on four different agricultural topics were created, two at a scanning-and-OCR station installed at NAL, and two by service bureaus. Thus, NATDP has gained comparative information in terms of those relative costs. Each of these products contained the full ASCII text as well as page images of the material, or between 4,000 and 6,000 pages of material on these disks. Topics included aquaculture, food, agriculture and science (i.e., international agriculture and research), acid rain, and Agent Orange, which was the final product distributed (approximately eighteen months before the Workshop).

The third phase of NATDP focused on delivery mechanisms other than CD-ROM. At the suggestion of Clifford LYNCH, who was a technical consultant to the project at this point, NATDP became involved with the Internet and initiated a project with the help of North Carolina State University, in which fourteen of the land-grant university libraries are transmitting digital images over the Internet in response to interlibrary loan requests—a topic for another meeting. At this point, the pilot project had been completed for about a year and the final report would be available shortly after the Workshop. In the meantime, the project's success had led to its extension. (ANDRE noted that one of the first things done under the program title was to select a retrieval package to use with subsequent products; Windows Personal Librarian was the package of choice after a lengthy evaluation.)

Three additional products had been planned and were in progress:

1) An arrangement with the American Society of Agronomy—a professional society that has published the Agronomy Journal since about 1908—to scan and create bit-mapped images of its journal. ASA granted permission first to put and then to distribute this material in electronic form, to hold it at NAL, and to use these electronic images as a mechanism to deliver documents or print out material for patrons, among other uses. Effectively, NAL has the right to use this material in support of its program. (Significantly, this arrangement offers a potential cooperative model for working with other professional societies in agriculture to try to do the same thing—put the journals of particular interest to agriculture research into electronic form.)

2) An extension of the earlier product on aquaculture.

3) The George Washington Carver Papers—a joint project with Tuskegee University to scan and convert from microfilm some 3,500 images of Carver's papers, letters, and drawings.

It was anticipated that all of these products would appear no more than six months after the Workshop.

******

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ ZIDAR * (A separate arena for scanning) * Steps in creating a database * Image capture, with and without performing OCR * Keying in tracking data * Scanning, with electronic and manual tracking * Adjustments during scanning process * Scanning resolutions * Compression * De-skewing and filtering * Image capture from microform: the papers and letters of George Washington Carver * Equipment used for a scanning system * +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Judith ZIDAR, coordinator, National Agricultural Text Digitizing Program (NATDP), National Agricultural Library (NAL), illustrated the technical details of NATDP, including her primary responsibility, scanning and creating databases on a topic and putting them on CD-ROM.

(ZIDAR remarked a separate arena from the CD-ROM projects, although the processing of the material is nearly identical, in which NATDP is also scanning material and loading it on a Next microcomputer, which in turn is linked to NAL's integrated library system. Thus, searches in NAL's bibliographic database will enable people to pull up actual page images and text for any documents that have been entered.)

In accordance with the session's topic, ZIDAR focused her illustrated talk on image capture, offering a primer on the three main steps in the process: 1) assemble the printed publications; 2) design the database (database design occurs in the process of preparing the material for scanning; this step entails reviewing and organizing the material, defining the contents—what will constitute a record, what kinds of fields will be captured in terms of author, title, etc.); 3) perform a certain amount of markup on the paper publications. NAL performs this task record by record, preparing work sheets or some other sort of tracking material and designing descriptors and other enhancements to be added to the data that will not be captured from the printed publication. Part of this process also involves determining NATDP's file and directory structure: NATDP attempts to avoid putting more than approximately 100 images in a directory, because placing more than that on a CD-ROM would reduce the access speed.

This up-front process takes approximately two weeks for a 6,000-7,000-page database. The next step is to capture the page images. How long this process takes is determined by the decision whether or not to perform OCR. Not performing OCR speeds the process, whereas text capture requires greater care because of the quality of the image: it has to be straighter and allowance must be made for text on a page, not just for the capture of photographs.

NATDP keys in tracking data, that is, a standard bibliographic record including the title of the book and the title of the chapter, which will later either become the access information or will be attached to the front of a full-text record so that it is searchable.

Images are scanned from a bound or unbound publication, chiefly from bound publications in the case of NATDP, however, because often they are the only copies and the publications are returned to the shelves. NATDP usually scans one record at a time, because its database tracking system tracks the document in that way and does not require further logical separating of the images. After performing optical character recognition, NATDP moves the images off the hard disk and maintains a volume sheet. Though the system tracks electronically, all the processing steps are also tracked manually with a log sheet.

ZIDAR next illustrated the kinds of adjustments that one can make when scanning from paper and microfilm, for example, redoing images that need special handling, setting for dithering or gray scale, and adjusting for brightness or for the whole book at one time.

NATDP is scanning at 300 dots per inch, a standard scanning resolution. Though adequate for capturing text that is all of a standard size, 300 dpi is unsuitable for any kind of photographic material or for very small text. Many scanners allow for different image formats, TIFF, of course, being a de facto standard. But if one intends to exchange images with other people, the ability to scan other image formats, even if they are less common, becomes highly desirable.

CCITT Group 4 is the standard compression for normal black-and-white images, JPEG for gray scale or color. ZIDAR recommended 1) using the standard compressions, particularly if one attempts to make material available and to allow users to download images and reuse them from CD-ROMs; and 2) maintaining the ability to output an uncompressed image, because in image exchange uncompressed images are more likely to be able to cross platforms.

ZIDAR emphasized the importance of de-skewing and filtering as requirements on NATDP's upgraded system. For instance, scanning bound books, particularly books published by the federal government whose pages are skewed, and trying to scan them straight if OCR is to be performed, is extremely time-consuming. The same holds for filtering of poor-quality or older materials.

ZIDAR described image capture from microform, using as an example three reels from a sixty-seven-reel set of the papers and letters of George Washington Carver that had been produced by Tuskegee University. These resulted in approximately 3,500 images, which NATDP had had scanned by its service contractor, Science Applications International Corporation (SAIC). NATDP also created bibliographic records for access. (NATDP did not have such specialized equipment as a microfilm scanner.

Unfortunately, the process of scanning from microfilm was not an unqualified success, ZIDAR reported: because microfilm frame sizes vary, occasionally some frames were missed, which without spending much time and money could not be recaptured.

OCR could not be performed from the scanned images of the frames. The bleeding in the text simply output text, when OCR was run, that could not even be edited. NATDP tested for negative versus positive images, landscape versus portrait orientation, and single- versus dual-page microfilm, none of which seemed to affect the quality of the image; but also on none of them could OCR be performed.

In selecting the microfilm they would use, therefore, NATDP had other factors in mind. ZIDAR noted two factors that influenced the quality of the images: 1) the inherent quality of the original and 2) the amount of size reduction on the pages.

The Carver papers were selected because they are informative and visually interesting, treat a single subject, and are valuable in their own right. The images were scanned and divided into logical records by SAIC, then delivered, and loaded onto NATDP's system, where bibliographic information taken directly from the images was added. Scanning was completed in summer 1991 and by the end of summer 1992 the disk was scheduled to be published.

Problems encountered during processing included the following: Because the microfilm scanning had to be done in a batch, adjustment for individual page variations was not possible. The frame size varied on account of the nature of the material, and therefore some of the frames were missed while others were just partial frames. The only way to go back and capture this material was to print out the page with the microfilm reader from the missing frame and then scan it in from the page, which was extremely time-consuming. The quality of the images scanned from the printout of the microfilm compared unfavorably with that of the original images captured directly from the microfilm. The inability to perform OCR also was a major disappointment. At the time, computer output microfilm was unavailable to test.

The equipment used for a scanning system was the last topic addressed by ZIDAR. The type of equipment that one would purchase for a scanning system included: a microcomputer, at least a 386, but preferably a 486; a large hard disk, 380 megabyte at minimum; a multi-tasking operating system that allows one to run some things in batch in the background while scanning or doing text editing, for example, Unix or OS/2 and, theoretically, Windows; a high-speed scanner and scanning software that allows one to make the various adjustments mentioned earlier; a high-resolution monitor (150 dpi ); OCR software and hardware to perform text recognition; an optical disk subsystem on which to archive all the images as the processing is done; file management and tracking software.

ZIDAR opined that the software one purchases was more important than the hardware and might also cost more than the hardware, but it was likely to prove critical to the success or failure of one's system. In addition to a stand-alone scanning workstation for image capture, then, text capture requires one or two editing stations networked to this scanning station to perform editing. Editing the text takes two or three times as long as capturing the images.

Finally, ZIDAR stressed the importance of buying an open system that allows for more than one vendor, complies with standards, and can be upgraded.

******

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ WATERS *Yale University Library's master plan to convert microfilm to digital imagery (POB) * The place of electronic tools in the library of the future * The uses of images and an image library * Primary input from preservation microfilm * Features distinguishing POB from CXP and key hypotheses guiding POB * Use of vendor selection process to facilitate organizational work * Criteria for selecting vendor * Finalists and results of process for Yale * Key factor distinguishing vendors * Components, design principles, and some estimated costs of POB * Role of preservation materials in developing imaging market * Factors affecting quality and cost * Factors affecting the usability of complex documents in image form * +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Donald WATERS, head of the Systems Office, Yale University Library, reported on the progress of a master plan for a project at Yale to convert microfilm to digital imagery, Project Open Book (POB). Stating that POB was in an advanced stage of planning, WATERS detailed, in particular, the process of selecting a vendor partner and several key issues under discussion as Yale prepares to move into the project itself. He commented first on the vision that serves as the context of POB and then described its purpose and scope.

WATERS sees the library of the future not necessarily as an electronic library but as a place that generates, preserves, and improves for its clients ready access to both intellectual and physical recorded knowledge. Electronic tools must find a place in the library in the context of this vision. Several roles for electronic tools include serving as: indirect sources of electronic knowledge or as "finding" aids (the on-line catalogues, the article-level indices, registers for documents and archives); direct sources of recorded knowledge; full-text images; and various kinds of compound sources of recorded knowledge (the so-called compound documents of Hypertext, mixed text and image, mixed-text image format, and multimedia).

POB is looking particularly at images and an image library, the uses to which images will be put (e.g., storage, printing, browsing, and then use as input for other processes), OCR as a subsequent process to image capture, or creating an image library, and also possibly generating microfilm.

While input will come from a variety of sources, POB is considering especially input from preservation microfilm. A possible outcome is that the film and paper which provide the input for the image library eventually may go off into remote storage, and that the image library may be the primary access tool.

The purpose and scope of POB focus on imaging. Though related to CXP, POB has two features which distinguish it: 1) scale—conversion of 10,000 volumes into digital image form; and 2) source—conversion from microfilm. Given these features, several key working hypotheses guide POB, including: 1) Since POB is using microfilm, it is not concerned with the image library as a preservation medium. 2) Digital imagery can improve access to recorded knowledge through printing and network distribution at a modest incremental cost of microfilm. 3) Capturing and storing documents in a digital image form is necessary to further improvements in access. (POB distinguishes between the imaging, digitizing process and OCR, which at this stage it does not plan to perform.)

Currently in its first or organizational phase, POB found that it could use a vendor selection process to facilitate a good deal of the organizational work (e.g., creating a project team and advisory board, confirming the validity of the plan, establishing the cost of the project and a budget, selecting the materials to convert, and then raising the necessary funds).

POB developed numerous selection criteria, including: a firm committed to image-document management, the ability to serve as systems integrator in a large-scale project over several years, interest in developing the requisite software as a standard rather than a custom product, and a willingness to invest substantial resources in the project itself.

Two vendors, DEC and Xerox, were selected as finalists in October 1991, and with the support of the Commission on Preservation and Access, each was commissioned to generate a detailed requirements analysis for the project and then to submit a formal proposal for the completion of the project, which included a budget and costs. The terms were that POB would pay the loser. The results for Yale of involving a vendor included: broad involvement of Yale staff across the board at a relatively low cost, which may have long-term significance in carrying out the project (twenty-five to thirty university people are engaged in POB); better understanding of the factors that affect corporate response to markets for imaging products; a competitive proposal; and a more sophisticated view of the imaging markets.

The most important factor that distinguished the vendors under consideration was their identification with the customer. The size and internal complexity of the company also was an important factor. POB was looking at large companies that had substantial resources. In the end, the process generated for Yale two competitive proposals, with Xerox's the clear winner. WATERS then described the components of the proposal, the design principles, and some of the costs estimated for the process.

Components are essentially four: a conversion subsystem, a network-accessible storage subsystem for 10,000 books (and POB expects 200 to 600 dpi storage), browsing stations distributed on the campus network, and network access to the image printers.

Among the design principles, POB wanted conversion at the highest possible resolution. Assuming TIFF files, TIFF files with Group 4 compression, TCP/IP, and ethernet network on campus, POB wanted a client-server approach with image documents distributed to the workstations and made accessible through native workstation interfaces such as Windows. POB also insisted on a phased approach to implementation: 1) a stand-alone, single-user, low-cost entry into the business with a workstation focused on conversion and allowing POB to explore user access; 2) movement into a higher-volume conversion with network-accessible storage and multiple access stations; and 3) a high-volume conversion, full-capacity storage, and multiple browsing stations distributed throughout the campus.

The costs proposed for start-up assumed the existence of the Yale network and its two DocuTech image printers. Other start-up costs are estimated at $1 million over the three phases. At the end of the project, the annual operating costs estimated primarily for the software and hardware proposed come to about $60,000, but these exclude costs for labor needed in the conversion process, network and printer usage, and facilities management.

Finally, the selection process produced for Yale a more sophisticated view of the imaging markets: the management of complex documents in image form is not a preservation problem, not a library problem, but a general problem in a broad, general industry. Preservation materials are useful for developing that market because of the qualities of the material. For example, much of it is out of copyright. The resolution of key issues such as the quality of scanning and image browsing also will affect development of that market.

The technology is readily available but changing rapidly. In this context of rapid change, several factors affect quality and cost, to which POB intends to pay particular attention, for example, the various levels of resolution that can be achieved. POB believes it can bring resolution up to 600 dpi, but an interpolation process from 400 to 600 is more likely. The variation quality in microfilm will prove to be a highly important factor. POB may reexamine the standards used to film in the first place by looking at this process as a follow-on to microfilming.

Other important factors include: the techniques available to the operator for handling material, the ways of integrating quality control into the digitizing work flow, and a work flow that includes indexing and storage. POB's requirement was to be able to deal with quality control at the point of scanning. Thus, thanks to Xerox, POB anticipates having a mechanism which will allow it not only to scan in batch form, but to review the material as it goes through the scanner and control quality from the outset.

The standards for measuring quality and costs depend greatly on the uses of the material, including subsequent OCR, storage, printing, and browsing. But especially at issue for POB is the facility for browsing. This facility, WATERS said, is perhaps the weakest aspect of imaging technology and the most in need of development.

A variety of factors affect the usability of complex documents in image form, among them: 1) the ability of the system to handle the full range of document types, not just monographs but serials, multi-part monographs, and manuscripts; 2) the location of the database of record for bibliographic information about the image document, which POB wants to enter once and in the most useful place, the on-line catalog; 3) a document identifier for referencing the bibliographic information in one place and the images in another; 4) the technique for making the basic internal structure of the document accessible to the reader; and finally, 5) the physical presentation on the CRT of those documents. POB is ready to complete this phase now. One last decision involves deciding which material to scan.

******

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ DISCUSSION * TIFF files constitute de facto standard * NARA's experience with image conversion software and text conversion * RFC 1314 * Considerable flux concerning available hardware and software solutions * NAL through-put rate during scanning * Window management questions * +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

In the question-and-answer period that followed WATERS's presentation, the following points emerged:

* ZIDAR's statement about using TIFF files as a standard meant de facto standard. This is what most people use and typically exchange with other groups, across platforms, or even occasionally across display software.

* HOLMES commented on the unsuccessful experience of NARA in attempting to run image-conversion software or to exchange between applications: What are supposedly TIFF files go into other software that is supposed to be able to accept TIFF but cannot recognize the format and cannot deal with it, and thus renders the exchange useless. Re text conversion, he noted the different recognition rates obtained by substituting the make and model of scanners in NARA's recent test of an "intelligent" character-recognition product for a new company. In the selection of hardware and software, HOLMES argued, software no longer constitutes the overriding factor it did until about a year ago; rather it is perhaps important to look at both now.

* Danny Cohen and Alan Katz of the University of Southern California Information Sciences Institute began circulating as an Internet RFC (RFC 1314) about a month ago a standard for a TIFF interchange format for Internet distribution of monochrome bit-mapped images, which LYNCH said he believed would be used as a de facto standard.

* FLEISCHHAUER's impression from hearing these reports and thinking about AM's experience was that there is considerable flux concerning available hardware and software solutions. HOOTON agreed and commented at the same time on ZIDAR's statement that the equipment employed affects the results produced. One cannot draw a complete conclusion by saying it is difficult or impossible to perform OCR from scanning microfilm, for example, with that device, that set of parameters, and system requirements, because numerous other people are accomplishing just that, using other components, perhaps. HOOTON opined that both the hardware and the software were highly important. Most of the problems discussed today have been solved in numerous different ways by other people. Though it is good to be cognizant of various experiences, this is not to say that it will always be thus.

* At NAL, the through-put rate of the scanning process for paper, page by page, performing OCR, ranges from 300 to 600 pages per day; not performing OCR is considerably faster, although how much faster is not known. This is for scanning from bound books, which is much slower.

* WATERS commented on window management questions: DEC proposed an X-Windows solution which was problematical for two reasons. One was POB's requirement to be able to manipulate images on the workstation and bring them down to the workstation itself and the other was network usage.

******

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ THOMA * Illustration of deficiencies in scanning and storage process * Image quality in this process * Different costs entailed by better image quality * Techniques for overcoming various de-ficiencies: fixed thresholding, dynamic thresholding, dithering, image merge * Page edge effects * +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

George THOMA, chief, Communications Engineering Branch, National Library of Medicine (NLM), illustrated several of the deficiencies discussed by the previous speakers. He introduced the topic of special problems by noting the advantages of electronic imaging. For example, it is regenerable because it is a coded file, and real-time quality control is possible with electronic capture, whereas in photographic capture it is not.

One of the difficulties discussed in the scanning and storage process was image quality which, without belaboring the obvious, means different things for maps, medical X-rays, or broadcast television. In the case of documents, THOMA said, image quality boils down to legibility of the textual parts, and fidelity in the case of gray or color photo print-type material. Legibility boils down to scan density, the standard in most cases being 300 dpi. Increasing the resolution with scanners that perform 600 or 1200 dpi, however, comes at a cost.

Better image quality entails at least four different kinds of costs: 1) equipment costs, because the CCD (i.e., charge-couple device) with greater number of elements costs more; 2) time costs that translate to the actual capture costs, because manual labor is involved (the time is also dependent on the fact that more data has to be moved around in the machine in the scanning or network devices that perform the scanning as well as the storage); 3) media costs, because at high resolutions larger files have to be stored; and 4) transmission costs, because there is just more data to be transmitted.

But while resolution takes care of the issue of legibility in image quality, other deficiencies have to do with contrast and elements on the page scanned or the image that needed to be removed or clarified. Thus, THOMA proceeded to illustrate various deficiencies, how they are manifested, and several techniques to overcome them.

Fixed thresholding was the first technique described, suitable for black-and-white text, when the contrast does not vary over the page. One can have many different threshold levels in scanning devices. Thus, THOMA offered an example of extremely poor contrast, which resulted from the fact that the stock was a heavy red. This is the sort of image that when microfilmed fails to provide any legibility whatsoever. Fixed thresholding is the way to change the black-to-red contrast to the desired black-to-white contrast.


Back to IndexNext