Microfilm has a future?

Thursday, August 2nd, 2018

Microfilm is profoundly unfashionable in our modern information age, but it has quite a history — and may still have a future:

The first micrographic experiments, in 1839, reduced a daguerreotype image down by a factor of 160. By 1853, the format was already being assessed for newspaper archives. The processes continued to be refined during the 19th century. Even so, microfilm was still considered a novelty when it was displayed at the Centennial Exposition in Philadelphia of 1876.

The contemporary microfilm reader has multiple origins. Bradley A. Fiske filed a patent for a “reading machine” on March 28, 1922, a pocket-sized handheld device that could be held up to one eye to magnify columns of tiny print on a spooling paper tape. But the apparatus that gained traction was G. L. McCarthy’s 35mm scanning camera, which Eastman Kodak introduced as the Rekordak in 1935, specifically to preserve newspapers. By 1938, universities began using it to microfilm dissertations and other research papers. During World War II, microphotography became a tool for espionage, and for carrying military mail, and soon there was a recognition that massive archives of information and cross-referencing gave agencies an advantage. Libraries adopted microfilm by 1940, after realizing that they could not physically house an increasing volume of publications, including newspapers, periodicals, and government documents. As the war concluded in Europe, a coordinated effort by the U.S. Library of Congress and the U.S. State Department also put many international newspapers on microfilm as a way to better understand quickly changing geopolitical situations. Collecting and cataloging massive amounts of information, in microscopic form, from all over the world in one centralized location led to the idea of a centralized intelligence agency in 1947.

It wasn’t just spooks and archivists, either. Excited by the changing future of reading, in 1931, Gertrude Stein, William Carlos Williams, F. W. Marinetti, and 40 other avant-garde writers ran an experiment for Bob Brown’s microfilm-like reading machine. The specially processed texts, called “readies,” produced something between an art stunt and a pragmatic solution to libraries needing more shelf space and better delivery systems. Over the past decade, I have redesigned the readies for 21st-century reading devices such as smartphones, tablets, and computers.

By 1943, 400,000 pages had been transferred to microfilm by the U.S. National Archives alone, and the originals were destroyed. Millions more were reproduced and destroyed worldwide in an effort to protect the content from the ravages of war. In the 1960s, the U.S. government offered microfilm documents, especially newspapers and periodicals, for sale to libraries and researchers; by the end of the decade, copies of nearly 100,000 rolls (with about 700 pages on each roll) were available.

Their longevity was another matter. As early as May 17, 1964, as reported in The New York Times, microfilm appeared to degrade, with “microfilm rashes” consisting of “small spots tinged with red, orange or yellow” appearing on the surface. An anonymous executive in the microfilm market was quoted as saying they had “found no trace of measles in our film but saw it in the film of others and they reported the same thing about us.” The acetate in the film stock was decaying after decades of use and improper storage, and the decay also created a vinegar smell—librarians and researchers sometimes joked about salad being made in the periodical rooms. The problem was solved by the early 1990s, when Kodak introduced polyester-based microfilm, which promised to resist decay for at least 500 years.

Microfilm got a competitor when National Cash Register (NCR), a company now known for introducing magnetic-strip and electronic data-storage devices in the late 1950s and early ’60s, marketed Carl O. Carlson’s microfiche reader in 1961. This storage system placed more than 100 pages on one four-by-six-inch sheet of film in a grid pattern. Because microfiche was introduced much later than microfilm, it played a reduced role in newspaper preservation and government archives; it was more widely used in emerging computer data-storage systems. Eventually, electronic archives replaced microfiche almost entirely, while its cousin microfilm remained separate.

Microfilm’s decline intensified with the development of optical-character-recognition (OCR) technology. Initially used to search microfilm in the 1930s, Emanuel Goldberg designed a system that could read characters on film and translate them into telegraph code. At MIT, a team led by Vannevar Bush designed a microfilm rapid selector capable of finding information rapidly on microfilm. Ray Kurzweil further improved OCR, and by the end of the 1970s, he had created a computer program, later bought by Xerox, that was adopted by LexisNexis, which sells software for electronically storing and searching legal documents.

[...]

Today’s digital searches allow a reader to jump directly to a desired page and story, eliminating one downside of microfilm. But there’s a trade-off: Digital documents usually omit the context. The surrounding pages in the morning paper or the rest of the issue of a magazine or journal vanish when a single, specific article can be retrieved directly. That context includes more than a happenstance encounter with an abutting news story. It also includes advertisements, the position and size of one story in relation to others, and even the overall design of the page at the time of its publication. A digital search might retrieve what you are looking for (it also might not!), but it can obscure the historical context of that material.

xkcd Digital Resource Lifespan

The devices are still in widespread use, and their mechanical simplicity could help them last longer than any of the current electronic technologies. As the web comic xkcd once observed, microfilm has better lasting power than websites, which often vanish, or CD-roms, for which most computers don’t have readers anymore.

The xkcd comic gets a laugh because it seems absurd to suggest microfilm as the most reliable way to store archives, even though it will remain reliable for 500 years. Its lasting power keeps it a mainstay in research libraries and archives. But as recent cutting-edge technologies approach ever more rapid obsolescence, past (and passed-over) technologies such as the microfilm machine won’t go away. They’ll remain, steadily doing the same work they have done for the past century for another five more at least — provided the libraries they are stored in stay open, and the humans that would read and interpret their contents survive.

Comments

  1. Candide III says:

    Several companies are developing a sort of microfilm based on fused silica glass. If the data is written as images, the only equipment needed to read it is a good microscope. The data is estimated to last >100 million years and the density per cubic inch is about the same as a DVD.

  2. Mike in Boston says:

    If the data is written as images, the only equipment needed to read it is a good microscope.

    That’s a lot more appealing than needing to rely on microfilm readers.

    The Rosetta Disk strikes me as sort of a gimmick, but the underlying
    “HD-Rosetta” technology
    is certainly impressive.

  3. Wang Wei Lin says:

    A hard copy is the only viable archiving technique for long timeframes. If all you need is a magnifying glass then all later societies regardless of advancement will be able to access the record. As proof the Oracle Bones of China have lasted 1000s of years. Now that’s low tech.

  4. Kirk says:

    In the long run, everything is prone to deterioration and decay. Witness the efforts we’re going to, when trying to read the carbonized remains of Roman scrolls preserved in Pompeii and Herculaneum.

    What would be interesting would be to pause and think about how one might best preserve data over the truly long haul: My suspicion is that you might be wanting to look around the natural environment, and take note of which species have been around the longest, unchanged since ancient times, and who have an abundance of genetic information. One might want to ask the question “Why would a plant or animal species be hauling around this much genetic deadweight, in terms of information, and not use it…?”. From there, you could identify things you might want to take a long, hard look at with regards to figuring out why the hell something like that would last, and what might possibly be encoded in their genome besides instructions on how to make a lungfish or flowering plant…

    Frankly, if I were looking to record for posterity, and I were a master of biological sciences, I might just code my data into something that I also “froze” genetically, so that changes would be minimized down the ages.

    Just suggestin’…

Leave a Reply