Today, it’s a little difficult to think of a time when digital data didn’t exist. We get so much of our information from browsing the web that imagining a time before the internet, before computers, and before metadata is hard to do unless you lived it. The evolution from data to metadata has been a long and fascinating one, though, helped along by some of the most brilliant men who ever lived. From Martin Dewey—of the Dewey Decimal System—to Alan Turing, whose life was recently dramatized in the Oscar-nominated film The Imitation Game, these innovative thought leaders can be thanked for the information age and everything it has brought us.
The Beginning: Melvin Dewey and the Decimal System
Born in 1851, Marvin Dewey was the first man to push data in the direction of metadata and digitization. At the ripe age of 21, working at the library at Amherst College, Dewey was tasked with designing a new system of organizing and cataloging the library’s books. Before Dewey’s involvement, the Amherst College library—and most other libraries, for that matter—had organized books based on their acquisition dates. There were obvious benefits to this system for librarians: when new books came in, they could just be placed at the end of a shelf. However, for library guests looking for relevant texts, the system was next to useless.
Dewey devised a system that organized books based on subject classes. Each book was given a three-digit code based on class—with classes including subjects like religion, language, art & recreation, and literature. A book could then be assigned decimal numbers to determine where on the shelf it belonged in relation to other books in the library. In addition to making it possible for guests to find the books they were looking for in the library, the Dewey Decimal System also made it easier for librarians to re-shelve books after they were returned. It essentially provided a numerical map of every shelf in the library, providing an exact numbered location for every book in circulation.
The Next Steps
Unsurprisingly, Melvin Dewey was labeled the “Father of Modern Librarianship.” He even went on to assist in the founding and establishment of the ALA (American Library Association). He was also one of the men who helped initiate the first steps toward metadata.
Another thought leader who helped data on its path to digitization was Charles Babbage. A British mathematician born in 1791, Babbage passed away in 1871—around the same time that Dewey was devising his Decimal System. If Dewey is the father of the modern library, then Babbage is “the father of the computer.” Unlike Dewey, though, Babbage didn’t live to see his ideas come to total fruition. In 1822, Babbage formulated an idea for what he called the “difference engine,” a machine that could “compute polynomials of a variable.” The concept for the machine was viable, but the technology of the time meant that Babbage wasn’t actually able to build it—though he did try, spending a fortune in funding along the way.
Eventually, the difference engine was abandoned in favor of a different machine: the “analytical engine.” The analytical engine was, conceptually, a computer that was meant to process and analyze information. This machine was even more complicated than the difference engine, which meant that—once again—the technology didn’t exist yet to bring Babbage’s vision for sophisticated computing to fruition.
It would be nearly 100 years before a British scientist named Alan Turing would devise and build a functional system along the lines of what Babbage had envisioned. In 1936, Turing published a paper, proposing a “Universal Machine” that would be capable of “decod[ing] and perform[ing] any set of instructions.” He theorized that, just as humans often follow set procedures to perform tasks, a computer could be built to do the same. Three years later, Turing worked for the British government on a computing machine called the Bombe. The Bombe was ultimately the machine that broke the Nazi Enigma code during World War II, making it possible for the Allies to intercept coded messages sent within the German military. The accomplishment was key to the Allied victory in World War II.
Turing also developed the first digital computer ever, and perhaps even more notably, was the first person to propagate the concept of computers being able to replicate human thought. In 1950, after the war was over, Turing wrote a paper that detailed the concept of the “imitation game.” Essentially, what this term referred to was a test to see, if computers were to process and answer questions, how closely those answers would resemble a human response. This concept, generally referred to as the “Turing test,” was a key turning point for the study of artificial intelligence.
The Coining of the Term “Metadata”
The term “metadata” was first coined in 1971 by tech expert Bernard Plagman. However, according to Merriam-Webster, the word “metadata” is defined simply as “data that provides information about other data.” As such, metadata really isn’t anything new and has been around for ages. The Dewey Decimal System is, at its core, a metadata system designed to aid in organization and convenience. As PhotoMetadata.org notes, metadata has been around in the photography profession for ages now. Information compiled about a photograph—captions about who is in the picture or when it was taken, bylines identifying the photographer who took the picture, copyright information, etc.—all technically qualifies as metadata.
Thanks to the accomplishments of Charles Babbage and Alan Turing, it became possible to replicate and store that information digitally, without any loss in fidelity. As a result, most of us think about metadata today as a digital term, usually in terms of information about a website or a file. Document management software like eFileCabinet even takes metadata into account, ensuring that all metadata stays a part of its respective file or folder—regardless of how files are moved, edited, sent to new users, and more. In other words, today, managing metadata is as important as managing data itself—something that Bernard Plagman foresaw when he first coined the term 45 years ago.