Friday, November 14, 2008

The Social Life of Film Preservationists

Gracy’s piece (Gracy, Karen. Film Preservation: Competing definitions of value, use, and practice. Chicago: Society of American Archivists. 2006. Chp. 8) is really interesting from our perspective as students in the Ischool, because it really examines the work that a film archivist actually does, from inspecting the film, to negotiating with labs, to selecting what can be saved. It would be helpful to study many different types of positions in this field (libraries, archives, digitization), since professionally there are so many different kinds of positions and kinds of work to be done, and this type of anthropological writing, examining the work that people do, is so rare. We read a chapter of Gracy’s work in Caroline Frick’s Politics of Preservation course last year. It reminded me of readings that I did in Doty’s Users course about studying the way that people work and the way that people share information and gain knowledge on the job, primarily through gaining experience and watching other ‘experts’ in the field (The Social Life of Information, Brown and Duguid). This is why interviewing people, and working on case studies of archives and other institutions can be so helpful for students, because people on the job can succinctly sum up what the main issues are within the field.

For me, the most interesting aspects were the political discussions about deciding which films should be saved. This is something that a digital preservationist really decides, and there is never a right answer.

Friday, November 7, 2008

Viewing two digitized film collections

I looked at two archives online (note: It seemed that The University of Maryland site’s videos were not accessible to people outside of the University of Maryland).

The first site I looked at was the Internet Archive (archive.org), which has a lot of interesting films to watch. I had been on this site before. Anyone can upload videos here. Collections are subdivided into subject categories, or people can just use a keyword search. I first looked in the “Cultural and Academic Films” category, and found the University of Pennsylvania’s Museum of Archaeology and Anthropology collection, which includes a collection of digitized 16mm films made by Watson Kintner, during his travels around the world from 1933 to 1969. I watched “1967 #12 Tunisia.”

It is available in downloadable MPEG 1, MPEG 2, MPEG 4 and Streaming formats. Like most films on the Internet Archive site, the work is not formally cataloged, just tagged with a few words. It also includes a shot list though, and reel numbers. This must have all been compiled originally at the University of Pennsylvania and uploaded by them.

When “full screen” it looks bad, there is lots of blocky data. When it is not blown up, it doesn’t look so bad. It’s silent and in color. The color looks beautiful to me, though maybe it is a little faded.




“Theme from Shaft Hip-Hop Remix Video” 2008, was listed as a featured open- source video so I thought I would check it out. I'm not sure how they decide to 'feature' videos. The footage of this one was interesting to watch, but I have to say I didn't really get in to the remix.



This was just something that someone made, so it isn’t cataloged, though there are a good number of tags on the work. I’m wondering how some of this stuff is not under copyright, since it is in the open source section, and I don’t really believe that this video could just be used by anyone. Since it comes from a lot of different clips, some scenes look much better than others, which is interesting to look at in a digitization context. For the most part, it looks pretty badly pixelated though, probably the worst of anything I watched today.

It’s simply described as, “A Hip-Hop remix of Isaac Hayes' "Theme from Shaft" strictly for the B-Boys. The video includes classic footage from Wattstax, Shaft, Wild Style and other classics paying tribute to Mr. Hayes' contributions to Hip-Hop culture.” (Presumably this was written by the person who made this).

It’s available in Quicktime and MPEG4.

Finally I looked at the amazing “Disneyland Dream,” which is 1955 home movie footage of the Barstow family, who won a nationwide contest to take a trip to Disneyland. In 1995, Mr. Barstow narrated the story, turning the film into a little documentary, and he is quite funny:



It’s available in MPEG 1, 2 and 4, as well as streaming. Again, it looks pretty pixelated. I chose this item because the Internet Archive has a way of showing how many times things have been downloaded, and this was one of the most downloaded videos. They also allow users to rate the videos with stars, which is somewhat helpful. There’s no information about how long the video is, until you start playing it, this was true of all of them. This one also was not cataloged, just tagged. The Internet Archive gives information about the producer, but since everyone uploads their own content, I would assume that most do not include information about the digitization process, and these three were no exception. The Internet Archive does provide thumbnails from every minute of a video, which, while definitely not ‘cataloging,’ does give the user a little more information before he or she watches the film.

Overall, I had no problem viewing the streaming files. They didn't start and stop, and they just played in the browser, which is simple.

I also looked at the site “American Memory” motion picture collection from the Library of Congress, http://memory.loc.gov/ammem/browse/ListSome.php?format=Motion+Picture

I ended up viewing ‘Early Films of New York, 1898-1906. These files were much better cataloged then the Internet Archive collection, which makes sense because they are from the Library of Congress, and on the Internet Archive, anyone can add anything. They also have more information about how the videos can be viewed, which is helpful. Users can view the videos with RealMedia, Quicktime, and MPEG formats, all streaming. As a Mac user, I was able to easily use the Quicktime and MPEG formats, though they are quite small. (The MPEG is a little bigger). And the quality is not great on either version.

One thing that is really nice about this site, is that they give you contextual information about the collection. “New York City at the Turn of the Century,” “America at the Turn of the Century,” and “Pioneer Cameramen” are some of the educational pages that this collection includes. I would think that this could be a good resource for teachers. While this is not really a way to ‘describe’ the work, it does give the user more background information, which I would think a library would want to provide. Users are able to browse by subject, search by keyword or view an entire list of the films available. I browsed by subject. These records included information about the copyright holders, date created, cameraman, location of filming and descriptions. They also include the location of the holding of the physical film, as well as a digital ID number, which of course would be useful.

I watched a “Sleighing Scene” created by Thomas Edison’s company in 1898. http://memory.loc.gov/cgi-bin/query/S?ammem/papr:@FILREQ(@field(SUBJ+@od1(Sleighs--New+York++State+--New+York+))+@FIELD(COLLID+newyork))

As well as a film of the demolition of the Star Theatre in New York, over approximately 30 days in 1901:

http://memory.loc.gov/cgi-bin/query/S?ammem/papr:@FILREQ(@field(SUBJ+@od1(Star+Theatre++New+York,+N+Y+++))+@FIELD(COLLID+newyork))

And a parade in Washington Square Park, “Parade of "exempt" firemen / American Mutoscope and Biograph Company.”

http://memory.loc.gov/cgi-bin/query/S?ammem/papr:@FILREQ(@field(SUBJ+@od1(Washington+Square+Park++New+York,+N+Y+++))+@FIELD(COLLID+newyork))

Thursday, October 30, 2008

Film generation

Today my topic is: “Film generation (original negative, fine grain or projection print, answer print, work print, etc.)”

For a preservationist, it’s important to think about the life-cycle of a film object, starting with its beginning in the camera, through its adolescence as it is edited and processed, its middle-age as it reproduces into prints, to its somewhat final resting, as its decaying body is kept alive in an archive. Now, with digital technology, many of the traditional processes used in the creation of a film are done digitally, but film archives, of course collect old films, which could have been created in a number of ways. When dealing with issues related to ‘restoration,’ preservation, and arguments about different versions of a work, knowing what element of the film you have in your collection could be important.

Generally, it seems that film archivists want to preserve the version that is as close to the first generation as possible, though different versions may also be valuable and interesting for research purposes. If a film gets re-made and is further away from the original print, quality can be lost. We have probably all noticed that quality is lost when film is transferred to video, but the same thing can happen through generations of film, especially if they are not handled well or if it is a popular film and many copies are made.

As, I’m unfamiliar with a lot of this terminology, I’ll try to provide the definitions of some of the things I’ve been asked to write about:

OCN: The original camera negative (OCN) is defined on Wikipedia as “the film in a motion picture camera, which captures the original image. This is the film from which all other copies will be made. It is known as raw stock prior to exposure.” (http://en.wikipedia.org/wiki/Original_camera_negative).


Workprint: “After the film is processed by the film lab, they will assemble the camera rolls into lab rolls of 1200 to 1500 feet. Workprints may be made for viewing dailies or editing the picture on film.” (http://en.wikipedia.org/wiki/Original_camera_negative).

“A workprint is a rough version of a motion picture, used by the film editor(s) during the editing process. Such copies generally contain original recorded sound that will later be re-dubbed, stock footage as placeholders for missing shots or special effects, and animation tests for in-production animated shots or sequences.” (http://en.wikipedia.org/wiki/Workprint).

Answer print:

“Answer print refers to the first version of a given motion picture that is printed to film after color correction on an interpositive. It is also the first version of the movie printed to film with the sound properly synced to the picture… They are used by the filmmaker and studio to ensure that the work going in to the film during the post-production process is cohesive with the final goals for the project.” (http://en.wikipedia.org/wiki/Answer_print).


Interpositives and Internegatives

“After approval of the answer print, interpositives (IPs) and internegatives (INs) are created, from which the release prints are made.

…the IPs and INs are regarded as the earliest generation of the finished and graded film, and are almost always used for transfers to video or new film restorations.” (http://en.wikipedia.org/wiki/Original_camera_negative).


“An interpositive, IP or master positive is an orange-based motion picture film with a positive image made from the edited camera negative… The interpositive is made after the answer print has been approved. All lights and opticals from the answer print are repeated when striking the interpositive, and once the IP exists, the original negative can be vaulted.

…[it] historically has had only one purpose, namely, to be the element that is used to make the internegative…the only time the IP is touched is on the occasion of making the first or a replacement internegative. Since interpositives are used so rarely, they are usually the film element that is in the best condition of all the film elements." (http://en.wikipedia.org/wiki/Interpositive).

“An internegative is motion picture film stock used to make release prints for distribution to movie theatres” (http://en.wikipedia.org/wiki/Internegative).

Release print:

“A release print is the reel of film that is sent to a movie theater for exhibition.” (http://en.wikipedia.org/wiki/Release_print).

I’m assuming that the fine grain print and the projection print are somewhat similar to what is called here the “release print.” Anyone can correct me on that if they have more information about that. Thanks.

Thursday, October 23, 2008

What’s going on right now in news and television preservation?

I'm sure there is a lot going on in both of these areas, but I decided to focus on a project currently underway, funded by the Library of Congress, called Preserving Digital Public Television. ( http://ptvdigitalarchive.org/ ) While some programs are still shot on film or on analog video, many are digitally shot, and virtually all of them are digitally edited. This digital content needs to be preserved. The program site states that there is no mandate and little funding for the preservation of the rich cultural resource that is public television. The project is working on developing a repository and standards, creating a test model, creating guidelines for appraisal and looking at possible sources of funding for the project.

The Web site also discusses the difference between preserving the analog materials and the born-digital files:

“We are rapidly approaching the “tapeless environment” – where programs will live solely as “disembodied” assets, attached to their metadata, distributed and stored in a totally digital environment. This introduces an entirely new set of issues and problems relating to long-term program preservation, for which no coordinated strategy yet exists in public television.

While digital material is generally easier to duplicate, it is also more fragile. Hard disks fail at the rate of roughly 2 percent per year, which means that digital materials must be constantly checked, backed up, restored, and migrated from older to new disks.”

They have been making inventories of at risk programs and creating educational materials on file formats and metadata. They are also working with Turner Broadcasting to develop a standard MXF wrapper for preservation of program files.

In the future they hope to work preservation into the workflow of news and television programs. They are working with PBS and NYU on this project. The organization of the project is made more difficult because video and analog materials are not all stored together, and even though they are public programs, copyright is an issue.

CPB has been developing the PBCore metadata standard. (http://www.pbcore.org/ ). PBCore is similar to a Dublin Core (http://dublincore.org/) schema. Similar elements include Identifier, Title, Subject, Description, Creator, Language, etc. Some of the elements I did not recognize from DublinCore were AudienceLevel and AudienceRating.

To compare what the U.S. is doing to other countries the Web Site writes, that while the U.S. is having problems making things accessible online because of copyright, the BBC, the The Institut National de l’Audiovisuel (INA) in France, and programs in the Netherlands and Japan, are all working to put massive amounts of material online and accessible.

Finally, if you are interested, here is a Web cast of Nan Rubin about preserving digital public television at Thirteen. http://www.loc.gov/today/cyberlc/feature_wdesc.php?rec=3848

Friday, October 17, 2008

VRA Core

According to the VRA Core Web site, (http://www.vraweb.org/projects/vracore4/) VRA Core 4.0, (the most recent iteration of this metadata schema) was created by the Visual Resource Association’s (VRA is an organization dedicated to research and education related to managing images) Data Standards Commmittee , and is geared towards the ‘cultural heritage community.’ It can be used to describe works of visual culture, images that document those works, as well as collections of these objects. Version 4.0, released in 2007, is closely related to the content standard CCO (Cataloging Cultural Objects). VRA Core 4.0 can also be used within METS.

The VRA Core Web site includes a PDF “VRA Core 4.0 Introduction,” from which the following info is derived. VRA Core does not really have any required element types, however there are 5 core element types that generally should be described for all works. Those are:

WORK TYPE (what)
TITLE (what)
AGENT (who)
LOCATION (where)
DATE (when)

In VRA Core 4.0, “Agent” is a broader term used instead of “Creator,” and can be refined using sub-elements, such as “name,” “role,” and “culture.” “Culture” would include data about the culture or nationality of that agent. The other element, “Cultural Context” denotes the culture within the work or image was created, which is not always the same.

The VRA Data Standards Committee has also developed an XML Schema for VRA Core 4.0 elements. As such, data is divided into elements, sub-elements and attributes. (The VRA Core 3.0 used ‘qualifiers’ instead of sub-elements and attributes.)

In VRA Core 4.0, attributes can be used to modify an element or sub-element.
Some attributes are global because “they can be used to modify any element or sub-element rather than being tied to any specific one.” These include:
extent, dataDate, href, pref, refid, rules, vocab, source, and xml:lang.

The XML Schema allows collections to share information with others, and VRA Core 4.0 includes both a restricted and unrestricted schema, to offer collections some options for sharing.

VRA Core suggests on their site using controlled vocabularies to populate certain metadata fields. Suggestions include: Thesaurus for Graphic Materials (TGM), Art & Architecture Thesaurus (AAT), Union List of Artist Names (ULAN), and the Thesaurus of Geographic Names (TGN). They also suggest using a standard for formatting the data in the record, the Cataloging Cultural Objects (CCO) guidelines.

As with the Dublin Core standard, every record describes only one object. This is referred to as the ‘1:1 priciple.’However, the “relation” can show the ways in which different object and resources are related. These relations can be ‘work to work,’ ‘image to work,’ or ‘image to collection/work to collection.’

There are also administrative elements, like “source” and “rights.” “Source” operates differently for a work or an image record. For the Work Record, “source” includes data about the source of information where the cataloging information is taken from, whereas for the Image
Record source includes data about the publication or person from whom the image is taken.

Monday, October 6, 2008

File Systems: An Important Topic for Digital Storage and Preservation

Before I started looking into what file systems were, or why they might be important for digital preservation, I had an idea of what they might be: the way that files are organized on your computer. There is, of course, a little bit more to it than that, and those of us interested in the organization of digital information, should know about the basics.

Different computer operating systems have different file systems for organizing information, both for presentation to the user through the GUI, and also for organizing on the hard disk for ease of access by the computers. The different aspects of how the file system is designed will show up in the way that ‘files ‘ are arrange in ‘folders’ within ‘directories’ on your computer. Is there a flat or a hierarchical structure? How long can file names be?

According to Wikipedia (http://en.wikipedia.org/wiki/File_system):
“More formally, a file system is a special-purpose database for the storage, hierarchical organization, manipulation, navigation, access, and retrieval of data.”

Common file systems are NTFS, the standard for Windows NT, including Windows 2000, Windows XP, and Windows Vista. It replaced a file system called FAT, improving on FAT in a number of ways, according to Wikipedia (http://en.wikipedia.org/wiki/NTFS), including the support of metadata, improved and speedier performance for getting to files, and more security, including the use of "journaling," which means that the file system documents itself, so that in a crash less information will be lost.
Macs use a file system now called HFS+.

I actually (surprisingly) found a really interesting and funny article on the history of file systems from Ars Technica. “From BFS to ZFS: past, present and future of file systems” http://arstechnica.com/articles/paedia/past-present-future-file-systems.arsin by Jeremy Reimer from March 2008.

Thursday, October 2, 2008

The Format Wars: VHS

VHS (Video Home System) was developed by JVC in 1976, and beat out Sony’s Betamax, which had a higher quality of playback, presumably because VHS recorded a longer amount of time and cost less. VHS emerged in the 1980s as the standard for home viewing and recording and reigned until it was superseded by the DVD, which was introduced in 1997. Many studios stopped releasing VHS tapes by 2006, opting instead for DVDs. However, VHS tapes are still popular for recording television shows. VHS cassettes house ½ inch wide magnetic tape with a recording time of between two and six hours (Wikipedia: VHS, http://en.wikipedia.org/wiki/VHS).

Wikipedia goes into more of the technical information on VHS:

“VHS tapes have approximately 3 MHz of video bandwidth, which is achieved at a relatively low tape speed by the use of helical scan recording of a frequency modulated luminance (black and white) signal, with a down-converted "color under" chroma (color) signal recorded directly at the baseband. Because VHS is an analog system, VHS tapes represent video as a continuous stream of waves, in a manner similar to analog TV broadcasts.”


VHS tapes suffer from degradation of quality whenever they are copied. According to the EAI site on preservation of video (http://resourceguide.eai.org/preservation/singlechannel/basicquestions.html) there is no universal preservation format for videotapes. They state that DigiBeta is currently the archival standard.

Some similar formats to VHS cassettes are the VHS-C, which was used for home camcorders, and Super VHS (S-VHS), which was a more high quality version of VHS aimed at professionals. (Video Preservation Web site from Stanford: http://videopreservation.stanford.edu/vid_id/vhs.html).

This is a pretty cute video made about the current format war between VHS and DVDs:



The virtual museum of vintage VCRs includes some information about VHS formats, besides other video formats:

http://www.totalrewind.org/mainhall.htm

And the following includes some information about the very important 'Betamax' case, which deals with copyright law related to consumers rights to record video of material under copyright, for in home use:

http://www.museum.tv/archives/etv/B/htmlB/betamaxcase/betamaxcase.htm

Thursday, September 25, 2008

Looking at File Formats and Digital Preservation: MPEG-4

MPEG-4 Part 14 (which is one part of the Moving Picture Experts Group’s MPEG-4 suite of standards for compressing streams of audio and video) is a file format type also referred to as MPEG-4, mp4, and m4a (among others), used to store digital audio and digital video streams. The official filename extension MPEG-4 Part 14 files is .mp4. (According to the Wikipedia article: MPEG-4 Part 14).

The Library of Congress' site on digital preservation of file formats states, "This format is intended to serve web and other online applications; mobile devices, i.e., cell phones and PDAs; and broadcasting and other professional applications."

According to a page found on NYU’s film preservation site, the fact that MPEG-4 files are able to contain audio, video, and subtitle streams makes it difficult to determine the content of these files, and could cause problems when migrating the files, as the different types are all encoded differently. Apple began using the .m4a file extension to distinguish certain audio files from other types of MP4s, but it is not universally held and some audio MP4s will not have this file extension. The format also includes a standardized intellectual property protection coding which could make it difficult to playback those files in the future.


While the audio MP4 files are considered to be of superior quality with smaller file sizes than MP3s, some say that licensing reasons keep MP4s from becoming more popular. The NYU site states, “The MP4 file is open standard; however, the codecs (compression schemes) are commercially licensed. Over two dozen companies claim patents on the MPEG4 suite. These licenses cover the manufacture and sale of devices or software and, for some content disseminators, levy fees according to number of endusers or the extent of content delivered.” The fact that mp4s are not as widely popular, may mean that they will become obsolete in the future. The NYU site further states that, “The Florida Center for Library Automation (FCLA) digital archive has given the file format a rating of "low-confidence" for its digital preservation recommendations.”

MPEG-4 files can be played by: iTunes, QuickTime Player, Winamp, RealPlayer, VLC Media Player, foobar2000, Avidemuxiola, KSP Sound Player, Media Player Classic, MPlayer, and Nero Media Player.

Apple includes an info page on MPEG-4s; they write that their QuickTime application was the foundation for development of the mulitmedia MP4 files. http://www.apple.com/quicktime/technologies/mpeg4/. They write, "Just as QuickTime does, MPEG-4 also scales to transport media at any data rate — from media suitable for delivery over dial-up modems to high-bandwidth networks." And it seems like this is one of the main importances of this format- it's ability to provide higher quality files over a variety of different data rates. They also write on the Apple site that the MPEG-4 standard allows interoperability between different playback products.

Thursday, September 18, 2008

Image Management/Editing Software

Today I’m researching and writing about image management software (IMS) and image editing software (not to be confused with Digital Asset Management Systems!). I’ll sketch out a general definition of these kinds of tools, however, since we are talking about different, unique software programs that do very different things, there is often crossover in terms of functionality. It just really depends on what the software you decide to get can actually do, and what you need for your particular digitization project.

My professor Maria Esteva gives a good general description of these image management/editing programs in a report that she wrote in 2007 for the Texas Heritage Digitization Initiative called, “Recommendations and Best Practices for Management of Derivative Digital Objects,”

“The purpose of IMS is to aid imaging projects from image creation in a scanner or digital camera, through storage in a hierarchical file storage system, digital assets management system (DAM), or institutional repository system (IR). Available in the market and as open source free tools there are various types of IMS with different functionalities. “

Image management software and image editing software help between the stages of capturing the image and getting the image to its somewhat final resting place, when it is published or somehow made available to users. Esteva further states that image management/editing software can be used to enhance and edit images (crop, color correct, resize, rename, etc.), while other products edit and help to organize and describe images, which can be very helpful if you are dealing with a great number of images.

While depending on your project you may decide that you need to purchase a product, there are also freeware versions of image management software items that can be used if you are operating on a shoestring, as many preservation projects are. Some of these are open-source, which means that they can be played with and manipulated to suit the purposes of your program. It is important to have someone on your team with some programming background if this is the option that you choose.

TASI (a non-profit digital media resource center from the UK) did a survey of different types of image management software and compares them here: http://www.tasi.ac.uk/advice/delivering/imsoftware.html

It was updated in Februrary, 2008, so it should be pretty up-to-date, though these things are always changing.

Just a few (of many) things to keep in mind when looking for the image management software for your projects are:

• What file formats does the software support? Just a few, or tons? You may just need a few, but it helps to have different formats if you want to have master files, and derivatives.

• Are you able to rename, resize, convert formats, easily copy/move/delete images from various locations, create categories of images, view thumbnails or other versions of the images, and edit/crop? Can you do these things in batches? That’s important because batch processing can save lots of time, which saves money.

Also, in terms of image editing, TASI writes, “Some systems will apply edits directly to the image as you make them - which makes it impossible to recover an earlier version if you change your mind. Some systems will create a copy, providing you with a level of version control. Other systems store details of the edits you make and only actually apply them to the image when it is exported or saved into a different size or format” (http://www.tasi.ac.uk/advice/delivering/choose-ims.html).

• Can you easily navigate and search within the images?

• Can you use descriptive metadata to search, retrieve and track the project status (in batches)? Can this descriptive metadata and other technical metadata be moved to other systems easily?

• Does it automatically create backup copies?

• Is it easy to record who did what to which images? Can you control who is able to do what? Since IMS can be used to manage workflows this can be important. The TASI site states, “If you're using a system to manage your workflow, then versioning and auditing features (e.g. who edited the image) are worth looking out for. These can support your quality assurance (QA) processes, helping you to pick out where any errors may be being introduced” (http://www.tasi.ac.uk/advice/delivering/choose-ims.html).


The TASI site points out that with commercial products it is important to keep in mind that the vendors may not stay in business in order to be able to update the systems or provide service on them. You also may not be able to change or even know about certain parts of a commercial software package. And finally, many of these software products are not made for educational or heritage markets, but instead for professional photographers and corporate clients (http://www.tasi.ac.uk/advice/delivering/ims2.html).

Thursday, September 11, 2008

Reaction to the North Carolina ECHO Project Management Guidelines

I just finished going over the North Carolina ECHO Project Management Guidelines, which include a good, brief overview of the steps needed to manage any small or large, institutional digitization project.

One of the main points brought up is that ‘change’ is a necessary and inherent part of any of these projects, and building in the ability to not only deal with change, but to possibly use it as a tool or an asset, is an important part of succeeding at achieving your project goals. To that end, they talk about staffing your project, and the fact that while the skill sets of employees are important, a certain amount of learning and training has to happen with every position, not only once, but continually, so it is important that employees are flexible and can take direction and/or learn well. I think that this has a lot to do with the ability of everyone working on the digitization project to be able to work well with others and communicate well.

The project guidelines state that the creation of a training manual can be helpful as well, and I’ve found this to be really true. Any job that I’ve worked with that has given me a training manual (not one that is overly complicated, out-dated, or too hard to get through) has helped me to get my job done more efficiently and accurately, without having to take another person's time to answer simple questions. Of course, it is important to have people talking to people directly, to be able to answer more complex questions or take staff members through the steps initially, but too often, without a manual, employees find themselves thrown into unanticipated situations, not sure what the protocol is, and possibly doing the wrong thing until they are able to realize it through their own experience and mistakes. While learning though mistakes and experience is important, it can also sometimes be avoided with just a little bit more information at someone's fingertips. Starting new jobs can be overwhelming, and it helps to have a document that staff can refer to when they are working on their own.

I also found the break down of about how much time each part of the process should take, interesting. The actual image capture aspect was about only 15% of the process, while the rest of the tasks, like selection, preparation, creating the metadata and the Web site, and even outsourcing, took just as much time as this or more, each, (of course this is all going to depend on the project.) I think that once you have the system set up and you are digitizing massive amounts of pages or objects, more and more of the time would be spent with the actual image capture and editing, but I'm not sure, as this isn't my area of expertise.

Thursday, September 4, 2008

Cornell’s “Moving Theory into Practice” Tutorial

This tutorial (http://www.library.cornell.edu/preservation/tutorial/) was a great introduction to digital imaging basics, though a bit overwhelming for a beginner like me. I couldn’t process it all, but will definitely go back to it once I actually start working on projects and need the details to help me figure out what I’m doing. The additional readings looked helpful as well.

The formulas for benchmarking the quality of the digitized images, which this tutorial really stresses, were particularly difficult for me to wrap my mind around right now. I understood the underlying concepts, but found myself skimming over the details of the actual measurements. What often seems to be important is establishing reference points, so that the image outputs can be standardized, and you don’t find yourself (or the technician) making subjective judgements throughout the process. It seems that the more images that one digitizes, the harder it would be to keep the changes accurate without having these reference points to go back to. As we discussed in class, color can be particularly difficult to tweak, and can be quite subjective.

For me, the most interesting and complicated issues are those dealing with metadata and file management. These areas of digitization processes (including the networks, IT infrastructures, etc.) seem much more obscure to me than other aspects of the process (i.e. selections, scanning, and reformatting decisions) and I want to learn more about how and where the information is stored and accessed once it has been digitized. Obviously, digital preservation is also very related to these file management issues, and is something that people in our field are particularly concerned with. There don’t seem to be many answers about how to preserve all of these digital objects.

One of our readings referred to a digital project created by the BBC in the 1980s, the Domesday Project, which was related to the original Domesday Book compiled in 1086. The information collected in the 1980s (videos and more about British citizens) was almost inaccessible within just a decade and a half, as the equipment that was used to access it became obsolete. The original Domesday Book from 1086 is still readable. Here’s an article from the BBC about the project and attempts to save the info: http://news.bbc.co.uk/1/hi/technology/2534391.stm .

They finally figured it out using an emulator, but it seems like it was a huge project. Scary.

The Collodion Process

Collodion is a chemical, discovered in 1846, (see the Wikipedia entry on collodion: http://en.wikipedia.org/wiki/Collodion), and used in one of the first photographic processes, aptly named the collodion process, also known as the wet-plate process because it involves pouring of liquids onto a glass plate (Schimmelman, The Tintype in America 1856-1880, p. 13). Frederick Scott Archer first published information about this collodion process he was working on in 1851, which was notable because it created a glass plate negative rather than a positive (Schimmelman, 13). The Daguerreotype, which was popular at the time, created a single positive image. The negative created in the wet-plate collodion process could be used to create positive prints (which are called Ambrotypes [on glass with a black backing] and Tintypes [on dark metal]). Positives could also be printed on paper, such as albumen paper. These negative glass plates were also notable because the prints they created were of high quality (Schimmelman, 13-14).


According to Robert Leggat, on his Web site, which includes an interesting discussion of the history of the collodion wet-plate process, the process was faster and cheaper than other early photographic processes, which led to its popularity in the United States. Though, apparently it was also messy and not very easy to do because initially the whole process, including developing, had to happen in about ten minutes. And it was explosive. Oh, and also, because Archer didn't ever patent his process, many people were able to just do it for free, which added to its popularity (Robert Leggat, “The Collodion Process in A History of Photography, http://www.rleggat.com/photohistory/history/collodio.htm).



It was most popular from the 1850s until the 1880s, according to the Getty Museum’s Web site, which also has a video that gives a nice sort of re-enactment of the wet- plate collodion process: (http://www.getty.edu/art/gettyguide/videoDetails?cat=2&segid=1726).


The Wikipedia article on collodion lists the steps to the wet-plate collodion process
(http://en.wikipedia.org/wiki/Collodion):

* Clean the glass plate (extremely well)
* Flow the glass plate with "salted" (iodide/bromide) Collodion
* Immerse the plate in a silver nitrate bath (for 3-5 minutes)
* Expose the plate (can range from less than a second to several minutes)
* Develop the plate (using an iron based developer)
* Fix the plate (with potassium cyanide or sodium thiosulfate)
* Varnish the plate (with a varnish made from gum sandarac, alcohol and lavender oil).

In terms of the unique aesthetics of the collodion process, Schimmelman writes that the paper prints (photographs) made by the process had “the familiar look of a monochromatic wash drawing on paper, an advantage for those who by their creative energy wanted to nudge photography toward fine art” (14).

A dry-plate collodion process was created a few decades later, in 1871 by Richard Leach Maddox, according to an article, “The Preservation of Glass Plate Negatives,” by Greta Bahneman (http://www.webjunction.org/470/articles/content/439665.) This process was chemically a little bit different and basically allowed the photographer to develop the negative glass plate later, which gave them more flexibility. Bahneman gives a very detailed analysis of preservation concerns when handling the glass plates (wet and dry), and even includes a section on digitizing, suggesting that it needs to be done for preservation and access purposes, and can be completed with either a flat-bed scanner or a digital camera.

In terms of preservation and digitization, obviously the fact that the negatives are glass poses some problems.

Here is a discussion on photo.net, in which someone is asking for advice about scanning glass plate negatives: http://photo.net/digital-darkroom-forum/00QH6E . There doesn’t seem to be a real consensus about best practices for these. People on this discussion board don’t think that making prints of the negatives first, and then digitizing the prints is very necessary, or cost-effective for an organization without much money. Some said that the negatives could be digitized on scanners, but they have to be handled very carefully.

The glass can also cause reflections during the image-capture. This page from the Bancroft Library (http://bancroft.berkeley.edu/collections/casedphotos/digitization.html) describes a digitization project involving ambrotypes, and the problems faced because of the reflective surface of the glass. They were using a digital camera for this project.

According to this page, “Preservation issues in digitizing historical photographs,” created by the European Commission on Preservation and Access, which lists SEPIA (Safeguarding European Photographic Images for Access) Guidelines for preservation when digitizing old photographs, ambrotypes and tintypes are quite sensitive to abrasion as they do not have the protective covering that daguerreotypes have (http://www.knaw.nl/ecpa/sepia/workinggroups/wp4/guidelines.html).


Print source:
Schimmelman, Janice G. The Tintype in America 1856-1880. Philadelphia: American Philosophical Society, 2007.