MediaConch in action! Issue #1

Hey Eddy! Introduce yourself please.

Hey Ashley! I’ve recently become a “Denverite” and have started a new job as an Assistant Conservator specializing in electronic media at the Denver Art Museum (DAM). Before that, I was down in Baton Rouge, Louisiana working as a National Digital Stewardship Resident with Louisiana Public Broadcasting (LPB). When I’m not working, I like to listen to podcasts, read comics, play bass, and stare into the endless void that is “twitter” (@EddyColloton). I’m on a big H. P. Lovecraft kick right now so let me know if you have any recommendations (Dunwich Horror is my current fav, but At the Mountains of Madness is a close second).

What does your media ingest process look like? Does your media ingest process include any tests (manual or automated) on the incoming content? If so , what are the goals of those tests?

The ingest procedures I’m using for the Denver Art Museum are pretty different from the ones we worked out at Louisiana Public Broadcasting, for all kinds of reasons. The two institutions have very different types of collections, and they use their repositories very differently, too.

At the Denver Art Museum, ideally, material will enter the digital repository upon acquisition. Ingest procedures need to be able to be tailored to the eccentricities of a particular media artwork, and flexible enough to cover the wide array of media works that we acquire (websites, multi-channel video installations, software-based artworks, or just a collection of tiff files). With this in mind, we’re using Archivematica for ingest of media into our digital repository. It allows us to automate the creation of METS wrapped PREMIS XML documentation, while manually customizing which microservices we choose to use (or not use) as we ingest new works. Some of the microservices I use on a regular basis are file format identification through Siegfried, metadata extraction with MediaInfo, ExifTool, and Droid, and normalization using tools like FFmpeg.

Things couldn’t be more different at LPB. All completed locally produced programming automatically becomes part of the LPB archive. The LPB Archive is then responsible for preserving, describing and making that content accessible through the Louisiana Digital Media Archive (LDMA), located at http://www.ladigitalmedia.org/. LPB’s ingest procedures need to allow for a lot more throughput, but there’s much less variability in the type of files they collect compared to the DAM. With less of a need for manual assessment, LPB uses an automated process to create MediaInfo XML files, an MD5 checksum sidecar, and a MediaConch report through a custom watchfolder application that our IT engineer, Adam Richard, developed. That code isn’t publically available unfortunately, but you can see the scripts that went into it on my GitHub.

Where do you use MediaConch? Do you use MediaConch primarily for file validation, for local policy checking, for in-house quality control, for quality testing for vendor files?

Primarily policy checking, as a form of quality assurance. At LPB, most of our files were being created through automated processes. Our legacy material was digitized using production workflows, to take advantage of existing institutional knowledge. This was very helpful, because we could then repurpose equipment, signal paths, and software. But, using these well worn workflows also meant that occasionally files would be encoded incorrectly, aspect ratio being one of the most common errors. We would check files against a MediaConch policy as a way of quickly flagging such errors, without having to invest time watching and reviewing the file ourselves.

At the Denver Art Museum, we plan to use MediaConch in a similar way. The videotapes in the museum’s collection will be digitized by a vendor. Pre-ingest, I plan to do tests on the files for quality assurance. After fixity checks, I will check to make sure our target encoding and file format was met by the vendor using MediaConch. I intend to use the Carnegie Archive’s python script from their GitHub to automate this process. Once I know that the files are encoded to spec, I will be creating QCTools reports and playing back the files for visual QC. I’ve been following the American Archive of Public Broadcasting’s QC procedures with interest to see if there’s any tricks I can cop from their workflow.

At what point in the archival process do you use MediaConch?

Basically pre-ingest for both LPB and the DAM. When using MediaConch as a policy checker, my goal is to make sure we’re not bothering to ingest a file that is not encoded to spec.

Do you use MediaConch for MKV/FFV1/LPCM video files, for other video files, for non-video files, or something else?

I use MediaConch for MKV/FFV1/LPCM video files and for other types of video files as well. At LPB we were using MediaConch as a policy checker with IMX50 encoded files in a Quicktime wrapper and H.264 encoded files in a .mp4 wrapper. You can find the policies I created for LPB here, and I talk through the rationale of creating those policies in the digital preservation plan that I created for LPB, available here (MediaConch stuff on page 24, and page 45). I’m happy to report that LPB is currently testing a new workflow that will transcode uncompressed .mov files into lossless MKV/FFV1/LPCM files (borrowing heavily from the Irish Film Archive’s lossless transcoding procedures, as well as the CUNY TV team’s “make lossless” microservice).

At the DAM, we’ll be using MediaConch as a policy checker with Quicktime/Uncompressed/LPCM files, and for validation of our MKV/FFV1/LPCM normalized preservation masters.

My understanding is that MediaConch is going to be integrated into Archivematica’s next release. I’m really looking forward to that update, since at the DAM we have decided to create MKV/FFV1/LPCM files for any digital video in the collection that uses a proprietary codec, or an obsolete format. A lot of the electronic media in the museum’s design collection comes from the AIGA Archives, which the DAM collects and preserves. A ton of the video files from the AIGA Archives were created in the aughts, and they use all kinds of whacky codecs – my favorite so far is one that MediaInfo identifies as “RoadPizza” (apparently a QuickTime codec). Given that I don’t want to rely on the long-term support of the RoadPizza codec, we’re normalizing files like that to MKV/FFV1/LPCM through an automated Archivematica micro-service that uses the following FFmpeg script (which I cobbled together using ffmprovisr):

ffmpeg -i input_file -c:v ffv1 -level 3 -g 1 -slicecrc 1 -slices 16 -c:a pcm_s16le output_file.mkv

To be clear we are also keeping the original file, but just transcoding a second version of the file to be cautious.

Through that implementation we intend to use the Archivematica MediaConch microservice to validate the encoding of the video files that we have normalized for preservation.

Why do you think file validation is important the field?

I wish this was a joke but it honestly helps me sleep better at night. MediaConch and melatonin make for a well rested AV archivist/conservator, hah. I like knowing that the video files that we are creating through automated transcoding processes are up to spec, and comply with the standards being adopted by the IETF.

Also, using MediaConch as a policy checker saves me time, and prevents me from missing bonehead mistakes, of which their are loads, because I work with human beings (and possibly some aliens, you never know).

Anything else you’d like to add?

Just want to offer a big thanks to the MediaConch team for everything that they do! I know there’s a pretty big overlap betwixt team Conch, CELLAR, and the QCTools peeps – you’re all doing great work, and regularly making my job easier. So thanks for that.

To read more from Eddy, check out his Louisiana Public Broadcasting Digital Preservation Plan


veraPDF 1.6 released

veraPDF-logo-600-300x149The latest release of veraPDF is available to download. The validation logic and test corpus of veraPDF 1.6 have been updated to comply with the resolutions of the PDF Association’s PDF Validation Technical Working Group (TWG). The TWG brings together PDF technology experts to analyse PDF validation issues in a transparent way. It also connects veraPDF to the ISO committee responsible for PDF/A.

The GUI and command line applications feature a new update checker which lets you know if you are running the latest version of the software. If you’re using the GUI application select “Help->Check for updates”, command line users type “verapdf –version -v” to ensure you have the latest features and fixes.

Other fixes and improvements are documented in the release notes: https://github.com/veraPDF/veraPDF-library/releases/latest

 

Download veraPDF

http://www.preforma-project.eu/verapdf-download.html

 

Help improve veraPDF

Testing and user feedback is key to improving the software. Please download and use the latest release. If you experience problems, or wish to suggest improvements, please add them to the project’s GitHub issue tracker: https://github.com/veraPDF/veraPDF-library/issues  or contact us through our mailing list: http://lists.verapdf.org/listinfo/users.

User guides and documentation are published at: http://docs.verapdf.org/.

 

PREFORMA International Conference  – Shaping our future memory standards

To find out more about veraPDF and the PREFORMA project, join us at the PREFORMA International Conference in Tallinn on 11-12 October 2017. For more information see: http://finalconference.preforma-project.eu/.

 

About

The veraPDF consortium (http://verapdf.org/) is funded by the PREFORMA project (http://www.preforma-project.eu/). PREFORMA (PREservation FORMAts for culture information/e-archives) is a Pre-Commercial Procurement (PCP) project co-funded by the European Commission under its FP7-ICT Programme. The project’s main aim is to address the challenge of implementing standardised file formats for preserving digital objects in the long term, giving memory institutions full control over the acceptance and management of preservation files into digital repositories.


NEM Summit 2017 – call for abstracts

The NEM Summit is an international conference and exhibition, open to co-located events and organised every year since 2008 by the NEM Initiative (New European Media – European Technology Platform – www.nem-initiative.org) for all those interested in broad area of Media and Content. Over the years, the NEM Summit has grown to become an annual not-to-be-missed event, providing attendees with a key opportunity to meet and network with prominent stakeholders, access up-to-date information, discover latest technology and market trends, identify research and business opportunities, and find partners for upcoming EU-funded calls for projects.

nem

The 10th edition of the NEM Summit conference and exhibition will be held in Spanish capital Madrid at the exciting venue of the Museo Reina Sofía. Please, reserve these dates to attend the NEM Summit 2017 and take part in discussions on the latest development in European media, content, and creativity.

NEM Summit 2017 – Call for Extended Abstracts

  • Expected length of the extended abstracts is two A4 pages with possibility to provide further supporting information
  • The extended abstracts have to be submitted until 26 June 2017
  • Fast track evaluations of the received contributions will be applied and results will be known by 26 July 2017
  • More information can be found in the attached Call for Extended Abstracts
  • Submission portal is available on the NEM Initiative website at www.nem-initiative.org

Further details about the NEM Summit 2016, further opportunities to participate and exhibit at the event, and online Summit registration will be provided soon on the NEM Initiative website at www.nem-initiative.org.

Download the call for abstracts (PDF, 91 kb)

 


Photoconsortium Annual Event, hosted by CRDI. Public seminar and general assembly.

Girona-cathedral-PD-694x416img. Ajuntament de Girona, Vista exterior de l’absis de la Catedral de Girona, Public Domain.

test1aHosted by Photoconsortium member CRDI, the 2017 Annual Event of Photoconsortium is organized in the beautiful city of Girona (Spain), seat of an important audiovisual archive of millions photographs, films, hours of video and hours of sound recordings, mostly from private sources.

The archive is managed by CRDI, a body of the city Municipality created in 1997 with the mission to discover, protect, promote, provide and disseminate cinematographic and photographic heritage of the city of Girona.

Photos and follow-up


Friday 9th June 2017

PUBLIC SEMINAR: PHOTOCONSORTIUM INFORMATIVE SESSION

Languages of the seminar: English, Catalan and Spanish

Chair: David Iglésias, Officer of Photoconsortium Association

09.45 Welcome message by Joan Boadas, Director of CRDI

10:00 Prof. Fred Truyen, KU Leuven. President of Photoconsortium Association. Presenting the Photography Collection in Europeana

10:15 Antonella Fresa, Promoter Srl. Vice-president of Photoconsortium Association. Photoconsortium, the expert hub for photography

10.30 Pierre-Edouard Barrault, Operations Officer at Europeana. Publishing in Europeana – Tools, Data & good practices

11:30 Coffee break and networking

12:00 Sílvia Dahl, Laia Foix. Photography deterioration terminology. A proposal to broaden the EuropeanaPhotography vocabulary.

12:30 Pilar Irala. The Jalon Ángel collection.

13:00 Debate

14:00 Lunch

15:00 – 16:00 Visit to the Cinema Museum


On the day before, 8th June, the General Assembly of Photoconsortium members took place.

 

 


PREFORMA hands-on sessions in Europe
the_reel_thing_4

Photo credit: CC BY-SA Sebastiaan Ter Burg.

During May 2017, successful hands-on sessions and workshops have been organised by PREFORMA in several places to explain to the participants what does conformance checking mean, why is file format validation so important in long-term digital preservation, how to create their own policy profiles and how to download, install, configure and use the conformance checker to analyse their files. These workshops invite participants to bring their files and analyse them with the PREFORMA tools..

 

Barcelona, 10 May 2017

20170510_104903The first hands on session was organised in Barcelona with members of the Official Association of Librarians (Col·legi Oficial de Bibliotecaris i Documentalistes de Catalunya – COBDC), to show the functionalities offered by the DPF Manager to check TIFF files.

It took place in the premises of COBDC on the 10th of May and the session was conducted by Sònia Oliveras from the Girona City Council and Xavier Tarrés and Víctor Muñoz from Easy Innova. The attendees, who have large amounts of TIFF files, came mostly from local and national memory institutions and weren’t aware of any file conformance checker. The tools offered by PREFORMA project were the first solution in order to solve the file format conformance.

 

Amsterdam, 28 May 2017

the_reel_thing_5

Photo credit: CC BY-SA Sebastiaan Ter Burg.

A second hands-on session was organised in Amsterdam in the framework of The Reel Thing XL workshop, focusing on the challenges of using FFV1 and MKV for film digitisation.

Presentations were delivered by Erwin Verbruggen (Netherlands Institute of Sound and Vision), introducing PREFORMA and the AV challenges, Jérôme Martinez (MediaArea.net), introducing MediaConch, Eva Verdoodt & Noortje Verbeke (VIAA), presenting their film digitisation workflow and considerations using FFV1/MKV, and Reto Kromer (reto.ch), presenting the last developments on the FFV1 standardisation as far as colour information for films is concerned.

the_reel_thing_2

Photo credit: CC BY-SA Sebastiaan Ter Burg.

The session was closed by a panel discussion guided by the British Film Institute on the practical thresholds implementing FFV1/MKV in film digitisation. Basically the importance of the standardisation effort was discussed, as well as the posibilities for crowdfunding further development of the standards and the MediaConch tool.

The workshop was attended by 25-30 participants, mainly from film archives and film scanning services, including British Film Institute, Irish film institue, INA, Catalunyan Film Archive, Austrian Film Archive, German Film Archive, VIAA, Sound & Vision, Picturae Digitisation Services.

 

Quedlinburg, 29 May 2017

20170510_130301The third hands-on session was organised in Quedlinburg by SPK in cooperation with the Museum Association of Saxony-Anhalt, focusing again on the DPF Manager.

The session was embedded in a general meeting of the Working Group Digitisation of the Museum-Association (AG Digitalisierung MVSA). There were 22 participants mainly from medium-sized and small museums – museum-directors, curators, IT-people. There was a general introduction on file-formats for digital preservation, especially for text and images, followed by an introduction to PREFORMA and the tools created in the project.

Participants brought their own laptops and installed the DPF Manager. After that the functionalities available in the GUI version were explained and participants were able to try both the conformance checker and the policy checker.

In the end the participants agreed that the DPF Manager is a valuable tool for their digitisation-work, not only for digital preservation but also for checking image-files produced by external companies in the framework of a digitisation project of a museum.

 

Stockholm, 30 May 2017

the_reel_thing_1

Photo credit: CC BY-SA Sebastiaan Ter Burg.

Finally, the last hands-on session organised by PREFORMA in May was hosted at the National Archives of Sweden and it focuses on PDF/A and on the use of veraPDF conformance checker. The session brought together 21 persons working with archives and records management issues at public and private memory institutions, state and municipal agencies and organisations, and in SMEs.

The Riksarkivet team made an introduction to the seminar followed by a brief walkthrough of the PDF/A format. The hands-on part was divided into two main blocks: the first one focusing on conformance checking, the second one on policy checking. Each block began with a demonstration by the Riksarkivet team, and was then followed by practical exercises where the participants used veraPDF to check the conformance and policy respectively of their sample files.

The seminar ended with an informal discussion on the results of the seminar and whether the participants initial expectations were met. These expectations was mainly about learning more about PDF/A to better understand its “pros and cons” but also to learn about validation and the PREFORMA conformance checker. The overall feedback from the participants was very positive and participants were interested to continue to follow the developments of the PREFORMA project , possibly as part of a Swedish informal reference group.

 

Additional workshops and seminars will be organised by the PREFORMA partners after Summer. Stay tuned at www.preforma-project.eu!


Preservation courses at the IS&T conference Archiving 2017

House-of-Blackheads-and-St-Peter-s-Church-Tower-at-dusk-Riga-Latvia-by-DAVID-ILIFF-CC-BY-SA-3-0

 

Riksarkivet (the Swedish National Archive) and Packed, together with veraPDF and EasyInnova, were invited to the conference to arrange a course about formats for preservation on the 15th of May. The framework of the course was based on the work done in Riksarkivet’s research and development program ArkivE 2.0 — fundamental principles for selection of format — which within the PREFORMA project was applied.

 

The participants were introduced to an abstract and a generic overview of the meaning of “format” and digital preservation. Within that framework the value and importance of a Conformance Checker and what makes a format appropriate in a specific user case was explored. The course covered technical, legal and archival challenges facing those who work with digital preservation and handed the task of recommending and selecting “preservation formats”.

 

A demonstration of the difference between file identification and conformance checking provided the participants with a much appreciated practical connection to the theoretical presentation. A more concrete and detailed view of formats were also given through the presentations of veraPDF on PDF/A, and EasyInnova on TIFF and TI/A.

 

The course had 12 pre-registered participants (mainly librarians, academics, photographers and digitisation companies), of which 11 attended and 10 gave their evaluation through the IS&T provided feedback form. Background of the participants included the Swedish Media Conversion Center (formally part of Riksarkivet), Latvian National Library, Dutch City of Leiden Heritage Network and ancestry.com. The overall feedback was very positive.


MediaConch Newsletter #10 – June 2017

Updates

We are pleased to announce the progress we’ve made this year. There are new updates to MediaConch that expand its capabilities, user stories to share, and new events related to implementation checking and the standardization of open audiovisual formats.

The MediaConch team recently collaborated with VIAA on tests to migrate their JPEG2000/MXF collection to FFV1/Matroska. Learn more about our process and findings!

The development of MediaConch has been closely following the work of the IETF CELLAR working group which is creating specifications for EBML, Matroska, FFV1 and FLAC. Review the latest versions of those documents.

Some Highlights:

  • MediaConch optimized its FFV1 parser, which will allow upcoming versions to provide more comprehensive implementation checking of FFV1 against its specification.
  • We improved the Matroska checker, particularly to support the CELLAR working group’s development of the EBML Schema, which defines Matroska’s structure in a manner similar to an XML Schema.
  • Attachments to Matroska files may now be analyzed against other implementation checkers. For instance, Matroska can now use VeraPDF or DPF Manager to assess TIFF or PDF data.
  • In collaboration with the Tate Museum, we have added a TN2162 policy checker to MediaConch. This policy assesses uncompressed video in QuickTime against Apple’s list of additional requirements that affect that combination of formats.
  • Several bugs were reported and fixed. Thanks to our users for their reports!
  • See what’s new in MediaConch’s GUI ad CLI!

Jérôme, Reto, and Kieran recently presented MediaConch and CELLAR standardization at the Reel Thing Conference in Amsterdam.

the_reel_thing

The presentations and conversations at the Reel Thing demonstrated a growing interest in the standardization process of Matroska and FFV1 as well as methods to integrate these formats into preservation environments. Conversations focused on use of these formats for film preservation challenges and collaborative work to develop more tools around lossless video.

Next steps:

  • “FrameMD5” computing in MediaConch, in order to compare resulting file to a source file, pixel per pixel, after a conversion to Matroska/FFV1, if the conversion includes a “FrameMD5” of the source file.
  • Adding more information (specific location of the error in the bitstream) when FFV1 validation fails.
  • Stabilization of the software, with bug fixes and more automatic non-regression tests.

Ashley recently interviewed Eddy Colloton from the Denver Art Museum about his process and experience using MediaConch. Read it here!

 

Upcoming Events

June 21st, 2017 4:45 – 5:30 pm PDT: Ashley will discuss MediaConch and format standards in her talk How Open Source Audiovisual Tools Help Archivists (And You Too!) at Open Source Bridge this week in Portland, OR.

July 10th, 2017 2:00 – 5:00 pm EDT: Dave will be hosting An Archivists’ Guide to Matroska workshop at the Metropolitan New York Library Council.

July 19th, 2017 15:20 – 16:50 CEST : CELLAR (Codec Encoding for LossLess Archiving and Realtime transmission) will hold its second face-to-face meeting during IETF 99 at the Hilton in Prague. The final agenda will be published on June 23rd, 2017. Tessa and Jérôme will be attending in person.

September 20th, 2017 from 9:45a – 11:15a CEST: Jérôme and Dave will host a workshop: Checking Audiovisual Conformance, for IASA at the Ethnological Museum of Berlin.

October 11th – 12th, 2017: The PREFORMA International Conference will be held at the National Library of Estonia, Tõnismägi 2, Tallinn.

 

Latest Downloads

Download MediaConch’s latest release or a daily build.

 

New Release Notes

What’s new in MediaConch 17.05

GUI/CLI/Server/Online

  • Less verbose output by default
  • CSV output (useful for automation with the command line)
  • Option for creating policy from a file directly in the policy editor
  • New policy example based on Apple’s TN2162 which defines requirements when storing uncompressed video in QuickTime
  • Analyze attachments in Matroska files (useful especially with PDF or TIFF plugin, for validating attachments)
  • Better support of some broken Matroska files, displaying more information about the reason it is a broken file
  • More Matroska and FFV1 validity tests
  • Performance improvements
  • For Mac users, MediaConch is now available directly in the Mac App Store, with automatic updates

The MediaConch project has received funding from PREFORMA, co-funded by the European Commission under it’s FP7-ICT Programme.

 

Feedback

MediaArea is eager to build a community of collaborators and testers to integrate the software in their workflows and participate in usability testing. Please contact us if you’d like to be involved!


DPF Manager v3.3 released and available to download

dpf-release-33

 

We present a new update of the DPF Manager. This new version can be downloaded as usual from the PREFORMA Open Source Portal.

This new version includes several improvements and fixes. One of the most interesting new features is the Statistics Module. This new section reports the most usual tags found in the analyzed files, as well as their values. Also from the ISOs validated, it is shown the errors that have been found ordered by frequency.

The rest of the enhancements and bugfixes of this new release are listed below.

  • Improved reports section efficiency.
  • Filter unreadable tags in reports.
  • Fixed some errors related with the metadata fixer.
  • Added default configuration to periodical checks definition.

In the next release, we plan to create a benchmarking tool where the DPF Manager will be compared with other well-known TIFF validators such as Jhove and Jhove2.

Don’t forget to give us feedback to continue improving the project!


EVA London 2017: conference, exhibitions, a research workshop and numerous of demonstration sessions

Held annually in July, EVA London is one of the international Electronic Visualisation & the Arts conferences. The first EVA conference was held in 1990, with the intention to create a space for people using or interested in the new technologies to share their experiences and network in a friendly, collaborative atmosphere. EVA London’s focus is on the development and application of visualisation technologies to various domains, including art, music, dance, theatre and the sciences.

EVA London 2017 is held on 11-17 July 2017 with a preliminary event on the 10th July, and:

  • has a focus on visualisation for the arts and culture – interpreted broadly to include its implications, effects, and consequent strategies and policies
  • covers the burgeoning creative uses of digital media for works of art and creative productions
  • is a networking event for groups and projects, including European projects and groups
  • includes a free-of-charge Research Workshop for MA, MSc and PhD students and others, to share their research in a friendly and informal setting
  • is inspiring and informative, collaborative and friendly

Website and full programme: http://www.eva-london.org/eva-london-2017/programme/ 

EVA London’s Conference themes 2017 include:

  • Digital Art
  • Data, Scientific and Creative Visualisation
  • Digitally Enhanced Reality and Everyware
  • 2D and 3D Imaging, Display and Printing
  • Mobile Applications
  • Museums and Collections
  • Music, Performing arts, and Technologies
  • Open Source and Technologies
  • Preservation of Digital Visual Culture
  • Virtual Cultural Heritage
  • Virtual Worlds and Video Game Art

 eva london


Pratt Institute School of Information in NY – Master of Science Museums and Digital Culture

The Master of Science in Museums and Digital Culture (MDC) of the Pratt Institute in NY (School of Information) is an innovative program that breaks new ground as the first museum master’s degree designed for the digital world, advancing the concept of a museum studies program. The program focuses on the ways in which museums use digital technology and media to enhance services and collections and engage with visitors across physical and virtual contexts.

The program features partnerships and fellowships with NYC’s leading museums the MDC program prepares students with the knowledge and digital skills for careers in today’s information-and-technology-rich museum environments. Through structured practicums and field research, students develop into innovative and creative leaders in the museum field.

This cutting-edge program designed for the 21st century museum in the digital age is an innovative museum studies program that focuses on the digital life of museums across collections, galleries and activities, preparing graduates with the knowledge and skills necessary for careers in this rapidly changing field, and the ability to engage and interact with today’s diverse and connected global audiences.

mdc-Picture1

The curriculum builds on commonalities of knowledge and skills across GLAM (galleries, libraries, archives, and museums) and addresses emerging areas of the museum field including digital information behavior, digital seeing and aesthetics, digital curation, and the integration of the physical and digital life of the museum, so that:

  • Students learn from expert faculty about key areas of study including visitor engagement, user experience, digital curation and preservation.
  • Students gain skills and knowledge in all aspects of digital technology use across the museum sector, physical and digital, real and virtual.
  • Graduates are prepared to take on leadership roles as museum professionals who can meet the challenges of museums in our digital world.

More information and contacts: https://www.pratt.edu/academics/information/degrees/museums-and-digital-culture-ms/