Interview with Kathryn Gronsbell
gronsbell-mediaconch

Kathryn with MediaConch

Hey Kathryn! Introduce yourself please.

Hi Ashley! I’m the Digital Collections Manager at Carnegie Hall. I develop and support sustainable practices around the digital asset lifecycle to ensure the availability and integrity of material related to the Hall and its history, collections, programs, and operations. I can be found talking about the struggling mass transit infrastructure in NYC, metadata quality assessment, and my Great Pyrenees rescue pup on Twitter.

What does your media ingest process look like? Does your media ingest process include any tests (manual or automated) on the incoming content? If so, what are the goals of those tests?

In 2012, the Carnegie Hall (CH) Archives started a multi-year initiative to digitize the majority of our physical holdings for preservation and access. We outsource our paper, video, audio, and film reformatting to different vendors and use a digital asset management system (DAMS) to organize, catalog, and present the material. Our quality control (QC) procedures for incoming digitized material are available on Carnegie Hall’s Github. The process enables control and documented oversight from the point of hard drive / FTP delivery from a digitization vendor to ingest into our DAMS.

We aim to reduce risk while expediting the review and verification process with the QC procedures. The QC procedures increase our own accountability (How long does it take us to process 1 batch? What step is most time-intensive? Where can we expedite work by using different tools or workflows?) and allow us to better vet the continued work of our vendors (Are batches from the same vendor failing the same steps over time?). Another priority of the QC workflow is the ability to actually do it – the work is split between me and our Asset Cataloger, Lisa Barrier.

Where do you use MediaConch? Do you use MediaConch primarily for file validation, for local policy checking, for in-house quality control, for quality testing for vendor files?

Our primary use case for MediaConch is local policy checking against the digitization specs. We outsource our digitization, so quality testing the vendor output is a built-in function of our policy checking. We chose to balance manual review with automated testing: we perform manual visual/aural QC on 25% of a batch (or more, if the batch is small) and run MediaConch against the entire batch. I wrote a script which summarizes the MediaConch output to help expedite the review process for this step. We save MediaConch reports to an internal network drive for future use – we hope to build a digital repository in which we can submit submission information packages (SIPs) which contain information like the XML metadata from vendors and MediaConch reports.

At what point in the archival process do you use MediaConch?

MediaConch is part of Carnegie Hall’s pre-ingest procedures.

kg-mediaconch-summary

Carnegie Hall Github page for MediaConch commands with terminal window

Do you use MediaConch for MKV/FFV1/LPCM video files, for other video files, for non-video files, or something else?

Our specs vary by source format, so we pass a variety of things through MediaConch (audio and video). We will be reviewing and likely revising our digitization specs in the next year, and MediaConch’s ability to support Matroska, FFV1, etc. may play a role in our decision-making process.

Why do you think file validation is important [or whatever you are doing]?

There is an argument for verifying requested policies are being enforced on material digitized-as-a-service, and the ability to do so in-house, with a low learning curve. For my work in the CH Archives, I focus on how each step fits into the larger picture of what the entire workflow aims to achieve – accountability, dependability, and reproducibility.

Because of staff changeover and other factors, not all of our QC is completed in what I consider a ‘normal’ time frame. There is material digitized in 2013 that hasn’t made it past half of the QC workflow (pause for screams of horror). In updating and improving our QC strategy, we acknowledge that the reality of our procedures mean batches may be processed asynchronously or in a wildly delayed timeframe, and manage those unfortunate symptoms of the prioritization juggling act that is preservation/archiving moving image material.

Anything else you’d like to add?

MediaConch is an incredible resource for Carnegie Hall. There was a few-month period where all QC on audiovisual files screeched to a halt. We were using MDQC, a policy checker built on top of ExifTool and MediaInfo, for audiovisual material. We ran into an issue which prevented us from analyzing files over ~200GB, despite a few months of troubleshooting. Many of Carnegie Hall’s archival study recordings are full length concerts and performances, so we have some big uncompressed files to process. We still use MDQC for our still image material (concert programs, flyers, posts, photographs) but have transitioned to using MediaConch for any audiovisual material in the QC process. Without this tool, we would have a bigger QC backlog and would have needed to invest more money and time in determining how to facilitate policy checking audiovisual files. Thank you MediaConch creators, maintainers, and contributors!

rabin

“Concerts at Carnegie Hall – Michael Rabin, 1955” Michael Rabin, GIF Courtesy of the Carnegie Hall Archives

Thanks so much, Kathryn! For even more fun from the Carnegie Hall team, check out their Linked Data project repository.


DPF Manager 3.4 available to download

dpf_3.4

 

A lot of improvements and new features have been included in the new release of the DPF Manager.

The most important changes are the inclusion of a Quick Check functionality that allows to perform a fast validation of the files, which can be later analyzed in more detail if desired performing a full check. Also, the reports now include hints that explain how to fix them (whenever possible). A new metadata fix has been added to solve ascii encodings (a common baseline error, that is produced by many softwares and operating systems). And now, the PDF reports can be seen directly from the GUI (instead of opening an external PDF viewer).

This release also includes some minor improvements and a set of bugfixes, that can be seen in the github page and the list of issues closed in the current milestone.

The benchmarking tool anounced in the past release is in progress and will be ready in the release of July.


ICT Proposers’ Day 2017

ICT Proposers’ Day 2017 will take place on 9 and 10 November in Budapest, Hungary. This networking event centres on European ICT Research & Innovation with a special focus on the Horizon 2020 Work Programme for 2018-20. An Opening Ceremony and Social Event will be organised by the Hungarian Ministry of National Development and will take place on 8 November.

The event will focus on the 2018 Calls for Proposals of the Horizon 2020 Work Programme in the field of Information & Communication Technologies. It will offer an exceptional opportunity to build quality partnerships with academics, researchers, industrial stakeholders, SMEs and government actors from all over Europe.

ict_proposers_day_banner_600x400px_22552_43

The programme will include:

  • Networking sessions where potential proposers present their project ideas, organised according to the Pillars and Topics of the Work Programme 2018-20;
  • Information sessions on how to prepare and submit a proposal;
  • Information stands on the ICT-related topics of the Work Programme 2018-20 and the content of the calls for proposals;
  • A European Commission information desk to supply information on the content and logistics of the event;
  • Booths, organised by village, which serve as meeting points for people interested in the same research topics;
  • Ample space for informal networking and bilateral meetings between participants.
  • Workshops and Face2Face Brokerage organised by Ideal-Ist.

More info and registration:  https://ec.europa.eu/digital-single-market/en/events/ict-proposers-day-2017

 


Communicating the Museum – CTM17 Los Angeles: Museums Beyond Walls

ctm los angeles

Communicating the Museum (CTM) was launched in 2000 and since then over 5’000 professionals from the cultural sector have attended this conference.

Following a very successful edition in Paris in June, the 19th edition of the international conference Communicating the Museum will take place in Los Angeles, 6-9 November 2017.  Join us to discuss the role of museums in society and how they reach out to diverse communities. Discover how museums share their expertise and collections beyond the building. International experts from arts organisations will show us how they reach out to society through culture. John Giurini from the J. Paul Getty Museum and Miranda Carroll from LACMA are preparing an amazing itinerary to discover the most innovative art scene of Los Angeles.

angels-angel-angelic-768x512

img © The Los Angeles Music Center

Confirmed speakers include:

  • Lars Ulrich Hansen, Head of Communication, Kunsten Museum of Modern Art Aalborg, Denmark
  • Jennifer Northrop, Director of Marketing and Communications, San Francisco Museum of Modern Art, USA
  • Tim Marlow, Director of artistic programmes, Royal Academy of Arts, United Kingdom
  • Cathy Pelgrims, Head of Public and Education, Museum aan de Stroom, Belgium
  • Ann Philbin, Director, Hammer Museum, USA
  • Abhay Adhikari, Founder, Digital Identities, Sweden
  • Shirani Aththas, Manager, Communications & Public Affairs, Australian National Maritime Museum, Australia

More info and registration: http://www.agendacom.com/communicating-the-museum-2017-los-angeles/tickets/

 


DIGIMAG Journal issue 76 – SMART MACHINES FOR ENHANCED ARTS

digimag img

Artificial Intelligence (AI) and Machine Learning (ML) might be considered by many as synonyms, also because they are the buzzwords of this decade. But actually they are not. They both question though, the ability of the machines to perform and complete tasks in a “smart” way, challenging human intelligence and specificity.

With machines becoming more and more intelligent, Machine Learning is nowadays not only an interesting and challenging topic, but also a crucial discipline. If initially computing was just a matter of calculations, now it has moved beyond simple “processing” and implies also “learning”. In the age of Big Data and IoT, machines are asked to go beyond pure programming and algorithms procedures, introducing also predictions of data, OCR and semantic analysis, learning from past experiences and adapting to external inputs, reaching out the domain of human productions and processes.

As Gene Kogan and Francis Tseng write in their in-development book “Machine Learning for Artists”, we can “pose today to machines a single abstract problem: determine the relationship between our observations or data, and our desired task. This can take the form of a function or model which takes in our observations, and calculates a decision from them. The model is determined from experience, by giving it a set of known pairs of observations and decisions. Once we have the model, we can make predicted outputs””.

So, the subject of Machine Learning and Artificial Intelligence methods more in general, are going thusly much further the technology or science fields, impacting also arts, product design, experimental fashion and creativity in general. As ML features can fit with digital arts practices, we’re lead to explore the way some AI techniques can be used to enhance human performative gestures and creativity models.

How biological systems and machine intelligence can collaborate to create art, and which is the cultural outcome for our society? Which is the new role of creativity in this scenario? How the contemporary will face a future generation of automated artificial artists/designers, able to learn from the creatives themselves, or to have a direct impact on human creativity? Will the anthropocentric vision of the creative process behind the artistic creation, affected by new intelligent Neural Networks?

With this call Digicult aims at researching contributions on the mentioned topic, especially from individuals active in the artistic and academic fields (curators, critics, hackers, fabbers, creative producers, lab managers, activists, designers, theorists, independent and academic writers, scholars, artists, etc.).

Deadline: 01 September 2017

More info: http://www.digicult.it/digimag-journal/

digimag

About DIGIMAG

Digimag Journal is an interdisciplinary online publication seeking high-standard articles and reviews that focus on the impact of the last technological and scientific developments on art, design, communication and creativity. Following the former Digimag Magazine, it is based on international call for papers on given subjects and provides readers with comprehensive accounts of the latest advancements in the international digital art and culture scene. It is published by Digicult Editions, for free as Pdf, Epub, Mobi and in print on demand.


Call for artists: LE MERAVIGLIE DEL POSSIBILE

On December 2017, Kyber Teatro organises in Cagliari (Italy) the fourth edition of International Theatre, Art and New Technologies Festival

Le meraviglie del Possibile

LMDP1

LMDP Festival is the first of this kind in the whole Italy. It aim to promote the interrelation between artistic and technological languages.

The Kyber Teatro – spin off of L’Aquilone di Viviana theatre company, creator and manager of the Festival, addresses to Italian and International artists an Open Call to submit their projects about “Interaction between arts and technology”.

Who can attend

The participation is open to artists of every nationality, working individually or in a group.

Eligible projects

  • Theatrical plays, performances.
  • Installations that explore and realize the interaction between artwork, exhibition space and observers with the contribution of technology.

Application (deadline 21st September 2017)

The theme of the fourth edition of LMDP Festival is the interrelation between theatre, arts and  new technology.

The application must contain:

• Artist’s CV;

• Detailed description of the project (in PDF);

• Technical rider;

• Selection of max 5 photos;

• Link audio / video material (Vimeo or Youtube).

The result is going to be notified only to selected projects by the 1st October 2017.

IMMAGINE LMDP4

Publication

Applying for the call, artists agree that the projects should be represented at the Festival. Selected artists must provide a short biography and an abstract of the project. They also agree that the material related to the project could be published on the Festival website and/or presented to the press for promotional purposes.

 

Archiving process

Artists authorise Kyber Teatro – L’aquilone di Viviana to present their work, to store the material and make it accessible through the Festival’s website. All rights to the artwork and images will remain to the artist. The Organization is also entitled to document the event in all its phases through audio recordings, video or images.

 

Application materials must be sent to: info@kyberteatro.it

 

Kyber Teatro – L’Aquilone di Viviana Soc. Coop.

Via Newton 12, 09131 Cagliari

Tel: +39 0708607175 – Mob: + 39 3470484783

info@kyberteatro.it

 


TWA cultural heritage Digitisation Grant 2017 for UK-based digitisation projects

Following a successful 2016 and excellent bids from archives and other memory institutions last year, the TWA Digitisation Grant has relaunched with a fresh tranche of funding in 2017.

The fund offers grants of up to £5000 for UK archives, special collections libraries and museums to digitise their collections.

Last year’s esteemed judging panel will return to assess the grant bids and select the winners: including ARA chief executive – John Chambers; HLF appointed special advisor – Claire Adler; and senior digitisation consultant at TownsWeb Archiving – Paul Sugden.

The Grant can be used to fund the digitisation of bound books, manuscripts, oversize maps and plans, 35mm slides, microfilm/fiche, glass plate negatives, and other two-dimensional cultural heritage media. It can also be used to fund opening up access to heritage collections online.

The deadline for applications is 7th July 2017.

How to apply and more details at:

https://www.townswebarchiving.com/twa-digitisation-grant/

Matt-Scanning-Digitisation-Grant-2017


Europeana 1914-1918 thematic collection launches during Europeana Transcribathon Campus Berlin 2017

Officially launching the new Europeana 1914-1918 thematic collection, Europeana Transcribathon Campus Berlin 2017 marks the next milestone for the crowdsourcing digital archive dedicated to the historical conflict, and puts a spotlight on the involvement of its community.

On 22 and 23 June, the Berlin State Library will host the Europeana Transcribathon Campus Berlin 2017. Over two days, teams from three generations and several European countries  will compete to digitally transcribe as many World War One documents as possible, and link them to other historical sources such as early 20th century newspapers. Transcribathons gather people from across Europe and online to create digital versions of handwritten items found on Europeana 1914-1918. These innovative events are the latest crowdsourcing initiative enriching the Europeana 1914-1918 digital archive. Since their launch in November 2016, more than several million characters and 12,000 documents, from love letters to poems, have been transcribed.

Frank Drauschke, of Europeana 1914-1918 project team says: “Most sources on Europeana 1914-1918 are written by hand, and often hard to decipher. Transcribathon aims to help us ‘polish’ a raw diamond by this making private memorabilia readable online. We utilise the power of our community to transcribe as many private stories and documents from diverse languages and regions of Europe and make them available to the  public.”

These unique resources found on Europeana 1914-1918 have been collected and digitized since 2011 during collection days and online uploads inviting people to submit their personal documents. During  Europeana Transcribathon Campus Berlin 2017, Europeana 1914-1918, previously living on a separate website, will officially move platform and re-launch as a new Europeana thematic collection. This move onto the Collections site aims to broaden the current audience by opening up World War One related content to all Europeana visitors and to enrich their experience. People can now discover digital versions of testimonies handwritten 100 years ago, complemented by millions of digitized newspapers and documents provided by libraries and archives. Linking user generated content with other historical sources makes it possible to view them within the bigger picture. And thanks to the ability to search across the Europeana platform, people can now also easily access related items from the other four thematic collections: Europeana Art, Music, Fashion and Photography.

Europeana Transcribathon Campus Berlin 2017 is organized by Europeana, Facts & Files and the Berlin State Library, in cooperation with the German Digital Library and Wikimedia.

transcribathon

Europeana is Europe’s digital platform for cultural heritage, collecting and providing online access to tens of millions of digitized items from over 3,500 libraries, archives, audiovisual collections and museums across Europe, ranging from music, books, photos and paintings to television broadcasts and 3D objects. Europeana encourages and promotes the creative reuse of these vast cultural heritage collections in education, research, tourism and the creative industries.

Europeana Collections are the result of a uniquely collaborative model and approach: the web platform is provided by Europeana, the content comes from institutions across Europe, while consortiums provide the theme and editorial expertise to bring the content alive for visitors through blogs and online exhibitions.

Europeana 1914-1918 is a thematic collection that started as a joint initiative between the Europeana Foundation, Facts & Files, and many other European partner institutions. It originates from an Oxford University project in 2008. Since 2011, over 200,000 personal records have been collected, digitized and published. These events have now expanded to over 24 countries across Europe, building up an enthusiastic community of about 9,000 people.

Europeana Transcribe is a crowdsourcing initiative that allows the public to add their own transcriptions, annotations and geo-tags to sources from Europeana 1914-1918. Developed by Facts & Files and Olaf Baldini, piktoresk, the website is free to use and open to all members of the public. New contributors can now register and submit their own stories within the Europeana Collections site.

Europeana Newspapers is making historic newspaper pages searchable, in creating full-text versions of about 10 million newspaper pages. www.europeana-newspapers.eu

Europeana DSI is co-financed by the European Union’s Connecting Europe Facility


International Survey on Advanced documentation of 3D Digital Assets

The e-documentation of Cultural Heritage (CH) assets is inherently a multimedia process and a great challenge, addressed through digital representation of the shape, appearance and conservation condition of the heritage/cultural object for which 3D digital model is expected to become the representation. 3D reconstructions should progress beyond current levels to provide the necessary semantic information (knowledge/story) for in-depth studies and use by researchers, and creative users, offering new perspectives and understandings. Digital surrogates can add a laboratory dimension to on-site explorations originating new avenues in the way tangible cultural heritage is addressed.

The generation of high quality 3D models is still very demanding, time-consuming and expensive, not at least because the modelling is carried out for individual objects rather than for entire collections and formats provided in digital reconstructions/representations are frequently not interoperable and therefore cannot be easily accessed and/or re-used or preserved.

survey

This 15-20 minutes long survey aims to gather your advice concerning the current and future e-documentation of 3D CH objects. We would appreciate your taking the time to complete it.

Please access the survey HERE

Your responses are voluntary and will be confidential. Responses will not be identified by individual. All responses will be compiled together and analyzed as a group. The results of this survey will be published before the end of this year on the channels of Europeana Profesional (pro.europeana.eu), CIPA (Comité International de Photogrammétrie Architecturale – http://cipa.icomos.org/), Digital Heritage Research Lab (http://digitalheritagelab.eu/dhrlab/lab-overview/) and Digitale Rekonstruktion (http://www.digitale-rekonstruktion.info/uber-uns/).


VIEW Journal Celebrates Fifth Anniversary with New Interface

VIEW Journal started five years ago as the first peer-reviewed, multimedia and open access e-journal in its field. The online open access journal now has a fresh new look. Its new interface makes reading and navigation easier. More importantly, it now offers room for discussion – with the possibility to leave comments and responses under every article. Articles still feature embedded audiovisual sources. The journal continues to provide an online reading experience fit for a 21st century media journal.

view

Fifth Anniversary

VIEW Journal was started by EUscreen and the European Television History Network. It is published by the Netherlands Institute for Sound and Vision in collaboration with Utrecht University, Université du Luxembourg and Royal Holloway University of London. A heartfelt thank you goes to the support of all authors, the editorial board, and team, who have worked hard over the years to build up a journal with renown.

For the past five years, VIEW has published two issues per year. The journal’s aim – to offer an international platform in the field of European television history and culture – still stands. It reflects on television as an important part of our European cultural heritage and is a platform for outstanding academic and archival research. The journal was and remains open to many disciplinary perspectives on European television; including but not limited to television history, television studies, media sociology, media studies, and cultural studies.

Issue 10: Non-Fiction Transmedia

With the new design it also proudly presents its 10th issue on Non-fiction Transmedia. This issue was co-edited by Arnau Gifreu-Castells, Richard Misek and Erwin Verbruggen. The issue offers a scholarly perspective on the emergence of transmedia forms; their technological and aesthetic characteristics; the types of audience engagement they engender; the possibilities they create for engagement with archival content; technological predecessors that they may or may not have emerged from; and the institutional and creative milieux in which they thrive.

You can find the full table of contents for the second issue below. We wish you happy reading and look forward to your comments on the renewed viewjournal.eu.

 

Table of Contents

EDITORIAL

DISCOVERIES

EXPLORATIONS