Europeana AGM 2017

The Europeana Network Association AGM 2017 took place on 6 December in Milan, where it was possible for attendants to meet colleagues and peers to exchange professional knowledge and get inspired in a lovely museum surrounding in the run up to 2018.

milano

The meeting happened in a friendly atmosphere where discussions for 2018 plans, reflection on 2017 actions, and inspiring speeches from different experiences with digital cultural heritage created a nice environment to officially lauch Europeana’s support to 2018 European Year of Cultural Heritage.

 

postscriptum

image by PostScriptum via Facebook

 

milano2

Museo Nazionale Scienza e Tecnologia Leonardo Da Vinci, CC BY-SA

 

There were five good reasons why you should be there:

1. Be part of the European Year of Cultural Heritage launch in Milan on 7 December with our new #AllezCulture campaign #SoDoWe

Whether you can attend or not, please spare 10 minutes for a new #AllezCulture campaign dedicated to the Value of Culture in support of the European Year of Cultural Heritage 2018! The European Year of Cultural Heritage will launch at the Culture Forum on 7 December in Milan. To spread the word, we want to create a massive @EuropeanaEU thunderclap on Twitter on the evening of the event: follow the instructions to help us make noise and properly launch the European Year of Cultural Heritage.

  • Go to Thunderclap
  • Press the ‘support with Twitter’ button
  • Copy and paste this tweet:
  • “Everyone @EuropeanaEU supports #EuropeforCulture #SoDoI” or “Everyone @EuropeanaEU supports #EuropeforCulture #SoDoWe”(optionally @ handle of your organization)
  • Add any extra information you want to include
  • Send the tweet. It will be released during the AGM on the 6th of December.

2. Learn how to demonstrate your institution’s Impact with the new Impact Playbook

Having an impact beyond the numbers is a challenge many of us face. This year’s AGM is all about your projects. It is a chance to share knowledge about your own activities, but also to learn from others: we are looking for the best projects to create showcase studies for the Impact Framework. Are you working on a project that might have significant social or economic value? Would you like to get time, support and expertise from Europeana’s impact team to assess the impact of your project?

3. Work together on common cultural heritage issues with your peers

Fair exchange is no robbery! Create your own network by meeting up with colleagues from cultural heritage institutions, aggregators, research, education, developers, makers with whom you can discuss issues and find solutions.

The AGM will provide you with plenty of opportunities for exchanging ideas and problems, to build your own network and help others. Don’t forget: your attendance is sponsored, and your institution will benefit from your learnings! Let us know if you need some advice regarding a specific subject, or wish to be put in contact with peers facing similar issues: we will do our best to connect you.

4. Help the Europeana Network Association becoming an influential body at national and European levels

The Europeana Network Association is entering its fourth year. It needs to be even more vibrant and active to create a change in Europe’s cultural heritage arena. We already have the opportunity to influence topics from copyright reform to funding, from education to impact – but we need to band more strongly together. Being more vociferous will make us a force that really matters.

The Europeana Network Association needs to:

  • represent its members
  • cover topics that are of relevance and importance to the sector
  • influence how cultural heritage is perceived nationally and at European levels
  • support the future of Europeana

The European Year of Cultural Heritage can help us promote the need and position of digital cultural heritage in society, we need to find ways to build on it. Giving your time to these common goals will help make Europe a better place.

5. Predict next year’s trends and bring them into the Europeana business plan

Being able to see or predict trends, and acting on them at the right time helps maintain leadership and keeps an organisation or ecosystem relevant. This kind of input above and beyond the maintaining of existing systems and work-plans is needed to create the Europeana Business Plan 2018.

Bring your brain and dreams to this year’s AGM, and help make 2018 the year of change for Europeana!

Don’t miss any of this: register now!

 

 

 


Paul Kneale: Capturing the Digital Age through Art

Paul Kneale (b.1986) is interested in how the physical world is constantly translated into a digital language. His work explores the ways in which the digital facets of our existence can be manifested and reimagined in the physical object. In interview with i-D, Kneale defined the Internet as ‘a whole way of being in the world’ and his practice investigates the role that art can play in this new enigmatic dimension.

1

The significance of Kneale’s work resides as much in the process of creation as well as it does in the final result. The artist has recently pioneered a new technique of ‘scanner painting’ where he uses an open scanner to capture an abstract impression of the ambient light and atmospheric conditions in his studio. The impression is then printed on to a canvas, forging a connection with art historical tradition. Finally, the artist attempts to crystallise immaterial entities such as Time by superimposing numerous scans realised at different speeds and resolutions. This way, several moments in time co-exist on the same surface.

2

Paul Kneale received his Master of Fine Arts from the Slade School of Fine Art, London, in 2011 and has been working closely with ARTUNER art platform since 2015.

Coinciding with the 57th Venice Biennale, his work is currently on show at the Thetis Gardens in the Arsenale Novissimo and he is participating in a group exhibition ‘Contemporary Photography Forum‘ (7 November 2017 – 8 April 2018) at the Boca Raton Museum of Art in Miami, which examines how the medium of photography has shifted with the advent of new technologies.

Kneale is also a very accomplished writer and has contributed theoretical essays to art publications such as Frieze and Spike. He is also the author of experimental short story ‘Ex Oriente Lux’ and an ebook ‘New Abject’.

More about Paul Kneale | www.paulkneale.net | @freewillsalad | Paul Kneale on ARTUNER

3


Adobe steps up on standards adoption

complianceThe November 2017 updates for Acrobat DC, now available from Adobe, include some significant news for those who pay attention to ISO standards-adoption.

This release, 18.xx for those on the “Continuous” track, includes the following, among other improvements:

Accessibility: Adobe advances their tool-set to make it easier to remediate tables to ensure they are tagged in accordance with PDF/UA.

PDF/A and veraPDF: In their release notes, Adobe states: “veraPDF is a European Union (EU) project that was executed under the lead of the PDF Association. veraPDF is a purpose-built, open source, file-format validator covering all PDF/A parts and conformance levels. Now, Acrobat preflight tools can find and report issues that are not compatible with the veraPDF tests.”

For those interested, more information about veraPDF and the veraPDF test suite.

PDF 2.0: Adobe has announced that Acrobat and Reader can now open and process PDF files that claim conformance with ISO 32000-2 (PDF 2.0), the current version of the PDF specification.

Additionally, Acrobat will now retain files as PDF 2.0 when saving, except when downgrading to older versions of PDF is specifically requested by the user.


Interview with Brian E. Davis of Oregon State University

brian1

Hey Brian! Introduce yourself please.

Hi Ashley! I’m Brian, the head of the Digital Production Unit for the Special Collections & Archives Research Center at Oregon State University Libraries & Press. I’ve been at OSU since the summer of 2012.

My background is pretty broad and a bit random. I was the visual materials archivist at Arizona State University from 2005-08 and I worked as a digital production developer at Duke University right after leaving ASU. Through a series of unfortunate events, I spent a few years as a media services librarian at a small liberal arts college in North Carolina. Although outside the library/archives world, my earliest somewhat related work was installing exhibits at an art museum and working as movie theater projectionist. My experiences from both of those jobs still influence the work I do today. Oh yeah, I went to grad school and should mention that. I have master’s in computer science and I’ve been involved with programming to some degree since the late 90’s.

As you might expect, the Digital Production Unit provides digitization services for the library. I sometimes joke that I’m a unit of one since I’m the only staff member doing standards-based digital production work in the library. My work spans the entirety of our digital production process, from prepping physical materials and performing minor stabilization/repairs all the way through to digitization and later stage digital preservation. I’m responsible for a lot of things. Not having ALL the resources is both a good thing and a bad thing. I won’t mention the bad parts of it, but the best thing about it is that I’ve been able to come up ways to be more efficient.

Luckily, I have three part-time student staff and between the four of us, we produce a substantial number of files for both of our repositories (ScholarsArchive@OSU & Oregon Digital), as well as for our preservation storage. In terms of day-to-day work, my students take care of most of the flatbed digitization while I tackle the materials prep, book scanning, videotape transfers, quality control, and digital preservation.

brian2

What does your media ingest process look like? Does your media ingest process include any tests (manual or automated) on the incoming content? If so, what are the goals of those tests?

I should mention up front that we don’t typically work with files coming in from other sources. Virtually everything we work with is produced in my unit and we produce a large number of PDF, TIFF, and MKV files each week. Every file gets reviewed. #NoFileLeftBehind

Neither of our digital repositories is currently being used as preservation repositories. That simply means that I make reduced-quality access files for them. That also means that I’m juggling hundreds of derivative files spread across a number of workstations and production servers, along with all the master files. It can get a bit confusing. My focus is on the preservation-level files and those are moved onto two separate ZFS storage systems via a BagIt protocol. One is a local machine that I configured primarily for video storage and the other is a network share managed by our IT department. Because these files are preservation master files, my testing is to verify that the files were produced in accordance to our local policies and to perform format-specific identification, validation, characterization, and to run fixity checks.

I don’t utilize watched folders as much as I probably could, so most of the processes are somewhat manual. I prefer using command-line tools for most things. However, I often wrap terminal commands inside AppleScripts and install them as “right-click actions” across our macOS workstations. This is how I’ve implemented BagIt, FFmpeg, and a few other tools we use daily. I chose this route both to ease my students into taking on some digital preservation work and also to make certain tasks easier for myself. These actions are connected to a number of command-line tools and keeping things updated is key to making it all work. Thank God for Homebrew.

Quality control for PDF files can be burdensome and being the only reviewer, I need to find ways to be more efficient. In that spirit, I’ve abandoned Adobe Acrobat Pro‘s Preflight and moved to using Quick Look to scroll through the PDF and validating with MediaConch (GUI). Things move quite a bit faster this way. I’ve used command-line versions of veraPDF and Ghostscript in the past, but I like the speed of my new workflow.

We produce hundreds of TIFF files each week – more than any other filetype. I use DPF Manager for quality control on TIFF files since it can parse file directories and determine if files are valid and whether digitization specifications have been followed based. Verifying this information in a batch process eliminates the extra steps of manually checking with Photoshop or Bridge. I generally use the command-line version of DPF Manager on my Linux machine for this task since my other two workstations are often busy processing video files and PDFs.

As someone juggling many many processes and workflows, I try to keep things as simple as I can when it comes to my videotape digitization. I use vrecord for all of my analog video transfers and capture as MKV/FFV1/LPCM. I use a variety of other tools for digital videotape like VLC, QuickTime, and Final Cut Pro X, but I maintain format appropriate MKV/FFV1/LPCM specs for the preservation-level files. MediaConch is used to validate and to check that my transfer policies have been followed.

As soon as our files check out, I right-click to bag them and manually push them up to preservation storage.

macos_services1

macos_services2

Where do you use MediaConch? Do you use MediaConch primarily for file validation, for local policy checking, for in-house quality control, for quality testing for vendor files?

I’ve used MediaConch during the quality control process for my video files for a while now. I use it for validation and policy checking. I just started using it for PDF validation after struggling to find an efficient process.

Files coming into the library from vendors do not undergo any sort of quality control process or validation check. This is because the files simply do not come through my unit. Hopefully that will change because I kind of know what I’m doing with this stuff. ¯\_(ツ)_/¯

At what point in the archival process do you use MediaConch?

Pre-ingest. MediaConch is the first tool that I run during quality control. It’s an integral first step as it determines what happens next. If the file doesn’t pass, I need to figure out why and correct that problem. If it passes, then we can happily move onto the next step.

Do you use MediaConch for MKV/FFV1/LPCM video files, for other video files, for non-video files, or something else?

I use MediaConch for all of my MKV/FFV1/LPCM video files. As I mentioned, I recently started using it for our PDFs and it’s a real time saver. I still use DPF Manager for our TIFF files, but it would be nice to use MediaConch across the board. I’m a sucker for uniformity and may explore porting my configuration specs into MediaConch.

mediaconch_pdfa

Why do you think file validation is important?

When I first started working in my current position, if a file opened then that was good enough for everyone. But as we all know, there’s more to it than that.

I moved to PDF/A-1b for PDFs coming out of our digital production process in 2012. To keep to that particular flavor of PDF, I configured a number of Adobe Acrobat actions for students to ensure that they’re saving as PDF/A-1b. There are times when those actions stop working or a student decides to do something else. Whatever the case, the file validation that MediaConch does helps me catch those files early in a project.

I’m not immune to making mistakes either. I literally sit in the middle of a donut of computers and multitask the day away. There are a number of vrecord settings that are easy to mis-select during my videotape transfer process and MediaConch policies are my insurance. Moving from a Betacam SP workflow down to an EP-recorded VHS tape workflow, I could fail to drop the bit depth down to 8. My local policy for low quality VHS ensures that I don’t end up pushing a larger 10-bit transfer up to preservation storage. This makes everyone happy, especially the systems people who maintain our storage. Did I mention that they hate me and my big files?

Anything else you’d like to add?

I’ve been informally testing Archivematica for close to two years now, primarily on image files that we produce. The machine that I’m running it on is a Mac Pro from 2008 and it chokes a bit on some of the video files, so I haven’t done a great deal of testing on those. However, my library is moving forward with a large-scale Archivematica pilot later this year and I’m very much looking forward to trying out the MediaConch integration. That is, if I can convince them to run a development release of it.

mediaconch_fail


Interview with Patricia Falcao and Francesca Colussi of Tate

tate

Hey Patricia and Francesca! Introduce yourselves please.

PF: My name is Patricia Falcao and I am a time-based media conservator at Tate. I’ve had different roles within Time-based Media Conservation over the last 8 years, but most recently I have been responsible for the acquisition of new artworks into the collection. That means I work with a team of people to ensure that we have all the information, media, equipment and software that we need to preserve these works. This means I am very interested in checking that files that we receive from artists don’t have any issues that may impact their sustainability and playback and also that when we receive tapes and migrate their contents to file the resulting formats are consistent among themselves and with our specifications for our files.

FC: My name is Francesca Colussi and I’m one of the senior time-based media conservation technicians at Tate. Within our department I am part of the team that installs and takes care of displays and exhibitions. I’ve been at Tate for a bit more than a year and I have a pretty eclectic professional background, ranging from art publishing to photography and video production. Before Tate I was managing the studio of a video artist, there I started to deepen my interest in video archiving, equipment researching and exhibition formats production. I have a special interest in analysing the dependencies between specific files, software and hardware, especially in video production. Media encoding is a key part of our workflow and my colleagues and I try to make sure we have the ideal exhibition format to perfectly match the equipment we use for each display and exhibition.

What does your media ingest process look like? Does your media ingest process include any tests (manual or automated) on the incoming content? If so, what are the goals of those tests?

There are two different types of ingest;
1- Material that we have or receive as tape, which we must migrate to file and
2- Videos which we receive as file, either because they were produced as file (often on some flavour of QuickTime/ProRes) or in some cases it is tape-based video transferred at some other institution or by the artist.

Compared to AV archives we receive a small number of files, typically under 100 files/year, but they are all part of extremely valuable artworks that Tate has to preserve for the long-term.

We transfer our tapes in external facilities, and we are always present for the transfers, so we can do an initial quality control at that point. We would look for image errors and try to identify if they are on the tape or if they were caused by the transfer. At this stage we would also check the resulting files with MediaConch to see if the files being produced are what we requested. Further to that we would want to be sure that the content of any file (both from tape and born-digital) is what is meant to be, so that we have the right artwork, only the right artwork and all of the artwork. We also need to look for image errors and understand whether they are intended or are there by mistake. We found that we usually will look at the video for any flaws and in parallel look at the file in QCTools. We are still learning to use QCTools to its full potential. We are growing in confidence on the tool and our ability to use it, but we also still want to look at the total duration of the videos. We look at the video files in 3 players, typically QuickTime 7, VLC and QCTools, the first pass we look at the whole video and then we only sample the video in the 2 other players. This helps bring out issues with inconsistent metadata for aspect ratio, for example. We would also look at the metadata with MediaInfo, and make sure that what we received is what we expected. This sometimes also allows us to identify issues with colour space or aspect ratio. We are also seeing/hearing more and more complex audio tracks, from 5.1 to 9.1 and we have started to use audacity to check those.

These checks are usually done on the files in our Interim Storage, which we copy over from the artist supplied hard-drive using either Exactly (from AVPreserve) or Bagger (from the LoC), which creates all the checksums in a bag. Once the checks above are made we will then use Archivematica to transfer to our Archival Storage, but that is currently still in the testing phase.

Where do you use MediaConch? Do you use MediaConch primarily for file validation, for local policy checking, for in-house quality control, for quality testing for vendor files?

PF: As I said above, in my context we use MediaConch to check files that have resulted from migration. We usually have one or two migration sessions a year, and MediaConch makes comparing those files among themselves and with specifications a lot easier. Dave Rice analysed the specifications for QuickTime wrapper/V210 compression and created a profile for us that has already raised a series of questions about our processes and is making us reconsider how to transfer from tape to file.

FC: I mainly use MediaConch as a comparison and ‘problem solving’ tool to spot anomalies in exhibition format files, therefore I would say I use it both for local policy checking and in-house quality control. Our Exhibition Files are specifically produced each time we are preparing a work to be displayed and when encoding a video file for an exhibition we need to comply to the media player settings (in our case mainly Brightsign) and take in consideration the projector we are going to use. Sometimes unexpected problems occur like playback issues, glitches or simply the player being unable to read the file. The first step is always to check the file visually if it’s a video, but then a multiple set of tools is necessary to have a deeper understanding of the issues.

MediaConch is particularly useful when I prepare files for multiple channel installations that originally have the same or similar properties. For example if we use the same setting to encode all the files and some of them appear faulty I use MediaConch to create a local policy and spot the issues. Sometimes it’s challenging as the rules of dependency between equipment and files vary and Dave Rice helped us exploring the possibility to develop a specific policy to check the files against the Brightsign specs which I find a very interesting challenge.

At what point in the archival process do you use MediaConch?

PF: We would usually use MediaConch either when we receive new files, before preparing an AIP/Bag for storage or after a migration from Tape to file.

Do you use MediaConch for MKV/FFV1/LPCM video files, for other video files, for non-video files, or something else?

PF: So far we’ve used it for QuickTime files with either V210, ProRes, DV or H.264 compression, but also for other video files, like MPEG-1/2 Video.

FC: I mainly use it for video or sound files, ProRes, H.264 and MPEG-1/2 Video are the most common files I handle.

Why do you think file validation is important?

PF: Because it will highlight any issues that a file may have that may not be immediately visible in the image or even the metadata but that may impact a file’s sustainability in the long-term.

FC: I totally agree with Patricia. The previous step in validation done during the acquisition process makes sure we have an issues free master file to refer to when producing any exhibition formats.

Anything else you’d like to add?

PF: over the last few years, through the collaboration with Dave Rice and tool developers we have been completely sold to using open source tools for our archiving workflows. It is such an impressive community working in this field. Congratulations!


MediaConch Users Survey

MediaArea is immensely grateful to have been involved in the PREFORMA challenge over the past three years. Through this initiative, MediaArea has been given the opportunity to further contribute the cultural heritage sector through the development of the open source audiovisual conformance checker tool, MediaConch.

To better understand our users and plan more efficiently for the future of this software, MediaArea would appreciate your feedback via this MediaConch Users Survey.

This survey expects that you have used MediaConch in some capacity. This survey should take 5-10 minutes to complete.


Shaping our future memory standards (2/2)

Source: Open Preservation Foundation blog post by Becky Mc Guiness

Following on from Notes from ‘Shaping our future memory standards’ the final PREFORMA conference, day 1, here are the notes from day 2:

IMG_0899Monika Hagedorn-Saupe from the Prussian Cultural Heritage Foundation was the first speaker on day 2 of the conference. She spoke about the ‘Situation and perspectives for digital preservation among cultural heritage institutions’.

She has been involved in the nestor network which aims to raise awareness about digital preservation and build a network among organisations in Germany. Their aim was not to develop a technical solution..

Nestor produced some of the first reports on digital preservation. They carried out a survey of their network in 2005 and found very few organisations had a strategy or plan for digital preservation, and that knowledge about file formats was very low. Very few organisations had someone responsible for digital preservation within their institution.

They have since produced a handbook for digital preservation, scientific reports and guidelines, and have set up the Nestor seal for trustworthy archives. They have a dedicated working group investigating format detection and have introduced veraPDF and DPF Manager as important tools to their network.

Development of digital preservation knowledge is still quite uneven. Some organisations are yet to be convinced about the importance of file format validation, however, in other areas, there is an increasing understanding, particularly around text and image formats and standards. AV formats are still being evaluated and they are at a stage of monitoring the formats validated by MediaConch.

IMG_0900The second talk of the day was ‘Back to the future? Digital preservation needs of future users anticipated today’ by Milena Dobreva-McPherson of University College London Qatar. She commented that through the summaries of the projects she can see that a lot of useful work has been done, but also recognised that many other topics remain for future work. It is difficult to address the subject of the future and compared it to Alice in Wonderland: how to do know if you have got there, if you don’t know where you are going?

Preservation has many aspects to it. It is seen as a roadmap, users are today’s users – we cannot predict who the future users are. There are many horror stories about how much data might be lost – but it is based on projections, you don’t hear that many actual horror stories of lost data.

Research data has many similar challenges to digital preservation, but it remains a separate area at the moment. Digital preservation funding has been replaced by research data in the EU funding streams. Again, there was a question about responsibility in relation to preservation. Who should be responsible for preserving research data? Librarians? Institutional repositories? Researchers? Research funding agencies? A new kind of professional?

Milena observed that there are two ways of looking to the future: speculating verses creating. PREFORMA has taken the second approach: it is creating tools, new models and thinking in digital preservation.

We need joint efforts to take digital preservation forward. There are not many new initiatives coming up. We need to think about the next steps and what kinds of projects we want in future. We need to define what makes digital preservation skills unique and explain how they are connected to the issues. Milena called on participants to blend with other communities and be stronger in communicating our value. We are good at hiding behind the words ‘long term’. Users want results now. We have success stories and we should be more active in sharing them.

IMG_0905The final panel sessions addressed ‘Business models around open source software’ chaired by Peter Bubestinger-Steindl, AudioVisual Research & Development. The discussion began with a show of hands – all of the audience uses some kind of open source software, however, only some of them have ever paid for it.

Open source is often perceived as free work and commercial software is seen as the ‘opposite’. Open source is also confused with freeware. Jérôme Martinez from MediaArea said that his organisation is commercial and they develop open source software. He still needs to explain what open source is nearly every day. Someone needs to pay for it – developers don’t just work for free, they need to make a living.

Carl Wilson explained how the Open Preservation Foundation’s business model is different. We are not raising revenue from software sales, our income currently comes from membership fees and projects. It means that any organisation can download the software for free, and OPF members get support using the tools we maintain including JHOVE, fido and now veraPDF. The OPF was set up off the back of a large research project (Planets) to sustain the results. The software we adopt is driven by our members’ interests, for example, we adopted JHOVE at the request of our members. JHOVE is widely used, but only currently paid for by OPF members.

As a user, Klas Jadeglans at the National Archives of Sweden, explained that open source software can be problematic to adopt. When memory intuitions want to buy software they are used to going through a procurement process. Without anything to procure it is difficult to spend money on open source software. Klas tests and uses open source software to demonstrate the benefits internally. It is easier to get funding later to improve the software.

Julia Kim explained that the Library of Congress recently became members of the BitCurator consortium. It was difficult to find a logical accounting code with the administrators and managers for the membership. More transparency around different business models would help users to ‘sell’ the idea of open source software internally. She commented that JHOVE is integrated into their repository – it has become a standard because everyone uses it. She is now also relying on MediaConch for her work, but is uncertain of how long it will take to integrate it into their production environment.

There is still a lot of confusion about open source business models. If there is not a price tag, how do you deal with it? Users want to pay for it, as otherwise it ‘feels like stealing’ but there is not a process in place.

Secondly, there is still a problem with perception about the quality of open source software in management. Many users are so reliant on it now that this perception is gone, but it’s still an issue at management level. It’s difficult to pay for add-on services if the price point it zero – that sets that anchor for additional costs. We need to break this link between price and quality. Quality is independent of a licence.

Carl pointed out that the internet is built on open source software. It’s reliable and runs 24 hours a day. This was highlighted by the heartbleed bug. It transpired that it was maintained by one guy in Germany, it was fixed and internet commerce continues to use it. Encryption is another area where open source has benefits that aren’t always intuitively apparent. Open source means that the code can get lots of eyes on it and it can be thoroughly tested. When Sony Blu ray encryption was released it was cracked within 24 hours. Open source is not a new alternative that came later, it is the foundation of a lot of the software and systems we use today.

One of the main advantages of open source is that you can test and quality assure your software. By doing this you can mitigate the perception of lack of quality. The community is also very important. We are working in a niche area and using open source means you are not dependent on a single developer (or company). If you want to modify the software you can pay someone else to do it.

IMG_0906Borje Justrell gave the closing remarks for the conference ‘Looking after PREFORMA’. He explained that the three conformance checkers would be sustained by the suppliers through different means, and thanked everyone for their participation in the conference, and for their feedback throughout the project. He encouraged the audience to read the new PREFORMA handbook, summarising the project’s work.


DPF Manager Workshop

[English]

On Wednesday, December 13, PACKED and the Royal Library of Belgium, in collaboration with the University of Girona, organise a workshop “Quality Control of TIFF Files“. The workshop is intended for digital archivists who are responsible for the long-term preservation of TIFF files.

During the workshop you will learn how to gain an insight into the technical properties of TIFF files and what you can do to preserve them for future generations. The workshop is conceived as a hands-on session, where participants get started with TIFF files from their own collection. The purpose is to determine the durability of your TIFF files using the DPF Manager Conformance Checker and to create a policy profile for the files in your collection.

The workshop takes place in the Royal Sky Room 2 (6th floor) of the Royal Library In Brussels. The number of participants is limited to fifteen. Please subscribe to Piet Janssens (piet.janssens@kbr.be) before Friday November 27th.

This workshop has been made possible through the PREFORMA project, funded by the Seventh Framework Program for Research and Innovation of the European Commission.

[Dutch]

Op woensdag 13 december organiseren PACKED en de Koninklijke Bibliotheek van België, in samenwerking met de Universiteit van Girona, een workshop “Kwaliteitscontrole van TIFF-bestanden”. De workshop richt zich specifiek op digitale archivarissen die verantwoordelijk zijn voor de preservering van TIFF-bestanden.

Tijdens de workshop leer je inzicht verwerven in de technische eigenschappen van TIFF-bestanden en wat je kan doen om ze voor de komende generaties te bewaren. De workshop is opgevat als een hands-on sessie waarbij de deelnemers aan de slag gaan met TIFF-bestanden uit hun eigen collectie. Het doel is om met behulp van de DPF-Manager Conformance Checker de duurzaamheid van je TIFF-bestanden te bepalen en een bewaarprofiel op te stellen voor de bestanden in jouw collectie.

De workshop gaat door in de Royal Sky Room 2 (6e verdieping) van de Koninklijke Bibliotheek. Het aantal deelnemers is beperkt tot vijftien. Aanmelden kan tot vrijdag 27 november bij Piet Janssens (piet.janssens@kbr.be).

Deze workshop is mede mogelijk gemaakt door het PREFORMA project met financiering van het Seventh Framework Programme for Research and Innovation van de Europese Commissie.

[French]

Le mercredi 13 novembre, PACKED et la Bibliothèque royale de Belgique organisent, en collaboration avec l’Université de Gérone, un workshop sur le « Contrôle de qualité des fichiers TIFF ». L’atelier s’adresse aux archivistes numériques responsables de la préservation des fichiers PDF.

Au cours de l’atelier, vous apprendrez quelles sont les propriétés techniques des fichiers TIFF et les bonnes pratiques en matière de préservation de cieux-ici. L’atelier est conçu comme une session de travaux pratiques où les participants pourront travailler avec des fichiers TIFF issus de leur propre collection. L’objectif est de déterminer la pérennité  de vos fichiers à l’aide du logiciel Vera DPF Conformance Checker et de créer un profil de stockage pour les fichiers dans votre collection.

L’atelier aura lieu dans la Royal Sky Room 2 de la Bibliothèque royale de Belgique et se fera en anglais. Le nombre de participants est limité à quinze. Vous pouvez vous inscrire jusqu’au vendredi 10 novembre auprès de Piet Janssens (piet.janssens@kbr.be).

Cet atelier est parrainé par le projet PREFORMA avec le financement du septième Programme-cadre pour la recherche et l’innovation de la Commission européenne.


AMIA 2017 Conference

AMIA is a nonprofit international association dedicated to the preservation and use of moving image media. AMIA supports public and professional education and fosters cooperation and communication among the individuals and organizations concerned with the acquisition, preservation, description, exhibition, and use of moving image materials.

Programmed by professionals working in the field, the annual AMIA Conference is the largest gathering of motion picture and recorded sound archivists and interested professionals.  More than 550 annual attendees include members and colleagues from around the world.

The goal of the Conference is to present an broadly-based program that speaks to the wide range of attendees with a balance of theory and practice, inviting new ideas and concepts that may stimulate additional interest, involvement and educational benefit.  The conference provides an opportunity for colleagues and those interested in the field to meet, share information and work together. For newcomers to this vibrant, dynamic and committed community, networking with other AMIA members and industry professionals is invaluable for professional development. AMIA conference registration includes participation in all regular sessions and screenings  and some special events.

The 2017 edition of the AMIA Conference will be held on 29 November – 2 December 2017 in New Orleans.

 

amia2017_save the date

 

For further details about the Conference, go to www.amiaconference.net.

For more information about AMIA, events and membership, go to www.AMIAnet.org.


veraPDF Workshop

[English]

On Friday, November 24, PACKED and the Royal Library of Belgium, in collaboration with the Open Preservation Foundation, organise a workshop “Quality Control of PDF Files“. The workshop is intended for digital archivists who are responsible for the long-term preservation of PDF files.

During the workshop you will learn how to gain an insight into the technical properties of PDF files and what you can do to preserve them for future generations. The workshop is conceived as a hands-on session, where participants get started with PDF files from their own collection. The purpose is to determine the durability of your PDF files using the veraPDF Conformance Checker and to create a policy profile for the files in your collection.

The workshop takes place in the Royal Sky Room 2 (6th floor) of the Royal Library in Brussels and will be taught in English. The number of participants is limited to fifteen. Please subscribe to Piet Janssens (piet.janssens@kbr.be) before Friday November 10th.

This workshop has been made possible through the PREFORMA project, funded by the Seventh Framework Program for Research and Innovation of the European Commission.

[Dutch]

Op vrijdag 24 november organiseren PACKED en de Koninklijke Bibliotheek van België, in samenwerking met Open Preservation Foundation, een workshop “Kwaliteitscontrole van PDF-bestanden”. De workshop richt zich specifiek op digitale archivarissen die verantwoordelijk zijn voor de preservering van PDF-bestanden.

Tijdens de workshop leer je inzicht verwerven in de technische eigenschappen van PDF-bestanden en wat je kan doen om ze voor de komende generaties te bewaren. De workshop is opgevat als een hands-on sessie waarbij de deelnemers aan de slag gaan met PDF-bestanden uit hun eigen collectie. Het doel is om met behulp van de veraPDF Conformance Checker de duurzaamheid van je PDF-bestanden te bepalen en een bewaarprofiel op te stellen voor de bestanden in jouw collectie.

De workshop gaat door in de Royal Sky Room 2 (6e verdieping) van de Koninklijke Bibliotheek en de voertaal is Engels. Het aantal deelnemers is beperkt tot vijftien. Aanmelden kan tot vrijdag 10 november bij Piet Janssens (piet.janssens@kbr.be).

Deze workshop is mede mogelijk gemaakt door het PREFORMA project met financiering van het Seventh Framework Programme for Research and Innovation van de Europese Commissie.

[French]

Le vendredi 24 novembre, PACKED et la Bibliothèque royale de Belgique organisent, en collaboration avec l’Open Preservation Foundation, un workshop sur le « Contrôle de qualité des fichiers PDF ». L’atelier s’adresse aux archivistes numériques responsables de la préservation des fichiers PDF.

Au cours de l’atelier, vous apprendrez quelles sont les propriétés techniques des fichiers PDF et les bonnes pratiques en matière de préservation de cieux-ici. L’atelier est conçu comme une session de travaux pratiques où les participants pourront travailler avec des fichiers PDF issus de leur propre collection. L’objectif est de déterminer la pérennité  de vos fichiers à l’aide du logiciel Vera PDF Conformance Checker et de créer un profil de stockage pour les fichiers dans votre collection.

L’atelier aura lieu dans la Royal Sky Room 2 de la Bibliothèque royale de Belgique et se fera en anglais. Le nombre de participants est limité à quinze. Vous pouvez vous inscrire jusqu’au vendredi 10 novembre auprès de Piet Janssens (piet.janssens@kbr.be).

Cet atelier est parrainé par le projet PREFORMA avec le financement du septième Programme-cadre pour la recherche et l’innovation de la Commission européenne.