Control Issues: A Report of SXSW ’18

We went to the SXSW Conference this year to reach an audience of tech developers with our session Hacking the Library of Congress. As you may expect from an emerging technology conference, sessions on virtual reality (VR) (48 sessions) and blockchain (29 sessions) dominated the week.  At the Virtual Cinema, attendees demoed a variety of VR and augmented reality (AR) experiences– some of the most compelling of which were full-sensory mixed reality (MR) works such as Meow Wolf’s The Atrium. Panelists painted visions of the future that were as hopeful (blockchain democratizing the web) as they were scary (AI taking over humanity). Attendees took breaks at rest stations with VR vacation experiences and baby goats.

A winding line of concert-goers waiting for venue doors to open at South By Southwest

The hundreds of panels, meet-up sessions and networking events created a bubble in the city of more than 50,000 attendees. A bubble that was not burst by scary local events or realities….

Labs was interested in seeing past the hype of emerging technologies like AI (artificial intelligence) and blockchain to how they can be applied responsibly in cultural heritage spaces. Conference speakers from industries that are leading the way in adopting AI and machine learning revealed issues at the conference around transparency and control that are especially important for us to consider. How do we get our authoritative holdings to compete with invisible Google algorithms? How do we keep up with the demands of digital preservation in a market that doesn’t value it? How do we push back on our dependency on electronic journal vendors with ever-increasing subscription prices? We are interested in exploring the hidden controls of new technologies and how they could affect our profession.

In the session AR/VR Evolution or Revolution?, Tony Parisi of Unity Technologies spoke of the inevitability that all computing platforms will eventually move to the VR/AR space. The more important question is for consumers to decide what their “Personal Reality” will be. Given that attendees experienced a different reality than the rest of Austin that week, it made us wonder about this trajectory. A critical component of our work in cultural heritage is to present people with things that may not be pleasant in order to teach important lessons about history (the US Holocaust Memorial Museum is a great example). Doing this type of work in a world dominated by “Personal Realities” will be more critical than ever.

Case studies from the Pharmaceutical industry using VR for empathetic education really resonated with us. Companies are designing experiences to train doctors on what it’s like to tour a smoker’s lung or experience symptoms of chronic illness such as migraines. The collections libraries serve create empathetic connections all the time, and we see a huge potential to amplify these moments of transformation between a patron and an object through contextualized virtual and mixed reality. Great examples of this potential include the Smithsonian American Art Museum’s ”SAAM VR” product created with Intel and NASA’s “Beethoven’s 5th” VR film created in collaboration with Google.

a virtual reality beach break station at which a person is lounging in a hammock wearing a virtual reality headset in front of a sunset scene backdrop

Dell Intel provided relaxing virtual reality beach breaks for SXSW attendees.

Artificial Intelligence, the glue that makes these technologies possible, was discussed in the session “Letting Go: Designing for an AI you Can’t Control” by UX Designers representing Bonsai, Facebook, Singularity University, and Evoke.ai. As AI takes away the dependency for humans to wield the technology, designers are now freed to point AI to problems and dictate what positive outcomes should be. Instead of comparing users to some type of average, for example, the technology can adapt to who users are in the moment and how they change their behavior through time (referred to as “behavior systems”). This shift in the traditional power paradigm places importance on AI products to be transparent – showing a breadcrumb trail of decisions to users, and giving them multiple opportunities to say no as they engage with the product. Microsoft’s chatbot “Tay”, which was shut down only 16 hours after it was launched in 2016 after it began tweeting racist and sexual statements, was cited as an example of how very real an AI system failure could be – and what populations it could hurt.

The human role, more than ever, is to bring an ethical lens and critical questions to the design process. We must understand where the data is coming from, bring diverse perspectives into the process before the system is formed, and understand how a failure can affect certain populations. Here we see many parallels to the great initiatives around data already in place in the cultural heritage community by groups such as Always Already Computational, Frictionless Data, DocNow, WikiWomen in Red.

Of all of the technology mentioned, blockchain proved to be the most elusive and misunderstood by attendees. Kim Jackson of Singular TV and the documentary filmmaker Alex Winter predicted the hype of blockchain as an easy money scheme will soon fade, and the technology with many useful applications will one day be as invisible to users as – to paraphrase Winter –  JavaScript is to the web. The speakers discussed possible outcomes of a decentralized web, such as users being able to conduct monetary transaction directly (eliminating the need for banks), and users controlling their own personal information, as opposed to being forced to submit PII repeatedly with every new online service they subscribe to. We think blockchain could be a huge game-changer for cultural heritage institutions. A digital ledger system could help authenticate the digital items we serve, ensuring the integrity of our primary resources when cited by third parties. Blockchain technology could also improve our understanding of how the items we make available are used. Imagine the implications for measuring impact and fundraising if we could see a history of an item by every person that had cited it!

In closing, we appreciated the opportunity to parse through the hype, fear, and promise of emerging technologies at SXSW as some of the only cultural heritage representatives in attendance. Labs looks forward to piloting these technologies (except the goats) at the Library of Congress on our experiments page and using this blog to offer thoughtful critique about our experiences.

Welcoming Charlie Moffett and crafting interactive, location-based narratives on the web at the Library of Congress

In January, the LC Labs team welcomed Charlie Moffett as he kicked off his innovation internship with the Library of Congress. He’s been exploring digital collections and geospatial data and where the two might intersect to tell stories about place and change. We checked in with him to learn more about his goals, background, and […]

Rethinking LC for Robots: From Topics to Actions

Have you noticed that our LC for Robots page has a new look this month? We integrated feedback from visitors, discussion, and a card sorting exercise to consolidate resources for machine-readable access to Library of Congress digital collections. We’re looking for your feedback, but first, learn more about how we approached this redesign. In September […]

Digital Scholarship Resource Guide: People, Blogs and Labs (part 7 of 7)

This is the final post in a seven-part series by Samantha Herron, our 2017 Junior Fellow. She created this guide to help LC Labs explore how to support digital scholarship at the Library and we started publishing them in January. She’s covered why digital materials matter, how to create digital documents, what digital documents make possible, text […]

New Audio Digitization Performance Testing Guidelines and Software from FADGI

The following is a guest post by Kate Murray, organizer of the FADGI Audio-Visual Working Group and Digital Projects Coordinator at the Library of Congress. The Federal Agencies Digital Guidelines Initiative is pleased to announce the release of ADCTest, an open source software application designed to facilitate performance testing of analog-to-digital converters (ADC) used in […]

Digital Scholarship Resource Guide: Tools for Spatial Analysis (part 5 of 7)

This is part five in a seven part resource guide for digital scholarship by Samantha Herron, our 2017 Junior Fellow. Part one is available here, part two about making digital documents is here, part three is about tools to work with data, part four is all about doing text analysis, and today’s post is focused on spatial analysis. The full […]

Iterative Collaboration at LC Labs

The following is a repost of a blog post from the SAA’s Electronic Records Section: BloggERS! This post is part of the recent BloggERS series on “Collaborating Beyond the Archival Profession.”  The LC Labs team works to increase the impact of Library of Congress digital collections. This includes not only the 2,500,000+ items available on loc.gov, […]

Digital Scholarship Resource Guide: Text analysis (part 4 of 7)

This is part four in a seven part resource guide for digital scholarship by Samantha Herron, our 2017 Junior Fellow. Part one is available here, part two about making digital documents is here, part three is about tools to work with data, and part four (below) is all about doing text analysis. The full guide is available […]

Making a Newspaperbot

The following is a guest post from Library of Congress Labs Innovation Intern, Aditya Jain. While exploring the possibilities of digital collections, Aditya created @newspaperbot. Below he shares his process, some of the challenges he encountered, along with the code. The Chronicling America API provides access to historical newspapers from the first half of the […]