Top of page

Usability and Analytics in Libraries, Archives, and Museums

Share this post:

The following is a guest post by Jefferson Bailey, Fellow at the Library of Congress’s Office of Strategic Initiatives.

Librarians are often accused of talking in a language that can be difficult for non-librarians to understand. While jargon is a part of constructing professional identity, it becomes problematic when that jargonizing hinders how librarians interface with their patrons. That miscommunication can have a negative impact on the design and subsequent usability of library websites.

Shuo Yang, University of Michigan School of Information

Web usability aims to “to make websites more usable, useful, and accessible” according to Usability.gov.  But libraries are not always so good about ensuring usability. A recent paper examined a number of library usability studies and noted the frequent disjunction between library terminology and user understanding – or, as Library Journal succinctly put it, “users don’t know what libraries are talking about”. The design aesthetic of library websites often perpetuates this confusion. Web analytics, however, offer one method of assessing the quality and usability of library, archive, and museum websites.

Drawing on the ability to monitor and measure user interactions with online content, web analytics can offer detailed information on page views, unique visitors, navigation paths, and other statistics of website use. An examination of analytics reports can expose how users are, or aren’t, discovering, accessing, and interacting with digital resources.

In turn, this informs site administrators and collection managers why refining their websites to enhance the users’ experiences will maximize popularity, ease-of-use, and effectiveness.  Analytics projects have been chronicled for both libraryarchive, [ppt] and museum websites. Web analytics have also been a piece of broader data analysis efforts to study collection use and expenditures.

We use tools to generate analytics reports for digitalpreservation.gov and blogs.loc.gov/digitalpreservation, and we occasionally have some “fresh eyes” evaluate our site. Last month we were lucky enough to have Shuo Yang, from the University of Michigan’s School of Information, spend his Alternative Spring Break in our offices performing advanced analytics of the web metrics related to this blog. Shuo is pursuing his M.S. in Information with a specialization in Human Computer Interaction and Information Analysis and Retrieval.

As Shuo described, “the purpose of the project was to analyze what kind of blog posts could bring more visits. I used many scientific methods to conduct this project:  information retrieval, natural language processing, text mining, time series analysis, and statistics.” Metrics themselves, of course, are simply raw data and need additional interpretation and visualization to make them comprehensible. As Shuo noted, “I created a filter with a lot of rules to retrieve visit data of each posted blog.” He also “defined 6 variables that could be categorized into two categories: quantity variables and quality variables to analyze the blog.” Over the course of the project, he used “web analytics software, the programming languages Python, R, and SQL, and Unix line commands” all to extract, structure, analyze, and visualize trends across our blog posts.

A visualization of category-related blog web metrics created by Shuo Yang.

Shuo’s great work gave us insight into a variety of trends for how the blog is accessed and used and the popularity of certain topics and characteristics of posts. For the sake of office comity, I won’t reveal the most popular blogger, but the graphic here shows a visualization of the most popular blog posts by topic. These and other findings will help us effectively craft our communications strategy and online presence as well as (hopefully!) keep this blog engaging and interesting.

Advanced programming skills like Shuo’s are not necessary to analyze web metrics; a general web search will reveal a number of free analytics tools that can be installed and used rather easily; as well there are existing usability-tested free library website templates.

It also bears mentioning that the overall effectiveness of an institution’s web presence necessitates an awareness and use of emerging technologies for open access, exhibition, and support of like-minded projects.

Finally, it is worth remembering that, for all the power of analytics, unique visits and bounce-rates will not improve the website or resources of an organization that does not have well-defined goals. Little visited online collections or webpages may support those goals in ways not best ascertained through metrics analysis. Analytics, thus, are merely an important piece of a much larger toolkit institutions can use to effectively craft their online presence and support their overall mission.

Add a Comment

This blog is governed by the general rules of respectful civil discourse. You are fully responsible for everything that you post. The content of all comments is released into the public domain unless clearly stated otherwise. The Library of Congress does not control the content posted. Nevertheless, the Library of Congress may monitor any user-generated content as it chooses and reserves the right to remove content for any reason whatever, without consent. Gratuitous links to sites are viewed as spam and may result in removed comments. We further reserve the right, in our sole discretion, to remove a user's privilege to post content on the Library site. Read our Comment and Posting Policy.


Required fields are indicated with an * asterisk.