Search the Catholic Portal

Doing usability against the "Catholic Portal"

This posting describes a process for iteratively studying usability issues against the "Catholic Portal" with the expectation that it will be applied by each institutional member of the Digital Access Committee within the current calendar year. The posting is divided into the following sections:

This document is also available as a PDF document for printing, a second PDF document designed as a set of slides, and just for fun, an EPUB file for your mobile device.

Why do usability?

Why do usability? Because very few things in life are truly intuitive. We -- the totality of the Catholic Research Resources Alliance -- have worked hard to build a tool facilitating Catholic scholarship. This tool is functional, and by that we mean it is always online, does not crash, and does not return incorrect information. In order for the "Portal" to be successful, it needs to go beyond functionality to usability. This means it needs to be easy-to-use, contain a limited amount of jargon, and be perceived by its intended audience as a time saver.

Expectations

Usability is an iterative process benefiting from the experience of many people. For these reasons, each institutional member of the Catholic Research Resources Alliance's Digital Access Committee is expected to conduct usability studies against the Portal before the end of this calendar year. Using this posting as a framework, this means five or six usability studies will be done against the Portal before Christmas. The exact times when these studies will be done, and the exact way they are facilitated are left up to each institutional member as long as this document is used as a "recipe" and the balance of the Committee is consulted. This is a group process.

Locally required resources

Fiscally speaking, usability is not an expensive process. Instead, the greatest costs are measured in people's time. In order to do this work each institutional member of the Digital Access Committee will need:

  • a team of at least 2 people, but 3 for 4 is much better
  • approximately 40 hours of time to be spent by the team
  • money to compensate testers' time
  • usability software, optional

The most important resource is the people. At a minimum, two are required. One will facilitate tests. The other will record observations. But there are many tasks that need to be done besides facilitation and observation. It is better to have three for four people on the team. This will make scheduling testers easier, reduce the possibility of overwhelming anybody, and take advantage of different people's individual talents.

The whole usability process will consume approximately 40 hours of staff time. If you assume 6 to 8 hour-long usability studies will be facilitated at your institution, and two people are doing the work, then actually doing the tests will "cost" about 16 hours of time. Evaluating the totality of the tests may take another 2 or 3 hours, so the cost is increased by 4 to 6 hours. Scheduling people to participate is one of the most difficult parts of usability and requires a great deal of coordination. Expect to spend another 4 hours just finding and getting qualified and representative people to participate to your study. As the year progresses, we expect the Portal to change, and consequently the usability studies will change. Time will need to be spent coordinating with the Digital Access Committee on these changes, another 3 to 4 hours. If you plan to take advantage of usability software, time will need to be spent purchasing the software as well as practicing with it. Finally, time will need to be spent documenting the experience so it can most effectively be shared with the Committee.

Usability studies cost the time of testers. It is customary to compensate these people for their time. The amount of compensation will be guided by your local policies, but things like food, gift certificates, small spending sprees at the local bookstore, or services are all examples. You might need to allocate as much as $10-$25 per usability participant for compensation.

Computer software exists to help facilitate usability studies. At the very least such software records how the participant interacts with the system being tested. More full-featured software also records the participants' facial expressions and auditory responses. For the Macintosh we suggest Silverback which costs about $80. A popular Windows application is called Morae Recorder and costs just less than $200. Employing software in your studies makes it easier to be more thorough in your evaluation as well as enabling one to share individual studies. At the same time, the software adds a bit of complexity and expense.

Doing the work

Once the goals of usability and expectations are understood, and once the resources have been allocated, it is time to actually do the work. The process is more stratified and iterative than it is sequencial. In other words, it is not always necessary to complete one step before starting the next. The steps include:

  • refining the usability tasks to be studied
  • practicing with the technology, optional
  • scheduling testers
  • facilitating the studies
  • evaluating the results
  • reporting on the results

The following sections elaborate on each of these items.

Refining the usability tasks

Usability studies are done in an effort to learn how systems can be made easier-to-use, free of jargon, and perceived as a time saver for the intended audience. Since a primary focus of the "Portal" is to create "access to those rare, unique and uncommon research materials", the usability study must test how well the Portal facilitates these goals. The initial set of usability studies done at Notre Dame included the following tasks:

  1. Identify the library or archive holding the papers of Dorothy Day.
  2. Find a record whose author is Graham Greene. Create an account, then add the Graham Greene record to your favorites, tagging it as "ggreene."
  3. Locate resources, including primary resources, on the Catholic Conference for Interracial Justice.
  4. Find a set of records on the topic of "Catholic social action." Choose 1-3 from the retrieved set and email them to yourself for future reference.
  5. Locate materials on the topic of sermons and the Lutheran church.
  6. Who owns "Our Sunday Visitor Records"? What telephone number would you call in order to schedule a time to visit the collection?
  7. Which library has the most French-language materials in the "Portal"?
  8. What is the most frequently used word in the pamphlet owned by Notre Dame entitled "Pastoral instruction for the application of the Decree of the Second Vatican Ecumenical Council on the Means of Social Communication"? (hint: see the record with the call number BV 4319).

Notice how the tasks touch on many different aspects of the Portal. They focus on the finding of diverse materials, identifying where they are physically located, and actually using some of them.

Since we expect to take the results of usability studies and apply them to the Portal interface as soon as feasible, the tasks outlined above may become moot over time. Consequently you will need to see how the Portal is evolving, discuss your institutional study with Digital Access Committee, and combine the result with your own personal experiences and skills to create a new set of questions.

You don't want more than ten tasks in any set of usability studies. Any more than that and the tests take too long to facilitate and are difficult to evaluate. Put another way, create a list of tasks than can be studied in less than an hour.

Practice

If you opted to use computer software to help with your usability studies, then you will have to acquire it and practice with it. "Practice makes perfect," and it makes you look good when facilitating the studies.

Scheduling testers

Scheduling usability testers has got to be one of the more difficult steps. Not only does it take a long time, but it also requires a lot of coordination and selectivity. First and foremost, it is important to schedule testers who are representative of our target audience -- scholars. The Portal is intended to support research in all things Catholic. The material the Portal is archival in nature, leans towards primary literature, and requires previous knowledge of a deep nature in order to adequately interpret. In general, this not necessarily a system designed for undergraduates. Please make every effort to schedule faculty and graduate students working within the Portal's subject domain.

Schedule the testers for no more than an hour at a time. Thirty to forty minutes will be spent on doing the tasks. The balance of the time can be spent on discussion and elaboration.

Between four and eight testers is usually considered enough for usability studies, but schedule about ten with the idea that some will unexpectedly drop out. It is much easier to ask people not to come than it is to find people to participate at the last minute.

Facilitating the study

You've created your list of tasks. You've practiced with the optional software. Your testers are arriving. The hard parts are now behind you, and it is time to actually do a study. The process is easy. Here's how:

  1. One person facilitates the study, and another person takes notes.
  2. Thank the particpant for their time.
  3. Remind the particpant that they are not being tested, but rather the Portal's interface is being tested. There are no wrong answers.
  4. Emphasis to the particpant the critical importance of thinking out loud. By doing so it will be much easier for you, the facilitator, to understand what is going on, and it will be easier for the note-taker to record the results.
  5. When everybody is ready, go through the tasks one by one. Try really hard not to interfere with the completion of the tasks. If the particpant is really off base, then intervene but don't do so too quickly. This part of the process is sometimes difficult to watch. Be patient. Remember, you are not being tested either, only the Portal's interface.
  6. After each of the tasks have been completed, have a discussion. Ask the particpant what they liked, disliked, and thought was easy or hard. Consider asking, "If you could change one thing about the interface, then what would it be?"
  7. Thank the particpant for their valuable time, and don't forget to give them their compensation.
  8. After the particpant has gone, you may want to discuss the study among yourselves, and it a really good idea for the note taker to transcribe their notes for the evaluation process.

Evaluating the results

Once all the studies have been facilitated it is time to evaluate the results. The goal of the evaluation process is to come up with a prioritized list of things you think need to be improved with the Portal's interface. The key word here is "prioritized". We are sure there are many things that can be improved, but considering limited time and resources, some things need to be more important than others.

To create the prioritized list, try this:

  1. Read the written notes, and review the optionally used software recordings. Based on our experience, we think it is easier to read the notes "across rather down". In other words, we found it is easier to see patterns between the studies when we compared & contrasted the responses to each question from each participant as opposed to looking at each participant as a whole. For example, we looked at all the notes for task number #1, and then all the notes for task #2, and then task #3, etc.
  2. Based on the notes, create a prioritized list of three to five items where each item is something you think needs to be addressed. Be prepared to cite which tester and which task demonstrates the issue you think needs to be fixed.
  3. Based on your professional opinion, create a second prioritized list of three to five items where each item is something you think needs to be fixed.
  4. Bring together your team of people -- have a meeting.
  5. Go around the room asking everybody for their prioritized items, first based on the notes and second based on professional opinion. Record everything on a whiteboard, and add tick marks to items repeated by team members.

In the end you ought to have a list of as many as a dozen issues to be addressed, and you ought to be able to sort them by the number of tick marks each received. Here at Notre Dame we had as many as five people on our team, and our list ended up looking like this:

  • 6 - set search filter to off by default
  • 5 - enable sending of more than one email at a time
  • 4 - clarify difference between canonical and remote [files]
  • 3 - remove autocomplete feature
  • 2 - re-do text mining language
  • 1 - tweak facets to be more descriptive or complete
  • 1 - retain links of original EAD file in local EAD file
  • 1 - respect my browser preferences
  • 1 - remember [search] results after creating account
  • 1 - make local EAD file the default
  • 1 - implement authority control (cross-reference) functionality
  • 1 - highlight search words in result [list]
  • 1 - explain what facets are
  • 1 - enable further search [refinements] after selecting "archival records"
  • 1 - confirm adding to favorites
  • 1 - add addresses and phone numbers to records

Based on these sorts of lists it is easy to see what are priorities and what are not. "Librarians love lists."

Reporting on the results

This is the last step. Simply document what you discovered -- most importantly your prioritized list of issues -- and share it with the Digital Access Committee. As the results come in concerted effort will be made to address them as soon as feasible, or at least before the next round of usability studies to be conducted by other institutional members.

Summary and conclusion

Usability studies are an effective way of learn how software interfaces can be improved. They do not need to be expensive in terms of money, but they do require time and effort. Usabilities studies, like software, are never done. There are always things that can be improved. Consequently, usability studies are iterative. Implement something. Test it. Apply the results. Repeat. By working together we -- the Catholic Research Resources Alliance -- can share the load, draw from a wide variety of experiences, and ultimately create a better Portal interface. On our mark. Get set. Go!

Share this post:

Comments on "Doing usability against the "Catholic Portal""

Comments 0-5 of 0

Please login to comment