Loading…
Gateways 2018 has ended
Back To Schedule
Wednesday, September 26 • 11:15am - 11:35am
Morning-a-month Website Usability Testing for a Materials Science Gateway

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Science gateways, offering access to scientific tools and large amounts of data in the form of web portals and other applications, are often run by the research groups who produce these tools and data. As the prevalence of these gateways increases, a key issue that arises is that of usability. The developers of gateways want users to be able to interact with the gateways easily and efficiently. However, design choices are typically made by members of the research groups involved and can be ad-hoc, biased, and generally unpredictable. To ensure users can make full use of a gateway, the need arises for a usability study.

The resources needed to do a large-scale usability study are quite significant. Ideal usability testing would involve comparing multiple versions of a user interface to see which version users prefer. In addition to the time and effort needed to produce these versions, many participants need to be found, scripts need to be written, and a large amount of time needs to be dedicated for the testing session itself.

While the benefits of large-scale usability testing are obvious, research groups often have neither the user numbers nor the rate of development that would warrant such an effort. Additionally, these groups often do not have the resources needed to fund/staff such an endeavor.

When starting a usability testing initiative at the Materials Project, our approach was to make usability testing as easy as possible while still obtaining valuable information. We focused on a "one morning a month'" model and held 3 testing sessions over the course of 3 months, with 10 participants in total. Each session made use of a script to keep participant experience consistent. Scripts were slightly tweaked between sessions to obtain information on the impact of the script itself on user behavior. Each session was streamed live to observers in another room as well as recorded for later review, and observations were noted immediately after each session was over. With only 10 participants and a minimal budget, we were able to draw conclusions about web design and user behavior specific to our portal. We hope these conclusions, as well as our notes on the testing process itself, prove useful to developers of other science gateways.


Wednesday September 26, 2018 11:15am - 11:35am CDT
Balcones Room, Commons Conference Center 10100 Burnet Road, Bldg 137, Austin, TX 78758

Attendees (6)