A Web Site Usability Study
by Larry Lilly for the Mountain Plains Regional Resource Center (MPRRC) in conjunction with WebAIM (Web Accessibility in Mind) at Utah State University.
Contents:
■Executive Summary ■Methods ▷Dates ▷Participants ▷Procedure ▷Test Environment ▷Performance Measures ■Test Results and Discussion ▷Task Completion Data ▷Participant Commentary ▷W3C Web Content Accessibility Guidelines ■lRecommendations ■Appendices ▷Appendix A: MPRRC Web Site Test Participant Questionnaire ▷Appendix B: MPRRC Web Site Nondisclosure and Tape Consent Form ▷Appendix C: MPRRC Web SiteOrientation Script ▷Appendix D: MPRRC Web Site: Task Scenarios ▷Appendix E: MPRRC Web Site Data Sheet ▷Appendix F: MPRRC Web Site Post Test Questionnaire
Executive Summary
This report is a comprehensive analysis of the usability testing procedures performed on the MPRRC Web site. The purpose of this report is to identify accessibility and navigational concerns of the Web site discovered from the testing of visually impaired individuals.
The Methods section outlines the general information and procedures used in conducting the test. For the most part, the test followed the original plan. However, the number of participants was reduced from four to two.
The Results and Discussion section presents the findings. Even with just two participants valuable data was gathered. Much of the data was in the form of qualitative rather than quantitative data. That is, I obtained more information from what the participants thought of the site than from tracking statistical data (such as average times or button clicks). These test results are then discussed in conjunction with the Web Content Accessibility Guidelines produced by the W3C.
The next section contains the Recommendations. Based on test results in combination with the Web Content Accessibility Guidelines, I recommend the addition of a site map. In addition, I also recommend a link skipping the initial navigation buttons found on each page.
Finally, I recommend continued usability testing on the Web site. Not only would MPRRC benefit from the information obtained, but so would the WebAIM project at the Center for People with Disabilities. I wish to thank them, particularly Paul Bohman, for absorbing this usability test into WebAIM’s similar research, thus allowing me access to their equipment, resources, and pool of participants.
Included in the Appendix at the end of this report are copies of all test forms used in this usability study.
Methods This exploratory usability test gathered extensive data through the direct observation of test participants. The purpose was to determine the accessibility of the MPRRC Web site to individuals with visual impairments. The test accomplished this through a series of tasks designed to identify problem areas within the site. These problem areas were exposed through performance data and through participant comment. The remainder of this section details the methodology of the test.
Dates The usability tests were conducted on April 19 and April 26, 2000. Participants A total of two participants were tested. They were chosen according to the following criteria. They have: ▷a debilitating visual impairment. ▷computer experience and use software designed for the visually impaired. ▷experience using a Windows Operating System. ▷experience using the Internet.
The participants used for the test were recruited with the help of Paul Bohman, technical coordinator of WebAIM, located at the Center for Disabled Persons. One participant required the use of JAWS, a screen reader software program that reads aloud the contents of the screen. The other participant used ZoomText, a program that enlarges the information on the screen.
Both participants have extensive experience with the Internet, indicating on the usability test participant questionnaire (found in the appendix) that they each have well over two years of experience. Presently, both use the Internet every day, and one stated that he uses it an average of three to four hours a day. When asked to rank their ability to use the
Internet on a scale of one to five, with one being a “beginner” and five being “very competent,” both participants rated themselves a four.
Procedure Greeting
The test began with greeting the participants as they entered the computer lab. I, Larry Lilly, greeted the participants as they entered the computer office in the CPD building. They were seated at the computer station and given a short oral questionnaire designed to gather basic background information. (See the Test Participant Questionnaire located in the appendix.)
After completing the questionnaire the issue of confidentiality was discussed. Initially, each testing session was to be videotaped, but because of scheduling problems, only the second participant’s test was videotaped. The participant was asked to sign the nondisclosure statement. (See the Nondisclosure and Tape Consent Form located in the appendix.) At that time I began videotaping the session.
Orientation
The participants next received a short, verbal, scripted introduction and orientation to the test. This explained the purpose and objective of the test, reasons not to discuss the test until all testing is complete, and additional information about what is expected of them. (See the Orientation Script located in the appendix.)
The participants were assured that the Web site was the center of the evaluation and not themselves, and that they should perform the tasks as if they were alone with no one observing them.
Performance Test
The performance test involved a series of tasks each participant performed while being observed. The test consisted of five tasks. (Note the task descriptions in the table on page six or see the Task Scenarios located in the appendix.) Each task involved finding a specific page within the Web site and indicating orally that they had successfully completed the task.
The participant’s efforts in finding the requested information were observed. Notes and observations were recorded on a data sheet. (See the Data Sheet located in the appendix.) The participants were also encouraged to vocally express their thoughts as they progressed through each task. At the completion of each task the participant was given another task until all five were completed.
During each task, elapsed time and errors were noted. The test monitor, Larry Lilly, also took notes concerning participant behaviors, comments, and any other circumstances that might affect the results, such as the screen reader failing to read the site contents.
Participant Debriefing
After all the tasks were complete, the test monitor debriefed each participant. The debriefing session included a short questionnaire asking the participant to answer specific questions pertaining to the test. (See the Post Test Questionnaire located in the appendix.)
After completing the questionnaire, the monitor asked the participant to comment on his performance and offer any observations. Finally, the test monitor asked specific questions relating to observations made during the test.
Test Environment
The test attempted to simulate the environment where the visually impaired participants would normally use the Internet. More important were the participant’s software requirements enabling them to access the Internet.
Having access to the CPD computer room took care of these concerns. Not only were the participants familiar with the environment (they both had been involved in previous testing at this location) but also the computer room provided the specific software needed by each participant.
Performance Measures
The following performance measures were collected and calculated: ▷The percentage of tasks successfully completed.
▷The reasons for task failure.
▷The observations of participants concerning the usability of the site. |