Heuristic Evaluation Report Xi Zhu Jenny Brown Laura Brunetti Vince Diaz

Account Management Service Client: Website: Contact: Consultants: Date:

University Information Technology Services, Indiana University Bloomington http://itaccounts.iu.edu/ Youn-Kyung Lim Jenny Brown, Laura Brunetti, Vince Diaz, Xi Zhu October 4, 2007

Introduction A usability test of the Indiana University Account Management Service was conducted during the week of September 17, 2007, and a heuristic evaluation was conducted the following week. This report presents the findings from the heuristic evaluation and compares the two assessments. The Account Management System website is primarily intended for use by the Indiana University students, faculty and staff, as well as guests to create and manage their computing accounts and passphrases. For the heuristic evaluations, four reviewers first met to discuss the set of accepted guidelines ("heuristics") to be used, and then each person spent 1 – 2 hours using the site, and evaluated it against the heuristics, which were developed by Jakob Neilson [1]. The reviewers then met for a debriefing session, during which the problems that were found were prioritized. In addition, the reviewers compared the heuristic evaluation with the usability test that was conducted the week prior. Detailed Findings For the usability test, the problems were classified according to the standard set forth in a document adapted from a handout by Randolph Bias for the course Information Architecture and Usability Studies at the school of information, University of Texas at Austin (http://www.ischool.utexas.edu). Please see the Appendix for the descriptions and definitions of the classifications. The same criteria were used to prioritize the problems found during the heuristic evaluation. The usability test uncovered four problems, one of which was major while the other three were minor. The heuristic evaluation found those same four problems, but 29 more problems were also discovered. Of those 29, one was critical, 25 were moderate, and three were minor. The following tables contain detailed findings of those problems, and are grouped by the heuristic used to evaluate them. Visibility of system status The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. Table 1: System Status Visibility

Severity Issue Major The system is not well known or easy to find. It is difficult to initiate a session with AMS when you don't have the URL. Minor The indication of what page a user is on or what page a user has already visited is not clear enough. Currently, when a link is selected, a small triangle pointer appears next to the link. However, the links remain the same color. Users who miss the triangle indicator will not know what page they are on. The indication actually disappears on three pages.

Xi Zhu, Jenny Brown, Laura Brunetti, Vince Diaz

10/04/2007

Present in Usability Test Heuristic Evaluation Usability Test Heuristic Evaluation

Page 2 of 8

Match between system and the real world The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. Table 2: System vs. Real World

Severity Moderate

Moderate

Issue When creating a new account, the wording users must agree to is rather vague. (i.e. you must agree to “further the university’s mission”, but that mission is never explained, nor is it explained how that mission can be furthered. When setting up passphrase questions, users are asked to “register” the questions, when they are really picking a question and answering it. The word “register” is not the users’ language.

Present in Heuristic Evaluation

Heuristic Evaluation

User control and freedom Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. Table 3: User Control and Freedom

Severity Moderate Moderate Moderate Moderate

Moderate Moderate

Moderate Moderate

Issue When creating a new account, there is no way to return to the home page When creating a new account, any time the “Cancel” button is pressed, it goes to the “Help” screen When creating a new account and it is discovered that an account already exists, the “Exit” button does nothing. When creating a new account and it is discovered that an account already exists, the error message gives no instructions for how to correct the problem; it also states “Your user name is” but then does not list the user name. When enrolling to be able to reset the passphrase, the “Exit” button does nothing. When setting up the passphrase questions, if you don’t enroll, you must visit a support center in person to change the passphrase; if this is done, it cannot be undone. When changing a current passphrase, the only way to return to the previous screen is to close the additional browser window In the “Help” screen, the “Close” button works in IE but not Mozilla.

Present in Heuristic Evaluation Heuristic Evaluation Heuristic Evaluation Heuristic Evaluation

Heuristic Evaluation Heuristic Evaluation

Heuristic Evaluation Heuristic Evaluation

Consistency and standards Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. Table 4: Consistency & Standards

Severity Moderate

Issue The indicator of what page you are on disappears when you log

Xi Zhu, Jenny Brown, Laura Brunetti, Vince Diaz

10/04/2007

Present in Heuristic Evaluation Page 3 of 8

Moderate Moderate

Moderate

Moderate Moderate Moderate Moderate

in to create email and manage email and at the Guest AMS homepage The “Help button is only on the Create New Accounts page Heuristic Evaluation On most pages, the navigation bar is bold. But, it becomes Heuristic Evaluation unbold when in the faculty/staff pages and when setting up passphrases Most links cause the page to change to new content with the Heuristic Evaluation same layout with theavigation bar on the left, but three pages open new windows that do not have the same layout (Help, Manage Passphrase, and Manage Guest Accounts) In the Change Current Passphrase screen, only the last link at the Heuristic Evaluation bottom is italicized In View Accounts, there is no link to Help, whereas all other Heuristic Evaluation pages provide a link When setting primary email, there is no Submit button Heuristic Evaluation In the Passphrase Maintenance screen, the "Clear/Start over" Heuristic Evaluation button only clears new info added after you press the "Submit" button and receive an error message.

Error prevention Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action. Table 5: Error Caption

Severity Critical

Moderate

Issue When asked to provide student ID, you are given a tool to help you find the ID based on name, birthdate, and Social Security Number. This tool was unable to find a valid IU student. When setting up the passphrase questions, if you don’t enroll, you must visit a support center in person to change the passphrase. It is too easy to miss the warning about that and accidentally enroll.

Present in Heuristic Evaluation

Heuristic Evaluation

Recognition rather than recall Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. Table 6: Recognition vs. Recall

Severity Moderate Moderate

Issue When creating a new account, users are told to refer to a webpage, but only the URL is given; it is not a live link When asked to provide student ID, you are given a tool to help you find the ID based on name, birthdate, and Social Security Number. That tool opens in a new window. When the ID is found through that tool, you have to go back to the previous window and type it in, rather than the tool pre-filling the field

Present in Heuristic Evaluation Heuristic Evaluation

Flexibility and efficiency of use Xi Zhu, Jenny Brown, Laura Brunetti, Vince Diaz

10/04/2007

Page 4 of 8

Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. Table 7: Flexibility and Efficiency

Severity Minor

Moderate

Issue The navigation links in the upper left corner are redundant when users are actually at the page listed in the link. For example, when users are on the home page, the “home” link is active; but when pressed, nothing happens. This occurs with all the navigation links in that section. When creating new accounts, only one can be created at a time; there is no way to create multiple accounts at once.

Present in Usability Test Heuristic Evaluation

Heuristic Evaluation

Aesthetic and minimalist design Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. Table 8: Aesthetic & Minimalist Design

Severity Minor Minor Minor Minor

Issue The aesthetics and overall experience of the website is not engaging. Several pages repeated the same content When managing accounts, the faculty/staff services were listed on all screens, though they are not appropriate for all users When managing accounts, the “Forward email” and “Set Primary Email” buttons end up at the same page. Likewise, when creating a new account, the “Cancel” and “Help” buttons go to the same page.

Present in Usability Test Heuristic Evaluation Heuristic Evaluation Heuristic Evaluation Heuristic Evaluation

Help users recognize, diagnose, and recover from errors Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. Table 9: Error Recognition, Diagnosis, & Recovery

Severity Moderate

Issue Throughout the site, many error message provide easy recognition; some help diagnose the problem, but few provide recovery.

Present in Heuristic Evaluation

Help and documentation Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large. Table 10: Help & Documentation

Severity Moderate

Issue The Help page provides contact information only; it is not focused on user tasks.

Xi Zhu, Jenny Brown, Laura Brunetti, Vince Diaz

10/04/2007

Present in Heuristic Evaluation

Page 5 of 8

Moderate Moderate

Likewise, does not offer concrete resolutions There is no built-in search function

Heuristic Evaluation Heuristic Evaluation

Analysis of the Two Methods The heuristic evaluation uncovered a significantly larger number of problems than the usability test. This may be because the number of reviewers was larger than the number of usability test subjects and because each reviewer spent more time using the system than the test subjects did. In addition, the reviewers had a set of guidelines to help them look for problems, while the test subjects were focused on completing tasks. Additionally, according to a paper written by Hanna Yehuda, Jennifer McGinn [2], heuristic evaluations are less costly; there is no need for a lab in which to perform the tests, or to pay recruitment fees or incentives. However, while more problems may have been found by the heuristic evaluation, there are two disadvantages to that method. First, the validity of the guidelines developed by Nielson have been questioned and other guidelines do exist [3]. Second, finding many usability experts can be problematic. Other negative factors of the usability testing include: • Usability testing requires more preparation • The think aloud protocol used during usability testing is unnatural • The testing takes place in a controlled environment and is therefore not realistic However, usability testing is not without its merits. It provides the point of view of non-experts and those who most likely will be the users; it allows facilitators to concentrate on specific areas; it is standardized so the results can be easily re-created; and, while it is not realistic, it provides a closer approximation of individual usage than heuristics. In a paper published by Robin Jeffries and Heather Desurvire that compared usability testing and heuristic evaluations [4], the authors assert that “a usability test identifies problems that will plague the actual users of the application.” While our heuristic evaluation found a significantly larger number of problems than the usability test, we cannot say for certain that all of the problems we cite will be considered problems by the users. Additionally, the same article explains that software developers are more likely to be swayed by input that came from users, as opposed to that from “so-called experts”; the developer will give more credibility to someone who actually uses the product on a regular basis. “Developers may doubt that a problem in the user-interface exists, but when they see the user actually experience that problem in the laboratory, they change their minds quickly.” Reflection Our team experienced varying degrees of difficulty with both user testing and heuristic evaluation. We have learned from these difficulties by reflection on the process and have come up with ways to ease the difficulties we experienced in our use of the two methods. We found user testing to be more difficult than heuristic evaluation.

Xi Zhu, Jenny Brown, Laura Brunetti, Vince Diaz

10/04/2007

Page 6 of 8

User testing was the technique that was more difficult. In preparation for user testing there was a much larger body of work to be completed to prepare for the task. We had to first decide what user group we would focus on, and then proceed to attempt to find users who fit this profile who were willing to test the website. This process also involved the synchronization of many people’s schedules to be able to schedule the user test. In addition to finding users for the test there was also a substantial amount of paperwork that had to be created for the user test. The user test also required the setup of a webcam and two different software applications to record the user test. We found the most difficult tasks to be recruiting and scheduling users who fit our user group, creating good scenarios, and writing good debriefing questions. In order to address the issue of recruiting the correct users, and also recruiting more than just three users, we would begin to find users earlier in the process. We would have liked to have had more time to find users. Recruiting many users would allow us to test all the user groups that the website was designed for. With more users, scenarios and tasks would have also been comprehensive for all features of the website. The final issue we will address in our next user test is the amount of feedback that debriefing questions prompted. In our first test, many of the debriefing questions were answered in a yes or no fashion. We would like to have debriefing questions that prompt the user to give more descriptive answers. We will address these issues with improved methods in our next user test. Our team experienced a few difficulties while conducting heuristic evaluation, but these difficulties were not as extreme as the difficulties experienced during user testing. We found that it was difficult to be comprehensive in the detection of problems with the website. This difficulty could be addressed with more practice using the heuristic evaluation method as well as the possibility of spending more time on the evaluation of the site. A difficulty we encountered that is not as easy to address was the issue of personal biases. We found that we as individuals are predisposed to find certain problems more than others. We also found ourselves biased to finding the problems we encountered in user testing. To address this issue we would conduct the heuristic evaluation before user testing was conducted. The final difficulty our team experienced was concerned with the nature of the problems we found. Would the problems we found be problems to users? This issue can be addressed by comparing our problems with the problems found in user testing. Overall user testing proved to be more difficult than heuristic evaluation. In the future we would like to conduct more thorough user testing with more users. The user test would be followed by debriefing question that would prompt in depth responses from the users. We would conduct our heuristic evaluation before the user test to avoid gaining a bias for certain problems. We will use our past experience to be as comprehensive in our heuristic evaluation as possible. The members of Team 5 will use this reflection in the future to conduct better user testing and heuristic evaluations. References [1]

Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods, John Wiley & Sons, New York, NY.

[2]

Experience report: Coming to terms: comparing and combining the results of multiple evaluators performing heuristic evaluation, Hanna Yehuda, Jennifer McGinn April 2007 CHI '07 extended abstracts on Human factors in computing systems CHI '07

[3]

Axup, J. (2002) Comparison of Usability Evaluation Methods, http://userdesign.com/usability_uem.html

[4]

Usability testing vs. heuristic evaluation: was there a contest? Robin Jeffries, Heather Desurvire October 1992 ACM SIGCHI Bulletin, Volume 24 Issue 4

Xi Zhu, Jenny Brown, Laura Brunetti, Vince Diaz

10/04/2007

Page 7 of 8

Appendix Table 11: Randolph Bias Error Severity Classification

Severity Description

Severity Definition

Critical

The identified issue is so severe, that the user will not be able to complete the task, and may not want to continue using the website.

Major

Users can accomplish the task but only with considerable frustration and/or performance of unnecessary steps. The user will have great difficulty in circumventing the problem; users can overcome the issue after they have been shown how.

Moderate

Minor

The user will be able to complete the task in most cases, but will have to undertake some moderate effort in getting around the problem. They may need to investigate several links or pathways through the system to determine which option will allow them to accomplish the intended task. Users will most likely remember how to perform the task on subsequent encounters with the system. The problem occurs only intermittently, can be circumvented easily but is irritating. Could also be a cosmetic problem.

Xi Zhu, Jenny Brown, Laura Brunetti, Vince Diaz

10/04/2007

Page 8 of 8

Heuristic Evaluation Report

Oct 4, 2007 - A usability test of the Indiana University Account Management Service was conducted during the week ... The Account Management System website is primarily intended for use by the Indiana University .... are less costly; there is no need for a lab in which to perform the tests, or to pay recruitment fees or.

76KB Sizes 3 Downloads 301 Views

Recommend Documents

Evaluation Report
bilingual education was established in 43 state schools with 1200 pupils aged three .... •An online Supplement which will contain an article setting the BEP against the ..... the CN and CS teachers through, for example, the teaching of specific ...

heuristic evaluation of user interfaces - Computer Science at Ashoka ...
they can be generally applied in real software development projects. Automatic .... which a telephone company wolild make available to its customers to dial in ...

Heuristic Evaluation Functions for General Game Playing
1.2.2 Flexible Software Capabilities . ..... defined game description language. ... understanding and automation of intelligent processing is neither nec-.

credential evaluation report
Nov 3, 2006 - Bachelor's degree in computer science from, and one year of graduate study in public administration at, a recognized university. CREDENTIAL ...

TARA evaluation report 2016.pdf
Page 1 of 59. Page 1 of 59. Page 2 of 59. 6. 9. 7. 10. 8. Page 2 of 59. Page 3 of 59. 17. wlucb rbd3 ihe blowing ir .|id.F!t to@ dli.!!? (A) Irto.y. (B) Wpro. (c) N$il6 (D) n€lim. 18. Wbidh of ttE following is a ;ift Det€iins €lviMderr? (A) cu.

TARA evaluation report 2016.pdf
TARA evaluation report 2016.pdf. TARA evaluation report 2016.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying TARA evaluation report 2016.pdf.

heuristic evaluation.pdf
Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. heuristic evaluation.pdf. heuristic evaluation.pdf. Open. Extract.

Heuristic Reasoning.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Heuristic ...

Heuristic Decision Making
Nov 15, 2010 - and probability cannot, such as NP-complete. (computationally .... The program on the adaptive decision maker (Payne et al. 1993) is built on the assumption that heuris- tics achieve a beneficial trade-off between ac- curacy and effort

Final Report Evaluation 2013 EYE Retreat.pdf
Treasurer: Margret Carter Secretary: Alan Chase .... Two articles by Reed and Curtis examined the perceived barriers to higher education with ... Reed, 2011).

External evaluation report for the TILE project
Technical Adobe connect pro-system prevented these Vat partners still from connecting and cooperating within TILE, therefore the coordinator of the project facilitated face-to-face Vet meetings during TILE events, which partly solved the problem, but

CAI Evaluation Report 2012-2015.pdf
Executive Director. AFAO. Whoops! There was a problem loading this page. CAI Evaluation Report 2012-2015.pdf. CAI Evaluation Report 2012-2015.pdf. Open.

Papakowhai School External Evaluation report 2017.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Papakowhai ...

Online Pizza ordering Project Evaluation Report ... -
This is a report of the usability test plan and study conducted to analyze the user interface of the. Tasty Pizzeria's pilot online pizza ordering system.

NCS Evaluation Final Report - CO Office of Behavioral Health ...
NCS Evaluation Final Report - CO Office of Behavioral Health(unsecured).pdf. NCS Evaluation Final Report - CO Office of Behavioral Health(unsecured).pdf.

Summary Solidaridad Narrative Evaluation Report FSP and Bridging ...
Summary Solidaridad Narrative Evaluation Report FSP and Bridging the Gap_ January 2017.pdf. Summary Solidaridad Narrative Evaluation Report FSP and ...

Final Report BNNRC Program Evaluation 2011-2014 -submitted to EU ...
Retrying... Final Report BNNRC Program Evaluation 2011-2014 -submitted to EU by FPU.pdf. Final Report BNNRC Program Evaluation 2011-2014 -submitted ...

Annex-6-ICC-Project-Evaluation-Report-Format.pdf
logical framework. E. Project Description. This section presents the project's configuration and scope of works particularly a. brief description of the components, ...

Heuristic Scheduling Based on Policy Learning - CiteSeerX
machine centres, loading/unloading station and work-in-process storage racks. Five types of parts were processed in the FMS, and each part type could be processed by several flexible routing sequences. Inter arrival times of all parts was assumed to