Monthly Archives: September 2011


Konigi is a relatively different UX blog compared to UX booth, which I recommended before, or to be precise, different with most of UX blogs. Written by Michael Angeles, it is a blog sharing showcases, tools, and knowledge with UX practitioner.

As mentioned in many comments of this blog, it is a fantastic website to check out inspiring designs and useful tools. I took a quick tour around the website and found something definitely invaluable for my future use. For example, you could find Konigi designed tools such as Graph Paper, and OmniGraffle Wireframe Stencils, or you could find other popular used tools recommended by Konigi. For example, I found Usaura, a website running free 5-second test quite interesting. And by the way, the graphic interface design of Konigi itself is a good showcase for UX designers, great alignment, great hierarchy and grouping, simple and easy to find information. Another section of Konigi that appeals to me a lot is the Wiki section. You could find fruitful lists of  UX related information, including terminology, deliverables, and even UX jobs resources.

Well, at the end, if you still insist on finding some recent topics about UX design, as what you could do in UX booth, Konigi actually has its blogging articles under “blog” section. I will say, it is definitely a website worth checking out from time to time.

Major hierarchy of Konigi: Home, Blog, Showcases, Tools, and Wiki

Major hierarchy of Konigi

Recommended tools by Konigi

Recommended tools by Konigi


RAA: Customize your heuristics?

Paper discussed:

Kientz, J.A., Choe, E.K., Birch, B., Maharaj, R., Fonville, A., Glasson, C. & Mundt, J. (2010) Heuristic evaluation of persuasive health technologies. Proceeding of the 1st ACM International Health Informatics Symposium (IHI ’10), 555-564. doi: 10.1145/1882992.1883084.

1. Purpose of the research:

Develop a set of 10 heuristics intended to find problems in persuasive technologies, and compare with Nielsen’s heuristics to see if specific designed heuristics could be more helpful for persuasive technologies.

2. Methods:

2.1 How to Define Heuristics

The research group firstly reviewed related literature and compile a master list of all usability guidelines and heuristics for persuasive technologies. Then they narrow down the list by combining the similar guidelines, prioritizing them, and discussing them using a process similar to affinity diagramming and ultimately came up with 10 heuristics. The list of 10 heuristics enable evaluators to focus on the most important aspects and also allow researchers do a comparison to Nielsen’s 10 heuristics.

The list of 10 heuristics are as follows (you could find explanation for each one in the paper):

Appropriate Functionality; Not Irritating or Embarrassing; Protect Users’ Privacy; Use of Positive Motivation Strategies; Usable and Aesthetically Appealing Design; Accuracy of Information; Appropriate Time and Place; Visibility of User’s Status; Customizability; and Educate Users.

As you might see these heuristics had some overlap with Nielsen’s. This was intentional and necessary because Nielsen’s list reflects the fundamental usability principles.

2.2 How to Conduct Evaluation

The researchers chose two web-based applications to evaluate: Mindbloom and MyPyramid BlastOff. The former is a website designed to track progress of users’ life goals, including health goal; while the latter is a online game aiming to educate children about healthy food choices. These two examples of persuasive technologies were chose because they could be accessed easily by any evaluators at any places with internet connection.

The researchers also recruited 10 evaluators, among who there were graduate students in HCI-related program and one game designer and one web coordinator. They were then randomly assigned to 2 groups: experimental and control group, corresponding to evaluate applications using new heuristics and Nielsen’s heuristics, respectively. The evaluation process was basically identical to Nielsen’s instruction, however, the researchers define the severity rating afterwards, instead of evaluators.

3. Main Findings:

There were several interesting findings.

The researchers claimed that the designed heuristics could discover more sever issues, more severe issues more frequently, and more issues that are useful in improving persuasive aspects of the interface evaluated.

What’s more interesting to me is that they found out the first two heuristics in the list had the highest number of issues correlated. This phenomena was consistent in both experimental and control groups. This finding suggested the order of heuristics we gave to evaluators might influence their findings. Thus we could intentionally randomize the order or place the heuristics in order of importance to get better data.

4. Analysis:

I will give 3 out of 5 points to this paper. Though the findings are kind of interesting, and valuable for evaluation of persuasive technologies, I doubt its scientific significance for the following reasons:

First, as the authors stated at the beginning of the paper, design specialized heuristics was already a trend in usability evaluation area. What this paper did was duplicate the research procedure of similar papers and applied it to evaluate persuasive technologies.

What’s more, I saw too much manipulations from researchers in the experiment. For example, since the evaluators recruited were not expert-level (with average self-rate experience = 2.3, 1=no experience, 4=very experienced), the researchers had to re-group the issues they found during evaluation and gave them severity rating. Without any verbalized comments from observer (see also Nielsen’s instruction), it was hard to tell how much subject opinions had been added during this process, which might skew the final conclusions.

Generally, I don’t doubt the overall conclusion of this paper, and agree that specialized heuristics could facilitate the evaluation for specific applications than Nielsen’s heuristics. However, as a research paper analyzed with statistics, I would like to see more rigid control of the experiment.

All About Designs — Equation Editor in Word

As we discussed before, a good design should have a user-centered approach, that being said, don’t let your users think, let them use the product nice and easy. A bad design does the opposite:

Equation Editor in Microsoft Word

What I am using is not the newest version of Microsoft Word (2008 version for Mac), and I noticed that it has already been improved somehow on this, but still, it is worth talking about as a case study.

Say, if I was working on my document and need to insert an equation.First thing to do, find where the function is. I started, naturally, with Insert menu, but could not find an option called “equation”, which is the searching goal in my mind. Then I googled it, thanked to so many same questions posting there, I knew it should be in “Object”. That does’t make too much sense for me, since everything insert might be count as an “object”. I am even not sure if I could remember where to find it next time.

Next, clicked the equation option in the “Object”, typed in equations and ready to insert it. But, I was scared when I found no bottom saying anything like “Insert it!”. What am I supposed to do now? Copy and paste equations in this editor window, find other bottoms in the tool bar, or just close it? Will I lose the equation I’ve typed in if I close it? With so many question marks in my head, I decided to try my luck and just close it to see what will happen. To my surprise, the equation was in the right place in the document waiting for me! After all these confusion and surprise thing, only one question left: why not just give me a simple “insert” bottom?

Equation editor view

According to J. Nielsen’s ten usability heuristics, there are some design principles the design confronts here.

First, visibility of system status. An “insert” bottom with in time feedback (the equation inserted to the text) could let the user know what is going on. Or even a realtime updated display in the document will let the user know (s)he won’t lose anything if (s)he close the editor since the equation is already inserted the same time (s)he types it.

Second, recognition rather than recall. That being said, make things visible! The designer should put the function of inserting equations visible for users, rather than testing their imagination and asking them to remember the path if they want to use the function again.

RAA: Care more about intended users rather than general public

RAA stands for: Research Article Analysis

Paper discussed:

Das, A., Faxvaag, A., & Svanas, D. (2011). Interaction design for cancer patients: do we need to take into account the effects of illness and medication? Proceedings of the 2011 annual Conference on Human Factors in Computing Systems, 21-24. doi: 10.1145/1978942.1978946

1. Purpose of the research:

Examine cancer patients’ ability to use a patient-centered information system and their need for the system, in order to improve the usability of the next version of the system.

2. Methods:

One way to accomplish the research goal is to establish if there are significant differences in task performance between particular patient groups and average computer users. Authors proposed a hypothesis that the group of cancer patients would have significantly more difficulties using a web-based healthcare system compared to a control group of healthy individuals.

The study was set up as an observational case-control study with an experiment where cancer patients and healthy controls were observed while they conducted tasks by using a web-based healthcare system. Semi-structured interviews and questionnaires were used afterwards to collect additional information. The whole precess was captured on video and analyzed to evaluate the usability based on the definition of usability by ISO (effectiveness, efficiency, and satisfaction).

3. Main Findings:

Effectiveness: the cancer patients experienced more difficulties compared to the healthy controls for the entire task and for all its five subtasks (lower completion rate). Efficiency: measured “time on task” was not quite useful because cancer patients quickly gave up and was given assistance while healthy controls quickly finished the tasks, which led to similar time on task. Satisfaction: SUS score might be affected by cancer patients’ motivation to use the system.

In conclusion, authors claimed effectiveness is the main issue faced by cancer patients due to their impaired physical and cognitive ability. A patient-centered systems should be designed with the intended users in mind, rather than average, general public.

4. Analysis:

I will give 4 out of 5 points to this paper. Four points for its novel and sympathetic consideration towards the design of patient information system, as well as the clear-conveyed experiments. One point is lost for the relatively small size of sample and lack of in-depth suggestions about specific improvements should be made. For future study, research could be carried out to examine the exact problems that cancer patients face. Possible approaches could be eye-tracking experiments and think-aloud method, in the way that both objective and subjective descriptions could be gained.

A Good Research Tool: ZOTERO

ZOTERO is a free and easy-to-use tool to collect, organize, cite, and share research sources. Combined with firefox, it is very convenient to use while we are searching for different resource.

I recommend it because it is a very well-designed software for users — easy to install, easy to learn, and easy to use. I will not jump in details about how you could use ZOTERO, because you could pick it up really quick. One key feature that might facilitate our classroom study is the “Group” function. It allows us to set up a group to share research sources among members of the group. For instance, we could build a folder named “RAA” below the root folder, to share the information and PDF files of RAA article we choose. For each article, people could also add notes below it, conveying personal opinions about the article. In this way, all members could have a clear achieve of all sources, and avoid wasting time to do duplicated work such as downloading. I found it very useful in my research to share articles within lab, or within project team members in the past, and hopefully ZOTERO could help with yours too.

Finally, below is how it looks like in my firefox:

Screenshot -- a website with hidden Zotero Fig.1 A screenshot of a website with a hidden ZOTERO.Screenshot -- a website with expanded Zotero

Fig. 2 A screenshot of a website with an expanded ZOTERO.


UX booth is one of my favorite blogs about user experience study. This blog is run by a group of people named UX community. Its online presence delivers informative articles and resources on usability, interaction design, and user experience. The blog is updated regularly and has a readership mainly composed of beginning-to-intermediate user experience and interaction designers (just people like us).

If you take a look at this blog, it itself is a pretty nice, and user-friendly website (of cause, it has to be!). The map of the blog is easy to grasp (very good visibility!). As the blog consists mainly with articles discussing and sharing different aspects of user experience, it provides a clear achieve of related topics, which you could find either on the top of the page, or the bottom of the page. Other forms of information, such as video records, podcasts, tools and books that they recommend, or even some get together they organized, are achieved in “resources”.

Overall, I think this blog is pretty cool. It has something you might already know about, some more you might haven’t heard of. This is good for us as learners — you could have some reflections on your experience, and you could also learn new things as well. You could also join the community and contribute as a guest writer. As I subscribed this blog to google reader, it told me that there are already 13,693 readers has subscribed it (don’t leave yourself out!), and the update frequency is 1.4 posts per week. Check out the latest blog — “Personas: putting the focus back on the user” — isn’t it appealing to you?