NOTE: Because of a confidentially agreement with Inteum,
I cannot show images of the actual product, but every
other aspect of the Usability Evaluation is detailed here
Introduction
This project required a usability evaluation of the Inteum Inventor
Portal.
The Inventor Portal is a product of the company, ‘Inteum’
which is primarily in the field of Technology Transfer
What is Technology Transfer?
Technology Transfer is the process of sharing skills, knowledge,
scientific findings, processes, methods, samples, and facilities
among governments, and/o universities so they can be distributed
among users who can use them to create products, processes,
materials, applications, and/o services.
Technology Transfer is used to convert ideas, concepts, designs, and many other
early stage inventions into commercial or products that can be distributed,
marketed, and sold to consumers.
The process of Tech Transfer takes a product to market by utilizing all factors that
are legally relevant to commercialization.
Meeting and Understanding the Client's Objectives
The company that we working with for this project was Inteum, who
develop industry-leading intellectual property management
software. Their clients include technology transfer offices at over
400 installations in 26 countries around the globe,
including universities, research institutions, federal labs,
corporations, and startups.
The team had their first meeting with their point of contact at Inteum
where they were given access to the portal (including one admin
account and 5 inventor accounts).
The product from Inteum that we were going to conduct our usability
evaluation on was the Inventor Portal which allowed users to
complete the following tasks
1. Create a new disclosure with details about their new invention.
2. Edit their disclosure before submitting for approval.
USER FLOW MAP (Creating a Disclosure)
Project Objectives
1. Identify obstacles that users may face while creating and
editing a disclosure. These obstacles are only centered on the
interface and the overall usability of the portal.
2. Identify if a user would recommend the portal to a potential
inventor who needs a patent to make their product marketable.
3. What recommendation would they give to their own Tech Transfer
Office (TTO) based on their experience with the Inventor Portal.
Research Questions
1. How easily and successfully can users create a disclosure?
2.How easily and successfully can users edit an existing disclosure
according to the remarks they received?
3. What obstacles do users encounter while creating or editing a
disclosure?
1. What questions do users have while filing a disclosure?
2. What questions do users have about the overall design of
the portal? Do they find it easy to navigate?
3. Do users have difficulty finding the remarks from the TTO?
4. We gathered the following qualitative data during the course of
our evaluation:
1.Whether the Inventor Portal was easier or more difficult to
use than the user initially expected
2. Real-time observations or initial opinions that the user has
about the Inventor Portal from comments made using the
Think-Aloud protocol while completing tasks
5. At the end of the study we will also have the following quantitative
data:
1. User satisfaction scores from the System Usability Scale
(SUS) questionnaire
2. Success and failure rate for different tasks and subtask
Methodology
Aspects of the Inventor Portal We Evaluated
The goal of the evaluation was to learn about the experience of
inventors in using the system’s core disclosure creation functions. To
do this, we created 2 test scenarios. Scenario A had the participants
work through the entire disclosure creation process. They began on
the dashboard of the Inventor Portal, created a new invention, gave
it a title, moved through each section of the form that makes up the
disclosure editing portion of the web app, and finally submit the
disclosure to the TTO. This scenario allowed us to evaluate the ease
of use and intuitiveness of the entire disclosure creation process.
As part of the larger disclosure process may also involve using the
Inventor Portal to communicate back and forth with the TTO to get
more information or make clarifications, we also included a short
scenario for editing disclosures. Scenario B evaluated the
discoverability of remarks left by Inventor Portal administrator users
(who would typically be employees of a TTO). Participants were
then asked to make the requested changes to the Inventor Portal and
resubmit it.
Dates of Study
Our evaluation took place over a two week period ranging from
April 6th - April 20th. Our pilot study took place on April 6th and
the following evaluation took place April 11-20th. The specific dates
that the evaluation took place were: April 11th, 12th, 16th, and 20th.
Each evaluation took around one hour total.
Evaluation Environment
All evaluations took place at RIT in Golisano room 2293 and 2289.
The evaluation took place in the evaluation room where one
moderator sat with the participant. The observation room was
behind a 2-way mirror, and there was always one to three
observers in the observation room.
Participants
In total we conducted this evaluation with nine participants:
6 graduate students, one TTO staff member, one professor, and one
undergraduate student. Each participant had varying levels of
familiarity with the concept of Tech Transfer
Tasks and Scenarios
We evaluated the Inventor Portal through two scenarios: creating a
new disclosure and editing an existing disclosure based on a remark
left by a member of their university’s TTO. The task and subtasks for
each scenario are as follows:
Scenario A-Create a Disclosure
Scenario A involved the participant creating a disclosure on the
Inventor Portal. All information needed to fill out the disclosure was
provided in an information sheet. The Task List below details each
step/subtask.
Moderator read the script for scenario A (see Appendix A) then
gave the participant their task materials, including a task
breakdown/description (Appendix C), and the information sheet
(see Appendix B).
Task List: Create a disclosure for your project
Subtask 1: Select the right type of disclosure.
Subtask 2: Upload the correct document.
Subtask 3: Add a subscriber.
Subtask 4: Add an inventor.
Subtask 5: Add an interest.
Subtask 6: Add a marketing target.
Subtask 7: Add funding.
Subtask 8: Submit the disclosure for review.
Scenario B-Edit a Disclosure (10 minutes)
Scenario B involved the participant editing an existing disclosure
based on a remark made by the administrator of the portal. The
remark clearly detailed what needs to be edited in the disclosure.
Moderator read the script for scenario B (see Appendix A) and also
gave a copy of the task breakdown/description to the participant to
refer back to if needed (see Appendix C).
Task List: Edit an existing disclosure for a project
Subtask 1: Find the disclosure that was returned to the inventor.
Subtask 2: Find comments made by the admin.
Subtask 3: Edit the disclosure according to the comments.
Subtask 4: Submit the new version of the disclosure.
Test Design Matrix
For our evaluation we used a within-subjects design which means
that each participant attempted both scenarios. The order of the
scenarios were counterbalanced to minimize any learning effects
that could potentially influence their performance in their second
scenario. Below in Table 2 you can see the order in which scenarios
were given to each participant.
Measurements Taken
For each evaluation, several measurements were taken including a
standardized questionnaire to assess usability known as the System
Usability Scale (SUS), task success and failure rates, and the
participant’s opinions about their interactions with Inventor Portal.
According to usability.gov, “The System Usability Scale (SUS)
provides a “quick and dirty”, reliable tool for measuring the
usability. It consists of a 10 item questionnaire with five response
options for respondents; from Strongly agree to Strongly disagree.
Originally created by John Brooke in 1986, it allows you to
evaluate a wide variety of products and services, including
hardware, software, mobile devices, websites and applications.”
Task success was defined as successfully completing a subtask using
data provided in the Information Sheet (see Appendix B). What the
participant entered for each section did not have to match what was
in the Information Sheet exactly, as long as they demonstrated an
understanding of how to input information into each section of the
portal.
At the end of the post-evaluation questionnaire we asked
participants if they would recommend this system to their university’s
TTO and why or why not. We asked these questions to assess the
participant’s opinions regarding the portal and it’s ease of use.
Performance during the evaluation does not always reflect the
impression they are left with after using the system.
Deviations from Test Plan
The information sheet was changed to specify what type of
disclosure was required to be input by the participant. Some
participants selected disclosure types that were created in the test
environment, but weren’t intended to be used in this evaluation.
The name of the account holder from which the participant would
complete the task was added to the moderator script. This made
certain elements of the tasks clearer, as the current user’s name
appears a few times during the process, and participants needed to
know that was supposed to be them.
The participant was given some time to review the information sheet
before the task officially started.
The moderator script also included when to open the internet tab
integral to each task and scenario.
A pre-task checklist was created to make sure everything was in
place.
The definition of Tech Transfer was added to the moderator’s script
so it could easily be communicated to the participant in the event
they wanted information on it before the task.
The moderator also included a prompt to specify what a disclosure
was to a participant
We decided to not count the number of errors participants made for
each task/subtask. Instead we focused on identifying the most
significant obstacles participants faced.
Findings
Quantitative Results:
The task success and failure rate is recorded and calculated for each
subtask in Scenario A and Scenario B (see Table 3 and Table 4).
Task Success/Failure Rate
Scenario A: Create a disclosure
SUS (System Usability Scale)
We used a standardized questionnaire System Usability Scale
questionnaire (see Appendix D) as a post-evaluation questionnaire.
The score of the Inventor Portal is 65.3, which means participants
tend to think that the Inventor Portal is not very easy to use. A SUS
score above a 68 would be considered above average and
anything below 68 is below average in terms of user satisfaction.
We also asked a question at the end to assess whether a participant
would be willing to recommend this system to their university’s TTO.
Qualitative Results:
Participant Feedback:
We also asked the participants about their opinions of the Inventor
Portal in the post-evaluation questionnaire.
The positive feedback indicates that:
the search function is convenient
the layout of the website is consistent and straightforward
The negative feedback is listed below:
There are a lot of fields to fill in and the process is tedious, such as
adding inventors individually.
It is tricky to enter contributions and significance while not knowing
all the limitations.
The use of different terms “Remark” and “Comment” for the same
meaning is confusing.
Comments made by TTO is not distinguished from those made by
team members, which made it difficult to find.
Usability Problems Identified
Creating a new disclosure
Participants were confused to what ‘significance’ meant and made
several assumptions about it (Heuristic 2, Internal Consistency). In
addition, participants also had issues dividing the level of
contribution among all the inventors (contributor) for an invention
(Heuristic 5, Frequently Used Functions Optimize).
Participants assumed that multiple fields (e.g. contributors, marketing
targets) could be added and saved at the same time (Heuristic 5,
Frequently Used Functions Optimize; Heuristic 12, User Control and
Freedom).
The error notification “Total contribution cannot be greater than
100” appeared even when the contribution input was below 100
(Heuristic 7, Perceptibility of Feedback).
Some of the terms on the form seemed to cause confusion. Users
came up with varying definitions for terms like Significance, Interests,
and Marketing Targets (Speak the User’s Language).
Editing a disclosure
Some participants did not find the remark made by the admin easily.
Many expected a message from the TTO to be highlighted or
otherwise made to stand out from other elements of the disclosure.
One participant was confused over whether the comment made on
an existing disclosure was the same as the remark that an inventor
could make creating a new disclosure.
Design Recommendations
We recommend adjusting the interaction for adding items to forms
so that users can add multiple items at a time. For simpler items like
Interests, a checkbox list or list builder design pattern may work well.
(see this website for a more information about that design pattern:
For adding Inventors, many participants were expecting an Inventor
to be added to the list when clicking “choose.” If the list of Inventors
was visible and editable while searching for and adding Inventors to
the list, either by placing the list in the pop-up, or pulling the “Add
Inventor” form out of the pop-up and onto the main form area, that
could accommodate adding multiple users at once. It would also
make it easier to manage the contribution percentage, since users
would be able to see how the total exceeds 100% and make
corrections to that without having to leave the “add inventor” form.
Some participants also expected percentages to dynamically
change to accommodate what they entered, or for the system to
allow them to set all Inventors to automatically have equal
contribution.
We noticed during the evaluation sessions that participants were
often confused by terms used in the form, or they interpreted the
meaning of a term differently than the system intended. Adding
documentation that provides brief definitions for each section and
term that has a system-specific meaning may help users map the
various terms and headings to their function in the disclosure process.
When searching for a comment made by the TTO in Scenario B,
many users scanned quickly through the entire disclosure, expecting
something to pop out at them. Once they realized there weren’t any
highlighted messages, they didn’t have much trouble finding the
remarks. However, in a real situation where a disclosure may have a
lot more information in it, users may have a difficult time discovering
all changes made by admins, as they blend in to the rest of the form.
We recommend adding callouts or highlighting changes made by
other users, especially changes that were made by admins before
setting a disclosure back to draft. This will make communication
between Inventors and the TTO more discoverable and efficient.
Further Research
We found that most of Inteum’s interactions were easy for
participants to discover and understand. The obstacles uncovered
were mostly related to processes that took a few more steps than
participants were expecting, or with participants being confused by
a lack of explanation or feedback. The next research steps we
recommend would involve testing prototypes of potential changes to
the design to address the usability problems we found. Testing early
prototypes will reveal which design choices will be most helpful in
improving the usability of the inventor portal.
Final Report
The team then compiled the data generated from the individual
evaluations into a final report.