Engaging Qualtrics:
Creating Interactive Training Modules
Sam Dowell
Yuanjie Dai
Mike Floyd
Kaylea Algire
Principles of Instructional Design
February 17, 2013
Introduction
The
Center for Instructional Technology perceives an online component to the face
to face Qualtrics training program for several reasons. First, there is no efficient way to conduct
an assessment on learned skills with the current training program. Second, there is a higher demand then it is
possible to conduct the needed sessions face to face. Third, face to face does
not allow the facilitator to adapt the training to reach learners where they
are in terms of schedule, learning preferences, or location. Developing an online component will
potentially address all of these needs and also allow CIT staff to reallocate
resources to other projects.
Needs
Analysis
CIT at JMU developed and conducted a
three-hour training program in a face to face environment for the past five
years. A year ago, David Stoops
facilitated these trainings as part of his position with CIT. The beginning of the training consists of a
best practices presentation by Dr. Peter De Michele from the Office of
Institutional Research. Dave then
finishes the training with a step by step facilitated presentation of the
program. However, as demand grows, CIT
determined that it would be more efficient and effective if they added an
online component to their face to face offerings.
Learner Analysis
Typically a participant
in the face to face sessions is a faculty or staff member in the JMU
community. Mean age of participants is
thirty five and of middle income. However, there is a blend of ethnic and
socioeconomic backgrounds. Each gender
is equally represented, and all have at least a bachelor’s degree.
This training is not
currently open to graduate students, but an online training would allow them to
become familiar with the software if their departments negotiated accounts for
them. For example, the Adult
Education/Human Resource Development program has several department accounts
that it allows students to use to conduct thesis research.
Target Audience
|
Age:
|
25-60
|
Education:
|
Mostly bachelor’s degree holders with
some master’s and doctorate degree holders
|
Computer
Skills
|
Varies
|
Gender
|
Mix of men and women
|
Attitudes
|
Required training with paid time at
the training
|
Experience
|
Mixture of experienced Qualtrics uses
and new users
|
Ethnic
Background
|
Mostly Caucasian with a few Asian and
African American participants
|
Income
|
Median Income to Upper Income
Participants
|
Context
Analysis
This
is a required training before participants receive access to JMU’s Qualtrics
license. All learners have a varying
amount of experience with the program.
For example, if a professor came from another university that already
used Qualtrics, then they come in at an expert level, however, currently there
is no way for them to be granted access without going through the
training. Each session has a different
mix of learner experience which presents challenges within the facilitation.
The
target audience of each training session is faculty, staff, or graduate
students that are conducting customer service satisfaction surveys or research,
offering conference registration or conducting departmental voting. The sessions are generally held in Rose
Library in the computer lab in CIT’s office.
There are thirty two computers on computer tables with chairs. Each participant has access to a computer in
order to be able to work with the program during the session. There are white boards around the perimeter
of the room and the desks are oriented in rows towards the front of the
room. There is an LCD projector mounted
on the ceiling with a screen at the front of the room. Dave also uses flip charts and markers to
facilitate learning. The facilitator
workstation is located at the front of the room to the side. There is an option for private face to face
training sessions which Dave also conducts in the participants’ office or
lab. This can mean a variety of
technologies available and levels of technology experiences present.
Personas
|
“Susan” is a new professor at James
Madison. She teaches several sections
on the effect of the media on politics.
She is interested in conducting a survey to test the effect of mass
media and advertising on the most recent presidential election. She wants to create a survey that she can
distribute easily to JMU students, staff, and faculty and non-JMU members in
the community. While doing her initial
research she finds out from her department head and CIT that she needs to go
through training for Qualtrics before she can design her survey.
Unfortunately, she is unable to attend
the next training because it is during one of her scheduled classes. The next training is not being offered for
a month, and the grant application that could fund her research is due in two
weeks. She has to do the training as
soon as possible to start collecting data.
Right now she will have to cancel her class to attend the
training.
|
|
“Mr. Jones” has been working at JMU for
ten years in the Center for Multicultural and International Student
Services. He recently was approached
by the director of CMISS to design and offer a training program for diversity
awareness among faculty and staff. In
order to get a good idea about areas of improvement, existing programs and
the effectiveness of these programs he would like to send a survey to his
colleagues that will be easily accessible and efficient. From his director he finds out that he must
participate in the Qualtrics training session at CIT.
However, Mr. Jones travels for his
position and he is unable to adjust his schedule to attend scheduled
sessions. When he contacts CIT to
schedule a one-on-one meeting, he is informed that the facilitator cannot
meet him for another two months. Mr.
Jones must have an initial lesson plan prepared and ready for his director by
the end of the month.
Right now he will not be able to meet
his deadline.
|
Instructional
Goal
Users will be able to
take more control of their learning experience after the successful completion
of the online Qualtrics modules. After
learner self-sufficiency, efficient reallocation of CIT resources is the next
major goal for this instructional design project.
Content
Analysis
This conceptual
framework shows the five main instructional units and their connection to the
successful completion of survey creation using Qualtrics. With these five units, students will be able
to complete the main instructional goal.
Behaviors/Content
·
Best practices
for closed and open ended questions
·
Access to
Qualtrics
·
Create a new
survey
·
Understanding
Look and Feel options
·
Understanding
Survey Options
·
Create five basic
question formats
·
Skip logic
·
Editing options
·
Launch survey
·
Collecting
results
·
Analyzing results
Content
Checking/SME Information
Elaine Roberts and David Stoops will act as the
subject matter experts for this project.
Elaine Roberts is a recent AHRD graduate and currently works at CIT as a
Learning Technologies Support Coordinator.
In her current position Elaine works to coordinate student employees and
support faculty development. She is
acting as an advisor for the instructional design process for this
project. David Stoops is an acting
Educational Technology Consultant. He
currently facilitates all of the Qualtrics face to face sessions, and house
call sessions. He is our primary SME on
this project.
|
Need to Know
|
Nice to Know
|
User Interface
|
Open Qualtrics
Login
Tasks
Folders
Create New Survey
Where to Save
Name
|
Use existing survey
|
Macro Knowledge
|
Look and Feel
Templates
Survey Options
|
Collaboration
My Survey Options
|
Question Creation
|
Create Questions
1.
Multiple Choice
2.
Static Content
3.
Matrix Table
(Likert)
4.
Text Entry
5.
Special Questions
Layout
Skip Logic
Editing
Preview
|
Advanced Logic
Options
|
Launch
|
Publish
Share
|
Collaborate
|
Results
|
Data Analysis
View Reports
Responses
Download Data
Stats Section
|
Cross Tabulation
|
Rationale
Skills listed in the nice to know column
represent advanced Qualtrics skill sets.
It would be nice if participants were familiar with these skills;
however, it is not necessary for them to master them in this training program
for completion. These skills maybe
mentioned if applicable to the content area, but participants will be referred
to the Qualtrics University website to learn more if they are interested. Furthermore, the SMEs mentioned that the most
critical piece right now for this project is the Question completion
piece. Once the prototype of this unit
is complete, others can be explored in more depth.
Performance
Agreement
Objectives
|
Evaluation
Tasks
|
User Interface Skills
|
Recognize and describe key features of the Qualtrics interface, in order
to design an effective survey.
|
Participants will show successful completion of this task by
identifying where and how to change the look and feel of their Qualtrics
survey. They will have the opportunity
to practice the skill before a short assessment with software simulation
activities within each module.
|
Macro Knowledge and Launching
Skills
|
Illustrate knowledge of “survey options” by preparing and launching
their survey, and submitting the password protected link to the instructor.
|
The final product of a successfully designed survey will be the main
evaluation piece of these two skill sets. Students must be able to understand
how the software works (macro knowledge) and also to apply that knowledge by
launching the final product. Each
section will also have practice and assessment pieces.
|
Question Creation
|
Demonstrate understanding of
Qualtrics by composing a survey which includes 5 distinct question types, as
outlined in the session.
1) Multiple Choice
2) Static Content
3) Matrix table (likert)
4) Text entry
5) Special question (users choice)
|
This module will be broken down into chapters. Each chapter will cover each different
question type and its creation.
Students will practice each question creation piece before a final
assessment piece.
Participants will also complete skip logic and editing as part of
this unit. Each of these skills will
be assessed with a short quiz at the end of each module.
|
Results Skills
|
Locate and identify tools
needed to collect and view results.
|
Participants will complete a simulation of locating and viewing
results for successful completion of this unit.
|
Demonstrate understanding of
results tools in Qualtrics by interpreting compiled results.
|
Participants will complete a simulation of locating and viewing
results for successful completion of this unit.
|
Instructional Module
After discussing the format and target audience with the SMEs, the
decision was reached that Articulate will allow the modules to be distributed
across viewing platforms, protect the integrity of the design, and allow for
simple and easily replicated software simulation for CIT. Each module can be followed through in
completion for the novice, or only the final assessments can be taken for the
expert. Time for completion depends on
the participant. Ideally, up-front
content knowledge will last between ten and fifteen minutes, following adult
learning theory. Each of these will be
followed by software simulations that can be completed within ten to fifteen
minutes depending on the participant.
The following markups are sample designs for the modules.
Figure 1: Basic Layout of Modules and Sitemap
Figure 2: Sample Layout for Question Creation Module
Figure 3: Sample Layout for User Interface Module