Friday, April 19, 2013

Program Evaluation and Improvement

Program Evaluation and Improvement (Yuanjie and Kanita)




prgram evaluation video




Program evaluation and improvement

Program evaluation is an essential organizational practice in social programs. However, it is not practice consistently across program areas, nor is it sufficiently well-integrated into the day-to-day management of most programs.

Effective program evaluation is a systematic way to improve and account for public health and social service actions by involving procedures that are useful, feasible, ethical, and accurate.

The author provides a framework, which is composed of six steps that must be taken into account for any evaluation.

Step 1: Engage stakeholders

Step 2: Describe that program

Step 3: Focus the evaluation design

Step 4: Gather credible evidence

Step 5: Justify conclusion

Step 6: Ensure use and share lessons learned

Stakeholders must be engaged in the inquiry to ensure that their perspectives are understood. When stakeholders are not engaged, an evaluation might not address important elements of a program’s objectives, operations, and outcomes.

Program descriptions convey the mission and objectives of the program being evaluated. Descriptions should be sufficiently detailed to ensure understanding of program goals and strategies.

The evaluation must be focused to assess the issues of greatest concern to stakeholders while using time and resources as efficiently as possible

An evaluation should strive to collect information that will convey a well-rounded picture of the program so that the information is seen as credible by the evaluation’ primary users.

The evaluation conclusions are justified when they are linked to the evidence gathered and judged against agreed-upon values or standards set by the stakeholders.

Deliberate effort is needed to ensure that the evaluation processes and findings are used and disseminated appropriately.

 

One of the most significant benefits that a program evaluation communicates is the need to make improvements.

Examples of the types of improvements that an evaluation may reveal could include:

·         The elimination of program services and activities that do not achieve program outcomes

·         The addition of other services and activities that are better designed to achieve outcomes

·         Acquiring more adequate resources to support program services and activities

·         Target a different group of participants to receive program services because there are an inadequate number of specified participants available to receive services

These and/or other types of program improvements that are revealed by the evaluation findings should be viewed and communicated as an opportunity to make the program better.


How have you used the CDC framework in your line of work to evaluate something?

What other frameworks to find to be more effective then this particular framework?

What are some possible ways in which you can engage stakeholders?

What important factors can you take away from the youtube video link posted?




Reference:

Posavac EJ, Carey RG. Program evaluation: methods and case studies. Englewood Cliffs. NJ: Prentice-Hall, 1980.

Eddy DM. Performance measurement: problems and solutions. Health Aff 1998;17(4):7-25.

Thursday, April 4, 2013

Why Should We Beta Test?

http://www.digitalwhip.com/wp-content/uploads/2012/08/shutterstock_48434191-617x416.jpg
By: Katie Horst

What is beta testing?
Beta testing is when a product is tested by real customers in their real environments before it is ever out. During a beta test, the company and the customers form a partnership in order to make the product reach its full potential.

More about beta testing:


Why should we beta test?
The overall purpose of conducting a beta test is to create the best possible product. By doing a beta test, the company is receiving feedback, and they’re gaining more knowledge about their product before it is ever released. This way, improvements can be made before the product is available to the public. Beta tests can help to develop a product that has improve customer satisfaction, better reviews, increased sales, fewer problems/less support costs, a more positive brand image, etc. It’s essential to do a beta test in order to make sure that you are getting a product that consumers actually want.  Beta tests also provide an opportunity for customers to get involved, and help improve a product that they’re interested in.

I found it interesting that companies such as Play Station, have programs that are offered to the public to participate in their beta testing. Here are the terms and conditions a person must follow: http://us.playstation.com/support/useragreements/ps_public_beta_test_program.html. Play Station allows for people to be given early access to certain things like games. Those people are then required to provide feedback about things such as the chat rooms, connectivity, and usability of the product.

Doing a beta test:
Click here for a great article if you want to know more about implementing a beta test. The article looks at why a beta test should be done. It also provides general guidelines, the planning process, evaluation process, and other helpful tip about doing a beta test.

Questions:
1)      Have you ever participated in a beta test, or have you ever implemented one? 
2)     How did you go about conducting a beta test for your project?
3)     By doing a beta test, do you think this will better your product?
4)     From your personal experiences, why do you think we should beta test?


Saturday, March 30, 2013

Why Do We Evaluate Instruction?


Evaluation is an integral part in the instructional design process. The purpose of evaluation post instruction is to allow for feedback for improvement. Forming and utilizing an effective evaluation approach allows for continuous improvement in training system quality. Evaluations are set in place to help determine the value of the learning program. By using assessment and validation tools, data is provided to help in the evaluative process. Overall, the purposes of the evaluation level of ADDIE is as follows:
1. Feedback
2. Control
3. Research
4. Intervention
5. Power Games

Feedback revolves around how one linked the learning materials and course content to the objectives set forth at the beginning.

Control and Research are the processes of making connections with real world aspects with the material you learned from the training program.

Intervention is when the results from the evaluative process influence change in the course program or its delivery.

Lastly, power games, which are not always prevalent, are an area of manipulation. This is a part of evaluation where the results are skewed to react with organizational politics.

The steps to evaluation include:

1. plan the evaluation
2. determine the evaluation design
3. develop the instruments
4. collect data
5. analyze the data
6. report the findings

While evaluation seems overarching as one principle, there are three types of evaluation. Formative evaluation, Summative evaluation and confirmative evaluation.

Formative evaluation is where there is continuous evaluation throughout the content. One example would be to have checkpoints throughout a training program that change the course of the learner based on responses provided to questions.

Summative evaluation is a knowledge understanding check at the end of a training program that evaluates the learner. Think of the prefix of the word, sum-, and being a total understanding. One example of summative would be a test at the end of a training course to check for understanding.

Lastly, Confirmative is when a period of time lapses between the end of a training program and when the evaluation is taken. An example is when the learner is retested 6 months after the training to check that learning has occurred and was retained from the course work.

Watch this YouTube video for further understanding of Evaluation types.



Questions:
1. When was a time that you had to evaluate instruction and how did you do so?
2. How would you apply the 3 types of evaluation to your Instructional Design Project?
3. What is your feeling on Power Games? Do you think evaluation measurements should be changed based on organizational politics?
4. Evaluation has to do with Kirkpatrick's 4 Levels including Reaction, Learning, Behavior, and Results. Which levels match with the types of evaluation(formative, summative, confirmative)?









References


Bramley, P. & Newby, A. C. (1984). The Evaluation Of Training Part I: Clarifying The Concept. Journal of European
& Industrial Training, 8,6, 10-16.
Foxon, M. (1989). Evaluation of training and development programs: A review of the literature. Australian Journal of
Educational Technology, 5(2), 89-104. http://www.ascilite.org.au/ajet/ajet5/foxon.html


http://www.learningsolutionsmag.com/articles/530/

Friday, March 22, 2013

Integrating New Technologies



This Rethinking Learning video from the MacArthur Foundation presents several thought-provoking comments on getting 21st-century students out of the 20th-century classroom, and some innovative examples of technology integration in education.

We are all familiar with the annual Horizon Report, "designed to identify and describe emerging technologies likely to have an impact on learning, teaching, and creative inquiry in higher education." In this report, key trends and technologies are profiled in terms of project time-to-adoption, ranging from one to five years. So, as educational technologists, the question isn't what new technologies are coming, or even necessarily when they will become mainstream, but how we should integrate these new technologies to best serve teaching and learning.

The constant proliferation of new tech options affords learners a never-ending opportunity to interact with peers, instructors, and course content in novel and engaging ways. At the same time, the pace of technological innovation is so rapid that it is longer possible for instructors to maintain real-time awareness of the latest options and to thoughtfully, creatively, and successfully apply them to educational contexts. And that's okay. In the words of instructional technologist Andrew Marcinek, "Educators should not pace education at the same pace at which technology moves. It is far too fast, and too sudden." Surveying the technology environment, understanding the educational context, and meeting learner and instructor needs at their current level of ability remains key when investigating a potential educational technology solution.

For reflection:
  1. In terms of new technology integration in the classroom, what is your experience, either as instructor or student? 
  2. Have you ever been required to use a "tech solution" that didn't match your needs? What happened? 
  3. As discussed in the MacArthur Foundation video, how can we more actively link informal, technology-rich learning outside of school with formal learning inside school?

                                                              References

Boss, S. (2011, Sep. 7). Technology Integration: What Experts Say. Retrieved March 22, 2013 at
http://www.edutopia.org/technology-integration-experts

Edutopia staff. (2007, Nov. 5). Why Do We Need Technology Integration? Retrieved March 22, 2013 at http://www.edutopia.org/technology-integration-guide-importance

Johnson, L., Adams, S., and Cummins, M. (2012). The NMC Horizon Report: 2012 Higher Education Edition. Austin, Texas: The New Media  Consortium. Retrieved March 22, 2013 at
http://www.nmc.org/publications/horizon-report-2012-higher-ed-edition

MacArthur Foundation. (2010). Rethinking Learning: The 21st Century Learner. [web video]. Retrieved March 22, 2013 at http://www.nmc.org/publications/horizon-report-2012-higher-ed-edition

Monday, March 18, 2013

Friday, March 15, 2013

VMRC Prototypes

Below is the link to our prototype for the Power Tools and Fire Safety training modules for the Virginia Mennonite Retirement Community:

http://firesafetyandpowertoolstraining.weebly.com/


Alex Lee Jones, Sarah Peachey, and Shana Ryman


LFCC Math Tutorials- Alicia, Tara, Katie

Here is our prototype for LFCC Math Center project.  Our project is for creating math tutorials for students to review before taking a math placement test. We are still working on our quizzes and supplemental material for our client.  Please review and leave your comments!




Lord Fairfax Community College Academic Center for Excellence Math Tutoring Videos

Tara Cassidy, Alicia Grasso, Katie Horst  

March 14th, 2013


Module 1.1: Fractions: Defining the Basics

Objectives:
LFCC students taking Module 1.1 will be able to apply basic math concepts to fractions. After completing module 1.1, the learner will be able to
  • Define fractions and mixed numbers
  • Identify the parts of a fraction
  • Recognize when a fraction is in simplest form
  • Discriminate between a proper and improper fraction
  • Recognize the difference between dividing by zero and dividing into zero.

The proposed module is a streaming video tutorial that will prepare LFCC students to successfully apply college math concepts without a calculator when they take the math placement test. The core content in lesson 1.1 is basic college math vocabulary and concepts: fractions in simplest form, proper and improper fractions; mixed numbers; division by zero; division into zero.

The abstract math information will be paced for the novice learner and made concrete by presentation in video format, incorporating a student resource person (LFCC Math Consultant Mr. Jeremiah Dyke) into to video,offering a text transcription of Mr. Dyke’s instruction, and a supplemental glossary of math terms to accompany the video tutorial.
*Screen shot from video that introduces LFCC Math Consultant, Mr. Jeremiah Dyke


After each math concept is introduced and verbally explained by Mr. Dyke while he visually represents the concepts on his whiteboard, the video will pause and present the learner with an embedded quiz question to check for comprehension. This approach will enable comfortable self-pacing and avoid overloading the novice by breaking the larger set of introductory concepts and vocabulary into smaller, more manageable pieces of information.
*Screenshot of the student view of Mr. Dyke’s visual representation of a concept



The learner  will not proceed through the tutorial without correctly answering the assessment questions for each concept, in order as presented, for a given module. Learning will be measured by a comparison of pre-module and post-module scores on a basic fraction vocabulary quiz. Learners will not advance to the next module until the post- module score equals 100% correct.
Self Check:

0
--   =  ?
1

a.   1
b.   0
c.   0.1
Representation of an embedded quiz question for module 1.1
Self Check:

12       
--    
6       
                                        
What is the numerator in this fraction?
                                         
a.   12
b.   6
c.   2
Representation of an embedded quiz question for module 1.1



The tutorial and self-check quiz questions will be self-paced in order to accommodate LFCC students with widely-varying levels of math ability who wish to effectively prepare for the math placement test. Learners can engage the tutorial either as a simple refresher or as a first exposure to these math concepts, accessing this online module at any time and moving through the tutorial at his or her own comfort level.

Materials Needed For this Module:
  • A computer or web-enabled mobile device
  • Reliable Internet connection
  • Headphones (if completing the module in a public space)

Amanda, James and Kelly Prototype

You can find out prototype materials at http://ahrd610.weebly.com. There you will find two images (a slide and a clipboard). To download the storyboard of our training program, click on the slide, and to download the follow-up quiz, click on the clipboard. 

Our prototype does not yet include videos and benchmark quizzes that we plan to incorporate into our training program later on to make it more concrete. Please give us feedback on the materials provided so we can improve!

Thursday, March 14, 2013

Chad, Lauren, Kanita Prototype


We will be using Storyline to create our prototype, this is a rough draft with a powerpoint of the flow of 1 of 3 trainings. The other two training courses, not listed, include a setting up and how to use blackboard course as well as a course about what is required for online faculty.

What Core content will you include in your lesson?

The core content includes:
1.     Facts about LFCC
2.     HR logistics
3.     Email set-up
4.     Syllabus creation
5.     Understanding how to use blackboard
6.     Expectations of online faculty

How will you assess learning?

The course assessments will very from a series of practice scenarios, proper writing techniques and process assessments. Furthermore, when the faculty properly completes what is required of them without error, it will show understanding and retention of skills knowledge. There will be an assessment in accordance to policies.

How will you make the information concrete?

For the information about the campus, a short video and activity will make the information concrete, HR logistics will be done in the course, setting up an email will walk you through how to do with a link that allows you to go and do it directly after. In terms of syllabus building and knowing what is to be included, there will be an activity to chose the incorrect syllabus.

How will you control the step size?
We will break steps down accordingly and make sure they are learned before moving forward, steps will be grouped accordingly to topic area.

How will you handle pacing?
This is self-paced learning. The learner can start and stop the training program as they please. The course will build upon itself and not progress until the mini assessments throughout the course has been met. We do not want to go through the entire course without proper retention at a fast pace and have to repeat the course.

How will you address cognitive load?
With activities and check-points, we will help to ease cognitive load as well as the presence of employee process aids that can be printed.

What format will you use?
We will use online training modules that are self-paced and taken by individuals





This is a rough draft of what the course look and feel will consist of using the branding guidelines provided by LFCC.

Qualtrics Modules Prototype




Friday, March 1, 2013

How can we motivate learners? Sam Dowell and James Goldberg

We started talking about motivation in class last night, and I wanted to use this blog to present several abstract ways of thinking about motivation that renowned theorists are making cases for. There is no right and wrong way to motivate learners, just different ways that are either more or less effective. The goal is to stimulate your minds and give you some new concepts to consider.

Daniel Pink has an interesting theory about motivation that rejects the traditional perception of extrinsic rewards if there is a divergent thinking involved. Evidence of this is shown in his candlestick problem. Instead, Pink claims that motivation is driven for autonomy, purpose, and mastery. For those who haven't read his book drive, or would like to brush up on the content, I have attached a link to a ted talk of him on motivation. He's a really good speaker, and listening to him helped me gain a better understanding. 

The following youtube video has been created by Sir Ken Robinson who advocates changing education paradigms to improve motivation.


He claims that our current school system is to standardized and that in turn stunts students motivation. Two quick points he made that I would like to point out is that we are trying to motivate our kids more and more through medication (adderall, ritalin, etc.) This factory made motivation turns what should be aesthetic learning into anaesthetic learning. This defeats the whole purpose of divergent learning. Robinson also claims that schools are too standardized and that in turn stifles divergent learning form occurring. Our educational system is to factory based, where students are grouped by age which is not a very good indicator of learning ability. Everyone is taught the same thing and there is normally only one correct answer. He claims that the key to motivate learners is to allow them to think divergently and break away from the standardized norms of education.

I came across one last video that is very short but provides an interesting concept that I found to be true in my specific case. Derek Severs claims that if you keep goals to yourself, you are more likely to achieve them. Conversely, if you tell your friends your goals, they are less likely to happen. At first this concept sounds like bologna but after thinking back on it, I found that when I told my friends about different short term goals I was trying to achieve, I normally didn't end up achieving them. Take a look at this video so that you can discuss your opinion on it in the critical thinking section below: 

Critical Thinking:

I have provided you with a couple different "left-field" theories so to speak that have research studies backing each of them up to a point. I have come up with a couple questions below that are based on the blog posting and are geared to help establish your perception feeling on each one.

1. Give an example of how you could incorporate a candlestick type problem into learning, and then explain if you think just having a problem like that in place is enough to motivate and engage the learners, or if you think there is a lot more to it.

2. Do you think the structure and standardization of todays school system has a big impact on motivation. What changes can you think of to improve this?

3. Do you agree or disagree Severs concept. If YES, how so you think an individual goal differs from the goals in the mission, vision, goals, values section of your company? Would you change the way your companies goals were presented at all? If NO, why not?

Friday, February 22, 2013

How Do We Design for a Global Audience? (Kelly & Kyla)



In class last night, we spent some time discussing the different theories that can be applied to instructional design. Gauging your audience and its needs are important aspects of choosing what theories to align your instruction with. Looking at things like learning style, demographics, experience and ability are all important when deciding how to structure your learning programs. But what if your audience is made up of learners from somewhere on the other side of the planet? One of the first steps in the design process is to determine the best way to cater to your audience. On a global scale, this is especially important, because people of different nationalities and cultures not only learn in different ways, but have different norms that should be considered when a designer is creating instruction for them. This means being culturally aware. According to the Centre for Cultural Diversity in Ageing (2010), cultural awareness “entails an understanding of how a person's culture may inform their values, behaviour, beliefs and basic assumptions.” Being culturally aware is one of the fundamental requirements when designing instruction on a global level.

How culturally aware are you? Take this quiz and find out! http://www.kwintessential.co.uk/resources/culture-test-1.html


 Quesenbery and Szuc (2011) believe when designing for a global audience the core techniques of design does not vary.  Nevertheless, additional techniques suggested are to plan for globalization, decide on your global strategy, get the language right, and create a good local experience (Quesenbery & Sulc, 2011).  Planning for globalization includes researching the culture, the locations, and understanding similar previous products (or instruction) that have been implemented. Researching this information will identify areas that need adjusting whether it is the text, layout, procedures, or context.   Understanding what makes the audience different, but yet the same will start to align the concepts within the instruction and allow the product (or instruction) to be distributed throughout all desired markets.  Deciding on your global strategy is the next step in designing for a global audience.  Having a clear goal for the project will give the project direction.  In most cases companies start off with a goal directed to one sector and later broaden the goal; however some companies do start out with a broader view that allows them to avoid significant amounts of focus on one sector (Quesenberry & Szuc, 2011).  After the goal is set selecting the approach for a global brand is recommended.  There are three approaches that are suggested, one product, local control, or the global templates with local variations.   One product allows little deviation on how the product is used or understood.  This design is for a primary market and for products that are mainly hardware-based.  The locally controlled products make their own design decisions about the layout and content of the site.  Local companies have the most control under this approach.  The final approach is a global template with local variations.  Creating a single design for all markets and being able to have variations for local needs is the core of this approach.  This approach tries to find a middle ground.  All of these approaches try to create a clear global strategy (Quesenberry & Szuc, 2011).    The next step to designing for a global audience is getting the language right.  Translating the content consist of not only translating the text, but also changing the format from right to left or left to right if need be, and making the information appropriate in style and tone.  The last step Qesenberry and Szuc (2011) addressed was creating a good local experience.  Whatever you may design for a global audience must be accepted by the user.  Being able to adapt your product (or instruction) into the local environment will dictate the success of the design (Quesenberry and Szuc, 2011).       

Questions to consider:

1. What are some experiences you’ve had with differing cultural norms? How did you react to these differences?


2. If you were to design a training program for learners from another country, what first steps would you take in the design process?


3. If you were to design a training program for a globally diverse audience, what are some concerns you’d have in the development of your instruction? Why?

Extras:


This is a TED ed video talking about how Twitter and Facebook speak to a global audience:
http://www.ted.com/talks/ethan_zuckerman.html






References



Quesenbery, W., and Szuc D. (2012). "Design for a Global Audience." Global UX: Design and Research in a Connected World. Waltham, MA: Morgan Kaufmann, 171-94.



Centre for Cultural Diversity in Ageing. (2010). “Cultural Awarenes”. Retrieved from http://www.culturaldiversity.com.au/practice-guides/cultural-awareness.

Sunday, February 17, 2013











Engaging Qualtrics: Creating Interactive Training Modules
Sam Dowell
Yuanjie Dai
Mike Floyd
Kaylea Algire

Principles of Instructional Design
February 17, 2013


Introduction
                The Center for Instructional Technology perceives an online component to the face to face Qualtrics training program for several reasons.  First, there is no efficient way to conduct an assessment on learned skills with the current training program.  Second, there is a higher demand then it is possible to conduct the needed sessions face to face. Third, face to face does not allow the facilitator to adapt the training to reach learners where they are in terms of schedule, learning preferences, or location.  Developing an online component will potentially address all of these needs and also allow CIT staff to reallocate resources to other projects.
Needs Analysis
CIT at JMU developed and conducted a three-hour training program in a face to face environment for the past five years.  A year ago, David Stoops facilitated these trainings as part of his position with CIT.  The beginning of the training consists of a best practices presentation by Dr. Peter De Michele from the Office of Institutional Research.  Dave then finishes the training with a step by step facilitated presentation of the program.  However, as demand grows, CIT determined that it would be more efficient and effective if they added an online component to their face to face offerings.
Learner Analysis
Typically a participant in the face to face sessions is a faculty or staff member in the JMU community.  Mean age of participants is thirty five and of middle income. However, there is a blend of ethnic and socioeconomic backgrounds.  Each gender is equally represented, and all have at least a bachelor’s degree. 
This training is not currently open to graduate students, but an online training would allow them to become familiar with the software if their departments negotiated accounts for them.  For example, the Adult Education/Human Resource Development program has several department accounts that it allows students to use to conduct thesis research. 
Target Audience
Age:
25-60
Education:
Mostly bachelor’s degree holders with some master’s and doctorate degree holders
Computer Skills
Varies
Gender
Mix of men and women
Attitudes
Required training with paid time at the training
Experience
Mixture of experienced Qualtrics uses and new users
Ethnic Background
Mostly Caucasian with a few Asian and African American participants
Income
Median Income to Upper Income Participants
Context Analysis
                This is a required training before participants receive access to JMU’s Qualtrics license.   All learners have a varying amount of experience with the program.  For example, if a professor came from another university that already used Qualtrics, then they come in at an expert level, however, currently there is no way for them to be granted access without going through the training.  Each session has a different mix of learner experience which presents challenges within the facilitation.
                The target audience of each training session is faculty, staff, or graduate students that are conducting customer service satisfaction surveys or research, offering conference registration or conducting departmental voting.  The sessions are generally held in Rose Library in the computer lab in CIT’s office.  There are thirty two computers on computer tables with chairs.  Each participant has access to a computer in order to be able to work with the program during the session.  There are white boards around the perimeter of the room and the desks are oriented in rows towards the front of the room.  There is an LCD projector mounted on the ceiling with a screen at the front of the room.  Dave also uses flip charts and markers to facilitate learning.   The facilitator workstation is located at the front of the room to the side.  There is an option for private face to face training sessions which Dave also conducts in the participants’ office or lab.  This can mean a variety of technologies available and levels of technology experiences present. 
Personas
“Susan” is a new professor at James Madison.  She teaches several sections on the effect of the media on politics.  She is interested in conducting a survey to test the effect of mass media and advertising on the most recent presidential election.  She wants to create a survey that she can distribute easily to JMU students, staff, and faculty and non-JMU members in the community.  While doing her initial research she finds out from her department head and CIT that she needs to go through training for Qualtrics before she can design her survey. 
Unfortunately, she is unable to attend the next training because it is during one of her scheduled classes.  The next training is not being offered for a month, and the grant application that could fund her research is due in two weeks.  She has to do the training as soon as possible to start collecting data.  Right now she will have to cancel her class to attend the training. 
“Mr. Jones” has been working at JMU for ten years in the Center for Multicultural and International Student Services.  He recently was approached by the director of CMISS to design and offer a training program for diversity awareness among faculty and staff.  In order to get a good idea about areas of improvement, existing programs and the effectiveness of these programs he would like to send a survey to his colleagues that will be easily accessible and efficient.  From his director he finds out that he must participate in the Qualtrics training session at CIT.
However, Mr. Jones travels for his position and he is unable to adjust his schedule to attend scheduled sessions.  When he contacts CIT to schedule a one-on-one meeting, he is informed that the facilitator cannot meet him for another two months.  Mr. Jones must have an initial lesson plan prepared and ready for his director by the end of the month. 
Right now he will not be able to meet his deadline.

Instructional Goal                      
Users will be able to take more control of their learning experience after the successful completion of the online Qualtrics modules.  After learner self-sufficiency, efficient reallocation of CIT resources is the next major goal for this instructional design project.
Content Analysis








This conceptual framework shows the five main instructional units and their connection to the successful completion of survey creation using Qualtrics.  With these five units, students will be able to complete the main instructional goal.
Behaviors/Content
·         Best practices for closed and open ended questions
·         Access to Qualtrics
·         Create a new survey
·         Understanding Look and Feel options
·         Understanding Survey Options
·         Create five basic question formats
·         Skip logic
·         Editing options
·         Launch survey
·         Collecting results
·         Analyzing results
 Content Checking/SME Information
                Elaine Roberts and David Stoops will act as the subject matter experts for this project.  Elaine Roberts is a recent AHRD graduate and currently works at CIT as a Learning Technologies Support Coordinator.  In her current position Elaine works to coordinate student employees and support faculty development.  She is acting as an advisor for the instructional design process for this project.  David Stoops is an acting Educational Technology Consultant.  He currently facilitates all of the Qualtrics face to face sessions, and house call sessions.  He is our primary SME on this project. 


Need to Know
Nice to Know
User Interface
Open Qualtrics
Login
Tasks
Folders
Create New Survey
Where to Save
Name
Use existing survey
Macro Knowledge
Look and Feel
Templates
 Survey Options
Collaboration
My Survey Options

Question Creation
Create Questions
1.       Multiple Choice
2.       Static Content
3.       Matrix Table (Likert)
4.       Text Entry
5.        Special Questions
Layout
Skip Logic
Editing
Preview
Advanced Logic Options
Launch
Publish
Share
Collaborate
Results
Data Analysis
View Reports
Responses
Download Data
Stats Section
Cross Tabulation

Rationale
Skills listed in the nice to know column represent advanced Qualtrics skill sets.  It would be nice if participants were familiar with these skills; however, it is not necessary for them to master them in this training program for completion.  These skills maybe mentioned if applicable to the content area, but participants will be referred to the Qualtrics University website to learn more if they are interested.  Furthermore, the SMEs mentioned that the most critical piece right now for this project is the Question completion piece.  Once the prototype of this unit is complete, others can be explored in more depth.
Performance Agreement
Objectives
Evaluation Tasks
User Interface Skills
Recognize and describe key features of the Qualtrics interface, in order to design an effective survey.
Participants will show successful completion of this task by identifying where and how to change the look and feel of their Qualtrics survey.  They will have the opportunity to practice the skill before a short assessment with software simulation activities within each module.
Macro Knowledge and Launching Skills
Illustrate knowledge of “survey options” by preparing and launching their survey, and submitting the password protected link to the instructor.
The final product of a successfully designed survey will be the main evaluation piece of these two skill sets. Students must be able to understand how the software works (macro knowledge) and also to apply that knowledge by launching the final product.  Each section will also have practice and assessment pieces.
Question Creation
Demonstrate understanding of Qualtrics by composing a survey which includes 5 distinct question types, as outlined in the session.
                1) Multiple Choice
                2) Static Content
                3) Matrix table (likert)
                4) Text entry
                5) Special question (users choice)
This module will be broken down into chapters.  Each chapter will cover each different question type and its creation.  Students will practice each question creation piece before a final assessment piece. 
Participants will also complete skip logic and editing as part of this unit.  Each of these skills will be assessed with a short quiz at the end of each module.
Results Skills
Locate and identify tools needed to collect and view results.
Participants will complete a simulation of locating and viewing results for successful completion of this unit.

Demonstrate understanding of results tools in Qualtrics by interpreting compiled results.
Participants will complete a simulation of locating and viewing results for successful completion of this unit.

Instructional Module
After discussing the format and target audience with the SMEs, the decision was reached that Articulate will allow the modules to be distributed across viewing platforms, protect the integrity of the design, and allow for simple and easily replicated software simulation for CIT.  Each module can be followed through in completion for the novice, or only the final assessments can be taken for the expert.  Time for completion depends on the participant.  Ideally, up-front content knowledge will last between ten and fifteen minutes, following adult learning theory.  Each of these will be followed by software simulations that can be completed within ten to fifteen minutes depending on the participant.  The following markups are sample designs for the modules. 







Figure 1: Basic Layout of Modules and Sitemap




Figure 2: Sample Layout for Question Creation Module
Figure 3: Sample Layout for User Interface Module