We are taking a break from writing blog posts for a little while as we work on writing a book relating to instructional design practices and processes.
In instructional technology, gamification is still one the most talked about methods for connecting learners with content especially in online learning. Proponents of gamification believe in its ability to provide a better learning experience because of its “fun factor”. At its heart, gamification is the application of incorporating game playing elements like competition or point scoring to a non-game situation as a way to engage learners. The key to understanding and thinking about gamification is learner engagement. In library instruction this has often been employed through the use of digital badges where an instructional event takes place and a badge may be earned in relation to participation, thus increasing engagement.
Game-based learning however is an instructional design approach that is based on defined learning outcomes in which a game format is used to reinforce educational goals. Game-based learning is often an authentic learning experience that uses active learning in a game-like format to help learners apply subject matter. An example of game-based learning we use in the UC San Diego Library is the Jeopardy game we use within the Academic Integrity workshop to reinforce plagiarism concepts.
Gamification and game-based learning often times blend harmoniously yet there is a discrete distinction between the two. Of the two, gamification of instruction is the most likely to go awry in part due to its lack of learning outcomes as a grounding principle. Learners can become distracted by the chase of achieving badges or flashing icons in online learning environments and fail to comprehend presented content. To keep the gaming of your instruction on point, focus on the learning objectives first before considering elements of engagement.
Although a considerable amount of instructional design planning and testing goes into the instruction we provide in the library, not everything goes according to plan. Occasionally we find ourselves having to rework or refine a lesson, activity, or learning object. One recent example of this relates to a library scavenger hunt activity that introduces new freshman and transfer students to the library’s spaces, services, and resources. The activity is in its third year and recently we made a slight change that had unintended consequences for our Information Desk. One of the first stops on the scavenger hunt requires students to locate the Information Desk and input into their mobile devices the activity code they find there. Originally the sign at the desk was temporary and printed on neon paper that looked like Image 1 below. Due to the success of the activity, we moved to permanent signage and in that process we changed the sign to look like Image 2.
The first quarter of the signage change, we noticed that the Information Desk staff put a sticky note on the sign with an arrow that pointed to “help”. The Information Desk was being inundated by students asking for the activity code. This confusion took up staff time and negatively impacted the desk workflow. It was clear to us that the issue was associated to the new sign as we didn’t experience this issue in previous quarters. Our first solution assumed a user issue and we checked the activity instructions for clarity of language. In doing so we changed the instruction language to refer to “activity code” instead of “validation code.” The next quarter, the sticky note was back. Although we had clarified language in the instructions activity for the user, the sign design was still problematic. It was then that we determined that the new sign conveyed to students “come her for activity help” as opposed to “the activity code is help”.Clearly this was a sign design flaw as opposed to a user issue. We then decided to change the code from “Help” to “Desk”, a word that had no assistive meaning assigned to it.
It may be difficult at times for a developer to look critically at the work they have done but it is a necessary part of evaluation. It is all too easy to assign fault based on how others behave as opposed to the design of a lesson or object itself. This experience is a reminder to us that in many cases when something is not working the way it is intended, the issue at hand is most likely a design issue and not a user issue.
A lot of the work that I do as an Instructional Technologies Librarian is focused on creating eLearning objects, which refers to the collection of content or assessment items that are used in a virtual learning environment. This includes creating objects that support users in completing a specific task (i.e. performance support) like LibGuides, how-to screencasts, videos, and online tutorials. The most time consuming and robust objects that I create for eLearning are online tutorials that may be imbedded into a learning management system (LMS) or accessed via a link on a webpage. The tutorials are designed to offer students the ability to learn new content and then actively use their knowledge to perform tasks, play games, etc. In order to do this work as a non-computer program, I rely on rapid authoring tools.
A rapid authoring tool is software that helps a designer build self-containing tutorials, much like the online tools that help you create your own website. The software does the background programming for you and allows you as a designer to focus on applying sound instructional design principles to the content you are creating. The rapid authoring tools on the market tend to vary in degree of sophistication. Products like Articulate Storyline have a small learning curve as the framework for the software builds off of pre-existing PowerPoint skills. On the other end of the spectrum are products like ZebraZapps, that enables the designer to do some amazing game-like simulations but has a much steeper learning curve. My comfortability with rapid authoring tools is somewhere in the middle. I want to be able to create engaging interactions and have the ability to make minor code adjustments but I don’t want to have to write my own java script to create a learning interaction.
Here are some things I look for when shopping for a rapid authoring tool.
Learning curve. I first look at the learning curve. I don’t necessarily need a short learning curve but I do need a product that offers tutorials, how-to instructions and a community of support. It is likely that your technology department will not support the rapid authoring tool you select so you’ll need to have access to external help.
Functionality. I need the tool to enable me to upload a package to a LMS or send out web links. I need it to work on all browsers and devices, flash or no flash, and ideally help me with 508 accessibility standards.
Compatibility. I also look at the tools compatibility with other software our department uses or built in functionalities like audio editing features. Captivate, Storyline Articulate (depending on version) and Lectora by Trivantis are examples of rapid authoring tools that are compatible with other software packages.
If you are thinking about adding active learning elements to your e-learning, a rapid authoring tool can make it possible for you to do so. Educational content becomes more engaging, students are able to apply what they learned and elements of assessment can be added through the use of quizzes and test scores. With some creativity the possibilities for engaged student learning are endless.
Would you like to learn more about how an Instructional Design Librarian and an Instructional Technology Librarian work together? Check out our slide deck from our presentation at LOEX and Library Instruction West this year. Feel free to contact us if you’d like learn more.
We know it has been a while since we’ve posted. Despite our dearth of posts on this blog, we’ve been busy publishing and presenting. Below is a list of what we’re doing in the next several months.
- Tamara Rhodes recently published a user experience (UX) article in the Journal of Library Administration: “Our experience with user experience: Exploring staffing configurations to conduct UX in an academic library.”
- Amanda Roth, Dominique Turnbow, Crystal Goldman and Lia Friedman have an article about the First Year Experience scavenger hunt in an upcoming issue of Library High Tech News: “Building a Scalable Mobile Library Orientation Activity with Edventure Builder.”
- LSV has two articles in the upcoming issue of Communications in Information Literacy.
- Crystal Goldman, Dominique Turnbow, Amanda Roth, Lia Friedman and Karen Heskett have an article about the First Year Experience Program: “Creating an Engaging Library Orientation: First Year Experience Courses at UC San Diego.”
- Dominique Turnbow and Annie Zeidman-Karpinski will be contributing an article about doing assessment in one-shot instruction: “Don’t use a hammer when you need a screwdriver: creating assessment that matters.”
- Dominique Turnbow and Amanda Roth will have a chapter in an upcoming book, Distributed Learning: Incorporating Online Options into Your Information Literacy Instruction, published by Elsevier about the instructional design approaches used in many of our e-learning projects: “Engaging Learners Online: Using Instructional Design Practices To Create Interactive Tutorials.”
- Tamara Rhodes and her LIS professor have submitted an article to Reference Services Review: “Hispanics and public libraries: Assessing their health information seeking behaviors in the e-health environment.”
- Karen Heskett provided a continuing medical education class for physicians through the PACE (Physician Assessment & Clinical Education), “Information Retrieval, Digital Libraries and the Use of Evidence-Based Principles for Clinical Decision Making.”
- Karen Heskett participated in an online conference offered by AMIGOS about her Problem Based Learning (PBL) work with the School of Medicine’s PBL curriculum. Information about the conference is on LISN.
- Amanda Roth, Dominique Turnbow and Crystal Goldman will be presenting our work with the First Year Experience Program at the CARL Conference: “Adding Value to Library Orientations: The First-Year Experience Program at UC San Diego.”
- Dominique Turnbow and Amanda Roth will be presenting about how we work together to produce our e-learning at Library Instruction West: “Walking the path together: creating an instructional design team to elevate learning” and again at LOEX 2016: “Future Reimagined: Shaping Teaching Through Design.”
- Crystal Goldman and Dominique Turnbow will be presenting about the flipped classroom redesign of MMW 121 at Library Instruction West: “Guided Adventures in Team Hiking: Collaborations between Librarians and Writing Program Faculty to Flip the One-Shot Library Workshop.”
- Amanda Roth and Dominique Turnbow will be presenting about the plagiarism tutorial at Library Instruction West: “Drab to Fab: Elevated practices for active learning online.”
- Crystal Goldman and Tamara Rhodes are presenting about their work with the Making of the Modern World (MMW) writing program at LOEX 2016: “Recycling the First-Year One-Shot Workshop: Using Interactive Technology to Flip the Classroom.”
- Crystal Goldman will be presenting about the 6th College writing program (CAT) redesign at the AAAS-PD conference: “Active Learning Online and in the Classroom: Scaffolding and Assessing Library Instruction within a Multi-Course Writing Program.”
- Lia Friedman and Gayatri Singh will be doing a presentation/workshop at the National Diversity in Libraries Conference in August: Wikipedia: A Bridge from Basic Markup to the Research Cycle.
Written by Amanda Roth and Dominique Turnbow
What is performance support?
Have you ever been to a great workshop or training session and felt like you are capable of doing a newly learned task or activity. Then when you sit down to do the task, you don’t quite remember where to start? Do you think it’s likely that your students are experiencing the same thing after a library workshop? One way to address this issue is through the use of a performance support tool. Performance support provides “just in time” instruction that informs and guides someone so they can complete a task. Performance support tools address an immediate need. Here are a few examples:
- Step by step instructions (either verbal or pictorial)
- Decision trees
- How-to videos or animated images
Here are some library related examples:
- Step by step instructions on how-to use a database
- A decision tree helping students determine when to use a citation
- A video on how-to use UC-eLinks to get full-text articles
Notice that performance support tools are created to help an individual perform a task. To help solidify this idea, here are a few library related examples of materials that are not performance support. Information is provided; but there is not an associated task.
- A list of databases
- An image of the publication process explaining primary, secondary and tertiary sources
- A link to Ask a Librarian
When should I use performance support?
Connie Malamed (aka the eLearning Coach) provides guidelines for when to use performance support instead of relying on instruction. Performance support tools are best used to help with the limitations of memory and are best used when an activity occurs infrequently, when a process is complex, involves many steps, or has many attributes when there is little time or few resources to devote to it and especially for English as a second language (ESL) learners. By adding performance support tools to your LibGuides or as handouts, you are providing the refresher instruction that a student needs after class that will assist assisting them at their time of need.
Another reason to use performance support is as a replacement for information that is not covered in the classroom. For example, in an undergraduate database searching class, we could spend more time on the thought process behind pre-search activities like formulating a research question and selecting key words and less time on the mechanics of how to get an article. In this case, you could provide a link to our UC-Links video on a LibGuide or course web site and refer students there when they are ready to obtain the full-text.
How do I determine when to use performance support or instruction?
It all starts with well-written learning outcomes. Imagine how you are going to teach the outcome. Do you need to explain concepts? Is it procedural? Your goal here is to categorize them. For example, the outcome below is procedural.
“Given a list of article databases with descriptions recommended for their course topics, students will be able to identify at least two that are relevant to their topic”
In order to complete this task, students would:
- Go to the course guide with the list of article databases.
- Read the descriptions for each database.
- Select two that are most relevant.
While you can easily provide five to ten minutes in your workshop for students to do this, it is also possible to have a performance support tool (i.e. written instructions on the course guide or an animated image) for students to consult when they are ready to do this. You can then use this valuable in-class time to address learning outcomes that are more conceptual in nature, which may be more difficult to teach using performance support tool.
Above all, performance support should only be used to help students complete a discreet task. Just because the information is provided on a LibGuide or through a tutorial or video does not make it performance support. I could just be online instruction.
The benefit of adding performance support tools to your instructional tool kit is that it extends your reach beyond the classroom and provides targeted educational support when it is needed.
If you would like us to help you determine if you should use performance support tool, or help creating one, contact us via our consultation form.