By Reuben Tozman
“We didn’t cause the business to achieve its objectives, but our data supports the system for helping the business achieve its objectives. You can do this by designing your interventions to work within the system (I’m not talking LMS or any technology for that matter) and generating data that’s important for the business.”
The business environment, learner profiles, training environment, and IT infrastructure are all things that instructional designers consider in their design plans. For many in the instructional design space, the term big data is something that is probably neither interesting nor relevant to the craft of design.
In the coursework leading to a master’s in educational technology, any discussion about using data to inform the design process is generally tied to creating courses that improve test scores. There is nothing about designing experiences to generate a certain and specific type of data.
Where does big data fit in?
By Rick Wilson
I ended the previous article in this series (Naked Truths and Fundamentals) with Fundamental Three on analysis and audits.
I want to step back and add a few additional words on audits. Content represents tangible things – assets in many formats and forms. Content has identity. It has a name, key-word associations, tags, and other structure; and these characteristics make content useful. The strategy is about content value. Content represents ideas, concepts, advice, insight, direction, understanding, and much more. We say it is the knowledge of our business. How important is that?
Content resides in various places, often called repositories – some more useful than others. So, this Fundamental Three of accurate and reliable inventories and audits is crucial (and essential) in the fight for purposeful content and for an ability to understand what it is, and what it can do. Conducting an inventory has significant value, although it is also prone to errors and inconsistencies. My personal interest and quest are for better ways to conduct the analysis and audit activities. I’d like a tool to expedite and improve the quality and sophistication of content inventory development and auditing.
Revisit Part 2: http://www.learningsolutionsmag.com/articles/757/
By Janet Buckenmeyer, Casey Barczyk, Lori Feldman, Emily Hixon
This study summarizes the results of a program evaluation of the Distance Education Mentoring Program (DEMP), an ongoing initiative at Purdue University Calumet, Indiana (USA) designed to enhance the development of online courses by mentoring faculty in instructional design principles and technology. The evaluation covers a four-year period and is based on a survey of 47 protégé-participants, who are both faculty members and clients of the program, using an anonymous online questionnaire. The research questions yielded evidence that focused on two broad themes, one of which was faculty participation, satisfaction, and university impact of the program.
The second theme addressed the programmatic modifications required by a changing faculty client base. Analysis showed that thirty percent of the university’s faculty have participated in the program and were teaching 44% of the online courses offered by the university. This suggests that the DEMP was making a mainstream impact on faculty views and abilities related to the online delivery of material. Participants were satisfied with the DEMP and its effectiveness, which was related to the collaborative nature of the program. It was also found that faculty participating in later cohort groups of the DEMP had different needs, which necessitated building more structure and accountability into the program. Policy implications for program administrators are discussed to help universities develop a competitive advantage in the growing market for online education.
By Daniel Fusch
Debates continue in the public sphere over the quality and efficacy of online instruction, with research studies citing quite different outcomes confusing the issue. The heart of the matter is that not all online instruction is equal –- institutions still differ widely in the level of planning that goes into the online instruction they provide and in the level of preparation and training provided for online instructors.
To see success with an online learning initiative, hiring and training for specific competencies is critical. Director of instructional design and development Larry Ragan and a number of his colleagues at Penn State World Campus (including Janet May, Paula Bigatel, Shannon Kennan, and Brian Redmond) have for some time been engaged in defining competencies for online instructors with some specificity.
This week, we interviewed Larry Ragan and Brian Redmond, who guided the panel of researchers for the Competencies for Online Teaching Success (COTS) study, to learn more about what competency they would cite as most critical –- and specific activities that can help instructors in online courses can develop it.
By Marc J. Rosenberg
“Get your people together and talk about this. The stakes are high; make adjustments and set a better course. As the ancient Chinese proverb puts it, ‘If you don’t change your direction, you’ll end up exactly where you are headed.’”
A basic building block of a successful and sustainable eLearning program is a solid strategy. Most organizations say they have one, but when you look under the hood there is often a lot of weakness. Here are ten top mistakes people often make when building their eLearning strategy.
- Marc My Words: eLearning Myths, Parts 1 and 2 (hollymccracken.wordpress.com)
By Joe Ganci
“You choose the tools you need depending on the instructional design you need to implement, the venues to which you need to deliver, your budget and schedule, and other factors. I hope this table helps you to reach your eLearning goals.”
Recently the eLearning Guild published a research report written by yours truly, entitled Rapid eLearning Authoring: Top Tools, in which I analyzed the results of the continuous surveys to which thousands of Guild members respond. The results proved enlightening. In the report, I lay the groundwork for understanding and adopting tools, and then cover in good detail each of the top-seven tools. If you are a paid Guild member, I encourage you to download the report.
Following the publication of that report, I thought it would make sense to take the seven tools that survey respondents reported using the most and compare their features for you. While it is not always easy to compare the power and ease of use of two tools, it is certainly possible to place the features most people want and expect in an eLearning development tool side-by-side and review them.
However, it’s important to note that not all apples are the same: two applications may claim that they allow you to create quizzes, but one may let you create much more powerful quizzes than the other. For that reason, I decided to indicate whether a feature is partially supported or fully supported.
Continued at: http://wp.me/p11BlP-Hi
By John DiGiantomasso
“By separating Content, Style, and Flow, and integrating extensibility, an extended Learning Content Management System allows courseware authors to leverage their learning content and present it in countless different ways for a wide variety of target platforms and in a remarkably short timeframe.”
If you have used PowerPoint (or countless other tools, including word processors and spreadsheets), you understand the concept of separating content from style. With these tools, you can enter plain content and apply a number of “style sets” to it, changing its appearance. Experience eventually teaches users that it is a bad idea to embed styles within content. With embedded styles, it is a time-consuming effort to restructure content for use in other places, where the styles in the various re-used bits would not match or would conflict.
And yet, we routinely structure our eLearning content in such a way that we embed “flow.” “Flow” refers to the context or sequence of elements. Many instructional design models, as well as most if not all authoring tools, assume a certain strategy for instruction or learning. Each strategy typically involves an implicit flow, and this flow becomes embedded in eLearning as well.
Continued at: http://wp.me/p11BlP-Hg
- Content Strategy, Part 2: Naked Truths and Fundamentals (hollymccracken.wordpress.com)
By Ben Betts
“I’m reminded of an old adage from a Professor of mine who used to remind me on a regular basis that ‘not all models are right, but some are useful.’ Unfortunately, I’m not convinced that 70/20/10 is actually useful either.”
Where’s the research?
I’ve heard plenty of people like Doug Lynch tell us there is no peer-reviewed basis for the model. I’ve searched for peer-reviewed journal literature to corroborate the model but I can’t find any, despite there being much suggestion as to a solid research basis. I’ve had conversations with a number of colleagues in academia who are generally of the same opinion — 70/20/10 is a model based on what “seems” to fit.
Unfortunately, “seems to fit” is a trend that we don’t need any more of in workplace learning. Learning Styles “seemed to fit.” There is plenty of “seems to fit” evidence for 70/20/10, ranging in quality from anecdotal blog posts to studies like the one conducted by the Education Development Center (EDC), often quoted as the basis of most “70%” work. The EDC research is often cited as providing the corroborating evidence for suggesting that 70% of workplace learning is informal in nature, but it makes no reference to the 20% or 10% part of the model. This distinction is made by Lombardo and Eichinger as a part of their “Career Architect” process; a proprietary approach to assessing and developing leadership. Here the waters muddy further as overlapping definitions kick in. What the EDC research might call informal, Lombardo and Eichinger would call “learning from others,” and the definition often changes dependent on who you speak to. It is all rather confusing and is certainly far from a concrete foundation to effect grand change.