After e-Learning delivery and deployment, how would you drive adoption? was a question posed by my colleague and co-blogger in the "Explore Learning Solutions" Forum.
I am putting up my answer here in my blog as well: Read on...
I was thinking about this question and generally going through Twitter when I saw a tweet mention user adoption. The link led me to a slideshow on user adoption of SharePoint. While this is apparently unrelated to the user adoption mentioned in the question, there are certain fundamental relations.
If we think back to the ADDIE model, in the second last phase it mentions implementation followed by evaluation. Now, evaluation happens at two levels: Formative and Summative.
Summative evaluation "provides information on the product's efficacy ( it's ability to do what it was designed to do)." This may seem to be on the surface, a fairly simple definition. So, what's the big deal you may ask..."If a majority of learners clear the assessment module with 80%+, then obviously the program is successful," would be the typical response.
However, there is a hitch to this. Learners are required to pass, and they pass "when they attempt the assessment." They pass "when they have accepted the responsibility of going through the course." With adult learners, it is notoriously difficult to get the buy in and this, I feel, is not only a manifestation of the course design or structure or content presentation style.
There are various dynamics at play in the user acceptance of an e-learning course.
I have listed these here briefly. Each point can be expanded and dealt with in greater depth.
~Individual learner's characteristics and perceptions: This is the need each person feels for that piece of knowledge delivered in the program. Do the learners perceive themselves as being tech savvy enough to attempt and navigate through an e-learning course with confidence?
~The e-learning program characteristics: The nature of the design, interactivities, content presentation style and clarity, complexity or simplicity of flow, etc.
~The delivery mode: Is the program easily accessible as and when required? Are users allowes to take it at their own convenience? Can they access it from home or external to office? Do they have internet availability easily? Is there a tracking methodology?
~Organization characteristic and support: Is this the first e-learning course for the organization? Does the learners' immediate supervisor believe in e-learning as a means of effective training? Is there a system of reward and recognition for good performers? Is the training program tied to some kind of assessment/interview that can lead to promotion? Does the organization provide learners with some amount of free time that can be utilized as "learning time" so that learners can go through the course within work hours? Is there any online help available who can facilitate adoption, at least at the intial phases? Is there any "bonus", maybe non-monetary in nature, for proponents/initial adopters of the programs that will inspire them to spread the words to their colleagues?
~User Analysis Testing: Are the users asked for their honest feedback about the course? This is a part of UAT+Summative Evaluation that needs to be carried out to gauge the efficacy.
All of these questions have multiple answers and there is not right or wrong here. But each of the answers are drivers of user acceptance.
Here's an excerpt from a post called The technology is ready. Are you? at http://onlignment.com/2009/07/the-technology-is-ready-are-you/ on user adoption...
The post talks about an organization's role in the adoption of something as apparently simple as the use of Voice over IP (VoIP).
The excerpt:
"...As is so often the case with technology solutions, the real issue is with the implementation. Too often, the implementation is considered a success once the software has been rolled out across the organisation. In fact this is when the real work should begin.
Employees must be provided with the right equipment; if you want to use VoIP, make sure they have good quality headsets. Ensure that every user knows how to set up and use that equipment. Despite what vendors tell you, none of the tools are so intuitive that people can be expected to use them without some support and training. Invest the time at this point to check that everything technical works, and I do mean everything. Set up a pre-recorded webinar and get every user to log in and make sure they can navigate through it and that their audio and video works. This is a much simpler thing to deal with if you plan for it and ramp up your helpdesk support for the testing period. It’s certainly easier than trying to deal with the issue on an ad-hoc basis once someone is supposed to be taking part in a live session..."
Would love to hear your thoughts...
On emergent organization designs, future of work, and the impact of the digital era..
Subscribe to:
Post Comments (Atom)
Organizations as Communities — Part 2
Yesterday, in a Twitter conversation with Rachel Happe regarding the need for organizations to function as communities, I wrote the follow...
-
I have recently joined the open section of #MSLOC 430 - a graduate course in the Master's Program in Learning and Organizational C...
-
"The nature of work is changing. People’s relationship with work is changing. The changes to society will be vast" by @gapin...
-
I am not the kind to crystal gaze. I lay no claim to being able to predict the future. Now that my disclaimers are in place, let me ex...
No comments:
Post a Comment
Thank you for visiting my blog and for taking the time to post your thoughts.