On emergent organization designs, future of work, and the impact of the digital era..
Wednesday, July 29, 2009
Application Training Programs: What's the Big Deal?
The following para from Karl Kapp’s post called Yes, We Should Keep ADDIE, HPT and ISD Models is the trigger for my post. His post is an old one—almost 3 years old—yet extremely relevant in our situation today. I have pasted the para below:
“Products typically aren’t designed, built and marketed in a day. Why should training for that product be designed, built and marketed in one day? Same with software systems that take years to develop and then they want training created and delivered in a week...? What are they thinking and what are we thinking when we AGREE to the unreasonable demands?”
This was further ratified by a detailed post by Sreya, a co-blogger, called Challenges and solutions to technical software product training: Gathering Information. Through the details, the post illustrates the time that would be required to understand an application and gather relevant information before a training can be designed.
Currently, I am in the middle of creating a simulation-based training program for certain applications for a well-known organization. The application itself is and has been in the process of being designed and developed for quite a few months now. The organization wishes to roll out the application along with the training modules—a noble endeavor and a logical one, no doubt. Users need to know how to move around within the application before they run amuck on the live system and wreak havoc.
Wherein Lies the Problem
1. This need for a training module was not identified earlier and the training design team, i.e. us, came into the picture a week or so back.
2. The client is sadly under the impression that capturing screenshots of the application using a Rapid e-Learning Authoring Tool like Captivate and putting those inside a UI is what goes by the name of sim training modules.
3. The application is large, complex, and will be used by users of multiple roles performing varied tasks for very different business reasons.
4. This mean, the technical team developing the application cannot always provide business reasons behind all the tasks. They do and can explain the usage without necessarily knowing the logic.
What is the way ahead?
The onus, I think, is squarely on us to explain the following to the client and other stakeholders:
1. The training design team MUST be given time to understand the application from the following angles:
a. Different user profiles (Roles)
b. Tasks that each profile will perform
c. Business need for each task
d. Business need of the application
e. Whether the application is a new rollout or an upgrade (this defines the design of the training which would be a Product Upgrade Communication in the latter case)
2. This first step analysis has to be followed by the design team going through the application themselves. This helps a designer understand where separate Instruction Tips will be required to explain the logic of the step or set it in a larger context for the users.
3. A set of users needs to be identified who can be the test group for the first few modules. Based on this user feedback, the modules and program can be revised.
4. All of this can happen smoothly once the application design and development is over. If the logic of the application changes in the interim, this will directly impact the design of the training as well—not only from the perspective of different screen grabs being needed but most importantly, Instruction Tips would change.
Going back to what Karl Kapp began his post with—the need to have our fundamental models in place, ADDIE, ISD and HPT—I firmly agree.
Without thorough analysis, it would be foolhardy to claim that we can design a training program that will work. A badly designed training program is as likely to wreak havoc as zero training.
Today, when the need to justify every cent spent is HIGH, an organization stuck with a training program that does not work—read no/below average User Adoption—will not be a happy client. User Adoption = ROI + improved efficiency + greater productivity
Therefore, it is time we put our foot down and claim the time it will take us to deliver effective training programs based on sound instructional designs. The client has to be a partner in this and provide organizational support post implementation to drive adoption.
We cannot “throw away good design in favor of fast design.”
What are the Results of Following an Instructional Design Process? by Karl Kapp
Rapid eLearning Tools: eLearning Technology by Tony Karrer
Rapid E-learning Authoring Tools from Kineo
• Articulate www.articulate.com
• Atlantic Link www.atlantic-link.co.uk
• PointeCast www.pointecast.com
• Qarbon www.qarbon.com
• SCATE www.scate.com
• Udutu www.udutu.com
Thursday, July 23, 2009
The World has a New Mania: The "Learn English" Mania
A wonderful talk by Jay Walker on the English Learning mania that is sweeping across the world--2 billion worldwide are trying to learn English...most of all in China. China will become the world's largest English speaking country!
English--the language of problem solving...a common language for the world to solve its common problems...!
Tuesday, July 21, 2009
Sunday, July 19, 2009
Social Networking: As Easy as Pie
The sheer number of options can be overwhelming. Then, there are subtle and not-so subtle differences between these options--some of the differences are inherent in the features of the networking tool and some have been superimposed by the users over a period of time. The best example is of course the way Twitter is used today...
For such people, how would collaborative learning or informal learning work in this age of technology? They may have and I am sure do have a tremendous amoung of knowledge and experience to share but are simply unable to do so because of the channels of communication in use today...How can we facilitate this sharing and ease them into the world of technology?
The feeling of a new comer as I see it...for a lighter touch! :)
In Response to "The fewer the competitors, the harder they try" from Economist.com
The articles discusses the "n"-effect where "n" represents any numerical value in mathematics and the outcome of several experiments conducted to understand the relationship between the number of participants in a competition and the motivational level of the competitors.
"Two behavioural researchers, Stephen Garcia at the University of Michigan and Avishalom Tor at the University of
This seems to imply that when there are "too" many pariticipants for any event, a kind of inertia or attitude of "giving up" sets in. I call this "someone else is sure to get it done" feeling. As I read Economist articles, I was reminded of a chilling incident described by Malcolm Gladwell in his bestseller, "The Tipping Point."
In this classic Gladwell published in 2000, he describes an infamous incident that took place in New York City in 1964 when a young woman called Kitty Genovese was chased by assailants over a span of 30 minutes, attacked thrice and stabbed to death in full view of 38 of her neighbours. At that time, none of the witnesses called the police.
Further experiments proved that people rush to the aid of a distressed person when they feel they are only one around to help. When the number increases beyond a certain point, the responsibility or the motivation is diffused. What exactly is the Tipping Point and the perfect "n" that can maintian motivation is contextual and relative.
Thus, the conclusion that social psychologists like Latane and Darley arrived at is this: "the lesson is not that no one called the police despite the fact that 38 people heard her scream; no one called the police because 38 people heard her scream." She would probably have had a better chance of survival on an empty street with a lone bypasser.
This seems to be true of examinees in an exam hall. When they see a huge number of people competing in the same exam, they lose motivation thinking that there is bound to be a lot of people much better than them.
The same scenario can be transposed to a job interview. When we walk in for an interview and see one other or maybe two candidates for the same post, we kind of brace oursleves and tell ourselves we should be able to bag this one. When the number of candidates for the same post is very high, an automatic, unconscious feeling of negativity or despair set in. We tend to think, "Oh my god! I'll never get this one. Look at the number of people applying. Some of them are bound to have skills I don't even possess..." This becomes the deciding factor.
So, what is the Tippping Point, be it for an exam, a competition, a willingness to rush to the aid of someone in distress...?
Saturday, July 18, 2009
Penny Pinching but Valuable Client and a Unique, Challenging Project
How would you convince a "penny-pinching but valuable prospect" (a phrase from Cathy Moore’s blog: http://blog.cathy-moore.com/2009/05/four-ways-to-move-your-learners-from-clueless-to-confident/) the need for a training program you are recommending?
As you talk to the client, you realize the training is much needed and has a huge potential—both for you and the client i.e., business wise as well as true value addition.
Here's the real time scenario:
1) The client :
a) Is a valuable one with the possibility of becoming a long-term client
b) Expresses a tentative/possible need for a multimedia training program
c) Also mentions very clearly that budget for this program is minimal to non-existent
d) Requires the program to be attention grabbing and as “innovative” as possible
2) The training program delivery environment is challenging:
a) There would be no PCs for the learners
b) The program would be used to support the trainers in their ILT sessions
c) There would, in most cases, be no conference or seminar rooms for dissemination and training; the possibility of the program being shared in an open-air environment, in broad daylight with the help of a simple projector and a screen was very real
d) This, of course, led to a challenge in design, colour usage, font size, on-screen animation, and structure
e) The learner would have no control over the program in terms of being able to stop/play/replay…the trainer would have to gauge the learner’s reaction and carry out the actions as and when required
f) The program needed to have all the features that would attract learners with the psychographics and demographics mentioned below
g) To make things fun, the program was of a highly technical nature
3) The target audience is a very different set from the usual:
a) They have never been exposed to e-learning
b) Have rudimentary to non-existent knowledge of computers
c) Have minimal access to PCs and internet
d) Have low reading skills and poor attention span
e) Probably attending formal training of any kind for the first time
f) The English language could be a bit of a challenge for some
g) Would not be the kind at all to ask questions to clarify doubts
h) Would unquestioningly accept as gospel truth whatever was onscreen
i) Would need to quickly transfer the training to practical work
4) The business need of the program (our analysis):
a) To train mechanics in a country on the product maintenance where the product was being launched for the first time
b) To create a brand presence for the product in the international market
c) To act as support for their advertising campaigns (not yet suggested)
d) To instill confidence in customers of the product that they would get reliable after-sales and repairing services from trained mechanics, if required
e) To ensure trust in the product grows
f) Point b, c, d and e to act as drivers to boost sales in the country of launch
All of the above needs were to be addressed at a minimal cost but the client was an immensely valuable one.
5) What did we do?
a) We sat down
b) We discussed possible solutions that would be cheap enough for the client
c) We discussed solutions that would, most importantly, fulfill the needs mentioned above
d) We discussed solutions that would impact the learners
e) Most importantly, the solution should result in an ROI from the business perspective (point 4 above)
f) It should result in a feasible business deal for us as well
g) It should keep the client coming back for more such programs
6) Results of our brainstorming: (considering all the different parameters mentioned above)
a) The program would need to have a very strong and appealing visual design that could withstand open-air dissemination, support ILT sessions and still be relevant should it be taken on PCs by individuals at any given point
b) Content on screen would need to be minimal and simple
c) The voice over would need to explain the concepts clearly, simply
d) The graphics/images would need to have self explanatory animations, supporting voice over, smooth and easy transitions, few supporting onscreen keywords for emphasis and retention
e) Would have to be self running, yet at a pace that a majority of learners would be able to keep up with
f) Follow sound instructional strategies and structure that would:
i) make logical sense to the learners,
ii) build expectancy thus enabling the learners to predict what would follow helping the assimilation
iii) have a theme across all the topics that is constant (this would again be a support for point ii above)
iv) at no point be a cognitive overload for the learners yet convey all the important technical knowledge
v) enable learners to actually take the knowledge to the field by showing the practical applications (theory would be minimal in such a training program, appearing only to clarify/logically support a practical tip)
7) Addressing the low cost factor: (thought this deserved a separate point of its own as this became our biggest challenge)
a) We identified the constant/common elements across different topics during the content analysis and project scope definition phase
b) Created a structure that would hold true for a majority of the topics, i.e., could be repeated across the topics
c) Decided on the templates and animation flow and other design factors that can be standardized, repeated
d) Created a prototype course and evaluated it for effectiveness in the setting described above with the client’s involvement
e) All of this helped us to create a course design that was quickly implementable, had reusable elements (Learning Object driven) and succeeded in bringing down the cost
8) The final step:
a) We took our solution and analysis to the client
b) Posed our solution and showed the business benefits that could accrue—not only terms of training (that would become much less painful for the trainers) but the consequences of a training well designed and effectively delivered
c) The client was convinced…they will still continue to be penny pinching but we got the project! J
9) Our key learnings:
a) Carry out a very thorough needs analysis and delve into the root cause of the training need
b) Map it to the business and understand the business drivers very very clearly
c) Understand the audience and training environment with absolute clarity
d) NEVER hesitate to ask questions till you have all the information
e) Warn the client UPFRONT that this has to be a combined effort for the program to be successful and allow them to be involed even in the design decisions
f) This way, not only will you get their buy in but also their trust and respect
I would love to know if you have similar experiences and how you have dealt with them…
Thursday, July 16, 2009
New Skills for Learning Professionals
This month's Big Question from LCB has received a huge response. I have pasted the question here:
"In a Learning 2.0 world, where learning and performance solutions take on a wider variety of forms and where churn happens at a much more rapid pace, what new skills and knowledge are required for learning professionals?"
This is not a very new question and has been doing its rounds in different forums. My colleague and I had begun a similar discussion on our "explore learning solutions" forum as well. We also have a fair idea why this question is cropping up in various forms...Today, the role of learning professionals have undergone and are undergoing a paradigm shift. The main reason is of course the shifting, evolving technology and a flatter world with organizations spread across the globe.
As I read through the different responses to this month's Big Q, I saw that most of the major points had been covered.
Therefore, instead of trying to make the same point using different words, I thought I would synthesize my learnings from the different posts. I have not yet been able to cover all the posts but from what I have read, I have made the following jottings for myself. As I read the rest, I intend to add to this list.
The interpretations are mine and so is the synthesis I have arrived at. The original writer's intent may have been different.
While the key trends I have spotted so far focus on a learning professional's ability to:
Keep learning: Professionals have to become constant learners to be in tune wth the ever increasing flow of knowledge
Be a knowledge networker: Be a knowledge networker capable of identifying patterns and make connections
Solution provider: Use the best solutions and be able to bind different learning ecologies
Be tech savvy: Be comfortable with technology, especially the tools that can be used to create a "learning ecosystem or end-to-end learning solutions" at organizational levels (talking of organizational training here since that forms a large part of our "business")
Be adaptive: Be adaptive enough to see the link between theories of andragogy to present day technological innovations.
Be a social networker: Be good at networking and connecting with the "right" people because there is no way one can know everything. The only way to keep up with the rapid pace of tech development is to share and participate in online conversations.
Don't think of yourself as an expert: Be aware of the vanishing concept of "expertise". Today, everyone can easily gain expertise with access to information and the tools available to share and disseminate.
Have a sound grasp of theories: Linked to the previous point>> this puts a lot of pressure on learning professionals to constantly "innovate" and be the creator of sound instructional design. Only way we stand out is the design/form/structure/format that we present the learning in. Hence, it becomes even more critical to know the theories of learning in depth and apply these in course designs if we have to still make a case as to why organizations should spend in asking us to create a training program.
Be open to criticism and be courageous: Have the courage to put up one's understanding and perspectives out there for all to read, respond, criticise, disagree with, argue, rip apart...This can preferably be done via blogs which is also a platform that allows one to synthesize and analyze one's thoughts. Without this kind of "putting oneself on the block" attitude, it will not be possible to learn.
Understand business drivers: Be a consultant and be able to talk to Line-of-Business managers and grasp the true need of the training. This means being able to ask the right questions and not only focus on the "learning" but the end need of the learning. What is the gap that is impacting the bottom line? That is all that matters to an organization.
All of this means that it is well neigh impossible for an individual to develop on one's own. There is a growing and imperative need to collaborate. To understand how collaboration is different from coordination or cooperation, read the post When should we Collaborate? The whole in this case is way greater than the sum of the parts and knowing how to acquire the necessary information is as if not more important than the specific information itself.
The list of posts I have read so far are:
# Mohamed Amine Chatti - New Skills for Learning Professionals
# Harold Jarche - 2008 article on Skills 2.0
# Clive Shepherd
# Jay Cross - Informal Learning blog.
# E-Learning Curve Blog: Learning Professionals’ Skills 2.0
# Natalie - What Should Learning Professionals Know Today?
# Gina - Adventures in Corporate Education
# Jane Bozarth - New Skills for Learning Professionals
# Harold Jarche -Skills for learning professionals
# Clark Quinn: Web 2.0 Learning Skills
Driving User Adoption of e-Learning Programs
~The implementation is only the beginning
~The culture of the organization should actively suppport and drive post-implementation adoption
~Users/learners should perceive themselves as being appreciated and recognized for adapting to a new method
Applying these to the adoption of e-learning in an organization:
Mistakes that many organizations tend to make that get in the way of user adoption are:
1. Managers responsible for team productivity and performance define the learning objectives and ask a vendor to create an e-learning course failing to take the opinion of end users (what the managers perceive as difficulties may not be the real/root cause for productivity or quality failures)
2. Insist on "jazz" which if not aligned/required for the end learning can distract and confuse users (Refer to: Could animations hurt learning? by Cathy Moore)
3. Feel that transferring a ppt into flash is all about e-learning (putting "lipstick on a pig" effect takes place...Refer to: How to avoid putting lipstick on a pig by Cathy Moore again)
4. Think of the implementation of an e-learning program as the end of their responsibility, a task to be ticked off on their list for that quarter (the course is out there, learners will go through and performance will improve)
5. Unwillingness to go the whole way and invest in a course that can be made truly effective; the incentive for an e-learning course, from an org's perspective, is often to cut down on training cost and the hassles of coordination rather than learning efficiency
All of these points lead to the creation of a training program and an atmosphere that is not inducive to adoption.
Mistakes that most vendors tend to make that get in the way of user adoption are:
1. Accepting the brief from an organization's representatives and creating a course that they want but not necessarily what they need
2. Failing to do a thorough "Learning Needs Assessment" and thus barking up the wrong tree
3. Failing to carry out UAT because "implementation is not our problem; it's up to the organization to do what they want with the course"
4. Thinking that "jazz" can make up for sound instructional strategies and design
5. Not bothering to help the organization with suggestions on adoption (often, for an organization walking the path of e-learning may be a new measure with no set path to tread; a vendor/consultant/learning solutions designer, in such cases, need to suggest means of inspiring and driving user adoption...
After e-Learning delivery and deployment, how would you drive adoption?
I am putting up my answer here in my blog as well: Read on...
I was thinking about this question and generally going through Twitter when I saw a tweet mention user adoption. The link led me to a slideshow on user adoption of SharePoint. While this is apparently unrelated to the user adoption mentioned in the question, there are certain fundamental relations.
If we think back to the ADDIE model, in the second last phase it mentions implementation followed by evaluation. Now, evaluation happens at two levels: Formative and Summative.
Summative evaluation "provides information on the product's efficacy ( it's ability to do what it was designed to do)." This may seem to be on the surface, a fairly simple definition. So, what's the big deal you may ask..."If a majority of learners clear the assessment module with 80%+, then obviously the program is successful," would be the typical response.
However, there is a hitch to this. Learners are required to pass, and they pass "when they attempt the assessment." They pass "when they have accepted the responsibility of going through the course." With adult learners, it is notoriously difficult to get the buy in and this, I feel, is not only a manifestation of the course design or structure or content presentation style.
There are various dynamics at play in the user acceptance of an e-learning course.
I have listed these here briefly. Each point can be expanded and dealt with in greater depth.
~Individual learner's characteristics and perceptions: This is the need each person feels for that piece of knowledge delivered in the program. Do the learners perceive themselves as being tech savvy enough to attempt and navigate through an e-learning course with confidence?
~The e-learning program characteristics: The nature of the design, interactivities, content presentation style and clarity, complexity or simplicity of flow, etc.
~The delivery mode: Is the program easily accessible as and when required? Are users allowes to take it at their own convenience? Can they access it from home or external to office? Do they have internet availability easily? Is there a tracking methodology?
~Organization characteristic and support: Is this the first e-learning course for the organization? Does the learners' immediate supervisor believe in e-learning as a means of effective training? Is there a system of reward and recognition for good performers? Is the training program tied to some kind of assessment/interview that can lead to promotion? Does the organization provide learners with some amount of free time that can be utilized as "learning time" so that learners can go through the course within work hours? Is there any online help available who can facilitate adoption, at least at the intial phases? Is there any "bonus", maybe non-monetary in nature, for proponents/initial adopters of the programs that will inspire them to spread the words to their colleagues?
~User Analysis Testing: Are the users asked for their honest feedback about the course? This is a part of UAT+Summative Evaluation that needs to be carried out to gauge the efficacy.
All of these questions have multiple answers and there is not right or wrong here. But each of the answers are drivers of user acceptance.
Here's an excerpt from a post called The technology is ready. Are you? at http://onlignment.com/2009/07/the-technology-is-ready-are-you/ on user adoption...
The post talks about an organization's role in the adoption of something as apparently simple as the use of Voice over IP (VoIP).
"...As is so often the case with technology solutions, the real issue is with the implementation. Too often, the implementation is considered a success once the software has been rolled out across the organisation. In fact this is when the real work should begin.
Employees must be provided with the right equipment; if you want to use VoIP, make sure they have good quality headsets. Ensure that every user knows how to set up and use that equipment. Despite what vendors tell you, none of the tools are so intuitive that people can be expected to use them without some support and training. Invest the time at this point to check that everything technical works, and I do mean everything. Set up a pre-recorded webinar and get every user to log in and make sure they can navigate through it and that their audio and video works. This is a much simpler thing to deal with if you plan for it and ramp up your helpdesk support for the testing period. It’s certainly easier than trying to deal with the issue on an ad-hoc basis once someone is supposed to be taking part in a live session..."
Would love to hear your thoughts...
Wednesday, July 15, 2009
What is a Minimalist Training Model?
Minimalist Training Model would need to have the following features:
1. Be Learning Object (LO) driven. When instructional content is broken down into small chunks that are yet contextually meaningful and self standing, a good designer will automatically eliminate the redundant and retain the essential. It is not about "simplification" but about providing only the required information. So, the first characteristic of Minimalism gets taken care of and client does not pay for "extraneous" effort.
2. Business impact. These LOs or Shareable Content Objects (SCOs) have a direct impact on the client's training cost because these can be used "as is" or in combination/permutation with other SCOs to form training packages quickly.
As the term training implies (as opposed to "learning" which is much broader in nature), they are typically goal/object driven performance-support programs. Being specific goal driven, all effective training programs address the "task at hand".
Again, redundancy gets taken care of and what is required for a specific project or to acquire a skill set is communicated.
Thus, for clients the business impact of a Minimalist Training Model is the ability:
~ to create multiple training programs
~ for specific needs
~ as and when required
~ at minimal cost
~ and the least redundancy
~ by combining existing SCOs in different permutations
Without applying the theories of minimalism, one cannot create LOs.
3. Create Performance Support Solutions (PSS): This is to address point 3 raised by you. You are absolutely right when you say that no matter how many sims and case studies we create, it is still the actual experience that will count.
It is, therefore, much better to create PSS rather than elaborate, comprehensive programs to meet the needs of working adults. Elaborate programs that begin with concept, go to theories and examples and then give typical assessments are more academic in nature and may not always be efficacious in actually bridging skill gaps or enforcing a behavioral change. The latter is more aptly met with PSS and needless to say, PSS are always based on the Minimalist Approach. Because PSS ALWAYS aim to meet only the required need and can be in the form of Help files, Job Aids, FAQs, Forums, Manuals, How-to Process guides, etc.
4. When does minimalist work? Minimalist works:
~ when the training need is clear,
~ when the audience has a very specific and common performance gap (s) or a new skill acquisition need,
~ when the audience has approximately the same level/type of pre-knowledge
~ adequate support from the organization in terms of immediate access to training programs
~ access to performance support solutions
5. Benefit to client.
~ The savings in training program creation accrued over a period of time
~ The savings in time and effort
~ Catering to exact needs of audience thus reducing training seat time
~ Catering to the exact performance gap thereby allowing learners to start immediately on meaningfully realistic tasks
~ "Byte" sized, self-standing training modules facilitating assimilation and dissemination
What else do you think can be added?
What is Performance Support?
Performance Support (PS) is, says Tony Karrer, “Making information available to workers instead of forcing them to memorize it. That’s how we use Google and corporate wikis and instant messenger.”
What makes PS effective is the targeted learning that it provides and the “pull” approach inherent in its definition. PS, by definition, is access to information and knowledge as we need it, when we need it. It does not “push” information at learners in a pre-defined package but instead allows for selective learning. And this works best for adult learners. Remember, adults learn best if they learn in context, can apply their learning to their immediate work, and feel the need for that learning from within.
Thus, our programs based firmly on these tenets of PS are created after careful and thorough learner analysis, performance gap study and situational/contextual investigation. The programs are so designed as to allow learners to “pull” the information they need to execute their work better.
The programs follow minimalist design approach—that is, there is no redundancy which can easily put off adult learners, who anyways have busy work schedules and are pressed for time.
To summarize, how does the Performance Support Solutions as we design them facilitate productivity and growth and innovation?
Performance Support facilitates all of the above by empowering employees to:
– Execute better
– Deliver higher quality; thus lesser rework
– Demonstrate greater productivity
– Improve customer experience
– Display confidence by reducing dependencies on others
– Reduce time to deliver thus freeing up time for skill building, brainstorming, innovative thinking
The core idea is to reduce the need for training by providing information, aids, and learning on-demand tools at the moment of need.
When is Performance Support needed? Conrad Gottfredson aptly describes five moments of need when PS is required. He calls them "Learning at the moment of need."
1. When learning for the first time
2. When learning more
3. When applying what was learned or trying to remember
4. When things go wrong
5. When things change -- and there is now a new way to perform
Our industry has focused on the first two, it's now time for us to figure out how to address needs three through five as well. In today's scenario change management is going to be the biggest challenge and will call for innovative measures--in business processes and training deliveries.
Our easy-to-use performance-support solutions for software applications are principally designed to help learners during those five moments to:
– Understand the various features of the application
– Be aware of the purpose and the business need of the application (business process guidance)
– Understand the different roles and processes the application may support
– Get up to speed with the application
– Practice using it in a safe environment
– Understand the benefits of optimal usage
– See the consequences of making errors and learn from the mistakes
– Work independently and accurately
Please add to the thoughts and provide inputs.
Here's a link to a video that shows the difference in attitude between "hands on learning" and "you shove it down my throat" information, or "pull" vs. "push" as mentioned earlier on...
Organizations as Communities — Part 2
Yesterday, in a Twitter conversation with Rachel Happe regarding the need for organizations to function as communities, I wrote the follow...
In today's time-crunched, attention-deficit and multitasking world, micro-learning seems to have cropped up as a possible solution...
The post, Signs of Authority-Important Presentation Traits , appealed to me at various levels, and the comment from Mr. Govindarajan raised ...
“A QUESTION NOT ASKED IS A DOOR NOT OPENED.” ~ MARILEE GOLDBERG, THE ART OF THE QUESTION Powerful questions are viral. A powerful qu...