Introducing eLearningQA

Email this to someoneTweet about this on TwitterShare on FacebookShare on Google+Pin on PinterestShare on LinkedIn

eLearningQA

At Uncanny Owl, we like to experiment with new product ideas. Our latest is a service called eLearningQA and it’s targeted at elearning professionals.

In our experience, many elearning companies and designers don’t have a strong grasp of testing. Learners are rarely (or superficially) involved in the development process, even though their use of the elearning ultimately determines a project’s success of failure. Moreover, elearning professionals typically do technical testing themselves using platforms and devices that don’t necessarily reflect what their audiences are doing. With the rise of Bring Your Own Device environments, mobile learning and bigger audiences, robust testing is becoming even more important. Unfortunately, many companies don’t have the tools or expertise to perform testing themselves.

We created eLearningQA to try to address that gap. It focuses on 4 testing areas to improve elearning programs: usability testing (is it easy for the audience to use?), cross-platform testing (does it work for everyone?), load testing (will our LMS go down when the launch email goes out?) and expert feedback (is it actually a good program?). Most elearning companies can’t do these things themselves, especially with objectivity. They may not have the tools or people to perform neutral usability testing, but it’s essentially that they really know if their programs are easy to navigate and use before launch. And how many companies can perform both manual and automated testing to make sure their elearning works on the 20+ desktop and mobile platforms their learners use? Or see how their LMS holds up when 500 learners are in there at once? Even if companies can track down people and tools to support testing, few will understand how to interpret the results and take appropriate corrective action.

Once companies start realizing the importance of testing and see how much it can improve programs, we hope they turn to eLearningQA. We can perform or coordinate all of that testing, and even better, we can interpret the results and suggest practical and cost-effective strategies to make improvements. After all, if companies have the opportunity to spend 3% of their budget to deliver a better program that improves learning outcomes by 20%, wouldn’t they consider it? We hope so, and that’s why we’re testing the market to see if there’s a place for eLearningQA.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *


9 + = ten

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>