The Uncanny Owl Blog
We recently created a number of screencasts for clients to support software training. As public-facing screencasts for enterprise software, the standards had to be very high. That meant 1 single person couldn’t do everything; we needed a professional voice actor, a software SME, and someone from our team to take care of instructional design and editing. While we’ve certainly created many screencasts, I looked online for workflow best practices for creating high-quality screencasts with multiple resources.
Unfortunately, very little guidance and a lot of complaints were all that I could find. For this type of screencast you can’t wing it; having the SME record the video and then building a script around it just doesn’t work. And getting the voiceover work done prematurely, without proper planning, tends to mean rework. So what’s the best approach?
While it might not work for every situation, we’ve developed an approach to screencasting that generally works well for Uncanny Owl. Here are the steps we follow:
- Plan everything. We start off by looking at the planned outcomes and objectives. What does the learner need to get out of the screencast? What’s the best way to achieve it using the software? This certainly requires a lot of collaboration with an SME and production of a draft script. We can have the software up on the screen and walk through exactly what the learner needs to see and how best to explain it.
- Record the video. The SME does the actual interaction with the software while someone from our team observes the recording to do initial script reconciliation and verifying that the pace is appropriate. With 2-3 good takes, there’s typically enough video to perform a more detailed reconciliation and to tailer the script as needed. If anything needs to be recaptured, it can be done almost immediately. It may even help to have someone read the script while the SME is performing the activities.
- Finalize the script. This needs to get as close as perfect as possible before it goes to the voice actor. Try recording it yourself and listen to it both with and without the video to make sure it flows well and is straightforward to learners.
- Record the audio track. Since the voice talent may not have the video context to work with, include directional cues in the script as required.
- Put everything together. Cut the audio up and add it to the video, syncing everything so it seems like the person interacting with the software is the same person speaking. Add or shorten the video as appropriate, and make sure any edits don’t hurt the pacing or cause problems with mouse movements. Once that’s all done, the screencast is ready for publishing!
That workflow has produced generally good results for us, but we’d love to hear your tips for better screencasts in the comments.
Sometimes the elearning solution you want costs more than you want to spend (or more than you can spend!). Balancing budget and scope is always a challenge, and the cost of elearning can vary widely depending on the context and requirements. According to a 2010 research report by the Chapman Alliance, the cost of a 1-hour elearning course might average as little as $10k for a basic, linear course with static media to as much as $50k for a highly interactive and dynamic program. In this article, we’ll look at some ways to keep your project costs lower when you work with elearning vendors.
Make sure your goals and objectives are clearly defined before including any outside parties in an elearning project. What do you really need and what are the expected outcomes? Risk and unknowns are going to increase quote costs and potentially lead to expensive rework late in the project. The more you can define and prepare up front, the less you’ll need to spend.
Keep everything as simple as possible. Use animation and interactivity sparingly to improve knowledge transfer, not just to look good. Think about what really needs to be custom and what existing resources can be leveraged.
Compile all of your subject material and organize it for easy hand-off. This step can save a lot of time by eliminating expensive research and review cycles. Where material does exist, but may not be in a format suitable for easy incorporation into elearning, make improvements. Make everything as simple and straightforward for the vendor as possible (if they can see what will be provided up front to better assess their effort, they can lower the price accordingly).
Reduce review and testing requirements. Maybe 1 or 2 reviews with 3 people in a room is enough rather than 3 rounds with 6 people that’s conducted by email with updates in between. When it comes time to test the elearning, maybe testing on 3 platforms is enough with a handful of users rather than significant cross-platform testing and a large pilot group.
All of these ideas should help lower costs while not significantly changing the scope and outcomes of your project. Try doing whatever you can in-house and make the vendor experience as easy and straightforward as possible.
If you have any other tips, feel free to add them in the comments below!
We covered the launch of www.torontoelearning.com in a previous post and just completed some updates for April. Since stabilizing the platform, we decided to test it with a completely different market—regional track events. The new website, www.torontotrackdays.com, has already seen huge interest from the local racing community (250+ organic likes on Facebook within 6 weeks of launch!) and has become the most complete source of track event information for the Toronto area. So far, we’re very pleased with the results and its positioning for the upcoming track season.
Toronto Track Days also gave us an interesting opportunity to test some new technical ideas that we can apply back to future elearning projects. (We’ve had a number of complex platform requests recently, so it helps to know how far we can push our sites with available tools.) The event calendar we used for torontoelearning.com wasn’t robust enough for the large number of events we required, nor the complexity filtering and sorting that users would need, so we developed a Gravity Forms solution. Now users and businesses can enter events, see them immediately and make changes as needed; all without our intervention. With so many early users it’s been a great experiment in user-generated content and organizing contributions so they add value to visitors. We can apply many of the lessons to elearning and social sharing, like how to better engage users and encourage active participation.
If you have any questions about technical considerations or other lessons learned from our Toronto Track Days experiment, post in the comments and we’ll be happy to provide feedback!