An Experiment in Guest Blogging

offline ipadLast week I decided to write an article on e-learning and see if a popular e-learning news source might be interested in publishing it. Even if it didn’t work out, I thought I might be able to get some feedback and insight on how guest blogging worked. And if it was accepted, maybe we’d get a little more traffic and recognition for Uncanny Owl.

Because it was just an experiment, I didn’t put much work into the article. This may have been a mistake, because the article ended up taking off and getting a lot of unexpected attention. Viewed by over 1500 people in the industry, shared over 125 times on social networks, and heralded by the site editor as one of the most exciting things he’d seen for months, the response was overwhelming. It’s kind of exciting to see industry heavyweights cite your work.

The experiment was undeniably a success. We’ve had almost 50 visitors check out our website as a direct result of the article, we’ve established more expertise and the quality backlinks will help our search engine placement. I guess it also means we need to write more… Any suggestions for new topics?

If you want to check the article out, it’s available here: http://elearningindustry.com/the-state-of-offline-mobile-learning. It’s admittedly rough, but now we’re better prepared for next time.

Update: My second attempt at an article had similar results and is available here: Positive Feedback – Lessons from a 2-Year-Old.

Finding the Right Tools

We’re still on the hunt for the right tools for the right workflow – and we’re failing miserably. With so many SAAS products on the market today, it should be easy to find something that fits our needs, right?

Needle in a haystackSurprisingly, we’re struggling. We’ve spent the better part of 2 days looking for the right tool to manage client projects. It shouldn’t be that hard, right? Set up some tasks, organize them into a project, distinguish between billable and non-billable hours, track time, invoice clients, and maybe even give clients access to see what’s going on. But apparently nobody can get it quite right, and to find out what’s missing, these services need me to give them credit card info just to check out their products in more detail.

So what have we found? A supposedly great project management tool doesn’t handle time tracking. Task managers don’t allow planning future tasks, and forget about durations, dependencies and non-billable time. Easy invoicing built in? Not if you’re in Canada. Another seemingly great tool won’t let us see all of our tasks at a glance – we have to drill down into 5 potential projects to figure out what’s going on at the company level.

All of the niche products showing up now are also relying on “integrations” to fill in the gaps with their offerings. Rather than offer a complete product, we’ll let you hook up your XYZ subscription which does a great job! So instead of finding one complete service, we’re paying hundreds a month for lots of little incomplete services and still winding up with gaps. Is this really the future of SAAS? And why isn’t anyone doing project management in a way that fits our needs? Yes, we’ve checked out Basecamp, Harvest, Copper, Planscope, Toggl, Copper, Tempo, MinuteDock – you name it.

Offline Learning and the LMS

Uncanny Owl recently completed a project that required redesigning the interface for an offline iPad application that captures test data. It used Filemaker desktop and mobile applications to track test scores without any kind of network access for eventual consolidation and reporting. It did the job, but it wasn’t intuitive, it didn’t integrate with other learning data, and ongoing maintenance could be difficult.

offline ipad

While outside of our scope, we did a little digging into possible alternatives for this type of scenario. It turns out that there really aren’t a lot of tools that support offline data capture for eventual upload into an LMS! The Tin Can API looks promising, but there are very few LMS options and authoring tools that support it, and those that do are very expensive (particularly for the capture of modular test data). There are HTML5 possibilities too, but they would require some customization to get data into an LMS.

With all the interest in mobile learning, where are the tools that support offline learning? Yes, maybe something like Storyline could work, but 1 assessment question per screen is a big limitation for our scenario. What tools are you using to support offline learning on iPads?

Hypothesis Testing – Finding Participants

People talking

Further to the last Uncanny Owl post, we’re finding it more difficult than expected to get certain groups to complete our surveys. We’re trying to collect data for our Canada Translates project, and to do that we need feedback from 2 groups – translators and businesses that need translation. Finding translators was easy. We just found a board that’s used by translators, posted a compelling offer, and waited for the responses to trickle in. That part was pretty easy.

But where do you find random people who need translation? We’ve tried Google ads, our networks, targeted landing pages and more, but we’re still not getting submissions. Here’s one of the pages that’s not attracting interest or converting: www.canadatranslates.ca/canadian-translation/. How can we improve our participation rate?

On a related note, here’s an important lesson that we learned: Be very careful with Google’s “Broad Match” option in adwords. The scope was far too broad to be of any use and we spent more on advertising than we should have. Multiple, targeted ads were much more effective.

Hypothesis Testing – A Lesson in Hacking

Last week we recovered from a catastrophic data loss with one of our hosting providers. This week some of our hypothesis testing attracted the interest of a rather clever script kiddy looking to mine Amazon gift cards. We’re having lots of fun online. 🙂

We’re currently trying to get feedback from translation customers and providers over at www.canadatranslates.ca. To do that, we’ve got some links and ads directing potential clients to surveys to collect some of the data we need to validate our business model. As an incentive to participate, we offered professional translators $5 gift cards to complete a 5-minute survey. A few people participated the first day the site went live but it wasn’t popular.

Overnight on the second day, 40 surveys trickled in. This was pretty shocking – how did we go from 1 every 8 hours to 6 per hour? I looked closer and the submissions didn’t make sense. Values weren’t aligned with what we expected, email addresses didn’t match names, submission times weren’t too far apart… the data was just too suspicious. I took everything down while I investigated. Whoever tried to mine the gift cards did a pretty good job of trying to make the submissions look legit. Entries came from Canadian IP addresses from different parts of the country, and unique Canadian addresses accompanied each submission. After adding a captcha and some mandatory fields that required valid text input, I broke our visitor’s script and made it easier to identify false entries. It was a pretty annoying less to learn though, and I had to manually clear out all the bad data from our records.

stop sign

Lessons Learned

For our next surveys, he’s what we’ll watch more carefully:

  • Make several text fields mandatory that require unique submissions and at least a sentence or 2 of text
  • Include a captcha
  • Block multiple entries from a single IP address

Luckily our survey was small and we had few submissions. If our work was more popular, our guest’s interference would have cost us $200 PLUS made our research data invalid.

Hello World

Welcome to Uncanny Owl’s official blog! This is where we’ll informally talk about some of the things we’re working on and some of the things we’re learning. It’s also where we’ll ask for help from you.