logo

Newsletter

Join our mailing list for intranet and digital workplace links from around the web.
Newsletter

We’re careful with your personal information. Read our privacy statement for more about how we manage your details, and your rights.

Get in touch

Make your intranet work harder for you. Contact us to see how we can help.
hello@clearbox.co.uk
+44 (0)1244 458746

Don’t do a pilot in your digital workplace

Experimental tea strengths!

Don’t do a pilot in your digital workplace

I see a lot of enterprise social network (ESN) pilots where Yammer is ‘given’ to the IT department (who have already been using instant chat since 1996 / 99) and the Internal Comms team (who mostly sit together) as a trial. Feedback often boils down to whether individuals like Yammer, rather than testing how to enhance core work. This isn’t a pilot at all, it isn’t user research; it’s personal learning at best.

A pilot should be a study, with defined objectives (lessons to learn, things to change), a measurement process, and a readied group of people selected to benefit from the objectives.

But often, a pilot ends up just being about giving a team or department the new tech and then seeing how they use it – with no structured feedback system and little thought to whether they need the new tech.

I get that there’s nothing wrong with exploring the features of a new tech system; we have to be open to new ways of working and learn about new options ‘out there’. But tech should suit the jobs in hand (or the new requirements from a new objective or strategy) and pilots shouldn’t be about plonking software in front of people and asking if they like it.

Stop, start, continue

So, as you plan your pilot study or ‘soft roll-out’, have a clear idea of which tech and channels people can stop using when they start using the new tech / channel.

A huge concern, that clients always talk to me about, is adding to a person’s workload – adding a new thing to do just to get through the workday. This is why I’m always keen to see tech, and channels, replace existing tech and channels. I’m not exactly suggesting a ‘one in, one out’ approach all the time, but that would be a fine tactic in many organisations with overly complicated digital workplaces.

Measures

You want metrics / stats so you know how much of a success the pilot is. You need to avoid merely discovering if people liked the new thing; liking a new way of working might well be important to adoption, but people will almost certainly like something that’s easy to used and directly helps them complete everyday tasks and achieve existing goals.

Assuming you expect the new tech / new ways of working to massively benefit the organisation and your colleagues, you’ll want to see it rolled out across all relevant departments or teams. So, stakeholders will want evidence, rather than only conjecture, to convince them to go through the change.

The measures should relate to the objectives; perhaps you want to reduce delays in a certain process, increase participation in an initiative, or eliminate errors from a particular product / service.

It’s a good idea to get metrics about productivity / outcomes / ease-of-use before the pilot – measure the existing process / tool use. Then measure halfway through the pilot and see if one-to-one coaching can help people. Finally measure at the end and see if things are on a par or improved. Knowing how good or bad things are now will help you, decision makers, and end users, appreciate the benefits of the change if the pilot study creates great results.

How to run a decent pilot study

The most valuable pilots are well structured, with measures and checks, and set goals (if objectives are too specific, just have measurable goals) and the right people involved.

You should know when the pilot is a success and when it is not successful enough to instigate the expected change across the organisation / division. Your success criteria should be explicitly stated upfront.

Make it real

The pilot should be about existing work (or new work as part of your change management strategy) so that participants have real need of the new tech / system / channel and benefit from using it. Serving a defined business need is the way to go, and my entire argument here.

You might know the exact processes or ways of working that need radically improving, and then seek out the appropriate technology to suit. That’s the ideal. But you, in your wisdom, might have certain new tech / channels available and just kinda go looking for the processes that could be improved with them; this is often the case when organisations want to get the most out of their Office 365 subscriptions.

So, find a process or outcome that you imagine can be optimised with your new tech. It might be as simple as reducing email between two groups of people (and therefore reducing process lag) such as Procurement and Facilities. It might be as brilliant as replacing internal paper forms with digital forms and online workflow. It might be about replacing a well-used but ‘broken’ channel with a new channel with additional capabilities for fieldworkers or salespeople.

The idea is to replace something that is already happening, not add something to participants’ workload. The idea is to help with real work that is already important, not merely add a new communication channel without a purpose.

People should be trained a little. This will help you consider the learning curve of the new system, helping you plan the cost and time needed for a training offering for the full roll out. I won’t discuss training methods, as it all depends.

A ‘real’ pilot study checklist:

People should be trained a little. This will help you consider the learning curve of the new system, helping you plan the cost and time needed for a training offering for the full roll out. I won’t discuss training methods, as it all depends.

A ‘real’ pilot study checklist:

  • a defined purpose (objectives, goals)
  • a defined process or outcome to replace or support
  • agreed things to stop and start during the pilot
  • measures and checks throughout
  • defined success criteria
  • participants selected by appropriateness
  • a little training for the participants
  • a ‘champion’ to go to with questions and problems
  • a lightweight risk management plan that simply enables people to go back to the old ways if the champion and pilot owner agree.

And a plan for fully adopting the new tech / channel if success is so good that stakeholders have to back it!

Wedge Black

I support ClearBox in everything we do online, and I assist clients that are considering redeveloping or replacing their intranet platform. I worked in global and regional organisations as the intranet manager as part of the comms team, before becoming an intranet consultant. I'm the founder of the Intranet Now annual conference. I’ve tweeted about intranets and comms for fifteen years now.

No Comments

Post a Comment

Comment
Name
Email
Website

This site uses Akismet to reduce spam. Learn how your comment data is processed.