X
#277

How to use experiments to accelerate your career development

Experimentation is a great way to learn at work and accelerate opportunities for your career.

In this podcast episode, Helen and Sarah talk about practical ways you can put experiments into action and try out different approaches to support your development.

Listen

PodNotes

PodSheet

PodPlus

Listen

Episode Transcript

Podcast: How to use experiments to accelerate your career development

Date: 3 May 2022


Timestamps

00:00:00: Introduction
00:02:13: A description of experiments
00:03:41:
The benefits of experimenting…
00:03:57: …1: they can help you to get unstuck
00:04:29:
…2: they can reduce perfection pressure
00:04:57:
…3: they allow us unconstrained curiosity
00:06:13:
…4: you can find your fit before you commit
00:08:48:
Top tips for experimenting…
00:09:00: …1: always have a hypothesis
00:13:05:
…2: explore your execution-to-experimentation scale
00:18:29:
…3: let people know when you're experimenting
00:21:34:
…4: find your experiment energisers
00:25:44:
…5: collect feedback fast and frequently
00:29:52:
…6: aim low
00:34:20:
…7: measure what matters to you
00:36:58:
…8: celebrate successful experiments, not successful outcomes
00:39:23:
Recap of top tips
00:39:59:
Final thoughts

Interview Transcription

Helen Tupper: Hi, I'm Helen.

Sarah Ellis: And I'm Sarah.

Helen Tupper: And you're listening to the Squiggly Careers podcast, where every week we talk about a topic to do with work, to help you get a little bit more confident and feel a little bit more in control of your career.  And if it's your first time listening to the podcast, you do have a few episodes to catch up on; but don't worry, because every episode is supported with a PodNote and a PodSheet, so you can listen and absorb and hear our stories and hear all the actions we've got for you, and then you can download the PodSheet and it will just help you a little bit more with your reflection, and maybe it will give you something to talk through with somebody else as well.

Today's episode is all about how to use experiments to be better at your job.  When I was looking at some definitions of what an experiment is, it all felt very science-y and not very Squiggly.

Sarah Ellis: It's probably fair enough, given it's where they came from, I think.

Helen Tupper: Yeah, and it's such a shame though, because actually as we've started to, I guess, embrace experiments in the work that Sarah and I do, we've really seen the benefit of them, both for us in our jobs today, like how can we improve and try new things out in the job we're doing today, and give yourself some space and permission to do that; but also, how you can make some experiments for your career more broadly.  And with that in mind, we want to share in today's episode lots of different ideas for action, so that you can use experiments to be better at your job, and use them to help you explore your future a bit more too.

Sarah Ellis: Yeah, I think we have to let go in our mind, which I can't help but create, you know, every time someone says experiment, I just think "Bunsen burner"!  I wasn't brilliant at science at school, and I just remember that being the one bit I really enjoyed, when you got to get the Bunsen burners out.

Helen Tupper: I don't know if you know, I got a D in my science GCSE.

Sarah Ellis: Did you?

Helen Tupper: Yeah.  I retook it, because I was so embarrassed about it, so I retook it over the summer and got a C, so not much better!  But, science was never my hotspot, so yeah, I also probably have a difficult relationship with the idea of experiments being science-y.  But now I've reframed them as being Squiggly, I'm all for an experiment.

Sarah Ellis: I really like Margaret Heffernan's way of describing experiments.  I found it really helped me to reframe what you're trying to achieve with them, and why you'd spend time thinking about experiments.  And she describes them as, "They're how we prototype the future we want to create", which I appreciate that's not a very everyday definition either, but I find it quite that quite a propelling definition in terms of, given the amount of uncertainty and change that exists in our careers and in our organisations, the idea of standing still, of things staying the same, probably feels unrealistic, but also not that motivating for most of us.

So, I really like this idea of prototyping, of trying stuff out.  So, you don't necessarily have to use the word "experiment", but I think certainly the conclusion we've both come to is that although you can use different phrases, there is a lot about the process and the approach of experimentation that is really helpful to adopt as you're thinking about what this looks like for you in your job and in your career; because, I think we both found, as we've been doing more experiments, where you maybe forget to do things, or you maybe don't approach an experiment in quite the right way, you miss out on some of the value.

That's one of the things we also wanted to talk about today, is almost how do you extract the most value from an experiment?  I think there are almost fundamental building blocks that you have to have in place to make the most of them, if you're going to really do this well and in a useful way.

Helen Tupper: And I suppose there's the "Why bother?"  So, if you're going to do these and you're going to learn, you're going to adapt and you're going to get insights, what is the bigger benefit for us in our careers?  We've got four, so maybe if I do a couple, Sarah, and you do a couple for everyone, then we hope that we'll have convinced you to take some effort with your experiments.

So, the first way that experiments can help you is they can help you to get unstuck.  So, they're a bit of a forcing function for you to do something differently.  Often in our careers, we can find ourselves just doing things the way we've always done it before, because that's maybe a bit easier, or you're on autopilot, because you're trying to do so much stuff.  And at times when our careers change around us, that can lead us to getting a bit stuck.  But experimentation, particularly when you almost plan to do it, it's a forcing function for you to do things differently and create new opportunities and new ideas.

The second thing that an experiment can help you with is, it can reduce perfection pressure.  I have really appreciated this, because when you think about something as being an experiment, it doesn't need to be perfect, you don't need to get it right, and you are giving yourself permission basically to learn.  It's like intentionally learning, and even if it fails, there's something I can learn from failure, which takes the pressure off everything always succeeding and going the way you always want it to, because it's not really about that.

Sarah Ellis: The third reason why experiments are so helpful is, I think they give us the opportunity to almost have unconstrained curiosity and perhaps even before unconstrained curiosity, I think most of us acknowledge that we probably need just more curiosity generally in the work that we do and in our jobs, because it so often gets deprioritised. 

There's a brilliant Harvard Business Review article, called Building a Culture of Experimentation, and in that article, which we'll link to in the show notes, there's a quote that says, "Everyone in the organisation, from the leadership down, needs to value surprises", and I quite like that idea, because it almost is quite a challenging statement.  You know you often hear, and I've definitely hear this from people I've worked for, "I don't like surprises, I don't want to be surprised", and I understand that sentiment; but that idea of going, "Oh, I'm surprised, perhaps that worked better than we expected", and not just thinking, "Okay, fine" and move on to the next thing, but, "Why did that work better than expected?" or, "That really didn't work, but we thought it would".

So what does that tell us, that curiosity and enquiry that I think comes from successful experiments?  Actually, there are some really good HBR articles on experiments, which we'll refer to a couple of others later, but they're worth reading, because I found those really helpful as we were preparing for today.

Then, the final reason why we thought it might be helpful to think a bit like a Squiggly scientist is about finding out your fit before you commit, which unintentionally rhymes.  This is really when we're looking at experiments through the lens of where your career could take you.  So, I think as Helen described, you've got experiments in your day job to do a better job of the role you're doing right now, and then you've got, well how could you experiment with exploring your possibilities, about thinking about your career? 

If there's a role you're interested in, thinking about, or a career change that you're exploring, thinking about what you could experiment with, so how could you start side projects or volunteer, or work on some campaigns or causes, but just help you to experiment with, "Does that squiggle feel like it's going to be a good fit for me?"  I actually thought, when we were researching this, one of my friends, Becky, is a scientist, as in a real-life scientist.

Helen Tupper: A proper one, not a Squiggly Career scientist, a proper one!

Sarah Ellis: No, she's got a PhD and everything!  She's also interestingly a very successful squiggler.  So I was like, "She's like a scientist crossed with a Squiggly Career person".  She's gone from being a scientist to working in consulting, she's worked in really big charities and she now runs her own shop, which is called The GreenHouse, in a place called Ripon, which is in Yorkshire here in the UK, and it's all about renewables, an eco-friendly, very accessible shop.  And she's building that form scratch, having never run a shop before, and started her own Instagram, again which she's never really done anything on social media before.

I was chatting to her and just said, "Do you think your scientist brain has helped you with squiggling?"  I don't think she'd ever quite thought of it in that way before, but then it was really interesting, because she said, "I treat each squiggle like an experiment", and I hadn't told her what we were going to be talking about; and she said, "It means I stay objective and curious, rather than feeling like a failure, if it's not a complete success.  And, all experiments are useful if you approach them in the right way".  And then she shared loads of really good top tips with me on applying her scientist brain to squiggling.  I thought, "Right, that is absolute gold dust!" 

So, we're going to take lots of that and perhaps turn it into a LinkedIn article, I'll perhaps ask her to write a few of those things, we'll put those things on Instagram, but you'll hear a few more examples from her as well as we get into our top tips.  But I just thought it was interesting to find a person who had both of those things.  That was me testing my hypothesis that if you're a scientist, you can squiggle successfully, which I've decided is true!

Helen Tupper: So, we've got eight top tips for you now, so that you can put this idea of experiments practically into action in your career, and we'll talk them all through in turn and try to bring them to life as much as possible for you.

So, tip number one is to always have a hypothesis.  This does go back to science at school stuff really.  You need to know what you're trying to prove or disprove, otherwise it's just doing something because you can do it.  But you've got to really understand what is it that you're trying to learn by doing something differently.

Sarah Ellis: Yeah, I had a previous manager who was very good at always saying, "What's the essay question here?" or, "Be really clear about what's the problem you're trying to solve".  And sometimes, I think we are so keen to get started, maybe particularly if you're a doer, we've talked before about thinkers versus doers, and so often our tendency is to jump straight into action, to feel like we're making progress.  And I think if we haven't got the clarity on the question, on the hypothesis, you've skipped a really important stage.

I'm a natural thinker, so I always want to do this, I'm naturally drawn to spending some time figuring this out, but a few times I can think of examples in my career where I haven't, and I haven't because there's been that pressure to make progress.  So I was just, "I think I broadly know what I'm trying to achieve, so I'm just going to get started", and then things have got messy and taken longer, and I've really regretted it.  I can think of some really specific projects where I haven't done this, and they've been experiment-type projects, and then I've gone back and almost had to connect some dots in slightly unhelpful ways, and people have asked questions where I've thought, "I should know the answer to that, but I don't, because we just tried to get tasks done too quickly".

Helen Tupper: I think as well, when you're trying to test something and you're trying to have a hypothesis, it's useful if you're trying to test one thing at a time.  So, let's say I'm testing a new way of presenting and I think, "What I'm going to do today is I'm going to practise an experiment with a new presentation software, so I'm not going to use PowerPoint, I'm going to do something completely different; and at the same time, I'm going to experiment with standing up when I present, rather than just sitting down; and, I'm going to experiment with only asking questions, not giving answers".

Sarah Ellis: That's a lot.

Helen Tupper: Yeah, it's a lot.  It's quite a lot for you to do, but it also makes it a bit harder to learn, because the feedback that you'll get, and we'll talk about feedback as part of our tips; but the feedback that you'll get, it will be hard to know which one of the things was working.  So, when you have a hypothesis, something you want to test, just make sure it's quite focused, or you might not be able to learn a lot from it, because it will all just kind of become one and the same thing.

But to give you an example of something that we were trying to test and experiment with, one of the bits of insight that we have had, and if you're a regular listener, you'll probably know this, we produce quite a lot of assets, a podcast a week, a PodSheet a week, lots of different sessions, lots of different social media posts, and we had had some insight that it can be difficult for people to find.  There's so much stuff that people don't know where to start with some of the Squiggly resources that we put out into the world, which is maybe a nice problem to have, but we want to make it easy for people to get the help they need.

So, we were experimenting with, could a Squiggly Career Knowledge Navigator be a useful tool?  So, I started with a prototype, which was a PDF that basically started with a question that people had and then developed one of those flow charts, "Do you need to know this?  Do you not know this?  Yes, no", and it navigated them to the resource that would be most helpful for them.  And doing it as a PDF meant that it was something I could do quite quickly.  And the thing that we were trying to test was, was this useful, in light of the insight we'd had about people finding what they needed to know; was this a useful way of helping people to navigate the Squiggly Career knowledge we put out there? 

The answer was a big yes, loads and loads of people loved it.  Now, do I think that that is the right tool?  No, but do we now know that some kind of knowledge navigator, maybe it sits on our website in the future, or maybe it's a bot, or we don't know what it is; but that idea of knowledge navigator is definitely useful for people.  And we had the hypothesis and we put something into the world that proved that point.

Sarah Ellis: So, our second top tip for experimenting is, have an execution-to-experimentation scale.  This is something that has worked really well for us at Amazing If, so we're essentially sharing something that we've been doing, because we see how helpful it is.  And ironically, this is not an exact science, even though we're learning from scientists today, but in everybody's job, I think you have a range of projects that you're doing as part of your role, and some of those projects are not the place to experiment. 

Somebody might argue differently with me, I don't know; someone might say, "Maybe we should have experiments as part of everything that we're doing".  But certainly in our experience, what we've found is there are certain things where it is much more about execution, it is about getting things as close to right first time as you possibly can.  Maybe it's about more incremental improvements, maybe it's more about efficiency, and that's what you're trying to achieve with those things.  And almost being clear about what's at that end of the scale; maybe what are the projects you've got that do sit more in the middle, where there's some opportunity to experiment, perhaps with a certain part of what you're doing, to Helen's point, you can have really small experiments; and then there might be projects as a whole, where the whole project is an experiment, so everything about it is about experimenting.

I think, having that very clearly understood for yourself, but also having those conversations as a team and with your manager can be really useful.  So, we found that really, really helpful, in terms of just the conversations we're having within our Amazing If team.  So, at one end of the scale, we've had an experiment project, which has been our Squiggly Career Advocates Programme, and that has been a nine-month experiment.  So, that's definitely one end of that experiment scale that we were talking about, execution-to-experimentation scale. 

Then in the middle, we probably have things like our workshops, where we're continually trying to find small, micro-experiments, about how can we experiment with either, maybe it's new technology, new ways of getting feedback, new ways of people interacting in those workshops.  So there'd be, let's say, in an average workshop, there's a lot about just executing and doing a brilliant job, but there might be one thing that we're experimenting with. 

Then there are some things which are all just about executing, where it's maybe something that we've done before and we just need to get right.  Perhaps it's things like, after we've done workshops, we send out playlists of things to read, watch and listen to, and we just want those to work effectively, we want all the hyperlinks to be correct, and we want our partners who we work with to get those really quickly.  So, there's not loads of room to experiment with those right now. 

It doesn't mean that that's not a dynamic scale and that things don't change, but I think so often we don't look at our role through the lens of that execution to experiment, and just by us doing that, I think we have done some things differently, but also identified more opportunities to experiment as a result.

Helen Tupper: In listening to you, I've got a matrix emerging in my mind, Sarah, because that's how my weird little brain works.

Sarah Ellis: Oh, here we go!  It's a Monday morning matrix, we're recording this on a Monday morning.

Helen Tupper: A Monday morning matrix, just what everyone needs in their lives!  I was thinking about, and I might share this in PodPlus this week actually, if anyone who's listening wants to come and see the matrix in action; but I was thinking about autopilot and adaptability.  So for example, I don't think executing on autopilot is great for people, because that's where you might make mistakes, you might get bored, you might miss opportunities.  Whereas actually, executing and adapting in some ways, so you're getting feedback, executing is not a bad thing, but you want to be able to adapt a bit as well.  I think that also, experimenting and being adaptable, that's when you're going to learn the most, so you're taking those insights and you're doing something different with them, I feel like that might be a really interesting spot.

I'm going to play around with this matrix, we'll see.  Come to PodPlus.  Maybe I'll experiment with a matrix, so people can give me some feedback!  Is that experimenting on experimenting?

Sarah Ellis: Well, one thing for you to think about as you're designing your matrix, I was just reflecting on there is, I wonder, maybe based on our experiences in our careers so far and your tolerance to failure, like how frequently have you failed before, probably really impacts how you feel about experimenting more frequently.  Because I think, certainly when I've reflected on this for myself, working in quite big organisations, where I wouldn't say there's a real culture of experimentation, I don't think that's that unusual, I've done some jobs that had more experimenting in, and that's probably helped me.

But even the idea of failing feels quite hard for a lot of us, whereas probably there are some people who have done jobs, or maybe been in certain organisations where there's more of an experimentation culture.  So, I think we're all going to bring our own sense of, how do we feel about experimenting, how uncomfortable is it going to make us feel, how stretched will we be; and I think just recognising that is helpful, because I've certainly found we have done so much more intentional experimenting in Amazing If over the last year, and I find that much, much easier now than I did 12 months ago. 

If I think about where we were 12 months ago with it, I still found it really difficult, almost the idea of some things will fail along the way, and you're a bit less in control.  So, I think you have to let go of a bit of control, which I always find quite hard.  So, I think there might be a feelings element to the matrix as well, just to add in a new dimension.

Helen Tupper: Well, I'll play around with it and we'll see.

Sarah Ellis: Have fun.

Helen Tupper: Yeah, well I'll have fun.  Well, experiments can be fun!  So, the third idea for action and tip for you to take away is all about labelling and letting people know when you're experimenting.  So for example, let's take that idea forward that I am going to do something with this matrix, and it's going to happen in PodPlus the week this podcast comes out, so that's every Thursday morning, 9.00am, and I don't let people know.

So I go, "Here's a model" and I talk it through, and people are thinking, "This model does not make sense.  This is not a helpful model.  Helen's gone a bit rogue".  That's not really helpful for them, and not really helpful for me!  Whereas, if I said to people, "I'm just going to try a new tool out that I've been experimenting with to see if it helps to bring any clarity to this topic, let me talk it through", and I bring it to life for people, then I'm getting more buy-in from people.  They know what I'm doing, they're primed that it might not be perfect, that takes the pressure off me, and then I can just say to people, "So, what do you think; does it work, does it not work?"

When you label and let people know, it's better for them and better for you.  It means that you're priming them for feedback and you're taking that pressure off from people at the same time.  I do this quite a lot in workshops actually, and I'll say to people, "I'm trying something out for the first time today, I'd love to get your thoughts on it".  I feel like it's just better for everybody when you label and let people know that an experiment is coming.

Sarah Ellis: Well, even yesterday, you shared with me something that we're going to try on Instagram, over WhatsApp.  When you shared it with me for some feedback, you were very clear, "We are experimenting with XXX", and then you'd got a specific question for me that you wanted me to feed back on, and I found that really helpful; because if I had looked at that through an execution lens, my feedback would have been very different to when I looked at it through an experiment lens.  So then, I could be much more helpful for you, and also you do have more freedom, I think, when you're experimenting.  You're like, "I get we're trying some new stuff out here.  Let's think about how we can be helpful for each other, what's going to be useful at this point?" because we never want to slow experiments down, we want to get them out into the world; because, until you get it out in the world, you can't learn from it.

Of course, you want your hypothesis first, but you were very clear in describing to me what we're trying to do here and then saying, "And, what do you think about this?"  I was like, "Let's do that thing", and then very quickly we moved on.  I mean, we did that over two or three messages in WhatsApp.  So, I think that was a good example of where you'd labelled something, and I actually didn't know that beforehand.  So, if you hadn't told me that, I think you'd have got a very different response from me, probably not a WhatsApp message, probably more of a voice message!

Helen Tupper: It definitely controls the critique, and Sarah loves giving feedback, so I actually particularly find it quite helpful, because it's like, "Hold the critique and help me with the experiment".  It's a slightly different positioning of the support that I need, particularly from Sarah, on certain points.  Other points I'll be, "Look, we need to put this out for lots of critique, because it's going to be a big part of what we're doing in the future", and that's a very different lens that I'm asking Sarah to look through.  I think that's the point here; you're giving someone a lens to look through that's going to be useful with what you're trying to learn from the experiment.

Sarah Ellis: And so, out tip number four is about finding your experiment energisers.  So, I was reflecting on the most successful experiments that I feel like I've run or been part of, either in my jobs, or thinking about my career and my career possibilities, and I do think there are some people who are naturally brilliant at experiment energising.  They are the people who help you to build better experiments, and that's because they ask really good questions, they motivate you, they're really good challengers, often they're on your side and they're trying to support you to succeed.

There are some people where I think every time I spend time with that person, if I'm trying to do something new, or that's making me a bit uncomfortable, or that might fail, I feel like just spending time with that person will make that experiment better, because perhaps they'll remind me that I've not been clear about the hypothesis, or perhaps they just ask me a question about, "Well, are you clear about how you're going to measure the right things as part of this process?"  And they've got the confidence and the clarity to do that.

It's always easier to not be in it, isn't it; it's easier to be an experiment energiser than running it yourself, but I do think some people have this as their natural super-skill.  They appreciate the opportunity to -- it's quite playful.  Actually when you read about experimenting, and I was watching a really good TED Talk, again from a Nobel Prize-winning scientist, and there's even the word "play" in the TED Talk description, and this idea of being quite playful and having that freedom when it comes to experiments, I think, is quite important.

I was reflecting that often, my tendency might be to be a bit of a solo experimenter, so I think I really enjoy experiments, but quite often by myself, and there's a few challenges to that.  Sometimes, it means that you hold on to your ideas too tightly, but I've learnt that I think my experiments are always better when it's not just me.  But that could just be one other person, or it could be 50 people, and that might change throughout an experiment.

So, when we've been doing things like, we've done some Squiggly Careers stories over the past four or five months, where we've been asking people to do a video story of their career so far, and that's been a really small team of us experimenting with those and making those happen.  But I've also run that idea past a few people outside of Amazing If, and those people have different roles, I think. 

Some people are more about making that experiment happen, like our Amazing If team are there to make that experiment happen, but those external people are the people who have made me think, "Maybe we could do this in a slightly different way?" or, "Have we thought about [or] what would happen if…?" you know, all of those really useful questions that I just think propel your experiment into new areas, new territories, encourage you to explore as many different things and learn as many different things as you can as you're going.  So, if you're thinking about, whether it's for your career or your job, it's that question that we come back to for a few different areas, I think is, "Have you got the right people around you?"

Someone actually emailed me after a workshop last week, and I was talking about, "One of my strengths is, I love starting things from scratch, I love a blank piece of paper", and they emailed me afterwards saying, "How do you do that?" because they were saying, "I'm the opposite".  And it was so interesting, because then this person also said, "I want to experiment and have some new ideas, but also I've got a deadline of three weeks".  I was like, "Okay, well you're putting yourself under a lot of pressure to try an experiment if you've got a very clear deadline, and also very clear outcomes".  So, I think almost choosing those experiments and then choosing the right people around you, they feel like really important things to notice as you're thinking about experiments.

Helen Tupper: And when you mentioned there about freedom, experiments in freedom, it made me think actually, because freedom is one of my values and I do really like experiments, and I like having the space to go and do those things and run with them.  And I think, if you are an individual who also identifies with freedom as a value, or if you're a manager of somebody in your team who has freedom as a value, giving them some space to experiment, or giving yourself permission to experiment, is probably going to be quite fulfilling for you, because it's going to give you what you need in terms of your values.

So, our fifth top tip then for helping to be your Squiggly Career scientist is all about collecting feedback fast and frequently, and I think is probably the one area -- there's lots of areas I could improve on, but this is probably top of my list.  So, whenever you've got your hypothesis and you do your experiment and you're bringing people in, all the things we've talked about so far, what's really important is that once you've done your experiment, you get some feedback.

You've got to work out, was it impactful; did it help you learn about the thing that you wanted to learn about?  If you move on too quickly from an experiment onto the next thing, then you've missed an opportunity to gather the insight that will help you know whether this was useful to do or not.  So think about, "Who do I need to get feedback from?"  Maybe if you're doing that thing with presenting, maybe it's somebody who was in the meeting, and actually you probably want to do that quite quickly, whilst they can remember what you did.  Get lots of feedback and get it as close to the moment as you can. 

I was actually thinking, Sarah, about an experiment that we are doing at the moment with somebody new in our team.  So, because our business has grown quite quickly, and because we're a small business, so people do lots of different things, we often experiment with roles that we need in the team.  So, we're just bringing somebody new into the team on a seven-week learn-and-see role, which is about us experimenting with, "Is this the role that we need for the business?" and that person that we're doing the experiment with is thinking, "Is this the right role and opportunity for me?"

So, we're both doing this experiment together, and what we have got in place is the period of time, so it's seven weeks, so we know we're going to get feedback at the end of it; but every two weeks, we're having a review, a catch-up and review as part of this learn-and-see period to work out what's working well, what's not working, what could be "even better if" during this period of time.  And, I don't know whether at the end of seven weeks, that's going to be the right job for us, and whether it's going to be the right opportunity for someone else, but we've been really transparent about that with getting lots of feedback along the way, and the aim really is to learn and we'll put that insight into action after the end of the seven-week period.

So, I think that is another example for us of an experiment we've got going on in the business, but where we're collecting lots of fast and frequent feedback too.

Sarah Ellis: I think what's so nice about that experiment as well is, it's the opposite of what you'd automatically assume that you'd do in that scenario.  So usually, if you needed some quick help, often those contracts for employing someone are quite transactional.  You go, "Okay, I need a freelance person, or someone who can just come in and solve a problem for a bit", and almost the last thing that you do is learn and see; because, if you were thinking about where would that naturally sit on an execute to experiment, you'd go, "It's about execute".

Helen Tupper: Yeah, fix it fast.

Sarah Ellis: Fix it fast, yeah.  Now, I get that sometimes that might also be right, but what I think you've done such a brilliant job of identifying here is, we do have an opportunity to experiment.  It's not so urgent that somebody has to just go in and work their way through a list of 15 things, 1 to 15, as quickly as they can; we are figuring out a role and we're figuring out what's the right role and we've got a lot of questions.  And you can think about that for a long time, but probably the best way to learn, we're finding, is by doing, and to get people started.

Also, your point about transparency is really important, because if somebody coming to work with us thought it was about executing and we think it's about experimenting, then you're going to get a real mismatch of experience and expectations very quickly.  So, I think it's also a really nice challenge to think for yourself, you know where you might default, we can't help but sometimes default to, "We've always done things in a certain way, so that feels like that's an execute", it's just to always just press pause for that five minutes and just go, "Is it; or could it be an experiment?"

Sometimes the answer might be, "Not now", and I think sometimes it might be, "Not this year", but not now doesn't mean not ever.  There's a definitely a period of time where six or nine months ago, we would have had a "not now" answer to that exact same question, but it's really good that we're continuing revisiting that, and it's that unlearning and relearning, isn't it, that you've got to get good at, I think, with experimenting.

Our sixth tip for experimenting is about aiming low, which sounds counterintuitive potentially, but also quite appealing, I think, in our world of relentless high expectations of ourselves and each other.  And this is some borrowed brilliance from Jim Collins, who you might have heard me interview previously on the podcast.  What he describes is that when you're figuring out experiments and when you're designing experiments essentially, try and make them as low cost, low risk and low distraction as possible.  Then, once you validate what works and what doesn't, then make sure you really concentrate on the things that you've got confidence in.

This comes from his book, Great By Choice, and really what he's talking about here is almost designing businesses and designing business models, but I think you can then apply this to where your career could take you, or when you're thinking about experiments as part of your day job.  I think it's sometimes hard to do low risk, low cost and low distraction to tick all of those three boxes, but it's a good ambition to have.  I thought about this quite a lot yesterday as I was reading about this and I was like, "The best experiments probably do need to start in this way, because then it helps to get something off the ground, because if it's high in any of those areas, it just makes it much harder". 

So, I was trying to think, what's out best example of a low/low/low, like low cost, low risk, low distraction, and I was probably thinking just our business as a whole.  It's the longest, lowest-risk experiment probably that I've ever done, in that when we started Amazing If, it was low cost in that it didn't cost us anything to start Amazing If, we didn't have loads of investors, and we still don't, and we didn't put loads of money into it; it was low risk in that, if Amazing If hadn't work, and lots of versions of Amazing If didn't work, and lots of things that we did didn't work, it didn't matter to anyone really other than us; and it was low distraction, we'd both still got our day jobs. 

From 2013 until January 2020, that was the period of time where we were still both doing, at least one of us was still doing other things.  I mean, Jim would probably say that's too long!  There's probably a point, I'm sure, where low risk and low distraction becomes unhelpful.  But if you're thinking about it as, "Well, did it do the job?  Did that experiment of Amazing If do the job of figuring out whether something worked or not, so that we could then have confidence in what we were then going to deliver as a business, it did that", but it did take us quite a long time to get to that point, because we had other priorities and we had other things that we were doing.  So, it was probably on a very dramatic end of the low scale.

But equally, when I think about things that I've done in my job before, where you've been trying to experiment with getting something new off the ground, again, if you can just make it low, aim quite low, almost like prototype -- people talk about minimum viable products, don't they, "What's your MVP?"  I think if you can get your MVP and your prototype out as quickly as you can, it just helps you to get underway and to start learning. 

It's one of the things I always think you're really good at.  You won't procrastinate, your assumption is, "Can I make a PDF out of it?" I always feel is your starting point!

Helen Tupper: That is so true!  Do you know what, as you were talking I was thinking about, do you remember a while ago, Sarah, that Christmas when we decided to create something like 50 Amazing If ideas for action?  Do you remember we did it, this was years ago, this was 2013 or 2014 or something, and we produced it; again, it was a PDF.

Sarah Ellis: No, what was it again?

Helen Tupper: It was like 50 Amazing Ideas for Action.

Sarah Ellis: Oh, I do remember.

Helen Tupper: Do you remember?

Sarah Ellis: I do!

Helen Tupper: And that was a relatively low cost, it was just me and Sarah and PowerPoint.

Sarah Ellis: Oh, no, I know where you're going with this story now; oh no!

Helen Tupper: I wasn't going to bring in the bad bit. 

Sarah Ellis: Okay.

Helen Tupper: I will now bring in the bad bit, because Sarah just brought it up, which was basically, there were some design challenges between the two of us, but we resolved those.  My point was more that, that didn't cost us anything, because it was just our time over a Christmas period; pretty low risk, it didn't really matter if it didn't work; and we did it during the holidays.  I think it was pre-kids as well, so it didn't really take away from work.  But getting that out into the world was actually, I think, an early version of us testing useful ideas for action, which have now become a really big part of our business and part of our values.  But I think it was an early experiment towards that.

Sarah Ellis: Yeah, and our learning from that is, don't use two different versions of PowerPoint and not tell each other about it, because it creates arguments!

Helen Tupper: I do also remember that too.  So, tip number seven is to measure what matters to you.  If you're going to do experiments, you really do need to embrace the data, you've got to work out, "What are you going to measure?" but it does need to be meaningful to you.  I quite enjoy this bit, because I think you can create your own metrics.  It doesn't have to be the conventional things, it could be, how happy does it make you; or, how useful?  That's one of the metrics that matters to us.

With Squiggly Career Advocates, one of the experiments that Sarah mentioned, we wanted to understand whether bringing that community together helped them to amplify and accelerate their working career development, and that was a metric that mattered to us, because our mission is to make careers better for everybody.  So, that was one of the things that we measured when that community concluded recently. 

So, defining what the data that you want to measure is, is really important, and I think the earlier you can do that, the better, because then you can make sure that you can measure it.  Otherwise, if you're doing it at the end and you're thinking, "Wouldn't it have been amazing if we could have measured this?" it's a little bit too late.  Whereas, if you know it from the outset, then you can design around the data you're trying to collect.

Sarah Ellis: This is my personal biggest "even better if", I think by some margin.  I've spent a lot of time thinking about experiments, because I find this really interesting, and I think there's lots of value in this for all of us.  I think so often, I think of this way too late, and then you've missed the moment.  I think I miss the moment too frequently.  If you miss the moment, you miss opportunities to learn.  You've not been clear about measuring what matters early enough, and then you've missed the moment to collect it, and then it's too late.  It's that capturing value again that I think sometimes, I need to get a lot better at.

I was speaking to my Squiggly scientist friend, Becky, our go-to guru for this episode, although I don't think she quite expected that to happen, but that is what happened, as I started firing loads of messages at her; and she was saying to me, so obviously she's running a shop for the first time, having never done that before, and she has a shop diary on her phone which is her way of tracking not only all of the data, and she is a bit of a spreadsheet whiz, so I've got no doubt she'll have all of the facts and all the figures; but interestingly, she's also tracking her feelings.

So, one of the things that she's identified already is, enjoyment is a big part of the reason that she's doing this, this Squiggly Career experiment.  So, when she then has a choice or a decision to make about, "Is this something I want to keep doing?  Do I want to renew the lease on that shop?" she'll have data both in the form of facts and feelings together.  What I noticed about that, which I thought, "Yeah, that's really good", is that she's done that from the start and from scratch, and she's really committed to that every day as something to continually keep coming back to, so that in the moment where she needs it, it's all ready for her to reflect on.

Our final idea for action, so number eight, and we will do a quick recap for you in a second, is to celebrate successful experiments, not successful outcomes.

Helen Tupper: I love this tip, it's my favourite one, Sarah.

Sarah Ellis: Yeah, left the best until last.  Let's hope people are still listening by this point, otherwise it's a bit of a waste, isn't it?!  This does come from one of the HBR articles both Helen and I have read, which is one of the biggest barriers, or probably the biggest barriers to experiments in organisations is not capacity, it's culture.  So, everyone will probably say, "I haven't got time for this", and I understand that, but I think often there is this sense of, maybe organisations generally, and organisations are just the people, saying, "We want to be ambitious, we want to try new things out", but for every experiment that succeeds, probably about ten don't.  That, apparently, is about the average.

If your emphasis as a team, or if as a leader you're setting a kind of culture which is more about efficiency or predictability or winning, in quite a binary way, then experiments feel wasteful, they don't feel like a useful way for people to spend their time.  So, I think this doesn't have to be you changing an organisational culture overnight, but this could be about you reflecting on your role and your team and thinking, "Are we creating an environment where we are encouraged to experiment, where we are supported to experiment, at least some of the time?" 

I think this is probably the biggest win that we've had in Amazing If, is I don't think we'd intentionally not done this, but I don't think we'd intentionally done it either.  Therefore, those experiments don't just happen, I think you have got to decide that this is a useful thing for your team or for your organisation, this is going to help you create new value, to spot new opportunities, to be better; and I think we are seeing that.  I think that's probably why Helen and I are -- this is probably a bit of a longer episode, because we are really passionate about this, and we can see the value that it's brought to (a) how much we're enjoying our roles, but also (b) the value that we're offering, in terms of delivering on our purpose to make careers better for everyone.

So, I think as part of this, you've got to think about, "What does a successful experiment look like?" so have you done all those steps that we've talked about, the hypothesis, labelled it, you've let people know, you've thought about the metrics that matter, you've got your fast feedback; have you done all of those things?  If you've done all of those things and it's failed, that is a successful experiment, because it's all about learning and being curious.

Helen Tupper: So, let's just recap then on those eight top tips, so that you can put experiments into action in your job today, and to help you with your career: (1) always have a hypothesis, (2) explore your execution-to-experimentationation scale, (3) label them and let people know, (4) find your experiment energisers, (5) collect feedback fast and frequently, (6) aim low, (7) measure what matters to you, and (8) celebrate successful experiments, not successful outcomes.

Sarah Ellis: So, we hope you found that a useful episode.  We would love to hear any experiments that you're running, or anything that you've done to help more experiments happen in your teams or your organisations, anyone we can borrow brilliance from; we're always very open to that.  And, if you do have two minutes to rate, review, subscribe to the podcast or to share it with someone else, that really helps us to scale Squiggly, and plus we love reading all of your reviews.  So, if you get one minute this week to do that, we'd be very grateful.

Helen Tupper: So, you can send those experiments through to helen&sarah@squigglycareers.com, or just get in touch with us on Instagram, where we're @amazingif.  But thank you so much for listening today, everybody, speak to you again soon.

Sarah Ellis: Thanks everybody, bye for now.

Listen

Our Skills Sprint is designed to create lots more momentum for your learning, making it easier to learn a little every day.

Sign up for the Skills Sprint and receive an email every weekday for 20-days, a free guide to get you started, recommended resources, and a tracker to log your learning.