Insight Analysis methodology – part 2

Part 1 explored the overall insight analysis cycle. This post unpacks the ‘Analysis’ part.

Often after briefing, analysts go off to crunch the data and it can seem like your project goes into a black box and you don’t know what’s going on until at some point, a powerpoint with pretty graphs appears with some results.

We find that different projects unfold in different ways, but we’re nearly always going through this same process to get from questions to answers.

Analysis steps

Analysis steps

The first step of sorting out the data can be the hardest and most time consuming.
Working on the data definitions helps get everyone on the same page of what your story will be about (what do you mean by ‘active donor’? Someone giving a gift in the last 2 years, OK, anything else? What about if they engaged in some other way instead, does that change how we consider them?)

Clean datasets are hard to come by, but trash in means trash out so it’s worth at least deciding how good you think the data is and laying out the caveats and knowing your margins of error.

Everyone approaches analysis in their own style. For me there is no abc of data exploration, just like there’s no abc of how to be a detective or write a novel, people doing those things work at their craft and find their own ways and means.

Interpretation is the step many people will miss out, because it’s easy to present observations as though they are answers. Just as when you thrash out questions and hunches with clients you can tap into their gut feel about what’s going on (they do know their activity), when interpreting, check in with your clients for their opinions, I bet they will know something that will help you interpret a trend or other interesting finding.

Telling the story is also an underrated part of the process. You often only get one shot to put your findings across and influence what your clients take forward. Don’t be boring! Don’t hide behind your caveats! Tell the story. Make recommendations backed up by your findings and invite discussion and questioning of what you’ve found. The outcome should be some solid insight to take forward into action.

 

Insight Analysis methodology – part 1

I came to insight analysis with a mixed background. I had been a fundraiser, and because I’d worked in small organisations I’d tried my hand at many types of fundraising. The other side of my experience was from science where I studied psychology and neuroscience and worked on research on those topics. From that I learned scientific method, stats and data.

This is the general purpose methodology that I drew up as I was finding my feet in my non-profit Fundraising Analysis role, a role that was new in my organisation as well. I knew fundraising and fundraisers and I knew data and analysis but this brought those things together in a way that allowed me to explain the work to others.

Insight Analysis cycle

Insight cycle

 

What to bear in mind during the process

Objectives:
What do we want to achieve? (overall Goals – increase income/improve donor care)
What do we want to do? (plan for Activities – design a new product, improve a process)

Questions and Hunches:
Questions are often the place people start. There’s nothing wrong with that for brainstorming a starting point, but the sooner it links back to your objectives, the sooner you focus in on what you really need.
Bearing our objectives in mind, what do we already know?
What do we want to know – what will help towards our objectives?
Less is more! The fewer the questions you go into the analysis with, the more focused it will be and the easier to action or iterate onwards from.

Hunches (or hypotheses) about what you think the answers to the questions will be are really important. The move towards ‘insight driven decision making’ is often tagged as the move away from gut feel towards data. I find that using client intuition and experience to form hypotheses and to sense check findings as you go along keeps you on track to useful findings. Hunches are where you start to  combine gut feel with the data.

Analysis: this is where you crunch the data. I unpack this in part 2.

Insight:
We have our answers, and a story that we can understand. What do we do now?

Action:
Information from insight going into planning actions.

Evaluation:
What does success look like?
How are we going to measure that?
Do we need any different/additional data capture?
Evaluation is analysis too, checking what happened, iterating back through insight, action and evaluation while you tweak or optimise whatever you were working towards.

And when you’re done – on to New Objectives!

In part 2 we unpack what happens under the hood of analysis.

Data, Reporting, Analysis and Insight

We use these terms all the time, often interchangeably. As more organisations work to harness the power and potential of their data the more these words enter the vocabulary of every day life.

Working to gather requirements for data, reporting, analysis and insight, and to deliver them as outputs to my clients, it has really helped to agree what the words mean so that we all speak the same language. As a result, when we talk about each output we can ask and brief for the more important questions:

what’s it for?

who’s it for?

what will it help you to achieve?

Understanding the different outputs

  1. Data
    Raw Information
    Example: The contents of your spreadsheet or the output of a database query
  2. Reporting
    Data collected together and delivered regularly
      Example 1: What’s new? New sign ups this week
    Example 2: How are we doing? Income month to date
  3. Analysis
    Summarised and visualised data
    Example: What has happened?
    Graph comparing income from campaigns over time
  4. Insight
    Interpretation of analysis
    Example: Why did it happen and what should happen next?
      This campaign did better than others because…
    so next campaign we should use…
    or test…


SLIDE: Data Reporting Analysis Insight

There are no stupid questions

There are no stupid questions, there are only questions.

Then why do we feel nervous about asking when we don’t know?
Because we are not the question, we’re the person asking the question, and we can feel stupid. We’re not even afraid that we are stupid. We fear feeling stupid. We fear others might think we’re stupid or worse.

‘What if I’ve totally missed the point?’
‘What if this is the most obvious thing ever?’
‘I’m in the room with senior people, what if I make a fool of myself?’
‘Will asking for clarity sound like I’m criticising?’

Asking questions moves things forward. It enables clarity, it gets everyone on the same page of understanding an issue. It challenges assumptions, that’s not the same as challenging a person’s authority, it’s not personal to want to understand a situation better.

One of the colleagues I’m working on a project asks the best question and asks it often.
‘I didn’t get point you just made, can you go over that again?’
Yes, of course. And yes, it is up to me to make sure there is clarity if I’m putting something across.

“If you can’t explain it to a six year old, you don’t really understand it.”

Who said that? Einstein? Richard Feynman? I don’t actually know, maybe nobody! But I like it and anyone doing a technical job would do well to remember it, speak in plain English and invite questions at every opportunity.

And when you get that feeling in your stomach that you have a question to ask, raise your hand, do it, it’s the responsibility of the person you ask to deal with it it’s not for you to make your questions unstupid. If there are others around you, they are probably wanting to ask the same thing, fight the fear.

Knowns and Unknowns

Known knowns – even when you’re sure of what you know, question it, and be willing to test it.

A fact is not a fact just because you’ve found some corroborative evidence that supports it. A fact is a fact when you’ve you’ve gone looking for the evidence to discount it. If you can test your hypothesis like this with impartial curiosity and a willingness to be proved wrong, you will always learn something.

Known unknowns – once you’ve defined a gap in your knowledge, ask yourself what filling that gap will achieve. What will you do then? Or is it just ‘interesting’?

Unknown unknowns – these will always surface during the search and testing of the other two.

‘We don’t know what we don’t know’. So start with what you do know.

Start with what you want to achieve.

What do you know? Are you sure?

Now name something you don’t know but want to find out. What will you do once you have it?

Focus on objectives. Test existing assumptions. Identify the gaps. Fill the gaps. Tell the story. Take action.