Diverge & Converge – Finding Balance

Whether it’s unpicking the prickliest of thorny data problems or designing engaging and informative data visualisations, there is the need to use both divergent and convergent thinking.

As an analyst, I know I use both and I believe that data analysis and insight work lends itself strongly to the design thinking ‘double diamond’ model (two waves of divergent-convergent thinking, the first diamond focusing on problem, the second on solution).

I work both creatively and analytically, can generate ideas and whittle them down, can reflect and imagine as well as prioritise and act. So why do the words ‘brainstorming session’ still make my internal organs converge in on themselves? In the past I would have concluded that perhaps it was because ‘I’m just not an ideas person’, that ‘I’m not as creative as my peers’. Writing that down I see it for what it is, inner critic talk. What is an ideas person? It’s a person. Any person. Perhaps it’s not the ideas-y bit that I’ve found challenging, I think it’s the experience of imbalance between the divergent and convergent parts of the thinking process.

What I like about these process diagrams is their symmetry. There is balance between the divergent and convergent elements. When brainstorming, how balanced do you feel the use of time is between these two? Shorten the front and you will miss good ideas from outside the box, shorten the back and you risk rushing too quickly to a choice based on whim rather than evidence.

My analytical leanings means some of my best contributions are within convergence. Don’t get me wrong, I can come up with new ideas, I know how to use and continually strengthen that muscle, however, there’s also a real joy in adding flesh to the bones of an idea whether it’s yours or not. For example, I love weighing solutions against each other using something like an Impact vs Ease matrix in order to add some rigour to the choice stage. Because of all this I have felt underused, undervalued and disappointed when I’ve been part of processes which have spent hours on diverging only to follow with mere minutes converging.

These are the pointers I’m trying to keep in mind for any sessions that I’m facilitating for my team or others.

  • Equal time on divergent and convergent thinking is time well spent.
  • Ensuring that facilitators are aware of who in the room excels at each type of thinking can also help get the most out of participants, allowing them to play to their strengths and encouraging them to engage and contribute fully.
  • For any collaborative session, allowing prep time, reflection time and the opportunity to add thoughts later is more inclusive of neurodiversity in your group. It enables those who think clearer alone to cogitate and contribute in their own time, while quick collective thinkers can take the lead in a live session.
  • Allowing separate sessions for divergent and convergent thinking reaps huge benefits as it enables everyone involved to switch hats for each exercise. The emphasis can therefore change from quantity of ideas to quality of ideas, from horizon scanning to planning, from dreams to reality, from thoughts to actions.

Scaling teams part 3: Request and response

Here we continue the series exploring what’s needed to scale analytics and insight teams (parts one and two here). In this post we’re looking at the request funnel – the ways in which the team receives, and responds to, requests from stakeholders.

What type of flow are we looking at?

  • Flows from stakeholders to the team
  • Flows from the team to stakeholders
  • Flows between multiple stakeholders
  • Flows between analysts within the team
  • Flows between analysts and other collaborators outside the team
  • Flows between team leadership and analysts to support and guide

Automation vs the human touch

How can we make request and response practices both efficient and personal? This is a challenge for any team, and one that becomes harder when scaling teams. This is because the manual ways of working that function well with a small team (e.g. using a shared email inbox) become more laboured and harder to manage with a larger team and the volume of standard requests lends itself to an automated approach such as a Service Desk.

But which of us actively enjoys engaging with a Service Desk system as a user? And if we don’t like it, will our stakeholders?

Using my current team as the example we have broadly four types of request that come into the team regularly:

  1. Requests for access to the Tableau server
  2. Bug reports or requests for changes to existing dashboards
  3. Requests for new dashboards or pieces of analytics and insight work – strategic or operational level
  4. Ad hoc requests for analytics support and advice from those doing their own analytics work

Example 1
The only one of these that has any automation in request processing is no. 1. However, the current version of that process has combined email correspondence and an automated form to fill in. It doesn’t work clearly or smoothly for users and therefore gives (potentially brand new) stakeholders an experience of the worst of both worlds.

We have not yet implemented a solution here but I’ve been thinking through an approach:

  • Map the user journeys through whatever request systems you have. We care most about our stakeholders having a good experience and secondly about ease of process for the team.
  • Prioritise the worst and busiest channel – in our case it is the request for Tableau Access.
  • Outline all the steps in the journey and how they work, finding out the context and history for why it is the way it is. Team members have made decisions in good faith getting us to this point, we need to be mindful of that. If at all possible, also ask users directly about their experience of the process.
  • Create solution options that use either entirely human or entirely automated processes so you are clear on the differing paths.
  • Then if wanting to create a combination, to increase human touch in the automated route or automation in the human route, map the process options from the user point of view first and the team view second.
  • If you find yourself saying ‘we can’t do that because the system doesn’t let us’, pause before implementing that system-based approach you’ve come up with because you have hit an instance where you are compromising user experience for team or system experience. Do you really want to make that compromise?
  • Think of ways to hide automation within your human process so that a user would not necessarily be aware of the automatic processing but the human gets to save time. For example using standard email responses topped and tailed by a person dealing with the query.
  • Think of ways to embed human touch into your automatic process e.g. an online form that is written in a friendly human tone.

Example 2
We also want to improve communication around request types 3 and 4. We have already been able to cut down a huge amount of ‘fire-fighting’ work in the team by meeting analysis and ad hoc requests with a standard set of questions for the requester to write answers to. This not only helps the team obtain more concrete and more detailed information about the request asynchronously (i.e. it doesn’t require a meeting unless the stakeholder is really stuck for where to start), it also provides a valuable exercise for the stakeholder in clarifying their own thinking. We’re now wanting to see if we can structure this process slightly more by having a form to fill in, rather than simply asking the questions via email.

Key questions we always look to include are below and we try to use friendly plain English:

  1. What are you working on and what is the overall objective you are working towards?
  2. Who is the work to be delivered to – who are the main stakeholders?    
  3. What will the stakeholders be using the information for? 
  4. If this is for dashboarding, what kind of regularity is the reporting for? 
  5. Tell us about the analysis you need. What questions do you want to answer? What measures do you want to visualise? 
  6. Where is your data coming from? Tell us about your data sources. 
  7. What timeframe are you working to? 
  8. What other context can you give us? 

What are our next steps?

Example 1: I’m going to be working on improving the Tableau Server request process so that it is smoother and quicker for users and keeps the ability for them to have human interaction where this is valuable. I believe that if we achieve this, we will have a process that is quick and easy for the team as well.

Example 2: We’re now wanting to see if we can structure this process slightly more by having a form to fill in, rather than simply asking the questions via email.

Scaling teams Part 2: Kanban

In part 1 we established the areas of flow to consider when improving and scaling analytics teams (items in bold we cover in this post):

  • Flows from stakeholders to the team
  • Flows from the team to stakeholders
  • Flows between multiple stakeholders
  • Flows between analysts within the team
  • Flows between analysts and other collaborators outside the team
  • Flows between team leadership and analysts to support and guide

Workflow management as a Kanban

Kanban is an agile method for managing team or project workflow.

Trello is a powerful tool for managing work in Kanban style, even using the free settings. Flexible and adaptable, it provides a lot of freedom for how to do things. Other tools such as Jira are far more powerful in terms of what they allow you to do. They are also pricey. It may well suit a larger analytics team to work with a more powerful tool such as Jira, especially since it provides more in the way of reporting features giving a manager the ability to track progress directly from the data stored in the system. However, the behaviours needed to work within a Kanban structure can be established with Trello for free, allowing a new or maturing team to settle into the way of working first, then switching as needed to a new tool that might expand or speed up practices in the future. People, process, technology usually works best in that order.

How can you create the structure for best practice behaviours in Trello as a Kanban?

In part 1 we established the ways of working that can best help move change forward:

  • Agreeing collectively ways of working to test and iterate.
  • Collectively setting expectations of behaviour.
  • Setting up structures and tools to enable agreed ways of working.
  • Agreeing some accountability mechanisms to help expectations to be met.
  • Realising how much of the work boils down to building relationships based on trust and good communication.

How did we put this into practice?

Deciding together what to test
We decided to test moving from a person per column to a Kanban structure for the main team Trello board.

Expectations set

We agreed to ensure the board was up to date for our Monday team meeting and that we would update our cards so that anyone looking at one could see at a glance what the item was, and the current status. We agreed that each card would have a team member responsible for it.

Accountability
As team manager I took it as my responsibility to make sure that items were clear every Monday and that each team member also had a summary card with their name on listing their priorities for the week.

Tools to help us: using functionality in Trello
To show who owned the cards we adding Members, one each to show who each card on the board belonged to.

We agreed to add Checklists to cards so that small tasks were linked up together under ‘pieces of work’ which could make their way across the board. This took some adjustment. There is a sense of achievement that comes from moving cards to done. However, if all tiny tasks have their own card then the board becomes cluttered and confusing. If each card is a piece of work, then checklists can show the tasks that are coming up and need to be completed before the card moves on.

After holding a session to agree the team’s 6 month priorities we put these on the far left of the board and created Labels for each. That way we could label cards with the priority that they fit, making it easy to see what work built towards our objectives and which was ‘other’.

Trust and Communication
In each team meeting we checked in to see how the new structure was working for people. Through this process we could gauge what was working (people were positive) and how it helped (people particularly liked listing their priorities for the week ahead, it helped them stay on track and push back on other requests). We could also get feedback on what didn’t work so well and discuss together how that could be adapted so we could test it again over the next week or two. An example of this was our set up of columns where we started off with a certain number of columns representing different aspects of progress on a card, we had too many columns to start with so we agreed the definitions and streamlined them as we went.

This is an ongoing process of test, reflect, adapt which is agreed as a team, keeping things collaborative, productive, flexible and positive. There is no right answer for exactly what the structure of your workflow management should be, but this is an approach that can help you find your own best way of doing things.

Scaling teams Part 1: Flow

I’ve just started a contract to head up a Data & Insight team with the goal of scaling from two team analysts up to six in the next six months. Naturally, my mind has been buzzing with ideas about what key things are needed to achieve successful scaling.

The team has been both stretched and under-resourced for the best part of a year and is about to lose one of the existing two analysts. They are a good team, recognised and respected for the value they add to the organisation. Their analytics practices are good – they have a well-managed Tableau server set up with processes and standards in place governing what is published there. They also have some of their main data sources housed in a data warehouse, feeding Tableau.

Working from the set up that this team already has, these are the elements that we are reviewing, improving and adapting in order to scale the team.

  1. Managing the delivery of the programme using Trello.
  2. Gathering requests from stakeholders using a shared email inbox.
  3. Documentation and communication focused around Microsoft Teams, OneDrive and Sharepoint.
  4. Using the website as a shop window communicating what the team offer.
  5. Building out a team structure based on domain areas, one for each new Senior Analyst to lead.

Reflecting on this list as a whole, there’s a strong theme here about creating smooth and effective flow. Flow of information or action into, out of, and across the team. Good flow owes more to establishing consistent behaviours from team members than it does to any particular tools used. Establishing well-flowing work practices takes:

  • Agreeing collectively ways of working to test and iterate.
  • Collectively setting expectations of behaviour.
  • Setting up structures and tools to enable agreed ways of working.
  • Agreeing some accountability mechanisms to help expectations to be met.
  • Realising how much of the work boils down to building relationships based on trust and good communication.

The challenge to creating flow is to create structure that is easy, productive, logical and helpful to the people doing the work. Anything that feels onerous, tedious or overly dutiful will form rocks in the river, blocking the flow.

What are the stages of the analytics and insight pipeline?

  • Request
  • Triage and prioritisation
  • Briefing
  • Data, analysis and insight work
  • Pre-delivery
  • Delivery
  • Retro
  • Follow up

What are the flows that we want to make sure are smooth, easy and productive throughout the pipeline?

  • Flows from stakeholders to the team
  • Flows from the team to stakeholders
  • Flows between multiple stakeholders
  • Flows between analysts within the team
  • Flows between analysts and other collaborators outside the team
  • Flows between team leadership and analysts to support and guide

Averages: Is it mean to use the mean?

There’s an often-quoted anecdote about the flaw of averages which recounts the story of Lt Gilbert Daniels. In the 1950s, the U.S. airforce gave him the unenviable task of taking the physical measurements of over 4000 pilots with the aim of finding the ‘goldilocks’ set of averages that would lead to better design of fighter plane cockpits. Define the measurements of the ‘average man’ and you can design the average cockpit and most pilots will fit it.
Won’t they?
Apparently not.

“Out of 4,063 pilots, not a single airman fit within the average range on all 10 dimensions.” Todd Rose

In any statistical analysis it is second nature to grab the mean average as the overview stat of choice. The ill-fitting cockpit, however, is a reminder that in many circumstances with real-world data, the mean average might not actually describe anyone or anything real.

This plays out in fundraising data all the time.
‘Could you give us the average gift from this activity, please? We want to see what people raise so we can try to make fundraising more effective next time.’

Well, the short answer is no. I can give you the mean average gift from a fundraising activity, and that information will help you forecast likely income from running such an activity again (assuming similar circumstances). It will not show you anything interesting about how people are behaving. Therefore it is unlikely you will be able to influence that fundraising behaviour in the future.

In these circumstances, rather than providing the single figure to explain a distribution, I encourage clients to look at the shape of the distribution itself. A histogram of donation amounts, or a graph of banded income will nearly always show a positive skew – a bulge at the lower monetary end and a long tail of rarer larger amounts at the upper end. Plotting amount given by number of givers looks like this:

skew

Take the mean average and you don’t see the shape, you get a number in the middle. It doesn’t tell you that the bulge exists, where it is, and it doesn’t tell you that the larger amounts are outliers. A fundraising activity where most people give £10 could have an average gift of several hundred pounds due to a few people giving £1000, which could be hugely misleading.

Using the median (the indicator of the middle that literally points to the number half-way through the data) is less likely to be skewed by the outliers and therefore is more useful in showing a truer picture of average activity – what people are potentially capable of raising. As you can see, it’s closer to the bulge than the mean.

The mode (the indicator of the exact amount that the highest number of people gave) will show you the where that bulge is. It will often be useful to think about in terms of a psychological anchor that people are rooted to. A fundraiser can use this anchor to look at whether any of their marketing is driving people towards that anchor (by use of a prompt) or whether this is a window into a wider cultural anchor that it might be worth working with rather than against.

As with many things in the realm of insight analysis, the advice is to go into investigations with an idea of what you really want to understand. It is good to think about about what an average is telling you and what you might be able to do about it.

Are you creating a cockpit that will fit nobody? Is it better to get the measure of different people and have some different fits of cockpit, or one cockpit design that is adjustable?  In marketing terms could you be designing different messages for different audiences? Or might you design a general fundraising proposition that could be customised to different audiences?

One-size does not fit all and in some cases it fits nobody at all, whether that’s for cockpits or marketing messages. Don’t be mean – no supporter you’re talking to is merely average.

I very much recommend this article taken from Todd Rose’s book The End of Average. 


A footnote on gender:
An interesting addition to the pilot story is the what happened with an equivalent discovery of the mismatch between average body measurement and real body shape in women. Whilst the pilot finding led to acknowledgement that there was no ‘average man’ and cockpit design needed to adapt accordingly, the interpretation of the finding for women was rather different. In this case, it was the opinion of experts of the time that the average ‘ideal’ for women’s bodies was not wrong, women were. They should get in shape quick-smart.

It’s never a simple question of deriving ‘fact’ from ‘data’, it’s the interpretation of meaning that counts. To that end, it matters who’s doing the interpreting, who gets to tell the story, and who faces the consequences of conclusions.

Insight Analyst vs Software Developer: Puzzles

At home, my resident software developer and I discuss data, tech and coding among other more domestic topics. These other topics include: what we’re going to have for dinner, why the purchase of a new shed was never part of the plan of action prior to destruction of the old one, and whether or not socks can ever really be said to be ‘temporarily resting’ on the living room floor; a whole other blog is needed to tackle these questions.

When discussing technical matters it’s interesting to note what we agree on, and where our mindsets differ.

One Christmas, we’re at my mother’s home with my brothers and, as is customary in my family, we are lapping up various puzzle supplements from the Christmas newspapers. I alight on a Futoshiki and get stuck in, and then get stuck.

Futoshiki according to Wikipedia: The puzzle is played on a square grid. The objective is to place the numbers such that each row and column contains only one of each digit. Some digits may be given at the start. Inequality constraints are initially specified between some of the squares, such that one must be higher or lower than its neighbor. These constraints must be honored in order to complete the puzzle.

Futoshiki

Got it? Right, so I’m stuck and I can’t find the next logical deduction in the sequence. I ask software developer (SD) if he wants to help by taking a look at the Futoshiki with me. Despite previous suggestions that slightly more maturity and slightly less borderline racism might suit him better, SD replies ‘I think you’ll find it’s pronounced Fucko-shit-o’. Moving swiftly on, we establish that SD is not interested in puzzles of this kind. He finds them frustrating and feels no satisfaction upon completion, concluding that the only thing that can be learned from puzzles is how to get better at those puzzles, something of no use to anyone. He does however enjoy coding problems, and therefore looked at our puzzle problem as a classic case of ‘if something is worth doing, it’s worth coding a computer to do it for you’.

Thus started the race. I was to continue to try to solve the puzzle I was stuck on and complete it before SD wrote a programme that could solve any and all 5×5 Futoshiki grids. I won in terms of speed, although I had to use brute force on the logic at one point which felt like cheating. Unable to make a deduction I built scenarios for each of three options, following through until sure that two would fail, therefore confirming the correct path with the third. SD won in terms of comprehensiveness. His code used the brute force of computational power to run through every option for every row in a puzzle until it found the solution, thus providing a generalised solution. From my perspective his solution completely misses the joy of working through a puzzle yourself, from his perspective he had all the same fun in solving his own which also resulted in a concrete workable thing at the end of it.

Neither of us need to stop there with Futoshikis. I can continue to come to each puzzle fresh, perhaps even repeating ones in order to reduce the number of deductions needed to solve it. I could also move on to solving larger grids. SD could also tackle larger grids and the task of generalising a solution that solved grids of any size. At this point, victory potentially swings back in favour of the human analyst. I should be able to use the same deduction techniques again and again, larger grids will take more time but will still likely be doable until such time as I become bored or die. SD on the other hand will have to radically alter the brute force approach. Whilst computational power elegantly outweighs human endeavour for smaller grids, the processing time needed for such an approach on larger and larger grids multiplies exponentially. This means at some point he will have a large grid that, while still solvable, would take all the time between now and the heat death of the universe for the computer to complete. To avoid this, SD would need to figure out a far more sophisticated algorithm, one that (arguably) mimics human reasoning to a greater extent; some sort of artificial intelligence. So perhaps next Christmas, as I enjoy bending my brain with my Futoshiki, SD will try creating an artificial me to solve all possible ones.

Deconstructing the mathematical bridge

For music students everywhere, musical analysis is incredibly useful for many purposes. You can pick out different structures from a musical work, pinpointing particular styles and influences. You can learn to recognise and recreate a composer’s sonic signature in orchestration, rhythm, melody and harmony. Identifying and contemplating these features aids and increases the overall appreciation and understanding of musical works by providing information that connects us back through time to historical context, biography and musical purpose. And, importantly, these are aspects of music that you can hear.

When I was studying for a music degree, we were taught Schenkerian analysis, which I and my closest peers found abstract, boring and pointless beyond its performance as an intellectual exercise. I’m not saying it is pointless, I’m saying it felt pointless and reductive. The joke about any Schenkerian analysis is that what you do is take a great classical work (let’s say Beethoven’s 5th) and reduce it down to the kernel structure of Three Blind Mice.

How neat, how elegant, what a charming coherent unity it all has! Gah, I hated it! How can this possibly add to my understanding when I’m left with so much less than the sum of the parts I started with?

A tutor sympathised with my frustrations, recounting the story of the mathematical bridge in Cambridge. The bridge is hundreds of years old, originally designed by Isaac Newton. Very unusual in design it is made entirely of straight timbers arranged into an arch via some very sophisticated engineering. So ingenious was it that it held perfectly in place between two buildings by its unique balance of tension and compression alone. There it remained for many years until some inquisitive mathematicians wished to understand the design better. They dismantled the bridge with the intention of putting it back exactly as it had been. They failed. By taking it apart and not reaching a full understanding of how it worked, they were unable to restore it to its former elegance and today the bridge is held in place by rivets. Too much analysis can indeed spoil things.

As I start again in psychotherapy I find myself thinking of the mathematical bridge often. Will I discover a new and greater understanding of myself allowing for appreciation, healing and positive change? Or will I be reduced to some disparate sum of my parts that perhaps won’t add up to the whole I started with. Nobody wants to be Three Blind Mice after believing that they might be or have the potential to be Beethoven’s 5th.

What comfort then to discover that the story of the bridge is a myth? The original design (which was nothing to do with Newton at all!) always had rivets in it; there was simply a time when they could not be seen. The bridge has been dismantled and rebuilt twice allowing for maintenance.

There’s a bittersweet quality to myth-busting. The magic of the story that resonates so strongly has to be true, no? As Stewart Lee often ends his ridiculous flights of fancy: ‘This story is not true, but what it tells us is true’. The mathematical bridge myth reinforces our deep fears about self-discovery and change, helping us hide, that’s why we like it. It’s less romantic to go in search of the rivets to tighten them up a bit, but perhaps that’s what the bridge really needs to keep it in place.

 

 

Painting pictures with data: Emma’s year of adventure.

 

How’s this for a year’s project?

THE RULES:

  • Do something new each day

  • Do it with someone else

  • Document it

Pretty daunting, huh?

That’s why I’m not doing it. But Emma Lawton is. That’s incredible. What’s more incredible is that in 2013 at the age of 29, Emma was diagnosed with Parkinson’s. And, like so many people with Parkinson’s, she radiates a determination to focus on the things she can do more than the things she can’t.

Late last year Emma sent a call out at the Parkinson’s UK office to anyone who wanted to spend a lunchtime with her teaching her something. I wrote back and said she should come and geek out with me and Myuran over some analytics. She said yes. Hooray! A win for spreadsheets!

In prepping for Emma’s session, it was tricky to decide how to approach ‘analytics’ as a subject for an hour’s chat in a way that would do anything other than scratch the surface. It’s a vast topic that can easily get quite boring about stats if that’s not what a person is interested in. Instead of that, we did what we often do with clients and chose to focus on the person rather than the ‘data’. We took inspiration from Emma’s up front questions to us which were all about how important we regarded learning new things. Combining that idea with the knowledge that Emma’s own professional background includes design, we decided to play around with dataviz.

It turned out to be a fruitful starting point to ask Emma about her own project, regarding the year she was planning out and acting on as a potential dataset. She told us she had a planning spreadsheet that she was very proud of and would we like to see it (would we ever – hooray for spreadsheets!) Then we started asking questions. What did she really want to achieve by doing this? What kinds of stories would she want to be telling people? What pictures might she paint to illustrate those stories? What information might she collect that could form the material for those pictures if we thought of them as graphs or infographics?

Analysis can be defined as the summarising and visualising of information for the purpose of gaining insight.

Whilst Emma is writing a blog post for every activity filled with rich qualitative information about her experiences and thoughts on each, summarising and visualising the project as a whole after a year is more challenging based on reminiscence and journalling alone. That’s where data capture comes into play. By noting down a few pointers in a set format for every activity in a spreadsheet (hooray!), she can build material she can later mine for patterns, patterns she could draw. She could capture almost anything about her activities, but based around the ideas she gave us that the project was about people and about how doing this stuff made her feel, we kept focus there. The data capture includes names and dates, information on the activity, how the person she meets is connected to her and various measures of how she feels about the activity including how new, challenging and satisfying it was plus the brainwave of a dropdown list of emojis for overall feel.

Is this reductive? Yes.
Can we capture the richness of Emma’s emotional experiences with dropdown lists of ratings and emojis? No.
Might it allow the viewing of wood rather than trees? Yes.
Might it reveal something of her experience that might otherwise remain hidden? Yes.

Meeting, speaking and sharing ideas with Emma was a joy and I felt that in that hour we explored in microcosm what we do on any project with any client. We meet people where they are and focus on what they tell us is important to them. We then work with them to think of how data can be useful as a lens through which to look, not to the exclusion of other lenses, but in addition for the provision of a different angle.

I can’t wait to see what Emma’s year ends up looking like to her.

Catch Emma’s analytics blog post at the fuck it listAnd do explore the rest of her adventures. Perhaps there’s even something she could come and learn with you?

 

 

 

Why ‘I don’t have time’ is a lie to yourself and others

At work, time poverty is a lie, an illusion. In a company where everyone works the same hours a week, time is cancelled out as a factor in the productivity equation. Time doesn’t exist.

So what truths does that leave?

‘I don’t know what I’m trying to achieve’

Sounds simple, but your goals and objectives need articulating on paper, out loud and revisiting often.

Why are you at work? What are you doing there? What will be produced or delivered to prove you did something? What changes will have been made, how will things look, what will things feel like when you’ve done what you’re doing?

‘I take on more than I can do’

Workload is a real issue. A volume of work that is unrealistically high will hamper productivity. Knowing your objectives will make this one easier. Making conscious choices about what you say yes and no to is then possible.

‘I’m not clear on my priorities’

So you have said no to some things but still feel time poor for what you have on your list and you’re stuck on how to prioritise. Knowing your objectives also makes this one easier. When you know what you’re focusing on, you can rank different tasks, choose to work on the important ones. Urgent is not the same as important and you can choose to work through that issue too.

Admit and accept that when you say
‘I didn’t have time for that’,
you actually mean
‘That is not my priority right now, I chose this other thing instead’.

‘This is taking longer than I thought. I don’t actually know how to do this task’

Trying to complete tasks that you do not possess the skills, knowledge or experience to successfully execute is time consuming. And unnecessary.

  • Identify what’s needed that you don’t have.
  • Who can help you?
  • Swallow your pride and ignore any inner voice telling you not to ask for help.
  • Reach out and ask questions.
  • Accept help offered, learn what you need to, or share the task with others.

‘This is taking longer than I thought. There’s likely to be a more efficient way to do this that I haven’t explored’

You do possess the skills, knowledge or experience to successfully execute your task. But the way that you’re doing it is time consuming. Open yourself up to new methods, or solutions for automating parts of the task you’re trying to do.

Creating efficiencies or automation itself takes an investment of time and effort. But that up front investment is repaid on every occasion you repeat the task and reap the reward of the time saved then.

Creating more time is an impossible problem that nobody can solve.
Good news is that you can solve any and all of these other problems.

Who, what, why and Venn

Vacation is a good time to contemplate life and career, fulfillment and frustration. It’s a good time to assess if you have the right balance in your life, and, if not, what might be missing.

And what better way to make that assessment that using the favourite of data visualisers everywhere – the venn diagram.

ikigai

 

I enjoyed examining this and relating it to my own situation. I even found myself unpicking some of it: aren’t mission and vocation the wrong way round? Depends on your understanding of the semantics I guess. Also, what does it mean to be excited at the same time as complacent and uncertain? What kind of people actually find themselves occupying this space? It reminded me just how powerful venns can be for exploring concepts.

What I loved the most was trying to decide the tweaks I can make to draw me closer to the middle. For me, it’s allowing myself to ensure that I continue doing the things I love, and not be drawn too strongly out of the LOVE circle and into that space between profession and vocation. I think this happens to me sometimes. In future I’d like to remember to make choices that are enriching for me too, rather than always fall into the ‘pleaser’ habit of prioritising what others are asking me for.