A woman’s place is in the script: gender balance stats at Pixar

Photo by Klaas on Unsplash

The documentary series Inside Pixar, includes the brilliant story of Jessica Heidt, one of the animation studio’s script supervisors.

Jessica’s story got me very excited because it is a story about research, curiosity, data and about how evidence alongside advocacy can make change.

Jessica works on scripts at Pixar, she keeps track of all script changes through the multitude of versions that any production goes through, making sure all the right characters are saying the right lines in the right order to tell the story to the audience.

She read some research about the gender imbalance in film scripts and it got her thinking about how she had a unique perspective and opportunity to look at this issue at Pixar. Starting manually with a spreadsheet, Jessica started counting the number of lines of dialogue and the number of characters in a script. She then compared the number of male and female characters as well as the number of lines that male and female characters had. Working on Cars 3 at the time, she found that the male to female script ratio was around 9:1. Jessica started conversations on this topic with her colleagues and began regularly showing her script stats to senior staff at key production milestones. People got interested and changes would be made as the production went along – both the creation of female characters to add to the story, and the reallocation of lines of dialogue from male to female characters to balance out the general texture of the script.

Jessica then worked internally with a colleague to create a tool that would do the counting in the script software automatically – no more spreadsheets. In the documentary, Jessica describes her pride at Pixar’s latest release Soul achieving near gender parity. And it’s amazing to hear that it’s now part of Pixar’s vision to continue to measure and ensure balance in the future – not by imposing a 50:50 split in every film necessarily, but by aiming for a cumulative balance over the output of 5 years.

This story exemplifies how insight can drive change and is a fantastic case study for any insight analyst. Jessica cared about the issue of gender equality in movies and that led to her asking the question about Pixar’s own productions – how balanced were they? She found a way to produce clear and unbiased statistics to show the situation and she communicated that information to colleagues. Using the evidence in combination with personal persuasion, Jessica inspired changes to be made. She then worked on automating the process of data collection so that those stats could be quickly and easily reported at every key decision-making moment of the production process, embedding the insight and ongoing changes into the normal day-to-day activity of the company. Jessica Heidt is a worthy winner of Pixar’s Unsung Hero Award and an inspiration to insight analysts everywhere.

Purpose, Vision & Pride

Photo by Reiseuhu on Unsplash

Whether building a new team or function from scratch (maybe it’s just you!) or scaling up a team, intentionally defining your purpose is a great way to get clarity on what’s important, which will help enormously with future prioritising.

Crafting a purpose statement is well worth working in a group to do, perhaps even with an outside facilitator. For an analytics team, your purpose covers why you are here, what you do towards the wider vision of the organisation, who you deliver to and how. Others have written better about how to do this.

Vision – Roadmap – Plan

Vision is the description of where you want to go – Liz Ryan from humanworkplace.com talks about long-term vision being the view from the clouds. Objectives and goals are the hilltop view, the big picture of what’s going on and the ability to see if you’re headed in the direction you wanted to go. Initiatives, projects and day-to-day tasks are down in the weeds at ground level. Additional to all of this, in analytics, there’s always the rabbit holes you go down from there into the actual data – so there’s many levels to our work!

My passion is for creating the staircases between cloud and hilltop, and then again between hilltop and ground level. My experience is that where there is too much separation between these viewpoints, alignment slips and work becomes disconnected from vision. As a result, people can feel isolated in their jobs without a sense of contributing to the wider purpose of whatever organisation they work for.

What creates the staircase between cloud and hilltop?
The act of translating a long term vision into reality comes in two parts:

  1. Getting really specific about what the vision means in reality. Define those terms and say what it would look, feel, sound and smell like in that future reality. As an example, many analytics teams set a vision of being a ‘sector leader’. The next step is to articulate in specific terms what it means to be a sector leader.
  2. Roadmap from where you are now to where you want to be in a timeframe. An outline plan, a roadmap, sets out where you are now, what needs to be achieved to reach your vision point, the path forward and key milestones along the way. It is not a detailed plan, it must be flexible to change, it can’t be set in stone. However, it must give a sense of the order that change needs to come in. Some change is dependent on other change, certain things must therefore come first. A roadmap allows you to define what elements of your change are foundational:
    • what must be in place for you to succeed in the future?
    • what changes are needed to enable other changes?
    • what is most important and will create most value?
    • what order do you need to do things in? What is for now, what’s next, what’s for the future?

What creates the staircase between hilltop and ground?
The act of translating your roadmap into a plan involves:

  1. Planning out what you need to do now
  2. Setting objectives with short timeframes and specific deliverables

Measurement and Pride

The integrity of the staircases from ground to hilltop to cloud viewpoint is strengthened by both measurement and pride. It is vital to put in place some form of measurement of how you are doing, to enable you to adapt, improve and change direction as needed.

From ground to hilltop, the measurements can focus on whether tasks are being completed and how well.

  1. What outputs have you delivered?
  2. Are you doing what you said you were going to do?
  3. How well are you doing what you said you were going to do?

From hilltop to cloud, the measurements can focus on whether the work you are doing is having the kind of outcome and impact that you wanted – are you moving closer to your vision?

  1. What are the outcomes of what you have delivered? What has happened as a result of your work?
  2. How well do these outcomes show an improvement on the past?
  3. How much closer are you to your vision?

What about Pride?

Perhaps this is just another corny quote but I like it (and have no idea who said it):

Don’t wait until you’ve reached ​
your goal to be proud of yourself.​
Be proud of every step you take ​
toward reaching that goal.

The Internet…

This is incredibly important as an ingredient in effective measurement. Any evaluation, KPIs, monitoring, tracking and reporting could be improved by thinking about what there is to be proud of from the work done. Learning lessons about what to improve is, of course, essential too. But without pride in what has gone well with acknowledgment and celebration of that, learnings will feel weighty and burdensome. You can be proud of successes, and proud to have been bold enough to fail, knowing now something that you did not know before.

The Pride Spectrum

As an individual or as a team, where do you fall on the pride spectrum?
How might you aim more for the middle?
What are you proudest of?
How do you celebrate success?

Scaling teams part 4: Document management

Here we continue the series exploring what’s needed to scale analytics and insight teams (parts one and two, and three here).

What type of flow are we looking at?
All of them…

  • Flows from stakeholders to the team
  • Flows from the team to stakeholders
  • Flows between multiple stakeholders
  • Flows between analysts within the team
  • Flows between analysts and other collaborators outside the team
  • Flows between team leadership and analysts to support and guide

When a team is small and the team members can keep most of the information they need for day to day business in their heads, a neatly managed document system is nice to have, but you can definitely get by with free and loose as long as people get what they need when they need it.

To scale to a medium or large team, you need to decide how things are done and where documents are held. If you don’t, people produce inconsistent documentation (inefficient and can harm reputation), and people don’t know where to find docs (inefficient and plain frustrating). Additionally, by making these decisions, you set in place important parts of the collaboration workflow which enables people to work together easily. For this last part, there are many advantages to opting for cloud-based documentation by default such as Google Drive or OneDrive.

Document Management model

Advantages of Cloud solutions

  1. Remote workflow – it’s easier to work directly in a browser than through a networked connection such as VPN.
  2. Single document collaboration – comments, suggestions and edits can be incorporated on a single shared collaborative document. No more emailing of documents, no more multiple versions, no more collating changes between multiple versions edited by different people on different drafts. And people get to see each other’s comments which is more transparent and enables fuller reflection.
  3. More chat, less email – when you don’t need to email docs, you can opt to share docs and chat back and forth via a chat platform such as Google chat, Teams or Slack. This saves you from long email chains and enables more informal and quickfire conversations to evolve as you work with others. And if things are sensitive or confidential, then the chat can be private too.


  1. Functionality of online docs – browser based versions of Word/Docs, Excel/Sheets, Powerpoint/Slides or whatever it is you use are still not as good as their desktop siblings. Sometimes they don’t do what you want or need them to and you have to revert to desktop, or cry, or both.
  2. Reliance on the internet – unavoidable with reliance on the cloud. Fairly unavoidable in any remote working situation since your VPN connection also won’t work without internet. It’s not really feasible to keep an up-to-date copy of everything on your local computer. Mitigate by investing in good internet connectivity.
  3. Reliability of the system – issues like storage space, back-up, network speed do arise. This is only a disadvantage if your set up is a bad one and these issues are not dealt with. Mitigate by investing well and maintaining your set up, you would have to do this for a non-cloud solution too.
  4. If you don’t like change, you won’t like adapting to new ways of working – I can’t help you here.

I think the advantages of a well thought through cloud-based document management workflow vastly outweigh the disadvantages (most of which are simply bad management). I’m biased on this but all I can say on it is that done right, it works very well and you can just get on with it and not worry about filing.

Diverge & Converge – Finding Balance

Whether it’s unpicking the prickliest of thorny data problems or designing engaging and informative data visualisations, there is the need to use both divergent and convergent thinking.

As an analyst, I know I use both and I believe that data analysis and insight work lends itself strongly to the design thinking ‘double diamond’ model (two waves of divergent-convergent thinking, the first diamond focusing on problem, the second on solution).

I work both creatively and analytically, can generate ideas and whittle them down, can reflect and imagine as well as prioritise and act. So why do the words ‘brainstorming session’ still make my internal organs converge in on themselves? In the past I would have concluded that perhaps it was because ‘I’m just not an ideas person’, that ‘I’m not as creative as my peers’. Writing that down I see it for what it is, inner critic talk. What is an ideas person? It’s a person. Any person. Perhaps it’s not the ideas-y bit that I’ve found challenging, I think it’s the experience of imbalance between the divergent and convergent parts of the thinking process.

What I like about these process diagrams is their symmetry. There is balance between the divergent and convergent elements. When brainstorming, how balanced do you feel the use of time is between these two? Shorten the front and you will miss good ideas from outside the box, shorten the back and you risk rushing too quickly to a choice based on whim rather than evidence.

My analytical leanings means some of my best contributions are within convergence. Don’t get me wrong, I can come up with new ideas, I know how to use and continually strengthen that muscle, however, there’s also a real joy in adding flesh to the bones of an idea whether it’s yours or not. For example, I love weighing solutions against each other using something like an Impact vs Ease matrix in order to add some rigour to the choice stage. Because of all this I have felt underused, undervalued and disappointed when I’ve been part of processes which have spent hours on diverging only to follow with mere minutes converging.

These are the pointers I’m trying to keep in mind for any sessions that I’m facilitating for my team or others.

  • Equal time on divergent and convergent thinking is time well spent.
  • Ensuring that facilitators are aware of who in the room excels at each type of thinking can also help get the most out of participants, allowing them to play to their strengths and encouraging them to engage and contribute fully.
  • For any collaborative session, allowing prep time, reflection time and the opportunity to add thoughts later is more inclusive of neurodiversity in your group. It enables those who think clearer alone to cogitate and contribute in their own time, while quick collective thinkers can take the lead in a live session.
  • Allowing separate sessions for divergent and convergent thinking reaps huge benefits as it enables everyone involved to switch hats for each exercise. The emphasis can therefore change from quantity of ideas to quality of ideas, from horizon scanning to planning, from dreams to reality, from thoughts to actions.

Scaling teams part 3: Request and response

Here we continue the series exploring what’s needed to scale analytics and insight teams (parts one and two here). In this post we’re looking at the request funnel – the ways in which the team receives, and responds to, requests from stakeholders.

What type of flow are we looking at?

  • Flows from stakeholders to the team
  • Flows from the team to stakeholders
  • Flows between multiple stakeholders
  • Flows between analysts within the team
  • Flows between analysts and other collaborators outside the team
  • Flows between team leadership and analysts to support and guide

Automation vs the human touch

How can we make request and response practices both efficient and personal? This is a challenge for any team, and one that becomes harder when scaling teams. This is because the manual ways of working that function well with a small team (e.g. using a shared email inbox) become more laboured and harder to manage with a larger team and the volume of standard requests lends itself to an automated approach such as a Service Desk.

But which of us actively enjoys engaging with a Service Desk system as a user? And if we don’t like it, will our stakeholders?

Using my current team as the example we have broadly four types of request that come into the team regularly:

  1. Requests for access to the Tableau server
  2. Bug reports or requests for changes to existing dashboards
  3. Requests for new dashboards or pieces of analytics and insight work – strategic or operational level
  4. Ad hoc requests for analytics support and advice from those doing their own analytics work

Example 1
The only one of these that has any automation in request processing is no. 1. However, the current version of that process has combined email correspondence and an automated form to fill in. It doesn’t work clearly or smoothly for users and therefore gives (potentially brand new) stakeholders an experience of the worst of both worlds.

We have not yet implemented a solution here but I’ve been thinking through an approach:

  • Map the user journeys through whatever request systems you have. We care most about our stakeholders having a good experience and secondly about ease of process for the team.
  • Prioritise the worst and busiest channel – in our case it is the request for Tableau Access.
  • Outline all the steps in the journey and how they work, finding out the context and history for why it is the way it is. Team members have made decisions in good faith getting us to this point, we need to be mindful of that. If at all possible, also ask users directly about their experience of the process.
  • Create solution options that use either entirely human or entirely automated processes so you are clear on the differing paths.
  • Then if wanting to create a combination, to increase human touch in the automated route or automation in the human route, map the process options from the user point of view first and the team view second.
  • If you find yourself saying ‘we can’t do that because the system doesn’t let us’, pause before implementing that system-based approach you’ve come up with because you have hit an instance where you are compromising user experience for team or system experience. Do you really want to make that compromise?
  • Think of ways to hide automation within your human process so that a user would not necessarily be aware of the automatic processing but the human gets to save time. For example using standard email responses topped and tailed by a person dealing with the query.
  • Think of ways to embed human touch into your automatic process e.g. an online form that is written in a friendly human tone.

Example 2
We also want to improve communication around request types 3 and 4. We have already been able to cut down a huge amount of ‘fire-fighting’ work in the team by meeting analysis and ad hoc requests with a standard set of questions for the requester to write answers to. This not only helps the team obtain more concrete and more detailed information about the request asynchronously (i.e. it doesn’t require a meeting unless the stakeholder is really stuck for where to start), it also provides a valuable exercise for the stakeholder in clarifying their own thinking. We’re now wanting to see if we can structure this process slightly more by having a form to fill in, rather than simply asking the questions via email.

Key questions we always look to include are below and we try to use friendly plain English:

  1. What are you working on and what is the overall objective you are working towards?
  2. Who is the work to be delivered to – who are the main stakeholders?    
  3. What will the stakeholders be using the information for? 
  4. If this is for dashboarding, what kind of regularity is the reporting for? 
  5. Tell us about the analysis you need. What questions do you want to answer? What measures do you want to visualise? 
  6. Where is your data coming from? Tell us about your data sources. 
  7. What timeframe are you working to? 
  8. What other context can you give us? 

What are our next steps?

Example 1: I’m going to be working on improving the Tableau Server request process so that it is smoother and quicker for users and keeps the ability for them to have human interaction where this is valuable. I believe that if we achieve this, we will have a process that is quick and easy for the team as well.

Example 2: We’re now wanting to see if we can structure this process slightly more by having a form to fill in, rather than simply asking the questions via email.

Scaling teams Part 2: Kanban

In part 1 we established the areas of flow to consider when improving and scaling analytics teams (items in bold we cover in this post):

  • Flows from stakeholders to the team
  • Flows from the team to stakeholders
  • Flows between multiple stakeholders
  • Flows between analysts within the team
  • Flows between analysts and other collaborators outside the team
  • Flows between team leadership and analysts to support and guide

Workflow management as a Kanban

Kanban is an agile method for managing team or project workflow.

Trello is a powerful tool for managing work in Kanban style, even using the free settings. Flexible and adaptable, it provides a lot of freedom for how to do things. Other tools such as Jira are far more powerful in terms of what they allow you to do. They are also pricey. It may well suit a larger analytics team to work with a more powerful tool such as Jira, especially since it provides more in the way of reporting features giving a manager the ability to track progress directly from the data stored in the system. However, the behaviours needed to work within a Kanban structure can be established with Trello for free, allowing a new or maturing team to settle into the way of working first, then switching as needed to a new tool that might expand or speed up practices in the future. People, process, technology usually works best in that order.

How can you create the structure for best practice behaviours in Trello as a Kanban?

In part 1 we established the ways of working that can best help move change forward:

  • Agreeing collectively ways of working to test and iterate.
  • Collectively setting expectations of behaviour.
  • Setting up structures and tools to enable agreed ways of working.
  • Agreeing some accountability mechanisms to help expectations to be met.
  • Realising how much of the work boils down to building relationships based on trust and good communication.

How did we put this into practice?

Deciding together what to test
We decided to test moving from a person per column to a Kanban structure for the main team Trello board.

Expectations set

We agreed to ensure the board was up to date for our Monday team meeting and that we would update our cards so that anyone looking at one could see at a glance what the item was, and the current status. We agreed that each card would have a team member responsible for it.

As team manager I took it as my responsibility to make sure that items were clear every Monday and that each team member also had a summary card with their name on listing their priorities for the week.

Tools to help us: using functionality in Trello
To show who owned the cards we adding Members, one each to show who each card on the board belonged to.

We agreed to add Checklists to cards so that small tasks were linked up together under ‘pieces of work’ which could make their way across the board. This took some adjustment. There is a sense of achievement that comes from moving cards to done. However, if all tiny tasks have their own card then the board becomes cluttered and confusing. If each card is a piece of work, then checklists can show the tasks that are coming up and need to be completed before the card moves on.

After holding a session to agree the team’s 6 month priorities we put these on the far left of the board and created Labels for each. That way we could label cards with the priority that they fit, making it easy to see what work built towards our objectives and which was ‘other’.

Trust and Communication
In each team meeting we checked in to see how the new structure was working for people. Through this process we could gauge what was working (people were positive) and how it helped (people particularly liked listing their priorities for the week ahead, it helped them stay on track and push back on other requests). We could also get feedback on what didn’t work so well and discuss together how that could be adapted so we could test it again over the next week or two. An example of this was our set up of columns where we started off with a certain number of columns representing different aspects of progress on a card, we had too many columns to start with so we agreed the definitions and streamlined them as we went.

This is an ongoing process of test, reflect, adapt which is agreed as a team, keeping things collaborative, productive, flexible and positive. There is no right answer for exactly what the structure of your workflow management should be, but this is an approach that can help you find your own best way of doing things.

Scaling teams Part 1: Flow

I’ve just started a contract to head up a Data & Insight team with the goal of scaling from two team analysts up to six in the next six months. Naturally, my mind has been buzzing with ideas about what key things are needed to achieve successful scaling.

The team has been both stretched and under-resourced for the best part of a year and is about to lose one of the existing two analysts. They are a good team, recognised and respected for the value they add to the organisation. Their analytics practices are good – they have a well-managed Tableau server set up with processes and standards in place governing what is published there. They also have some of their main data sources housed in a data warehouse, feeding Tableau.

Working from the set up that this team already has, these are the elements that we are reviewing, improving and adapting in order to scale the team.

  1. Managing the delivery of the programme using Trello.
  2. Gathering requests from stakeholders using a shared email inbox.
  3. Documentation and communication focused around Microsoft Teams, OneDrive and Sharepoint.
  4. Using the website as a shop window communicating what the team offer.
  5. Building out a team structure based on domain areas, one for each new Senior Analyst to lead.

Reflecting on this list as a whole, there’s a strong theme here about creating smooth and effective flow. Flow of information or action into, out of, and across the team. Good flow owes more to establishing consistent behaviours from team members than it does to any particular tools used. Establishing well-flowing work practices takes:

  • Agreeing collectively ways of working to test and iterate.
  • Collectively setting expectations of behaviour.
  • Setting up structures and tools to enable agreed ways of working.
  • Agreeing some accountability mechanisms to help expectations to be met.
  • Realising how much of the work boils down to building relationships based on trust and good communication.

The challenge to creating flow is to create structure that is easy, productive, logical and helpful to the people doing the work. Anything that feels onerous, tedious or overly dutiful will form rocks in the river, blocking the flow.

What are the stages of the analytics and insight pipeline?

  • Request
  • Triage and prioritisation
  • Briefing
  • Data, analysis and insight work
  • Pre-delivery
  • Delivery
  • Retro
  • Follow up

What are the flows that we want to make sure are smooth, easy and productive throughout the pipeline?

  • Flows from stakeholders to the team
  • Flows from the team to stakeholders
  • Flows between multiple stakeholders
  • Flows between analysts within the team
  • Flows between analysts and other collaborators outside the team
  • Flows between team leadership and analysts to support and guide

Averages: Is it mean to use the mean?

There’s an often-quoted anecdote about the flaw of averages which recounts the story of Lt Gilbert Daniels. In the 1950s, the U.S. airforce gave him the unenviable task of taking the physical measurements of over 4000 pilots with the aim of finding the ‘goldilocks’ set of averages that would lead to better design of fighter plane cockpits. Define the measurements of the ‘average man’ and you can design the average cockpit and most pilots will fit it.
Won’t they?
Apparently not.

“Out of 4,063 pilots, not a single airman fit within the average range on all 10 dimensions.” Todd Rose

In any statistical analysis it is second nature to grab the mean average as the overview stat of choice. The ill-fitting cockpit, however, is a reminder that in many circumstances with real-world data, the mean average might not actually describe anyone or anything real.

This plays out in fundraising data all the time.
‘Could you give us the average gift from this activity, please? We want to see what people raise so we can try to make fundraising more effective next time.’

Well, the short answer is no. I can give you the mean average gift from a fundraising activity, and that information will help you forecast likely income from running such an activity again (assuming similar circumstances). It will not show you anything interesting about how people are behaving. Therefore it is unlikely you will be able to influence that fundraising behaviour in the future.

In these circumstances, rather than providing the single figure to explain a distribution, I encourage clients to look at the shape of the distribution itself. A histogram of donation amounts, or a graph of banded income will nearly always show a positive skew – a bulge at the lower monetary end and a long tail of rarer larger amounts at the upper end. Plotting amount given by number of givers looks like this:


Take the mean average and you don’t see the shape, you get a number in the middle. It doesn’t tell you that the bulge exists, where it is, and it doesn’t tell you that the larger amounts are outliers. A fundraising activity where most people give £10 could have an average gift of several hundred pounds due to a few people giving £1000, which could be hugely misleading.

Using the median (the indicator of the middle that literally points to the number half-way through the data) is less likely to be skewed by the outliers and therefore is more useful in showing a truer picture of average activity – what people are potentially capable of raising. As you can see, it’s closer to the bulge than the mean.

The mode (the indicator of the exact amount that the highest number of people gave) will show you the where that bulge is. It will often be useful to think about in terms of a psychological anchor that people are rooted to. A fundraiser can use this anchor to look at whether any of their marketing is driving people towards that anchor (by use of a prompt) or whether this is a window into a wider cultural anchor that it might be worth working with rather than against.

As with many things in the realm of insight analysis, the advice is to go into investigations with an idea of what you really want to understand. It is good to think about about what an average is telling you and what you might be able to do about it.

Are you creating a cockpit that will fit nobody? Is it better to get the measure of different people and have some different fits of cockpit, or one cockpit design that is adjustable?  In marketing terms could you be designing different messages for different audiences? Or might you design a general fundraising proposition that could be customised to different audiences?

One-size does not fit all and in some cases it fits nobody at all, whether that’s for cockpits or marketing messages. Don’t be mean – no supporter you’re talking to is merely average.

I very much recommend this article taken from Todd Rose’s book The End of Average. 

A footnote on gender:
An interesting addition to the pilot story is the what happened with an equivalent discovery of the mismatch between average body measurement and real body shape in women. Whilst the pilot finding led to acknowledgement that there was no ‘average man’ and cockpit design needed to adapt accordingly, the interpretation of the finding for women was rather different. In this case, it was the opinion of experts of the time that the average ‘ideal’ for women’s bodies was not wrong, women were. They should get in shape quick-smart.

It’s never a simple question of deriving ‘fact’ from ‘data’, it’s the interpretation of meaning that counts. To that end, it matters who’s doing the interpreting, who gets to tell the story, and who faces the consequences of conclusions.

Insight Analyst vs Software Developer: Puzzles

At home, my resident software developer and I discuss data, tech and coding among other more domestic topics. These other topics include: what we’re going to have for dinner, why the purchase of a new shed was never part of the plan of action prior to destruction of the old one, and whether or not socks can ever really be said to be ‘temporarily resting’ on the living room floor; a whole other blog is needed to tackle these questions.

When discussing technical matters it’s interesting to note what we agree on, and where our mindsets differ.

One Christmas, we’re at my mother’s home with my brothers and, as is customary in my family, we are lapping up various puzzle supplements from the Christmas newspapers. I alight on a Futoshiki and get stuck in, and then get stuck.

Futoshiki according to Wikipedia: The puzzle is played on a square grid. The objective is to place the numbers such that each row and column contains only one of each digit. Some digits may be given at the start. Inequality constraints are initially specified between some of the squares, such that one must be higher or lower than its neighbor. These constraints must be honored in order to complete the puzzle.


Got it? Right, so I’m stuck and I can’t find the next logical deduction in the sequence. I ask software developer (SD) if he wants to help by taking a look at the Futoshiki with me. Despite previous suggestions that slightly more maturity and slightly less borderline racism might suit him better, SD replies ‘I think you’ll find it’s pronounced Fucko-shit-o’. Moving swiftly on, we establish that SD is not interested in puzzles of this kind. He finds them frustrating and feels no satisfaction upon completion, concluding that the only thing that can be learned from puzzles is how to get better at those puzzles, something of no use to anyone. He does however enjoy coding problems, and therefore looked at our puzzle problem as a classic case of ‘if something is worth doing, it’s worth coding a computer to do it for you’.

Thus started the race. I was to continue to try to solve the puzzle I was stuck on and complete it before SD wrote a programme that could solve any and all 5×5 Futoshiki grids. I won in terms of speed, although I had to use brute force on the logic at one point which felt like cheating. Unable to make a deduction I built scenarios for each of three options, following through until sure that two would fail, therefore confirming the correct path with the third. SD won in terms of comprehensiveness. His code used the brute force of computational power to run through every option for every row in a puzzle until it found the solution, thus providing a generalised solution. From my perspective his solution completely misses the joy of working through a puzzle yourself, from his perspective he had all the same fun in solving his own which also resulted in a concrete workable thing at the end of it.

Neither of us need to stop there with Futoshikis. I can continue to come to each puzzle fresh, perhaps even repeating ones in order to reduce the number of deductions needed to solve it. I could also move on to solving larger grids. SD could also tackle larger grids and the task of generalising a solution that solved grids of any size. At this point, victory potentially swings back in favour of the human analyst. I should be able to use the same deduction techniques again and again, larger grids will take more time but will still likely be doable until such time as I become bored or die. SD on the other hand will have to radically alter the brute force approach. Whilst computational power elegantly outweighs human endeavour for smaller grids, the processing time needed for such an approach on larger and larger grids multiplies exponentially. This means at some point he will have a large grid that, while still solvable, would take all the time between now and the heat death of the universe for the computer to complete. To avoid this, SD would need to figure out a far more sophisticated algorithm, one that (arguably) mimics human reasoning to a greater extent; some sort of artificial intelligence. So perhaps next Christmas, as I enjoy bending my brain with my Futoshiki, SD will try creating an artificial me to solve all possible ones.

Deconstructing the mathematical bridge

For music students everywhere, musical analysis is incredibly useful for many purposes. You can pick out different structures from a musical work, pinpointing particular styles and influences. You can learn to recognise and recreate a composer’s sonic signature in orchestration, rhythm, melody and harmony. Identifying and contemplating these features aids and increases the overall appreciation and understanding of musical works by providing information that connects us back through time to historical context, biography and musical purpose. And, importantly, these are aspects of music that you can hear.

When I was studying for a music degree, we were taught Schenkerian analysis, which I and my closest peers found abstract, boring and pointless beyond its performance as an intellectual exercise. I’m not saying it is pointless, I’m saying it felt pointless and reductive. The joke about any Schenkerian analysis is that what you do is take a great classical work (let’s say Beethoven’s 5th) and reduce it down to the kernel structure of Three Blind Mice.

How neat, how elegant, what a charming coherent unity it all has! Gah, I hated it! How can this possibly add to my understanding when I’m left with so much less than the sum of the parts I started with?

A tutor sympathised with my frustrations, recounting the story of the mathematical bridge in Cambridge. The bridge is hundreds of years old, originally designed by Isaac Newton. Very unusual in design it is made entirely of straight timbers arranged into an arch via some very sophisticated engineering. So ingenious was it that it held perfectly in place between two buildings by its unique balance of tension and compression alone. There it remained for many years until some inquisitive mathematicians wished to understand the design better. They dismantled the bridge with the intention of putting it back exactly as it had been. They failed. By taking it apart and not reaching a full understanding of how it worked, they were unable to restore it to its former elegance and today the bridge is held in place by rivets. Too much analysis can indeed spoil things.

As I start again in psychotherapy I find myself thinking of the mathematical bridge often. Will I discover a new and greater understanding of myself allowing for appreciation, healing and positive change? Or will I be reduced to some disparate sum of my parts that perhaps won’t add up to the whole I started with. Nobody wants to be Three Blind Mice after believing that they might be or have the potential to be Beethoven’s 5th.

What comfort then to discover that the story of the bridge is a myth? The original design (which was nothing to do with Newton at all!) always had rivets in it; there was simply a time when they could not be seen. The bridge has been dismantled and rebuilt twice allowing for maintenance.

There’s a bittersweet quality to myth-busting. The magic of the story that resonates so strongly has to be true, no? As Stewart Lee often ends his ridiculous flights of fancy: ‘This story is not true, but what it tells us is true’. The mathematical bridge myth reinforces our deep fears about self-discovery and change, helping us hide, that’s why we like it. It’s less romantic to go in search of the rivets to tighten them up a bit, but perhaps that’s what the bridge really needs to keep it in place.