The problem with feedback

Back in the 90’s, John Laws lead an ad campaign for Valvoline. It had the catchphrase of “oil ain’t oil”

It focused on the supposed quality and excellence of the oil in much the same way as John West did with salmon.

This focus on quality and excellence had me thinking lately about data and whether in fact ‘data ain’t data’ and that data is not neutral.

In an article for The Atlantic, Megan Ward provides a history of feedback. She touches on the origins associated with improving industrial machine efficiency and focus on finding fault. The problem is that in recent times it has been appropriated as a tool for managing people as a form of human machinery.

Positive ratings are a kind of holy grail on sites like Yelp and TripAdvisor, and negative reviews can sink a burgeoning small business or mom-and-pop restaurant. That shift has created a misunderstanding about how feedback works. The original structure of the loop’s information regulation has been lost.

Ward explains that this confuses things and in the process we risk making the activity one of noise, rather than any sort of purposeful meaning and change.

I was particularly reminded of this during a recent holiday to Fiji. I had some points of frustration about the place where we stayed and thought that it might be worth providing feedback. However, what I realised the longer I stayed was that such feedback would most likely miss the mark. Rather than improve the experience for others, as I imagined the feedback should, it would more likely be weaponised and lead to worse working conditions for the staff. To put the issues in context, they were each dealt with in a timely manner. In some respects that is all you can ask for. In addition to this, it would take away from what actually made the whole time most hospitable, the people. I decided not to provide feedback.

Another scenario that comes to mind is performance reviews in schools. I remember there was political outrage a few years ago that the vast majority of teachers in Victoria seemingly moved up their increment each year. It was felt by some that the review process was not weeding out under performing teachers. The problem I had then (and have now) is that it is failure for the wrong purpose. Teachers are not steam engines in need of optimisation towards some sort of greatness. Instead, they require feedback and follow-up based on particular contexts and conditions. This is why performance reviews are different to coaching programs. Jon Andrews explains this difference as improvement verses development.


The question that often feels overlooked when it comes to feedback is who or what is it actually for? It is easy enough to collect clicks and likes, but without purpose it can quickly just become noise. Data ain’t data, to treat it so misunderstands its purpose and association with feedback.


If you enjoy what you read here, feel free to sign up for my monthly newsletter to catch up on all things learning, edtech and storytelling.

Data Ain’t Data – Reflecting on Feedback by Aaron Davis is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

15 thoughts on “Data Ain’t Data – Reflecting on Feedback

  1. Nice post Aaron. A helpful discussion of the purpose and intent of feedback. Feedback as fault finding or error correction, as in mechanical systems is really unhelpful when working with humans!

  2. Thanks Chris. I have issue when we think the answer to improvement is by collecting a wide range of data points hoping to find some sort of fix. I feel that this often stripes agency from teachers to drive their own development.

  3. Marcus Buckingham and Ashley Goodall dive into the world of feedback. They argue that in many respects, it fails to achieve the intended outcome.

    Focusing people on their shortcomings doesn’t enable learning; it impairs it.

    Buckingham and Goodall highlight three theories that those who believe in feedback as often accepts as true:

    That other people are more aware than you are of your weaknesses, and that the best way to help you, therefore, is for them to show you what you cannot see for yourself.
    That the process of learning is like filling up an empty vessel: You lack certain abilities you need to acquire, so your colleagues should teach them to you.
    That great performance is universal, analyzable, and describable, and that once defined, it can be transferred from one person to another, regardless of who each individual is.

    In response, they propose a number of strategies to support the development of others, including:

    Look for outcomes
    Replay your instinctive reactions
    Explore the present, past, and future

    This is something I have written about too, discussing the problem of feedback.

  4. Chris Woolston dives into the problematic world of performance reviews. He speaks with a number of experts in the area, including Herman Aguinis, who explain that the process is in many respects broken:

    All too often, Aguinis says, formal performance reviews become a self-serving exercise in politics, not a realistic examination of an employee’s strengths and weaknesses. “Some managers will give biased ratings on purpose,” he says. “I have personally seen a supervisor giving a bad employee a good rating just so that employee could get promoted out of his unit.”

    The answer is not to remove reviews, by instead make them more regular, therefore making the feedback more meaningful:

    To really understand the value of their employees, Aguinis says, managers should double down on the practice of everyday management. That means checking in on employees every day and giving them real-time feedback on things they’re doing well and areas where they can improve. “When performance is a conversation, when it’s not something that happens just once a year, the measurement becomes very easy and straightforward with no surprises,” he says. He adds that it’s important to gather input from many different people within the system – peers as well as supervisors. “The best source of data is often not the manager,” he says.

    This is another interesting post which captures some of the problems with feedback and the challenges of self-determined learning in a world ruled by numbers.

  5. Chris Woolston dives into the problematic world of performance reviews. He speaks with a number of experts in the area, including Herman Aguinis, who explain that the process is in many respects broken:

    All too often, Aguinis says, formal performance reviews become a self-serving exercise in politics, not a realistic examination of an employee’s strengths and weaknesses. “Some managers will give biased ratings on purpose,” he says. “I have personally seen a supervisor giving a bad employee a good rating just so that employee could get promoted out of his unit.”

    The answer is not to remove reviews, by instead make them more regular, therefore making the feedback more meaningful:

    To really understand the value of their employees, Aguinis says, managers should double down on the practice of everyday management. That means checking in on employees every day and giving them real-time feedback on things they’re doing well and areas where they can improve. “When performance is a conversation, when it’s not something that happens just once a year, the measurement becomes very easy and straightforward with no surprises,” he says. He adds that it’s important to gather input from many different people within the system – peers as well as supervisors. “The best source of data is often not the manager,” he says.

    This is another interesting post which captures some of the problems with feedback and the challenges of self-determined learning in a world ruled by numbers.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.