Feedback fail: discover the risk of Agile’s data-driven Achilles heel

Feedback fail: discover the risk of Agile’s data-driven Achilles heel
0.0 0


Written by Phil Seaton.

Sometimes Analog Information Gives the Clearest Direction

Are you in danger of data-driven deceit?

Agile is a thousand different things. Agile could be the prefix for “agile scrum,” or it could be the vague notion of “moving fast.” Engineers might mutter something about “continuous [delivery / deployment / integration].” Microsoft has a page in the search results that appears right after the dictionary definition of the word. Did you know there’s a manifesto? It’s full of short, vague principles. It’s like import this from Python. You can’t go wrong.

Incremental Improvement without Dogma.

That’s what I think of when I think of agile. But buried in all the ideas and books and blogs and seminars and real-life experiences of teams everywhere is a forgotten assumption, the Achilles’ Heel of Agile. There’s one shortcut to muck up agile fast. Ready?

Feedback Is Not Optional

Agile is a contract. In exchange for fast, incremental progress, you get fast, incremental feedback. The fast feedback enables another increment of progress in the right direction. Over time, many increments can add up to something sensible and directional.

In the frenzy over agile, most people focus on the incremental progress. Some forget about the incremental feedback that’s also part of the contract.

Without incremental feedback, agile can appear to work. But that’s Agile’s Achilles Heel: It can be a development methodology that appears reasonable and productive, but fails to deliver customer value.

Without feedback, development will be directionless. A thousand minimum viable products will add up to a thousand little pieces. Customers may never see value if every MVP chips away in a different direction. This violates rule #1 from the manifesto:

Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.

Here to save the day: business processes that guarantee incremental feedback. There’s a raft of these that have to do with team internals, such as scrum retrospectives and manager one-on-one’s.

The danger of mindless feedback

I’m interested in the company-killing product feedback mechanisms. You know these heroes already: Data Science, A/B Testing, Objectives and Key Results (OKR), Key Performance Indicators (KPIs).

Each of these data-driven tools can fulfill the contract of agile. But there’s a danger: they’re habitual, automatic, and purposely dehumanized. Together these data driven strategies bring more gravitas than any gut decision could. “The data tells us” is the start of a sentence that can’t be contradicted.

Anything automatic can and will stop working if circumstances change. Every single one of these useful tools can be used to justify bad direction.

Many companies and teams have institutionalized feedback in one way or another. Just as it’s possible to have feedback “built in” to your workflows and processes, it’s possible for those very same “feedback enabled” processes and workflows to lie to you. How? By making you think you’re getting feedback when you’re not. Here are some scenarios, along with skeptical questions worth asking.

Data Science + A/B Testing

We know for sure that more people click the orange button in the bottom right than the blue button on the top right. Therefore, the orange button is better.

Is the fact that more people clicked / made it through the funnel / gave us their firstborn the only metric we care about? A/B tests almost always focus on the short term, narrow goal: more of X now.

But perhaps the business also has long-term goals, which might be served by the other option, or neither option? Perhaps it also wants to place qualitative limits on some options that seem too annoying / lack integrity / violate the mission.


We worked hard to define our Objectives, and we attached Key Results in numerical form so we’d know when we succeeded. Likewise, we worked hard to define our Key Performance Indicators so that they reflect the direction we want to grow in.

It’s great to have goals, and it’s great to be able to know when you’ve reached them or not. Numbers have a nice binary way of being reachable: you either make your numbers or you don’t.

But means matter arguably as much or more than ends. Did you make your numbers in a sustainable, repeatable way? KPIs can be modified and measure completely different things; are you sure they’re right? What if the business changes, and the KPIs aren’t updated?

Solution for bad feedback: find a deeper connection

For me, the lesson isn’t that Data Science, Analytics, A/B Testing, or Business Metrics are bad or don’t work. Most of the time, they’re useful tools. But it can be hard to see the human reality behind all the numbers at work these days. We have numbers that can be used by a clever automaton to justify even the worst decisions, business or otherwise.

The risk is an uncritical use of data fed right back into the next iteration.

I’m lazy. When I find something that works, I like to repeat it without thinking.

To make sure I’m critical of the feedback of my agile team, I try to break the routine from time to time. I try to connect with someone who actually talks with customers at least on a once-per-sprint kind of frequency.

Lunch with someone in sales. Coffee with customer success. Calls with customers. Talking with actual people, face to face, about the product. That’s how I try to avoid the risk of data-driven agile.

Is bad feedback the first thing you hear about a recent product decision? Don’t dismiss it out of hand. Your numbers say it was the right move to double the number of ads on the page… but was it?

If you get too many of those wrong, it’s hard to add up all those MVPs.

Was this piece thought provoking? Did I miss something huge, or hit the nail on the head? Please reach out in the comments, and connect on LinkedIn. I want to learn from your perspective!

Feedback fail: discover the risk of Agile’s data-driven Achilles heel was originally published in freeCodeCamp on Medium.