Episode 4: Just Counting
Okay, Actually is a podcast for people who are working hard, still falling behind, and are starting to wonder if the problem is them. It's not.
Each episode — always under 25 minutes — we dig into what's truly broken and figure out how to build a solution that can actually work.
In this episode: vanity metrics aren't just a marketing problem — they're everywhere, and they're undermining your ability to diagnose anything accurately. We've spent the last few episodes on wrong diagnoses, ground truth, and friction. This one is about the data that feeds all of it. My husband Jeff says reporting without targets is just counting. After this episode, you'll know exactly what he means.
00:00 Hallmark Movie Obsession
03:18 Episode Setup Measurement
05:56 Failure Mode 1: Counting Without Targets
08:27 Failure Mode 2: The Vanity Number Trap
10:01 Failure Mode 3: Data Collected, Action Not Taken
14:55 Three Questions Filter
18:30 Fix Your Reports
The three-question filter — apply it before you send the next report:
- Do you have a target? Not just a number — a number with a should be attached.
- Do you have a benchmark? What does good look like relative to something whether it's last period, best in class, or your own stated goal?
- Are you tracking a delta? Is anything changing, and do you know why (and do you know what you're going to do about it?)
If the answer is no to all three, you have a vanity metric. You're counting.
Find me here:
Get clear. Get sorted. Get going. Stay sane.
Transcript
A few years ago, Jeff and I got really
into watching Hallmark movies, mainly
2
:Christmas movies that were super cheesy.
3
:I think actually, if we're being
really technical here, it was when
4
:Netflix put out The Christmas Prince,
and the absurdity and the cliché
5
:of it was something that sort of
started as a hate watch, and then we
6
:realized we were kind of enjoying it.
7
:But over time, as we got more
into Hallmark movies and started
8
:watching them off-season, , there
are just so many elements we
9
:really liked and could count on.
10
:You know, it's this idea not necessarily
of the highest quality of filmmaking, but
11
:reliability and consistency and knowing
that no matter what's going on in the
12
:world, there will be a happy ending.
13
:You're basically guaranteed that
happy ending, and frankly, only
14
:low-stakes situations all taking
place in under an hour and a half.
15
:Hallmark just does exactly
what it says on the tin.
16
:You might be pleasantly surprised,
which you will be if you watch a
17
:charming little movie called Villa
Amore that features a donkey named
18
:Baci and, like, beautiful picturesque
shots of the Tuscan countryside.
19
:But you're never really gonna be
disappointed because, you know, when
20
:you see Lacey Chabert on screen,
you know she's just gonna show up
21
:and do the same thing she did in the
last movie, but with a different job
22
:this time, and it's gonna be fine.
23
:You're going to have a good time
24
:And because there aren't that
many surprises, Jeff and I have
25
:added in a layer of entertainment,
which is a scoring matrix.
26
:We have fifty-plus different dimensions
that are frequent scenarios in Hallmark
27
:movies, obviously different criteria
for Christmas versus non-Christmas
28
:movies, and we score every movie
that we watch against this list.
29
:This includes things like lead character
is up for a big promotion, prominent
30
:eccentric townie, family secret
recipes or shared baking projects, the
31
:ever-present wow dress moment, which
only counts if the guy actually says wow.
32
:Needy child gets a present, which is my
favorite Business name is questionable
33
:or punny, and we even have one for
inaccurate or fake business materials.
34
:That last one was added after a
particular movie we watched where
35
:there were multiple pie charts on
screen in a big pitch or presentation,
36
:and they didn't even add up to 100%.
37
:And we just were like, "Surely
whoever was in charge of this for
38
:the set decoration could have gotten
this right if they really cared."
39
:But the thing is, the idea of fake
business graphs might actually be one
40
:of the more realistic things we've seen
in a Hallmark movie, because I've seen
41
:pretty close to fake business graphs in
real life in real business environments.
42
:The thing is, it's because a
lot of those graphs aren't there
43
:to convey actual information.
44
:They're just there to signal that,
serious business is happening.
45
:You see some charts, and you
know business is getting done.
46
:We're able to clock it instantly
when we're watching something like a
47
:Hallmark movie, and we can laugh at it.
48
:But the reality is we'll go to work
on Monday, and we either make or are
49
:presented with a pretty similar graph.
50
:I'm Karen Doak.
51
:This is OK Actually, the show
where we get clear, get sorted,
52
:get going, and stay sane.
53
:And today, we're gonna talk a lot about
how a mediocre approach to measurement
54
:doesn't really get you anywhere.
55
:In earlier episodes, we talked
about making sure you're solving
56
:the right problem, and while often
measurement is used at the end of a
57
:process to assess performance, poor
measurement is often what kick-starts
58
:inaccurate problem-solving overall.
59
:If you're using incorrect or unhelpful
or unnecessary data to drive decisions
60
:and problem-solving, that's another
way you can end up wasting time
61
:and resources on the wrong problem.
62
:, In our first episode, we talked
about the wrong diagnosis, that you
63
:might be working on the wrong problem
entirely, just like I had been.
64
:And then we talked about ground truth
in the next episode, where even when you
65
:think you've found the right problem,
you need to test whether that's actually
66
:verified and load-bearing and not
just a story that has good posture.
67
:And then in our episode last week, we
talked about friction, the symptoms that
68
:make the right work painful to do even
when you've found the right problem.
69
:So this episode is about a new trap
that lives even earlier in the chain.
70
:Before you can diagnose well, before
you can find ground truth, before
71
:you can even identify where that
friction actually is, you need
72
:information that's real and actionable
73
:And a lot of what gets handed to us as
information isn't either of those things.
74
:Sometimes it's wrong, sometimes it's
bad, sometimes it's just a distraction
75
:Vanity metrics was a term that we used a
lot when we were talking about marketing
76
:in the early days of social media.
77
:I'm talking late aughts to early tens.
78
:Where we were just sharing
impressions, followers, page views.
79
:It was a number that sounds big
but ultimately means nothing.
80
:The truth is that vanity metrics are
everywhere still, and when you think about
81
:the kinds of data that you're presented
with on a regular basis, you're looking at
82
:a lot of things that tell you very little.
83
:My husband, Jeff, works, with data every
single day, and he has a great quote that
84
:is actually good enough that I attribute
it to him instead of simply claiming it
85
:as my own, which is that reporting without
benchmarks or targets is called counting.
86
:I think when you apply that barometer
to most of the reports you get, if you
87
:ask whether most things you're looking
at are sharing real information that you
88
:can action against, or whether you're
just looking at whether a number's bigger
89
:or smaller than the previous week, I
think you'll be surprised at how often
90
:that kind of thing is happening to you.
91
:There are a few different failure modes
that I've observed in all of this.
92
:So the first one is
counting without a target.
93
:I've seen and sat through vendor QBRs
with so many exported graphs that
94
:just tell me how many times I did a
certain action without any framing
95
:around whether that's good or is that
bad, and no recommendations on how I'm
96
:supposed to do anything differently.
97
:The number of pieces of software that
just add dashboards so that you can
98
:have another way to check things.
99
:How often are you really reading
every data point on that dashboard
100
:versus just being like, "Yep.
101
:Got it.
102
:I guess I'll look at that at some point."
103
:Very rarely are those data points really
driving the decisions you're making.
104
:If you've been a software vendor or
you've hired software vendors, you've
105
:absolutely sat through this QBR, the
business review where you're hoping for
106
:insight and you're given pie charts.
107
:And I think in general, I wanna
call out that a pie chart is a
108
:red flag for garbage reporting.
109
:So you end up with four slides
that say what happened, but
110
:rarely that, "O-okay, do I care?"
111
:You know, how many people
logging in is good?
112
:How often should they log in?
113
:Does it matter if they
have different jobs?
114
:If the platform is exporting
information that gets emailed around,
115
:maybe no one needs to log in because
they're getting what they need.
116
:I've had customers churn where they
were never using the platform, and then
117
:suddenly everyone was, and someone on
our side thought that that was a great
118
:sign because there was an increase in
usage when that increase was actually
119
:just everyone logging in to export
their own data, settings, and notes.
120
:Millions sounds like winning,
but what are you winning?
121
:What is the so what?
122
:What is the concept of what
good actually looks like?
123
:Are you just counting until infinity?
124
:And if not, what are you counting to?
125
:I think the concept of getting ten
thousand steps a day is, is something
126
:that Sort of falls under this bucket.
127
:It is a target, but it's
a bit of an arbitrary one.
128
:As a woman in my 40s, I've been
told I need to make sure I'm getting
129
:sufficient steps in every day, and
then that 10,000 number came in, and I
130
:don't know very many people who don't
live in a city who are able to hit
131
:that kind of number every single day.
132
:So now we're wearing devices and watches
and rings and pacing in our kitchen at
133
:10:00 PM trying to add a few numbers,
but there's no real point to it, and
134
:there's no real change over time.
135
:Yes, more movement is better than
being inactive, but who cares about the
136
:number of actual steps you got versus
just knowing that you were more active
137
:that day and you took steps, no pun
intended, towards increased movement?
138
:Failure mode number two
is the vanity number trap.
139
:I think so much about when I used
to work in marketing and social
140
:media, and in those early days
of digital media, everything came
141
:back to the number of impressions.
142
:So in 2009, something that might
have been said boastfully is, " this
143
:story was shared by yahoo.com,
144
:so we've reached 200 million
unique monthly visitors."
145
:When you know something is buried on a
subpage within Yahoo News, it's absolutely
146
:not being seen by 200 million people.
147
:It's not being seen by two million people.
148
:It's not being seen by 200,000 people.
149
:So we're wildly misrepresenting things.
150
:But we didn't have a more accurate number.
151
:And then because we've been putting
those giant numbers in front of everyone,
152
:when better measurement finally comes
around, more accurate numbers are
153
:available, the real numbers look lower.
154
:It looks like you've,
you're doing a bad job.
155
:It makes, it makes it
look like you're failing.
156
:It's because suddenly you're
having conversations like, "Well,
157
:last year you told me you were
reaching 200 million people, and
158
:now you're saying it's 20,000?"
159
:But I'm like, "No, I know for sure those
20,000 people saw it, they read every
160
:word, they got every part of our message.
161
:I don't know anything about
those made-up 200 million people.
162
:I don't know if even two of
them really saw it or read it."
163
:Those vanity numbers set expectations,
and those vanity numbers in some cases
164
:were part of compensation or performance
reviews or annual targets because there
165
:wasn't anything better, and then suddenly
everyone's trapped by their own data.
166
:The last failure mode I've observed is
just this idea of data being collected
167
:and collected, but no action being taken.
168
:I think a lot about NPS and a
past employer where we collected
169
:NPS, or net promoter score data
from our customers with customer
170
:feedback, and it was not good.
171
:And customers were really critical of us.
172
:They were frustrated with how long it took
to implement the software, how long it
173
:took to get some support tickets resolved,
all things that I and my team had said
174
:were problems but weren't being fixed.
175
:So then I'm like, "Great,
we're gonna have a meeting.
176
:I have real customer data
that no one can argue with.
177
:It's gonna affirm everything
I've been saying for months.
178
:I'll bring this forward.
179
:Change will come."
180
:Ideally, I wouldn't even need to do this.
181
:Ideally, I'd be able to say, "Hey,
guys, 11 customers this week have
182
:said our support is worse than any
other vendor they have," and that
183
:would be enough to make a change.
184
:But fine.
185
:Ever the optimist, I thought looking at
these survey responses without any sort of
186
:personal intervention or personal opinion
is gonna be just the thing to fix it.
187
:What I didn't count on is that while
no one could argue with the number,
188
:they could argue with all of those
verbatims and all of the actual
189
:survey responses from real customers.
190
:So we're looking at the number in
the meeting, and everyone is going,
191
:"Oh my gosh, we need to fix this.
192
:This is so bad.
193
:This is so low.
194
:We need to do something."
195
:But then when they look at the feedback
attached to the number, the why is
196
:this so bad rationale, those very
same people who are the ones who could
197
:fix support or fix implementation,
those people are like, "Hmm, I don't
198
:really think that's true," or, "I
bet I know which customer said that.
199
:They're always complaining.
200
:That's not what's happening."
201
:And instead of making a commitment to
really fix things, it, it got dismissed.
202
:All that feedback got dismissed,
because at the end of the day,
203
:those leaders thought that they
knew what customers wanted better
204
:than customers did themselves
205
:. So nothing would change, and then the
next quarter, when the number was down
206
:even further, everyone was surprised and
confused, and they would say, "Why do
207
:you think the number went down, Karen?
208
:Did we do something different?"
209
:And it's like, no.
210
:In fact, we did the opposite.
211
:We did the same thing that
they hated, but we did it even
212
:longer, and, uh, that is worse.
213
:While this might seem like a
stretch, I think a little bit about
214
:Wordle when I think about data
collected and no action taken.
215
:I know my streak, I know my average,
I know my three versus four count,
216
:but then one day my dad texted to
tell me that my one brother who got
217
:two twos in a week is clearly the
family genius, and I'm thinking, um,
218
:no, that's not what that data means.
219
:None of that actually
changes how I play the game.
220
:It doesn't change how
anyone plays the game.
221
:It changes my view of my father and
makes me lightly resentful of my
222
:brother for a brief moment in time.
223
:So because of that, you're just tracking
and, and that's fine for Wordle, but
224
:it's less fine when you're trying
to drive real business outcomes.
225
:I have been places where benchmarking
was done so well, and the data that we
226
:could pull on what similar customers
and similar companies were doing was
227
:able to really make a difference.
228
:Getting a chance to show a customer
how they're performing right next
229
:to someone just like them, same
budget, same size, similar industry.
230
:, In marketing programs, being able
to optimize what you're doing based
231
:on both you and your own performance
and then also your competitors
232
:makes it even more effective because
you're all in competition for
233
:the same eyeballs and attention.
234
:So when we presented that information
to them, we could say, "This is where
235
:you are, this is where best-in-class
customers are, and this is what
236
:you need to do to get there."
237
:That is data being used for good.
238
:That is data that adds so much value.
239
:And that's also something that
when customers improve because
240
:of it or become best in class
themselves, they get promoted.
241
:They're getting celebrated with
an organization, and they're
242
:feeling grateful and appreciative
of you and how you helped them.
243
:That's exactly the kind of
thing that you want more of.
244
:And yet there are so many places where
instead of great benchmarking, instead of
245
:truly actionable information, instead of,
defining and setting a standard on what
246
:best in class might look like, there are
just so many examples where instead we
247
:get a little dashboard sent to us every
week and one number is bigger or lower.
248
:There might be a little green
or red triangle next to it, and
249
:somebody might reply once a month
and be like, "Why did this go down?"
250
:And then there's a few emails sent
around that and nothing really changes.
251
:As I'm saying all of this, surely
The New York Times has enough
252
:data to do something better too.
253
:Where is the benchmarking
from The New York Times?
254
:They know everything about me.
255
:I'm gonna try to submit that as a note.
256
:So here's what I actually want
you to take from this episode.
257
:Not just a way to evaluate the
reports landing in your inbox,
258
:but a filter you can apply to
anything you're producing too.
259
:Because most of us are doing both.
260
:We're sitting in QBRs where nobody
can answer those questions, and we're
261
:also sending and sharing information
where we couldn't answer them either.
262
:And so the filter works
in both directions.
263
:Before you produce a report, before you
build a dashboard, before you schedule the
264
:review, I want you to ask three questions.
265
:And if you're on the receiving end
of someone else's data, ask them.
266
:First, do you have a real target?
267
:Not an arbitrary one, not just
a number, but a number with a
268
:this should be attached to it.
269
:This is where we are.
270
:This is where we should be.
271
:Are we better than that?
272
:That's amazing.
273
:We should celebrate.
274
:And are we worse than that?
275
:Here's how we're gonna get there.
276
:Second, do you have a benchmark?
277
:Do you know what good looks like relative
to something, whether it's best in
278
:class, your own stated goal, last period?
279
:Are you tracking against that, or are
you just watching a number move without
280
:knowing if the movement means anything?
281
:And third, are you tracking a delta?
282
:Is anything changing, and do you know why?
283
:Do you know how much it has
to change for it to matter?
284
:And, and do you know what
you're gonna do about it?
285
:Because if it's no to all three
of those things, you have a vanity
286
:metric, and you're just counting.
287
:And counting can feel like rigor.
288
:It can have the, the contours
of measurement, but it's
289
:not connected to a decision.
290
:And if it's not connected to a
decision, it's not doing a job.
291
:I want you to think about one specific
report you own right now, one dashboard
292
:you send, one number you track, one
meeting you run where data gets presented.
293
:Can you answer all three
questions about it?
294
:If not, what would it take
for you to be able to?
295
:I know that AI makes it so much easier
for us to analyze everything, so I
296
:can only imagine the number of reports
that are being created today that
297
:are just, again, looking at a million
things, maybe creating a benchmark
298
:off of a ChatGPT hallucination.
299
:Maybe, maybe creating a real target,
but again, not one that has been
300
:discussed or decided on by anybody.
301
:We saw this kind of thing happen with
impressions in two thousand and nine, and
302
:then all the dashboards in twenty fifteen,
and now AI is generating more reports
303
:faster than ever, some of them benchmarked
against data a language model invented.
304
:And people are getting even more
data thrown at them and being
305
:asked to do something with it
without having an opportunity
306
:to ask, "Is this data accurate?
307
:Is it helpful?
308
:Is it telling me anything other than
that it's more or less than last week?"
309
:I wanna be clear, I'm not actually
anti-counting Jeff and I score
310
:every Hallmark movie we watch.
311
:I know my Wordle streak.
312
:I check my steps.
313
:And none of that has a target or a
benchmark , that's a real verified one.
314
:And I genuinely do not care because it's
either fun to do, or it feeds my little
315
:competitive streak with my family, or
it lets me feel like my plot-thin but
316
:happy Hallmark movies have a little
more weight to them, and that's fine.
317
:The problem isn't counting for its
own sake when it adds something.
318
:The problem is when we bring that
same energy into organizations where
319
:someone is paying for that report, when
someone is spending hours building a
320
:dashboard, and we're still not asking
what decision it's supposed to inform.
321
:That's not fun counting.
322
:That's expensive counting.
323
:I'm not sure that the steps you're
getting pacing around the kitchen
324
:at eleven o'clock at night to
close a ring are really the ones
325
:that are making a difference.
326
:And that impression number is not
driving business impact if nobody
327
:ever saw the content because it's
buried on the bottom of a page.
328
:And your NPS score is certainly not gonna
improve if you just read it out loud and
329
:then move on to the next agenda item.
330
:And whatever report you thought of a
few minutes ago, the one that you own, I
331
:hope you can answer all three questions
about it before it goes out next time.
332
:Your graph needs to mean something.
333
:Even in the Hallmark movie, some of us,
admittedly maybe a slightly more OCD pair
334
:who are married to each other and put way
too much time and attention into watching
335
:those movies, even in that Hallmark movie,
some of us are pausing just to call it
336
:out when that graph doesn't mean anything.
337
:I would love to know your stories
of mediocre measurement or measuring
338
:mediocrity, and even more, I'd love to
know what report you're going to fix.
339
:You can email me.
340
:My information is in the show notes.
341
:I'm Karen Doak.
342
:This is OK Actually, the show
where we get clear, get sorted,
343
:get going, and stay sane.