There’s an idea I occasionally hear expressed among data professionals, which is that we believe we have the best vantage point in the entire organization to understand what’s really going on1. This is because of the cross-functional nature of data work—metrics don’t consider the company’s org chart when they move, meaning that as data professionals we frequently work with data generated by all sorts of unrelated teams, systems and business processes. It also means our work frequently brings us into contact with people in a variety of roles and on disparate teams, whether as stakeholders of the work or as partners in getting it done. This environment does give data folks exposure to bits and pieces of many different things, and by virtue of the type of work we do, we can’t help but see patterns and problems.
Unfortunately, these cross-domain glimpses are not always great data. People don’t come into conversations with data professionals intending to give us exactly the information we need to form an accurate bigger picture. Sometimes they’re in an especially good or bad mood, and we consequently get a dose of optimism or a frustrated rant. Sometimes they come into meetings with us unprepared, and we get a disorganized dump of whatever happens to be on their mind that day. Sometimes they’ll even have the explicit goal of pushing their agenda on us, sometimes done blatantly and sometimes subtly. The end result is a disorienting array of conflicting viewpoints, and before we know it, we’re staring into a dizzying kaleidoscope that only we can see.
There’s definitely something to that idea that data professionals have a unique opportunity to spot problems, but the real challenge is knowing what to do with the information we have. Some folks have the inclination to take responsibility for every issue they see, but this impulse can easily get you into trouble—I’ve never met a data professional who felt like they didn’t have enough to do. Others are quick to flag problems or raise alarms, only to grow frustrated when no one takes their concerns as seriously as they do—I’ve also never met a data professional who didn’t also create problems for themselves by having different2 standards than the people they work with.
Proactivity and an ownership mindset are crucial traits for anyone who works in data, but so is an awareness of the limits of your time, energy and organizational credibility. Once you’ve developed the ability to spot and describe problems, the next big skill to master is choosing your battles. Most folks learn this lesson the hard way3 but it’s possible to avoid a lot of grief by slowing down and asking yourself a few questions.
Does anyone else think this is a problem?
Just because you think something is a problem doesn’t mean anyone else does. This is not to say that something isn’t a problem just because you’re the only one that sees it right now, but rather that you should be cautious about expending a ton of energy on something when no one else understands why you’re doing it. In many cases, you will seem like you’re wasting time or resources trying to fix something that isn’t broken or is low priority to address, and if there’s an interpersonal dimension to the problem, it can make you seem political or petty. Leaning too hard into problems no one else sees ruins your credibility, and trust is one of your most important currencies as a data professional.
Again, just because you’re the only one that sees a problem right now doesn’t mean that it isn’t one—it just means that you need to do some legwork. Start by trying to define the problem in a precise and productive way. “The marketing team doesn’t care about data quality” is too vague and accusatory, and it says nothing about how any people or business processes are being negatively impacted. “The paid campaign performance report keeps breaking because the marketing team is naming ads in the wrong format” is better because it includes a root cause and explains why this is a shared problem between the data team and the marketing team.
Specificity is important in the context of organizational problems as well. It’s hard for an executive to be sympathetic to a data team manager who says the team is too bogged down with short term tasks to do “higher leverage” work. They likely hear the same thing all the time from leaders and departments all across the company, and there’s a good chance they have no idea what “higher leverage” data work would even mean. “The product analytics team is going to stop doing follow up analyses on inconclusive experiments because we need to expand our customer segmentation framework to include the new B2B product line” is better because it explains what the short term tasks are, what you could be doing instead, and why the tradeoff is worth making.
Once you’ve better defined your problem, you may want to get a second opinion. Share your framing with someone else at your company who you trust, and crucially, who can be objective about the problem. It can be therapeutic to share a frustration with a peer or friend who will more naturally take your side, but you want someone who will help you validate your claim, not your emotions. Listen to what they push back on or find confusing, and other theories or alternate explanations they can counter with.
If you can’t convince them your problem is a problem, you may need to go back to the drawing board and find a new way to make your case, or you may want to back off. No buy-in is a recipe for spinning your wheels.
Is this your battle to fight?
Let’s say you’re able to convince someone that your problem is real and that it needs to be solved. That means you’re good to dive in, right?
Well, maybe.
The cross-functional nature of data work means we get exposed to a lot of problems, some of which are problems a data professional should own and some of which are problems that are revealed through data. The latter may still be something you can help solve, but you should tread carefully as feel out the situation.
Take for example a scenario many data folks find themselves in—working with two teams that have conflicting interests. Let’s say you work for a company that allows you to host your own hobbyist forum, and you work with two PMs who are respectively responsible for discovery features and repeat engagement features. Since you work with both, you develop a pretty strong sense for how these two sets of products interact and impact each other’s metrics, and eventually you see a trend where the harder the company leans into discovery features, the worse reengagement metrics look.
You manage to successfully frame the problem. “We are bringing in so many new users through our discovery features that it is degrading the experience for existing users,” you say. “Places where we used to see extended back and forth conversations are now swamped with questions from beginners asking what is the best way to get into the hobby, and long-tenured users are now posting and commenting less than before.” The two PMs hear you out, and while they both acknowledge the trend you’ve identified, they disagree about the implications.
The discovery PM says user growth is essential—the existing community was too niche and bringing in new people will help broaden the set of things discussed on the forum. The reengagement PM thinks the discovery approach isn’t targeted enough—while they agree the forum needs new users, these ones are low-value, contributing nothing to the community but noise since most post once and never return. The PMs go back and forth for days, asking you to run analysis or pull data to prove or disprove their arguments.
Maybe this exchange feels fun at first, like you’re shaping the conversation around an important strategic decision, but before long you develop a clear opinion about what’s happening and you take the side of the reengagement PM. The discovery PM won’t give up, though, and insists there’s another way to cut the data that will clarify everything once and for all.
Perhaps you’re tempted to help them with this analysis because you know that slice of data will actually do the opposite and poke holes in the discovery PM’s argument. But sometimes, facts and logic don’t work. You uncovered the problem, but the longer this goes on, the clearer it becomes these PMs are misaligned on product strategy. Once you sense that a disagreement is becoming more philosophical than quantitative, you’re in exactly the type of situation that data professionals should back out from.
Even if you strongly agree with one of the PMs, it’s not your job to get the other one to come along—this is not a data problem, but instead someone else’s problem that was revealed through data. You’ve done your part by surfacing it to the correct parties, and now you should disentangle yourself. Save your energy and attention by for things that are actually your job.
Of course, there will be times when you identify a real problem that falls outside your scope, but unfortunately doesn’t really fall into anyone else’s scope either. That could be a problem worth solving, but you should ask yourself one more question first.
What happens if this problem doesn’t get solved?
You have finite time, so the order in which you solve problems matters. Picking a battle to fight means that you’re choosing it above other potential uses of your time. When you take a step back and look at the bigger picture, does it really seem like this has to happen first?
Tech debt is a classic example of this type of problem. Maybe the code for your active subscriber metric is an absolute mess and everyone on your team breaks into a nervous sweat at the thought of having to modify it. You all agree you have to refactor it at some point and that it has to be someone on your team that does it because you’re the only ones with enough business context. But of course, your roadmap is already jam-packed, which makes it tough to prioritize maintenance work. Besides, you rarely touch the code now so the complexity isn’t actively hurting you today. If there’s a lot of talk that your company is going to release a new subscription tier some time next year but there are no hard dates on calendar yet, would it better to scope out the refactor now and tackle it sprint by sprint, or is this a problem you should ignore until a new subscription tier is officially on the roadmap?
Data people love talking about opportunity sizing, and this is your chance to do some. Is this problem one that is hurting productivity, and if so, whose? How much time is it costing them in a week, a quarter, a year? Can it be solved simply by stopping doing something, or would require changing someone’s behavior or investing in new technology or headcount? Can it be broken into smaller problems, and if so, which of them are easiest or most impactful to resolve? Does it become harder to fix the longer you put it off?
You also may find that as you wait, the surrounding circumstances dramatically change the equation. Company strategy and priorities develop, people and teams come in and out of the picture, and markets shift to create new pressures. Would your problem still exist if any of those things happened, and would your understanding of the best possible solution be the same? Don’t go overboard with thinking through the possibilities, but make sure you understand and feel comfortable with tradeoffs of prioritizing this above other uses of your time. Delaying action and deciding not to act at all can be just as meaningful as actively engaging with a problem, so don’t underestimate the value of waiting.
—
Data work brings you into contact with all sorts of people and problems, and if you’re not careful, you can easily find yourself in a situation where you’re pulled into so many of those problems that you can’t contribute meaningfully to any of them. You have limited time and energy—don’t waste it on battles that aren’t worth fighting.
Depending on who’s saying it, it might be wishful thinking as much as it is than an idea
They’re not always better, and they’re not always right. Sometimes data folks’ standards are higher when they should be lower, and vice versa.
Myself especially
Love it. Fantastic articulation of an issue I've seen in every Data Org I've worked within.