BRYGS

Tag: analysis

  • The Difference Between Analytics and Analysis

    November progress report, noting that I worked out one fewer day compared to October

    I am something of a Peloton fanatic. You can find me (by my nom du guerre “LeftShark”) somewhere on a leaderboard every single morning. Coincidentally, I’m also an analytics junkie, and one of my favorite parts of a workout is afterward, when I crunch the numbers to see how I’m doing. Peloton provides all sorts of graphs and metrics, and it’s great to be able to wade through and see my progress over the weeks and months. As a bonus, after the first week of the month, Peloton emails me a review of the prior month.

    At the beginning of December, I received my November report. It told me that I had worked out thirty days in November. Below the calendar that had every day filled in, there was a one-sentence commentary on my performance:

    This is 1 less day than last month

    If you’re thinking: “It should be 1 fewer day”, I applaud your grammar. Still, my point in bringing this up is that October has 31 days, and November only 30. So if I work out every single day for both months, it is inescapable that I will work out one fewer day in November.

    When I first read this attempt at insight, it made me laugh, but it also disappointed me a little, because it’s a glaring example of a modern-day problem, that of Analytics vs. Analysis, and is a reminder that we have plenty of the former and far too little of the latter.

    For most of human history, a lack of sufficient information was a huge impediment to decision making and planning. By contrast, today we often find ourselves with far more data than we can manage. Pages and pages of data. The problem is no longer that we don’t have enough information, it’s that we have so much information that we don’t know what to make of it. The of-used but suitable metaphor for the attempt to gain insights from a flood of data is “drinking from the firehose.” The providers of the data often try to do some number crunching for us, to help us draw conclusions and help plan, which is where “data” leaves off and “analytics” begins.

    This year I played Yahoo’s Fantasy Football. After each week’s games, Yahoo would send me a recap of my team’s performance. At first, I was really delighted with the analysis, because it was written in a dramatic, sports-page style: “Jay Ajayi destroyed the competition with 15 runs for 100 yards,” and such. For the first couple of weeks, I looked forward to this recap, but after a couple of weeks, I started to spot patterns in the report. By the middle of the season, I had stopped reading. Most of the “analysis” given was actually very superficial, based on perhaps one number, using canned phrases that more often than not didn’t reflect reality in any meaningful. Some players were alternately praised and criticized in the same report depending on which statistic the algorithm was comparing at that moment.

    Of course, I never thought for a minute that there was an actual human being at Yahoo covering my fantasy football team, and I don’t think that anyone at Peloton is reviewing my performance data and writing my monthly report. But the superficiality of this so-called “analysis” shows me how far we still have to go. With the power of computers, analytics are easy. Analytics is just sums and averages and plotting changes over time. It’s the analysis that’s hard, and the part that no one seems to be able to automate.

    Professionally, I see this all the time, when dealing with website analytics. Say a web page has a “bounce rate” of 80%. (“Bounce rate” is how frequently a visitor to the web page leaves the site without visiting any other pages). Is that a good thing, or a bad thing? Conventional wisdom is that it’s a bad thing — your visitor is leaving! But what if the visitor found the information they’re looking for? A bounce rate that high might indicate that your web page is perfect. So, it’s either terrible or terrific, and the analytics is not going to tell you which. For that you still need a human, who can take into account the intent of the web page and nuances that would help understand the dynamics of a high bounce rate.

    Algorithm-based analysis has gotten better over the years, to be sure, and a carefully written algorithm can lead to insights, even without human intervention. Indeed, a lot of the tasks involved in human-powered analysis are the sorts of things that computers are good at (comparing trends, etc) and it is only a matter of time until the computer-generated analysis become much more valuable. Until then, we need to not confuse analytics with analysis, and we must not content ourselves with the former when we really need the latter, despite the cool charts and infographics that companies are always trying to use to impress us.

    I appreciate the analytics. More analysis, please.

    Here’s my background on the subject of this essay, so you can decide how much credence you want to put into the opinions I present.

    I have about twenty years or so experience working on the web, from back before the days of web analytics (and Google, for that matter). Somewhere in my file cabinet is the Google Analytics IQ certification. I have also spent at least one Saturday poring over my stats from Peloton and Strava, and for better or worse, can tell you on any given day how I’m doing against my weekly and yearly goals.