“Looking for supernovae is mostly a matter of not finding them.”
— Bill Bryson
Evans is a patient guy. Each evening, he goes outside, wraps himself in the night and watches the sky. He waits, actively. Evans pivots his 16-inch telescope across the universe like windshield wipers, observing upwards of 400 galaxies in just one evening. It’s just Evans, his telescope and his uncanny ability to identify patterns of stars in galaxies.
Evans said something very interesting about his craft, about what it’s like to invest so much effort for only a few moments of obliterating euphoria: when a phenomenon of light that traveled for millions of years through space before reaching Earth meets his eyes. Evans said:
“There is actually a certain value in not finding anything. It helps cosmologists to work out the rate at which galaxies are evolving…the absence of evidence is evidence.”
The absence of evidence is evidence. I get tangled in this statement.
Most often in life we look to “things” for direction—data, statistics, trusted perspectives, our own experiences—when I wonder if the quiet, the absence of direction, is where we’d find more meaningful answers.
Looking at the digital world and our obsessive use of data in understanding human behavior:
We use analytics data for directional cues to design more intuitive user paths through a web experience. We establish event tracking and conversion goals to understand how users interact with our content. We look at social metrics to determine what information people find useful, who they’re influenced by, what they enjoy, what (we think) tips them over the edge into a buying decision.
When we direct all our effort toward identifying patterns in data that only explain what people did, I wonder how many opportunities we miss—left on the table because we often don’t think outside the parameters of our own biases and assumptions.
There’s much we can learn about user behavior by looking at voids in data. I call this “invisible data.” Gleaning value from invisible data requires a slight yet pivotal lens adjustment. It’s less about what people did, and more about what they didn’t do: where they didn’t click, what they didn’t search for, what they didn’t read, what they didn’t share. Then it’s asking “why?” with a willingness to hunt down a deep enough understanding of our users to craft a hypothesis, and then test it.
Because making sense of data isn’t the tricky part. Understanding the people who generate it is. They are fickle, easily frustrated, attention-deficit beings whose behaviors, habits and experiences cannot be tracked in a spreadsheet. And when we only use existing data to better understand them, we should recognize that it’s only a fraction of the story.
A glimpse of the full story requires Evans-like patience and a habit of looking at the absence of evidence as evidence.