For years, the business world has been enraptured by the concept of big data. But the era of big data will not last forever. In fact, the replacement knocking on the door is one that might sound counter-intuitive: small data.
Conventional wisdom suggests that data aggregation will only increase in size and scale. With an ever-expanding consumer base with evolving tastes and an explosion of connected devices and digital channels to create and extract data, how could it not?
But as we reach the point where most forward-looking businesses have “digitally transformed” and successfully used the vast amount of data to their advantage, the foundation is shaking. Interestingly enough, the main factors driving this change are the consumers providing much of the data, and the technology that has made use of this data. Let’s examine each of these elements, and the path forward for businesses to use data in the best way possible.
Amidst privacy concerns, consumers push back
For much of the big data era, businesses have held the power. With immense grassroots advocacy and legislation such as the EU’s GDPR and the California Consumer Privacy Act, the pendulum of power has swung toward consumers.
As members of the business community, it’s all too easy to be both frustrated and skeptical of the consumer mandate to reclaim privacy control. While consumers may bristle at what they view as an invasion of privacy, people still demand access to every type of product at a moment’s notice, at a reasonable cost, and with the best promotions. In fact, data about consumer behaviors and desires has led to these products—from organic to private label—being created on a higher scale, with increased innovation. And just as businesses have started to revamp their technology infrastructure to drive the most value from big data, they’re being asked to pull back.
But consumers have a strong argument that grows stronger with the possibility of every wayward and invasive advertisement, particularly when they result from seemingly irrelevant data like how long someone spends on a webpage to whether they’re using mobile or desktop. For most people, it’s easy to recall an ad you received for something you already bought. Consumers appreciate when brands understand their motivations and preferences—underpinned by data aggregation—but they want to take the reins and customize their own journeys.
In lieu of traditional data aggregation, some companies are experimenting with a barter model, where, in “reward” for consumers sharing their data, businesses offer monetary or value-added services in a gamification type setting, such as extended free trials of their apps. Blockchain is also playing a role in monetizing consumers for their data.
For better or worse, brands are accountable to how they’re using data—they can’t embrace a black box mentality. And with every new opportunity for consumers to “opt out,” the big data pool becomes smaller. Brands need to get creative.
AI is now capable of doing more with less
One of the ironies of the big data era is that it may have played a part in making itself obsolete.
The basic ideas of deep learning have been around for decades, but have only taken off recently, with big data as the primary driver. Predictive machine learning algorithms historically required big data to drive value, because when you have more data to work with, you can train larger neural networks with more inputs to learn from and improve their “intelligence.”
With the explosion of the Internet of Things (IoT) and consumers’ digital footprints—from in-store purchases to social media posts—AI is yielding higher performance than it ever could before. But as the technology continues to advance in capability, businesses will actually need less data going forward to reach the point where machine learning becomes valuable.
Ingestion of data points has become more automatic, and less sifting is required for human operators to make sense of it all. Among other evolutions of AI, image processing technologies are increasingly able to automatically characterize a visual image without hand holding. The power of deep learning allows for embedded, deep neural networks to sift through datasets and use them to then generate a simple recognition of the image as the output.
As with any AI use of data, the most important element is the quality of the data. Aggregated data that is regularly updated and re-examined, thorough and inclusive may meet this standard. “Dirty” data means less intelligent analysis, resulting in more inefficiencies, corrections, and costs. And to abide by new and emerging legislation—and meet consumers’ expectations—the data brands gather needs to be superimposed with the necessary privacy metrics. Big or small, quality must be at the core.
Depth IS more important than breadth
Although it sounds intimidating to narrow the focus of the data pool, opportunities abound. Based on the consumer mandate and the evolution of technology, the light at the end of the data tunnel rests in the depth, not breadth, of consumer data. Even though over the past decade data has been generated on a higher scale than ever before, less than 1% of data created is actually collected and analyzed. And it’s not as though big data has driven resounding success with advertising: according to a recent U.S.-based NielsenIQ survey, 75% of respondents said advertisers have not improved or are worse at reaching them at the right time with the right product than they were five years ago.
Today’s economy runs on trust and transparency. For brands, this comes from using the right data, not necessarily more data. For example, a wellness-oriented grocery brand should know which demographics prefer plant-based meat alternatives, but not the types of technology gadgets these consumers are purchasing. In a similar vein, retailers should have a deep knowledge of the base of consumers with sugar-related dietary issues, to determine if they should be stocking more or fewer varieties of cereal. With the fragmentation of consumer preferences, it’s essential for brands to superimpose data sets that are very deep and specific.
The business world won’t shift to a small data landscape overnight, but there are avenues to take. Machine learning at the edge represents a way to bring small data execution to life. Within edge computing, resources like compute, storage, and memory are more limited, but when solving for specific vertical applications or a problem that needs a less complex model with smaller but deeper data sets, they can serve as a good fit. Yet for most companies, migrating to a small data era represents a future state. Some have succeeded with their big data initiatives, while others continue to struggle.
Regardless of the path they’ve taken to get here, for those willing and able to narrow their focus and market niche further than they have, they can put themselves on the path toward leveraging their data in a smaller, deeper fashion. Relevance and personalization are the key dynamics at play for brands to use data to their advantage.
This article originally appeared on CMS Wire.