Like oil, information is worthless in its original form.
By refining, distributing and gaining commercial value from it, information can aid companies in predicting future events.
Like most developments in the data economy, predictive analytic’s is not without its own set of challenges. If you’re considering investing into scaling an algorithm to help your business across its digital strategy, here’s what you need to consider.
Since the birth of the data economy, conflicts have raged on differing viewpoints as to which group has created value. The stakeholders are the producers and aggregators of data, the users who are represented in the data and the consumers of data.
Notable commentary simply states that the tradeoff for using a service for free is the willing surrendering of a user’s data. It has also been argued that user’s should profit from the exploitation of their data. Policy experts have questioned whether the user has the right to access the data if it is anonymized. Are you for a data monopoly or is that anticapitalist? Twitter gradually placed restrictions on its data feeds, going against an open data dominant strategy. In doing so, its approach views data (and its exclusivity) as a profit center.
Technical solutions can allow companies to remove the metadata component that identifies an individual person, leaving only their footprints in the digital sand. When deciding your company’s stance on the issue, take into account the brand promise, customer perception and legalities of the matter so an off-hand discovery of your data-use is not met with shock.
Sentiment and intent analysis represent the cornerstone of predictive analysis for advertisers. This is why search advertising is far more effective than display advertising in every way. Like with Google’s Waze, another application is the predicting of traffic patterns in real time that allow marketers to determine how best to serve location specific offers that drive drive through or physical footfall.
Technical solutions in this area rely on advanced middleware technology that can bring all this data from multiple sources. It then needs to be computed from an increasingly powerful machine learning community to make sense. The modelling steps (intent) and parameter fitting (sentiment) will be applied from a framework that considers generative models.
It is often presumed and hoped that with the right systems in place, data can be contextually service to exchanges and used to reach the right audience at the right time in the right tone. So it’s fair to say that not all data holds equal value for all. Waze’s traffic prediction tools can help businesses map their supply chains and restaurants plan their time sensitive offers to drive footfall, but that data may not be valuable for a high end car dealership that is way past the affordability of the represented data set.
The development of pricing models for data exchanges and data ecosystems will vary on assumptions we make with regards to the value of data to each customer segment. When exchanges are not competing with potential customers on how the data is used, they represent a viable business model as data providers.