Decision Intelligence: why entity resolution is foundational to success

By Jamie Hutton, CTO and Co-founder of Quantexa.

  • 5 months ago Posted in

Companies are inundated with data; with data teams swimming against the tide, barely able to keep their heads above water – let alone generate insights or spark innovations. Dell Technologies labelled this situation the ‘data paradox’ – where businesses are gathering data faster than they can analyze and use it; yet constantly needing more data than their current capabilities can handle. All of which conspires to make organization-wide data-driven Decision Intelligence difficult to achieve.

Truthfully, the odds are stacked against CDOs, CTOs and Chief Architects. They’re under pressure from the board to deliver tangible business outcomes, but are understandably focused on getting the fundamentals right: system architecture, data unification, data quality etc. There is the temptation to wait until these challenges are solved before deploying decision intelligence. But in reality data will never be perfect and substantial value is being lost. 

Building locks to handle the river of data

Pressure to deliver use-cases is an ongoing opportunity and barrier to true decision intelligence. Which can be seen as another digital transformation paradox. Let us view enterprise data as a river constantly being added to by tributaries of yet more internal and external data. Data teams looking to make sense of the river, siphon volumes of it off into ox-bow lakes, keen to isolate the overwhelming volume of data they are contending with. The result is products and services cut off from the river that often work in isolation but are not part of the organization’s whole data strategy. A typical enterprise will struggle to understand what data they have and where it is at any given time. The result is a lack of trust in the data being used to support strategic and operational decisions. 

Instead of siphoning data into oxbow lakes, CDOs should look to building locks that sit across the data river. They need a solution that is built for the high volumes of distributed, disparate data they are dealing with and that is equipped to deal with the often-attendant data quality issues. Good data management is foundational to everything organizations do. Yet out of 97% of organizations that invest in data initiatives, barely a quarter report having successfully created a data-driven organization (New Vantage 2022). To keep our river simile flowing, a more sophisticated approach is to create locks along the river – controlling the flow and making sense of it, while keeping it part of the overall whole. 

How Entity Resolution Can Overcome Traditional Master Data Management Shortcomings

Across industries, the first challenge is unifying disparate and siloed data at scale. You will have structured and unstructured data, both internally and externally, all of which refers to real world people, businesses, contacts etc. The capability required is called Entity Resolution (ER). Which is the ability to resolve individuals, businesses, products etc. into complete 360-degree views of people, organizations and places. 

ER software is industry agnostic. It’s as suitable for helping a global bank understand its customers and counterparties as it is a government seeking to create digital identities for its citizens. At the core of both ambitions is the problem of eliminating duplicate records and combining multiple records for the same person or entity. ER use an iterative approach that compares and combines data from multiple systems to produce the most accurate match possible.  This allows organizations to convert vast quantities of low-quality data with multiple duplications or ambiguities into meaningful, accurate descriptors of each entity. 

Unlike traditional master data management approaches, ER does not require a data transformation exercise to put it into a specific format. This approach to data wrangling is also continuous, which gets to the heart of our river and oxbow lake simile. ER capabilities implemented on siloed data are cut off from the whole. Let’s say an existing customer of a company signs up for a new product. Without continuous entity resolution an organization may well consider them a new customer, rather than correctly linking them with their existing profile. 

Why context is king in Decision Intelligence

According to Gartner, 65% of executives say the decisions they are making are more complex than two years ago, with 53% facing more pressure to explain or justify decision making. No decision stands on its own. Which is why decisions should be evaluated in a context-sensitive manner, beyond the scope of any individual or event. With our example of an existing customer signing up for a new product, context allows us to understand the relationships this customer may have with other people, organizations and places within the data. 

Using financial services as an example, contextual monitoring allows organizations to secure a wider and enriched view of the customer associated with any given transaction. A bank’s monitoring system may well be set up to flag any transactions above £50,000, chosen because it represents a change in behavior, above and beyond the usual transactions for a normal customer. However understanding the context around this payment uncovers that this is a deposit payment for a house which would dramatically change the risk profile. . 

Using contextual monitoring, the bank can see a wider and enriched view of the customer associated with the transaction. Including the source of funds for the purchase, whether the customer is in a high-risk geography or is associated with negative news - or if the customer is linked to someone on a high risk watch list. Suddenly, the bank can expose that risk and elevate that alert because it has the additional context via their connections to know what that risk is.

Although one example, this capability shows contextual monitoring can help build a solid data foundation from which enterprises can deliver more detailed analysis and ultimately improved decision making capabilities. 

Knowing when (and when not) to automate decision making 

Decision Intelligence refers to the ability for humans to make better informed decisions. Automation plays a huge part in this, but its true strength lies in knowing when to raise alerts for humans to them take stock and control the situation. In the example above, most transactions associated with a house purchases will be benign, but if there are contextual risk factors associated with this transaction, it will and should be left for humans to make the final call. 

Artificial intelligence and machine learning sits at the heart of these capabilities. Automating manual, high volume operational decisions allows organizations to maximize it for efficiency and cost savings. But transparent models mean each decision is explainable with full visibility for security and regulatory requirements as well as model validation and optimization purposes.

Generative AI co-pilots are particularly useful in this context. Organizations are rightly concerned about ‘black box’ thinking, but with generative AI organizations can interrogate its recommendations asking why and how it generated a specific alert or recommendation. However, it can only do so with great efficacy if it has constant access to the river of data flowing through the organization. 

It’s for that reason organizations should look to Decision Intelligence platforms that sit across the whole of an organization’s data. Building a platform that can support flexible deployment options – native or containerized - for public or private cloud will enable decision making intelligence across the whole organization.

By Andy Mills, VP of EMEA, Cequence Security.
By Paul Birkett, VP Strategic Portfolio Management at Ricoh Europe.
By Liz Centoni, Chief Customer Experience Officer, Cisco.
By Lars Rensing, CEO of Protokol, DPP Solution Provider .
The IT world is moving faster than it has ever been. As a manufacturer, the only way to compete and...