Follow Datanami:
August 9, 2017

Operationalizing Data-Driven Decisions: A 5-Step Methodology

Scott Zoldi

via Shutterstock

Making effective decisions across an enterprise, a government, or a planet involves more than using well-designed software to analyze data. To operationalize data-driven thinking, you need a decision environment that can be replicated consistently, in multiple situations.

Most organizations fail at this because they simply don’t have a decision methodology in place. Decisions are “just made,” and things happen as a result. Do stakeholders ever reflect on the effectiveness of that decision? Or are the factors that led to a decision, and its result, buried in complexity?

To do this companies need to establish a repeatable and simple process for deciding, a best practice for deciding: a methodology that works in any situation, and that allows stakeholders to reflect on and learn from the decisions they make.

Five Steps Toward Operationalization

Importantly, the methodology should not require IT resources to implement, and should cover five key stages:

1. Codify the decision process and domain expertise so both can be easily examined, repeated and shared. Codification is the critical first step in creating both a knowledge or logic audit trail for all decision logic, which can transcend individuals and organizations. Once documented, decision logic can be shared, improved, updated, audited, tested, and simulated. Perhaps equally important, decision logic codified as a business process can be implemented consistently throughout an enterprise and with any end user.

2. Record the decision and the factors and data that led to it. Many business decisions, customer interactions or treatments are governed by regulations and must remain compliant to legal or business best practices. By capturing both decision logic, as well as the data and analytics that informed a business decision or process, an enterprise can create an auditable paper trail that endures as individuals or knowledge workers come and go. As people move on to new projects, an enterprise knowledge or decision management solution can capture and document the details that would otherwise be lost with the original contributors.

3. Model the analytics used to make decisions, with models that can be managed and re-purposed. Part of the challenge with the rise of and increasingly competitive analytics and data science practice, is the ability to leverage best practices across an organization. Simply creating predictive analytic algorithms is not enough. Being able to share best practices and create cross-organizational collaboration provides opportunities to efficiently scale the use of and the power of analytics throughout an enterprise. In addition, creating more powerful analytics and decision process by connecting otherwise disparate parts of organizations increasingly improves an organizations ability to connect with, and impress, customers (i.e., connecting new customer onboarding with customer lifecycle marketing, fraud detection, upsell marketing, etc.).

4. Optimize the models as business conditions and data change to ensure they deliver the results you want. Optimization of the decision automation process provides two distinct and unique opportunities:

  • First, a chance to fine-tune decisions, offers, or treatments to better align with business priorities.
  • Second, the ability to simulate the changes or modifications of the decision making process, new analytics, or new data on customer interactions or decisions.

Simply adding analytics or codifying business rules to a business process does not connect either to a business’ priorities. This is the value that optimization provides: articulating, simulating and deploying decisions in a manner that keeps an organization aligned and on track.

5. Adapt models so they can be applied to multiple decision scenarios. Improve decisions by measuring results, evaluating successes (including the use of champion/challenger testing to compare accepted processes against alternatives) and optimizing further. Learning, adapting, and evolving analytics is what machine learning and artificial intelligence are all about; what it means is that the more you learn, the better you get at predicting or informing a decision making process.

Whether this is done manually or explicitly, versus automatically, is really the difference between a traditional learning loop and artificial intelligence. The value is in automating and refining improvement. Leveraging technology and analytic algorithms to see trends that may not be obvious, improve outcomes faster, make connections that would otherwise be invisible and, at the end of the day, leverage data to be more nimble, agile and efficient.

With this decision methodology in place, organizations can improve the outcomes of their decisions, and more people can engage at crucial points in the deciding process.

The software is willing. It’s not necessary to build an environment like this solely from scratch. (In fact, attempting to do so would be onerous.) Groundbreaking decision modeling and management solutions, which are built upon decades of analytics experience, are ideal for supporting a structured, repeatable process for deciding. These solutions are emerging now in part because they can be delivered from the cloud. This puts enormous resources at virtually any company’s disposal, no matter their size or the extent of its on-premises IT investment.

Using these software solutions and a rigorous methodology, decision-makers across many functions and lines of business can:

  • Determine what they need to make the decision and at what point they’ll consider it complete
  • Understand the decision in the context of related processes, systems and events
  • Visualize information that would otherwise be difficult or impossible to understand through text and numbers alone
  • Apply this approach to other decision scenarios.

This is where decision-enabling software is headed. Already today, a subject matter expert—a risk manager, for instance—can model a business decision and execute it without pulling IT resources away from other tasks. Operationalizing data-driven decisions in this way was unthinkable even a few years ago—but a reality today in organizations of all sizes, in all industries.

About the author: Scott Zoldi is Chief Analytics Officer at FICO responsible for the analytic development of FICO’s product and technology solutions, including the FICO Falcon Fraud Manager product which protects about two thirds of the world’s payment card transactions from fraud. While at FICO, Scott has been responsible for authoring 79 analytic patents with 39 patents granted and 40 in process. Scott is actively involved in the development of new analytic products utilizing Artificial Intelligence and Machine Learning technologies, many of which leverage new streaming artificial intelligence innovations such as adaptive analytics, collaborative profiling, deep learning, and self-learning models. Scott is most recently focused on the applications of streaming self-learning analytics for real-time detection of Cyber Security attack and Money Laundering. Scott serves on two boards of directors including Tech San Diego and Cyber Center of Excellence. Scott received his Ph.D. in theoretical physics from Duke University.

Related Items:

The Next Data Revolution: Intelligent Real-Time Decisions

Survey Finds Disconnect Between Big Data and Decision-Making

Datanami