How to build decision intelligence software

towards-data-science

This post was originally published by Cameron Afzal at Towards Data Science

Four lessons from launching my first product

man standing in the middle of woods

Photo by Vladislav Babienko on Unsplash

We are awash in data, grappling with the complexity of problems we can increasingly model, but still struggle to interpret. We need sound decision-making in important domains like healthcare, finance, and infrastructure now more than ever.

Software plays a key role in helping us make sense of the world. Software products provide us with methods to overcome cognitive blindspots, expand our awareness, and make connections. Decision intelligence tools represent a quantum leap in practical applications of the information that surrounds us.

Applying new technologies is the tricky part. We want scalable solutions that take advantage of advancements in computation and algorithms. That’s where product leadership comes in. We need to guide the creation of the software so it can be leveraged by users who drive progress in their organizations. Decision intelligence software has a significant role to play in improving the performance of companies. Product leaders who harness best practices shape outcomes by offering solutions that can be widely adopted.

In this article, I will illustrate the importance and meaning of decision intelligence software and cover the important factors to consider when building it, drawing upon my experience designing and launching a new product. I’ll discuss users who are decision-makers that manage complex systems (water networks), but the techniques generalize across domains.

I hope these lessons will be helpful for other technology professionals who develop analytics, decision support, and other data-oriented enterprise products to improve decision quality.

The lessons

  1. Design to maximize the human-computer partnership
  2. Framing the problem is the problem
  3. Tailor to each user type and workflow step along the data stream
  4. Open up the data to facilitate connections (interfaces are key)

In general, we wield decision intelligence to translate information into actions and improve outcomes. Decision intelligence is often a combination of data science, business intelligence, decision modeling, and management. The mix will depend on the organization type and problem context.

In the context of network planning, decision intelligence unites three approaches. We need optimization to generate optimal plans, data science to interpret them, and decision analysis to converge on a way forward.

These networks are complex systems, requiring software that produce complex data. But what good are data if people aren’t able to use them?

I defined the vision for Aperture, a product we launched this year at Optimatics, which offers a platform powered by artificial intelligence. Aperture provides a bridge between optimization results and users who make business decisions, so they fully realize the benefits of multi-objective optimization. Our algorithms allow users to consider many criteria and decisions, to explore and evaluate thousands of plans to improve their network. The catch? There is no one best solution.

The multi-objective optimization solution exploration process. (Image by the author.)

Users go from searching for solutions to trying to understand an array of strategies while balancing competing objectives. Excelling at this balance is notably important for our customers. Their stakes are high since everybody needs water. So our challenge:

An Aperture dashboard for optimization results. (Image by the author.)

Users required a way to a way to sift through and make sense of the optimization data. Responding to that need we build Aperture, a decision intelligence toolkit consisting of interactive web dashboards and a cloud-based data platform.

Technology is only one side of the equation. Complexity increases if a system is difficult to model or the decision-maker grapples with competing objectives, meaning no single approach is correct. The product must meet the need for users to frame and decide, necessitating a two-way interface between the decision-maker and the software. To realize the potential of AI to improve decision quality, we had to foster the elusive human element.

(1) Design to maximize the human-computer partnership

The primary driver for the adoption of enterprise technology is efficiency. Customers want to increase their return on investment. Optimization is a potent way for companies to boost their performance.

It’s often infeasible for humans to analyze the breadth and depth of resulting data required to make informed decisions. Algorithms can arrive at insights humans cannot through sheer computational force. But us mere mortals have a key role to play at the beginning and end of the simplified, iterative workflow I’ve outlined below.

The big picture decision-making workflow. (Image by the author.)

Despite all the excitement about predictive analytics, machine learning is only one approach of artificial intelligence as a whole (Optimizer embodies the search paradigm and the planning problem domain). I think the most exciting prospect of artificial intelligence is to augment human judgment — help us with what we aren’t so great at. Part of that is parsing through large quantities of data, sure, but there’s more, like expanding our perspective.

We need a division of labor and interpretable data for decision-makers to take full advantage, a bridge between the real world and the data streams.

Sticking with the network optimization example with the products I manage, Optimizer and Aperture, our decision stack includes engineering judgment and domain expertise, applied metaheuristic algorithms (tools for discovery), high-performance computation, and user interfaces.

We can simplify the dichotomy to be between our users (the Humans) and the computational end of the toolkit (the Algorithms). We follow the rule of human — algorithm specificity to let the computers do what they’re best at and provide the most value for our users.

  • [] Humans are still needed to frame the problem. Even the most advanced AI cannot define its own objective function. Humans define the decision options — the realm of possibilities — and desired outcomes in the form of objectives and criteria.
  • [] Algorithms encode information, evaluate solutions, parse models and scenarios, map the decision space to the objective space, then post-process the data with statistical analysis like calculating probabilities and impacts of decisions.
  • [] At the end of the workflow, Humans decide. They use judgment and data tools to balance trade-offs, discover data, and hone in on an approach, with help from statistics, visualizations, and interfaces.

It’s important to note that not all knowledge is encoded in available data. Many real-world considerations are not captured in the model, not sensed or accessible to the algorithms. That’s why only certain organizations like Google can organically leverage data to the fullest: because they are built upon data. Noise — data in excess or of dubious quality — is just as significant an issue.

A major pain point is translating logic between human and computer, and means of distilling information in useful ways. I wanted our software to solve this, to address the entire workflow, take full advantage of the partnership to help our users make decisions. But to accomplish that, we’d start with framing.

(2) Framing the problem is the problem

In the beginning, there was silence. Then there was noise.

Customers use software to solve a problem. For decision intelligence software, defining the problem is not only the starting point, it’s also the most critical point. Everything else depends on the frame: what data are incorporated, available alternatives, and the performance metrics that decision-makers use to measure progress. The software is a vehicle, providing the controls and location, sometimes even navigational guidance. High-quality decisions are the destination.

Users need to be able to delineate the constraints, priorities, and available resources. Data collection and later interpretation must be motivated by an end goal, leading to more realistic and useful results. The frame should also include reference points for users to make useful comparisons and dictate where to focus in terms of outcomes.

Technology isn’t inherently good. Data aren’t inherently useful. The new wave of decision intelligence products must help guide users to define their frames. Effective software should enable decision-makers to consider the big picture and define objectives with minimal friction. But the design considerations will differ given the user type and where they are in their journey.

(3) Tailor the analysis to each user type and workflow step along the data stream

Just as we appeal to the specialization between humans and machines, we also need to build the interface and content of analyses to match users and their workflow steps. The user journeys, of course, depend on the product application.

We built Aperture to address the entire planning workflow. Aperture allows technical users to discover, compare, and inspect strategies and their impacts. They curate Optimizer plans so executive users can then decide on a plan of action and present it to stakeholders.

An example technical user journey for Aperture. (Image by the author.)

I was inspired to expand the user workflow to create value for decision-makers along the data stream, enabling customers to realize the full potential of their software subscription. Each workflow step is a dashboard with its own analyses. Our design principles were that each should be easily shareable, interactive, adaptable, and interconnected. The dashboard data reflect the latest optimization results and the selection of plans in one interface component or dashboard is reflected in all others.

The design started with user motivation. Executive users want to make sense of data to make defensible business decisions. They are motivated by a drive for clarity and transparency, and to ultimately manage risk. Strategic plans require coordination to execute and are often iterative, so they refine frames with new information or priorities using control points like budget thresholds. That meant we had to open up our software.

(4) Open up the data to facilitate connections (interfaces are key)

Opening up the data provides an impact magnifier for decision intelligence technology. By bringing more users into their ecosystem, a tech company can increase engagement, deliver more value, and scale their solutions.

Now by opening up, I don’t mean a free-for-all, open access without regard. The key is to translate complexity into clarity. Data discovery must be anchored in context, with stories and interfaces shaped for the user and purpose, salient data filtered for feasibility and noise. All underscore the importance of curation — which I’ll explore in my next article — to increase relevance, usefulness, and accessibility of the information

When embarking on the product which would become Aperture, I noticed that for every person who used Optimizer several more technical users would use the data coming out of it. And for every one of those technical users, there’d be other stakeholders and decision-makers downstream who’d want to examine and provide feedback.

So we had the opportunity to engage and deliver value to that range of users spanning from modelers and planners to utility executives and the public. To accomplish that we’d need to expand our software offerings and the workflows addressed to give users opportunities to collaborate and share plans. It was time to open up Optimatics by opening up the planning process to a range of decision-makers.

We lean heavily on data visualization, mapping, and summary statistics to open up the data (shout out to d3.js, leaflet, pandas, and scikit-learn). Our customers can interpret, action, and communicate optimization results to improve decision quality when planning their water networks. But each product leader will assemble their own toolkit given their users’ needs.

A year ago, I set out to solve a problem: to help our customers make sense of their data to improve the quality of their business decisions. I ended up discovering opportunities to create a platform that facilitates connections between data, tools, and people. We scaled the impact of our products by opening them up and now enjoy broader adoption as our users can access richer insights. I hope you will find these lessons useful for your own product development journeys.

Software and AI can wade through valuable data to aid in the fight against entropy. But it takes thoughtful design and engineering of the products to bring that to fruition. The potential of innovative tools to improve complex decision making in important domains has never been greater. Let’s go build.

Spread the word

This post was originally published by Cameron Afzal at Towards Data Science

Related posts