
Improving Risk Evaluation Through Advanced Analytics

(Peshkova/Shutterstock)
The data we generate and copy annually doubles in size every two years, and IDC says it will reach 44 trillion gigabytes by 2020. That creates significant new opportunity for business. Indeed, as The Economist has noted, data has become more valuable than oil.
In the case of data, there’s a lot of it to wrangle from many sources. Ensuring you have the right data, that it’s fresh and that you’re using it to yield accurate results is challenging. And it’s quite complex.
Traditionally organizations have used people and manual processes to collect, review and report on data. That’s no longer a viable strategy. Manual systems simply can’t keep up with the growing volume of transactions and data processing required to make timely, informed decisions. Overwhelmed with higher demands and expectations for quality assurance, personnel and processes across industries have reached capacity.
Legacy rules-based analytical models don’t scale to meet today’s requirements for evaluating risk associated with this flood of data. Fortunately, technology innovation is fueling significant improvements in analytics. That’s enabling organizations to address the data onslaught and use it to their advantage.
Overcoming the Limitations of Legacy Rules-Based Models
Early decision models rely on hard-coded rules or very simple linear simulations with a limited number of features or variables. These models use sub-sampling and analyze cases as groups, ignoring important data sources that can impact outcomes. Restricted to making decisions at the group level, they are incapable of examining individual cases.
Legacy models also are limited by the perspective of those creating the rules, making it hard to incorporate unanticipated parameters. As new rules are added for each new case discovered, it becomes more difficult to see gaps and overlapping rules within hundreds of rulesets. This process yields too many false positives and exceptions. And over time it becomes too complex to maintain.
Automated decision-making is not new. Computers sift through data, spotting patterns and anomalies a lot faster than humans can. However, the application of artificial intelligence (AI) and machine learning is ushering in the next generation of analytics modeling using intelligent automation.
Improving Accuracy With Next-Generation Technologies
At first glance, intelligent automation approaches might seem similar to existing solutions, but technology innovation brings considerable change to the outcomes. New capabilities make more granular models possible, shifting from group to individual decisions.
Increased transaction volumes and higher data availability provide the inputs required for more accurate results. Greater processing power from cluster computing, distributed data management and cloud storage enable this data to be collected and used. Finally, advanced algorithms in AI and machine learning examine data fast and update frequently. And they learn from patterns and correlations to improve results over time.
Shifting the Roles of Rules
To take full advantage of new capabilities in AI and machine learning, the function of rules must shift. Instead of designing analytical models using rules to make decisions, businesses can develop rules of learning. This approach entails using algorithms to process all cases and examine individual behaviors. This allows organizations to move beyond sampling and segment-based treatments.
Sampling data only allows automated decision-making to identify segments that share high-level characteristics. This over-generalization increases the chances of false positives and negatives. While human intelligence remains a critical part of data analysis, it is limited to previously identified cases. Machine learning techniques, on the other hand, process huge amounts of data in real time to quickly identify newly emerging unknown patterns.
Even in known cases and with supervised learning, it is more efficient to let computer algorithms automatically find the relationships and causality analysis. Manually applying rules is error-prone and limited in terms of analysis and can be subjectively biased.
Deploying Advanced Analytics at Scale
Though many organizations are enthusiastic about adopting intelligent automation for risk assessment, few solutions are widely deployed in live operations. Challenges include a shortage of experienced data scientists, gaps between business and technology expertise, and a maturity curve for deploying prototyping tools at scale in the production environment. Most organizations have valuable data but face challenges drawing insight from it with no expertise in creating algorithms to extract knowledge. A lack of data science experience limits the sophistication of analytical models, and often the right technical personnel don’t have a deep understanding of the business. For more meaningful modeling, technical and business experts must work together to align with organizational goals.
Indeed, one report suggests America alone is short more than 151,000 data scientists. Another article indicates it will only get worse, as the need for data scientists is poised to increase 28% by 2020. A McKinsey Global Institute study about competing in a data-driven world comments on the data scientist shortage and related challenges.
“The biggest barriers companies face in extracting value from data and analytics are organizational. Many struggle to incorporate data-driven insights into day-to-day business processes,” McKinsey reports. “Another challenge is attracting and retaining the right talent – not only data scientists but business translators who combine data savvy with industry and functional expertise.”
Finally, many advanced analytical models in development today are prototypes. To reap the benefits of AI and machine learning, a model needs to scale in production. Hybrid approaches—building the model offline, generating insights and then manually building rules based on the insights—restrict the model’s capabilities. Transitioning from prototype to production is easier when partnering with third parties that specialize in integrating advanced analytics pipelines into operations.
To keep pace with a rapidly-moving, globally-connected digital world, analytics approaches that embrace intelligent automation will pave the way to faster, more accurate risk evaluation.
About the Author: Dr. Ali Akbari is the senior artificial intelligence lead at Unisys in Asia-Pacific, specializing in unstructured data and image processing. He has enabled the practical application of artificial intelligence and big data in multiple industries to address their real-life business challenges, including risk management. Unisys data scientists and subject matter experts put advanced analytics into practice, helping organizations in the public and private sectors to improve efficiency, assess risk and enhance customer experiences.
Related Items:
Beware of Bias in Big Data, Feds Warn
ADP Leverages Huge Database to Gauge Worker Attrition
Banks Consider the Rewards and Risks of Big Data
June 30, 2025
- Campfire Raises $35 Million Series A Led by Accel to Build the Next-Generation AI-Driven ERP
- Intel Xeon 6 Slashes Power Consumption for Nokia Core Network Customers
- Equal Opportunity Ventures Leads Investment in Manta AI to Redefine the Future of Data Science
- Tracer Protect for ChatGPT to Combat Rising Enterprise Brand Threats from AI Chatbots
June 27, 2025
- EarthDaily Ignites a New Era in Earth Observation with Landmark Satellite Launch
- Domo Deepens Collaboration with Snowflake to Accelerate AI-Driven Analytics and Data Integration on the AI Data Cloud
- AIwire Launches Annual People to Watch Program
June 26, 2025
- Thomson Reuters: Firms with AI Strategies Twice as Likely to See AI-driven Revenue Growth
- DataBahn Raises $17M Series A to Advance AI-Native Data Pipeline Platform
- BCG Report: Companies Must Go Beyond AI Adoption to Realize Its Full Potential
- H2O.ai Breaks New World Record for Most Accurate Agentic AI for Generalized Assistants
- Foresight Raises $5.5M Seed Round to Bring Unified Data and AI to the Private Market
- Treasure Data Launches MCP Server: Let Your LLM Talk to Your Data
- Fujitsu Strengthens Global Consulting with Focus on AI, Data, and Sustainability
- HPE Expands ProLiant Gen12 with New AMD Servers
- Emergence AI Launches CRAFT for Natural Language Data Workflow Automation
- Overture Maps Launches GERS, a Global Standard for Interoperable Geospatial IDs, to Drive Data Interoperability
June 25, 2025
- Inside the Chargeback System That Made Harvard’s Storage Sustainable
- What Are Reasoning Models and Why You Should Care
- Databricks Takes Top Spot in Gartner DSML Platform Report
- Snowflake Widens Analytics and AI Reach at Summit 25
- Why Snowflake Bought Crunchy Data
- Top-Down or Bottom-Up Data Model Design: Which is Best?
- Change to Apache Iceberg Could Streamline Queries, Open Data
- Agentic AI Orchestration Layer Should be Independent, Dataiku CEO Says
- Fine-Tuning LLM Performance: How Knowledge Graphs Can Help Avoid Missteps
- LinkedIn Introduces Northguard, Its Replacement for Kafka
- More Features…
- Mathematica Helps Crack Zodiac Killer’s Code
- Solidigm Celebrates World’s Largest SSD with ‘122 Day’
- AI Agents To Drive Scientific Discovery Within a Year, Altman Predicts
- DuckLake Makes a Splash in the Lakehouse Stack – But Can It Break Through?
- ‘The Relational Model Always Wins,’ RelationalAI CEO Says
- Confluent Says ‘Au Revoir’ to Zookeeper with Launch of Confluent Platform 8.0
- Supabase’s $200M Raise Signals Big Ambitions
- The Top Five Data Labeling Firms According to Everest Group
- Data Prep Still Dominates Data Scientists’ Time, Survey Finds
- Toloka Expands Data Labeling Service
- More News In Brief…
- Astronomer Unveils New Capabilities in Astro to Streamline Enterprise Data Orchestration
- Astronomer Introduces Astro Observe to Provide Unified Full-Stack Data Orchestration and Observability
- BigID Reports Majority of Enterprises Lack AI Risk Visibility in 2025
- Databricks Unveils Databricks One: A New Way to Bring AI to Every Corner of the Business
- Snowflake Openflow Unlocks Full Data Interoperability, Accelerating Data Movement for AI Innovation
- Seagate Unveils IronWolf Pro 24TB Hard Drive for SMBs and Enterprises
- Yandex Releases World’s Largest Event Dataset for Advancing Recommender Systems
- Gartner Predicts 40% of Generative AI Solutions Will Be Multimodal By 2027
- Databricks Donates Declarative Pipelines to Apache Spark Open Source Project
- Code.org, in Partnership with Amazon, Launches New AI Curriculum for Grades 8-12
- More This Just In…