Compliance Analytics Gaps: How to Fix Bad History with Novel Data Bounding

Close Gaps in Compliance Analytics with Data Bounding

It feels like with each passing year the stakes rise for predicting and managing risk. Does the universe need to keep dishing us market, environmental and health crises when regulated industries are amid massive digital transformation and analytic platform modernizations?

The insurance industry is facing down the compliance deadline for the FASB’s Long Duration Targeted Improvements (LDTI). Public carriers should be on the cusp of final transition to running new models developed over the past two years.

Similarly, most financial institutions should have their Credit Expected Credit Loss (CECL) compliance transitions in the rearview mirror. However, many credit unions have yet to reach key milestones toward implementation of the new credit standard.

And for private and smaller entities just now converting new roadmaps and processes into trial implementations, what are the lessons learned from their larger counterparts to help smooth the process?

Timing, business considerations and the scope of modernization may vary among companies impacted by new regulatory standards, but there is one unavoidable similarity: all will surface analytics gaps. Deadlines can’t be missed, but issues must be addressed. All that’s left is determining how and when.

Over the coming months, I will highlight some common gaps we have encountered while building analytics platforms for clients to help guide you through updates in your LDTI or CECL compliance strategy – whether it’s prioritizing your post-deadline improvements or anticipating and addressing gaps on the way to implementing new models.

Analytics platforms built to meet regulatory requirements and accounting standards (LDTI, CECL, IFRS 9, IFRS 17, CCAR, etc.) inevitably exist with common gaps that can lead to trouble down the road. Addressing the root cause of these flaws can result in more than better risk assessment and refined forecasting. They can help you run a smarter business. Some gaps include:

  • Historical data flaws
  • Manual preparation of external data (econ, auxiliary internal data)
  • Stress testing with the same regulatory models
  • Calculating and/or allocating adjustments
  • Publishing results

All these gaps introduce potential exposure to deficiencies in IT or business controls over the platform and the business process of reporting a forecasted exposure and ultimately the reserve calculation. At a minimum they will burden those responsible for each part of the regulatory modeling process (data prep, model development, model deployment, model execution, and reporting) with additional work in each reporting cycle.

Addressing these gaps with controlled, automated processes will alleviate hours of manual work preparing data, adjusting models, and/or performing controls. When you solve gaps with automation you can gain back time to do more than the minimum with your platform and the insights it is generating.

We’ll start this blog series by focusing on historical data and data bounding methods that automate steps to prepare macroeconomic data in a time where forecasts were not characteristic of anything credit risk experts had previously encountered.

Facing down unprecedented data volumes for an unknown future

Source systems change, business acquisitions happen, and even if you have a master data management platform in place, there are inevitably sets of historical data negatively influencing your analytics and/or reporting workloads. The severity of these flaws is broad, but even minor flaws, such as a missing FICO score or an incorrect NAICs code, can have material impacts on loss forecasts.

Automation can correct these data integrity issues and add quality checks leading to more accurate and reliable model results and reporting.

Whether it’s uncharacteristic results, unexplained issues with account-level input data, or issues/changes with supplemental input data, the process of understanding the cause of issues is time consuming. Tracing results back to input data, and then tracing these data back to their source systems often takes more time than running the regulatory reporting process end-to-end.

We’ve been in the trenches trying to understand anomalies in input data and model results, and we know all this does is take away from the time needed to understand the correct results, come up with recommendations for adjustments, and present to the compliance committees. Corios has seen real world benefits for our clients by building checks and balances and implementing dynamic, yet controlled, processes to account for anticipated issues along the analytics pipeline.

The use case I’ll highlight here is one regarding the bounding or massaging of macroeconomic forecasts. This process may have several different names, depending on your organization. Here I’m specifically referring to the process of manipulating the macroeconomic series that drives regulatory forecasts.

A model for shifting from unbound to bound data

With the right combination of forces, historical data leveraged into current bound data sets will produce insight-driving analytics. The key is pinpointing how and where the data can co-exist and work in concert in your roadmap and models.

It’s important to establish the most relevant connection points and parameters. In most cases, we are implementing a ceiling and/or floor to raw or transformed macroeconomic variables, such as national GDP, quarterly change in employment rate, or month-over-month change in oil prices. Like it or not, models can only be informed by what has happened in the past.

When a global economic event occurs that causes macroeconomic forecasts to behave unlike anything seen in history, let alone the historical training set, strange things are bound to happen. In the relatively short time new credit risk accounting standards have been in effect, several events have driven global and local economic volatility, the likes of which have not been seen before.

Any global event will have wide-reaching market impacts as well as impacts to specific industries that are not duplicated elsewhere. For instance, the COVID-19 pandemic is an example of a global event that drove extreme volatility in macroeconomic forecasts that directly influenced credit risk models. The Russia-Ukraine conflict, natural disasters, and changing macroeconomic indexes by data providers have all led to volatility and a need for dynamic processes when preparing or using this critical input data.

The experience and skills to design such processes can be hard to find within a team or company. Adopting novel methods for tackling data gaps can require outside partners who bring not only experience with novel market events, but the absence of preconceptions about what the data can and cannot do.

In one specific case, a client needed to manipulate economic forecasts, because a handful of specific variables were so far beyond the threshold of any historical data used during development, that it was causing the models to predict a total loss for greater than 90% of some portfolios. While there were many reasons to anticipate some significant impacts to the economy, no one expected this degree of severity.

Once the driving factors for the nonsensical forecasts were understood, the next step was manipulating the economic data manually. This caused issues almost immediately. The controls for generating the data were compromised, which required new compensating controls. Clearly, we needed to automate the adjustment of economic data to ensure completeness, accuracy, and timely completion.

The root cause was the unprecedented month-over-month and quarter-over-quarter change in a handful of economic forecasts. To address this issue, we had a few options:

  1. manually manipulate the economic data file
  2. develop code changes to the economic data preparation process
  3. recode or recalibrate models
  4. set a ceiling and floor (bound) for the problematic forecasts within the models’ implementations

The most pragmatic way we have identified to implement these forecast adjustments is bounding transformed variables based on a weighted value of their historical maximum and minimum value. We then add the bounding parameters (maximum, minimum, and weighting) for use in the model implementation, as indicated in the graphic below.

In other words, the model was slightly modified to only use the raw economic data if the value is within the weighted ceiling and floor, otherwise the weighted ceiling or floor itself is used. Some analysis is usually required to determine where inflection points occur, then these paraments are set in a lookup table.

Graph indicates inflection point of Unbounded data (red line) and selected alternate weighting/bounding based on historical maximum and minimum change factors prevent model forecasts too far outside historical extremes.

The process of bounding can be easily activated or deactivated by using different versions of the lookup table. The models will first try to use the bounded values, but if the bounding is not active, then the raw values will be used. This is a global setting for the series, simplifying the process and number of inputs. Horizon level bounding could be implemented; however, this brings unnecessary complexity into the data preparation process and model inputs.

Often, we find the simplest solution, when all aspects of the process are considered, is the best solution. Some might opine that the simplest intervention would be to manipulate the economic data directly. This, however, brings in many collateral complexities when you consider the need for controlled data preparation processes.

And we specialize in remedies for the complex. In this case bounding was the most practical solution, because it only required a few simple code changes and a small lookup table with four columns and one record for each of the impacted modeling segments.

Gap-free analytics platforms without the data mystery

Using a parameter file to activate/deactivate bounding, set values, and set weightings is a pragmatic, controlled approach to addressing a problem many didn’t see coming but will undoubtedly occur again. Regardless of if it’s this gap or another chasm in the data, getting to the core of issues is the start of framing corrective steps that put the data back on the predictive track.

When your LDTI and CECL compliance is stable, you’ll need to prepare for the next standard – inevitable with the rapidly increasing environmental and health-related risks to the markets. We have found consistent success in building platforms with automated data integrity and quality checks, flagging issues before they enter the system. Building dynamic processes to account for these issues also goes a long way in enabling the users to take matters into their own hands.

Better builds reduce client time spent on the endless odysseys searching data and running models to no avail and bounding has been proven to yield more reliable model results and reporting. So once the mystery is solved, the real fun in data storytelling can begin.

Count Us In on More Data Analytics

Corios transitions from pandemic to post-recession data analytics storytelling

What a weird several years we’ve all experienced. The pandemic presented all businesses with unique circumstances to overcome. It is possible the biggest challenge was staying the course in data analytics transformation efforts while pushing all employees into the digital landscape full time. Even as we managed to keep Corios working through COVID, we chose to stand up to adversity and consider our options beyond simply riding out the turmoil.

Would we be satisfied with the pre-pandemic status quo? Or would we instead choose to mature and grow our management analytics business?

When 2022 arrived, we chose evolution of the Corios Way. We retained our core team throughout the pandemic, so, instead of a rebuild, our expansion is underway with purpose and a recommitment to our storytelling vision.

We are still here, simplifying the complex in data strategy and humanizing the mechanical in analytics for high business value. And there is so much more to do.

Way beyond Portland: Living values in hybrid mode

The pandemic put a spotlight on heavy culture shifts for a lot of firms. Work from home, remote connectivity and the cloud were not yet widely adopted or operational. Our own dedication to the cloud for us and our clients – in SOC2- and PCI-compliant management controls, made for a smooth shift to Work-From-Wherever.

Now, as some of the “old ways” of office culture are coming back, not only are we celebrating team events in person in our Portland HQ, but we are also adding work from work locations in new markets. In October we opened a new office in Denver, where Austin Barber leads our Credit Risk and Compliance practice. In 2023, we will continue to explore where in the U.S. we can put down more roots as we expand the team and serve new clients.

Simplifying access to Corios data analytics solutions

Speaking of Corios practices, delivery of our service and solution capabilities is now restructured into three areas that better reflect the way our clients seek out data and analytics project support. After five years mostly dedicated to the Analytics Modernization practice, I have turned over that responsibility to Tallack Graser. In addition to the Credit Risk and Compliance practice, we formalized analytics roadmap and operational analytics (think marketing, customer value) into a single strategic support offering for clients.

On the energy side, Corios VP John Willey’s knowledge and leadership highlights the natural structuring of a Utility Analytics practice serving electric utilities. With a sound solution-approach in Corios Lightning, created based on the energy needs of California, we are well-poised to meet the unique circumstances driving analytics for electricity suppliers anywhere in the U.S.

Each practice focuses on our domain expertise, application of analytics solutions and account management in the business areas where we are strongest. This establishes a platform for company growth that anticipates the data and analytics demands our clients will face in the future.

What’s not changing? The Corios Way – our approach for how we help clients migrate functional, secure, and compliant workloads to the cloud and open source environments. More than just a process, we demystify the complexity in a fundamentally different solution to tackling these giant tasks.

From human brand to data transformation solution

We anchor our data and analytics transformation expertise and leadership with four software-based solutions created under the Corios Rosetta brand banner. These provide an instrumental foundation for our clients as well as our delivery team to reduce time to market, reduce cost and improve operational risk for analytics cloud migration and modernization.

Our next motion for Rosetta is investments that make the solutions more accessible through partners like Amazon Web Services (AWS). Giving companies an entrée into the scope of their analytics migration is the natural extension of our commitment to keeping the task as transparent as possible. And making the task more human – for both broad modernization initiatives as well as focused and near mission critical efforts like the ones John continues to lead in the utilities sector.

Beyond the technology and resource benefits, the more important outcome our clients experience is unifying different cultural threads previously fragmented and splintered across their organization. The way clients and the Corios team engage using Rosetta helps reconfirm analytics is a team sport, played by real people, not algorithms.

Real people are at the core of building our own team consistently and prudently. So even as we are in growth mode, we are carefully adding the right new people.

Notable analytics talent is coming out of the pandemic woodwork

As we close out 2022 our team has already grown. We have expanded our project resources with two energetic and focused associates already hard at work on several client projects. Steven Maxwell and Ria Kim bring diverse backgrounds and work experiences to the Corios team and yet both fit easily into the weave of the company culture.

Key to company trajectory, we also added two new team members focused on growing our brand and market value. Big goals for the future and our solutions demand greater attention on our go-to-market strategy and building stronger relationships with both customers and partners alike. Adding Amelia Johnson-Lewis and Jason Kempson in marketing and sales gives us the first coordinated team approach and is already showing us that scaling a boutique firm is more than do-able.

The Song Remains the Same: stay sharp, serve with purpose and multiply value

I see the market and companies of all sizes fully grasping the shift from nice-to-have into need-to-have for their data and analytics strategy. And with any luck, quickening the pace of data access and analytics insight will improve how companies compete as well as how we deal with uncertainties like pandemics and recessions more quickly.

If we have learned anything from the past three years it is to not make projections, but rather ground deeper into our values to meet the opportunities we unveil. As we look forward into 2023 and beyond, we see data and imagine all the untold narratives just waiting to be revealed. Somewhere between harnessing the data and reaping the benefits of better decision making are the people key to translating analytics into action.

So, count us in because we are ready to play on. Oh, and I meant to ask:

What’s your story?

Eating the elephant, one iteration at a time

Estimating Value at Risk for an IFRS17 Insurance Portfolio via Monte Carlo Simulation using SAS Viya

Supporting IFRS17 portfolio cash flow modeling and simulation

Corios has been busy lately supporting our client’s actuaries who are implementing the IFRS17 standard on their set of insurance portfolios. The purpose for this engagement is to better estimate the Value at Risk (VaR) on their portfolios’ liability for remaining coverage (LRC) and liability for incurred claims (LIC). LIC in turn includes both the liability for claims reported but not fully reserved, and for those claims incurred but not yet reported. The approach we and our client are following uses a cash flow projection analysis that is conceptually similar to the way our banking clients model future cash flows for secured and unsecured lending portfolios.

Read More

Five D’s of Analytic Model Deployment

Moving your models from the lab to the field for business impact

The Challenges: Business Adoption of Analytic Models

In order to increase business adoption of analytic models, there is a great deal of work that must occur in addition to model development, and it extends well beyond the model development team.

  • First, businesses need to establish connections between model scores and business decisions. This connection usually takes place outside the analytics team building the model.
  • Second, the data structures and systems used by model developers for building models are often different from the one that will be used for implementing in production. Adaptation of the model asset into production should incorporate these differences.
  • Third, businesses must be able to easily interpret, assess, and catalogue the model scores and the changes in scores over time on an ongoing basis.
  • Fourth, to deploy and execute these models in a production information technology environment and in the field requires diligence, planning, design, execution, and quality assurance practices that are not commonly adopted by model developers.

The Five Ds of Model Deployment

The purpose of this chapter is to provide a set of best practices for analytic model deployment, organized into five phases that we’ve nicknamed the “Five Ds.” They are,

  1. Develop: Developing and packaging models
  2. Decisions: Tying operational business decisions to model scores
  3. Data: Operationalizing analytic model deployment in a specific data architecture
  4. Delta: Monitoring the workflow and numeric performance of analytic models in the field
  5. Deploy: Implementing analytic models via a software development life cycle

Read More

Model governance checks: Stability, Peformance and Calibration

Benchmarks for CCAR and IFRS17 practitioners

Some enterprises build a formal model governance practice in order to comply with industry standards such as CCAR (banking industry) or IFRS17 (insurance industry). Others know that building a sound predictive model governance discipline is a great idea to improve the quality of your business decisions. Here are some well-tested practices for ensuring three pillars of model governance: Stability, Performance and Calibration.

  1. Stability: Can I rely on the process that generates my enterprise data as stable and believeable?
  2. Performance: Can I predict the difference between good and bad outcomes, or between high and low losses?
  3. Calibration: Can I make those predictions accurately?

Read More