Zephyrnet Logo

Solving the Right Problem: A Case Study in Risk Management

Date:

Click to learn more about author Steve Zagoudis.

“Successful problem solving requires finding the right solution to the right problem. We fail more often because we solve the wrong problem than because we get the wrong solution to the right problem.” – Russell L. Ackoff [1]

Managing
risk starts with identifying and solving the right problem. A recent client experience
bears this out. A bank’s external regulator did not have confidence in the results
of 30-year income projections. Even after multiple model runs, something was
definitely off.

Companies
use sophisticated modeling tools to forecast the value of their assets or
liabilities some 20 to 30 years into the future. They are working to determine
their future cash flows, including projected interest income or expense.
Typically, multiple scenarios are employed using different interest rates along
with assigned probabilities. This allows executives to plan accordingly. This
modeling consumes vast amounts of raw data, as is the case in a financial
institution.

There are
three major components of interest rate projection models that can influence
the accuracy of the results:

1. Quality
and sophistication of the modeling tool
2. Model Risk scenarios and basic assumptions used for the model run
3. Quality and accuracy of the data fed into the model

Our client
institution had received a Matter Requiring Attention (MRA) from their
regulator, giving them several months to remediate the issue and report back. The
client’s Model Risk team was convinced the variations were caused by the
modeling tool. They proceeded to utilize a different tool and reran the tests —
at the cost of millions of dollars and months of effort. The problem was not with
the modeling tool.

Next, the
team modified the basic scenarios and assumption sets used to run the model.
The results showed vastly different answers, which was expected since they
changed the underlying business rules. But alas, both modeling tools showed consistent
unacceptable variations. The problem was not with their model assumptions.

We suspected
the problem lay in the third component of the model — the quality and accuracy
of the underlying data that was fed into the model. My mentor and first
employer at Standard Oil was a huge fan of Gane and Sarson Data Flow diagrams
(DFDs). Whenever we started a project at Standard Oil, our first task was to
draw a Level 0 DFD, putting the system in question in the middle of the page
and showing all data flows in and out.

We created
this DFD diagram at the start of the project to help us understand the scope of
the data feeds into the model. It turned out this institution had multiple
copies of their commercial loans scattered across different systems and data
warehouses. Market Risk defined each of the model runs using data from
different systems, assuming all copies of the data were the same. They were not.
Thus, the right problem was identified and rectified. The new results were
validated and reported to the regulators.

We subsequently gave the client recommendations for new information governance procedures and technical solutions to mitigate data risk going forward. The goal was to provide new confidence in interest rate forecasts. Not only was the immediate problem solved, but our approach helped to future-proof this critical part of the client’s business. Unfortunately for this company, the same problem of data inconsistency with their financial models surfaced again in their financial reporting and public disclosures, triggering new regulatory and audit issues.

Even in this advanced information age, institutions continue to operate without the full knowledge of their true sources and quality of data, which makes it difficult to know if they are solving the right problem. We developed the Reconciliation Control Framework® for this type of situation. In very simple terms, it answers the key basic question, does A = B = C? Our framework is an example of data triangulation. According to BetterEvaluation:

“Triangulation facilitates validation of data through cross verification from more than two sources. It tests the consistency of findings obtained through different instruments and increases the chance to control, or at least assess, some of the threats or multiple causes influencing our results.”

In the case of our client, the InfoCheckTM
control reconciled the loan data across the multiple systems, proving day in
and day out that A=B=C was offering a permanent reduction in enterprise risk.

With the
works of those like Russell Ackoff, Gane and Sarson, and so many others as
guidance, it is possible to identify and solve the right problems.

In the next blog in this series, we will complete the remaining sections of the Data Governance policy.

References

[1] Ackoff, R. L.:
1974, Redesigning the Future: A Systems Approach to Societal
Problems (John Wiley & Sons, New York)

Source: https://www.dataversity.net/solving-the-right-problem-a-case-study-in-risk-management/

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?