Industry Insights

September 23, 2019
Balance Sheet Optimization
By: Alla Gil, Founder and CEO – Straterix

Balance sheet optimization has been the Holy Grail of companies’ management for many years now. But to quote Fintekminds on the topic, “A lot has been tried and written on the subject, but very few have been able to achieve meaningful results.”

There were attempts to treat institutions’ (and corporate) balance sheets as portfolios of assets, with balance sheet optimization reduced to traditional investment portfolio optimization in a Markowitz mean-variance framework. It works by finding the best expected return (mean) on portfolio assets for each level of risk (defined as portfolio variance). The difficulty in applying this framework, however, is that banks’ balance sheet optimization, while looking at capital allocation from a risk-reward perspective, has to consider also the full capital structure of the institution (e.g. loans, securities, deposits and wholesale funding).

The main assumption in the mean-variance portfolio optimization framework is that portfolio variance-covariance is constant over the optimization horizon. Since this assumption doesn’t hold over any reasonable time horizon (a week, a month or at maximum a quarter), it implies that the portfolio must be rebalanced periodically; that is, it should be liquid enough. Such rebalancing is infeasible when it comes to managing the balance sheet and making strategic decisions like M&A, divestitures, capital raising, determining risk appetite and establishing long-term goals for the institution.

First, due to longer time horizons of strategic decisions, changes in market conditions can cause dramatic shifts in correlations between the asset classes. Second, if one decides to extend certain loans or merge with another institution, it is not easy (and often not possible) to reverse these decisions. Finally, there are numerous regulatory constraints from capital and liquidity angles that trigger respective changes on the asset and liability sides (e.g., maintaining certain levels of ALLL, high-quality liquid assets and capital ratios).

In addition to the above challenges to balance sheet optimization in the Markowitz framework, regulatory and investment communities expect institutions to incorporate stress considerations into their baseline projections of capital, profit and allowance. It’s not enough to find a balance sheet composition that is optimal from a risk-return perspective. One must also explain why assets or liabilities would behave in a certain way and what macroeconomic and market environment factors caused this behavior.

Even the definition of risk-return trade-off is very different when it comes to balance sheets. It is not clear what is being optimized – capital, return on equity or profits – or what the relevant measures of risk should be, with the variance of the above Key Performance Indicators (KPIs) only one of the candidates. And the constraints of balance sheet optimization are numerous and dynamic – capital and liquidity ratios should never fall below a certain level over the optimization time horizon, earnings should never decline by more than X in any single quarter, worst-case net income should exceed Y, etc. These are difficult to incorporate into a Markowitz framework.

From the regulatory perspective, it is crucial to predict how institutions may respond to some adverse macro-economic scenarios because the practical balance sheet optimization depends on an evolving macro-market scenario and on management actions that are scenario-dependent.

Given these challenges, most optimization approaches simplify the problem by either dealing with narrow silos of an institution and/or by using very restrictive assumptions (e.g., a single-time step) to maintain computational transparency and still use the static mean-variance approach. As stated in a recent European Central Bank research paper, “Financial institutions need a new approach from the very senior strategic level on how to optimally manage their balance sheets and make them more resilient to potential new stress, both from capital and liquidity perspectives.”

Multiple forward-looking scenarios are the only way to incorporate longer-term risks into analysis. An institution’s balance sheet can be projected on these scenarios, but manually building a few dozen scenarios is not enough. What’s needed is the full distribution that incorporates potential extreme events and their consequences, captures dynamic correlations between PDs, LGDs and EADs, and allows the integration of macroeconomic and market risk drivers and their impact on various deposit segments, non-interest income (e.g., fees), demand for new loans, etc. Such scenarios include all components necessary to calculate high-level KPIs: capital and liquidity ratios, ALLL and ECL, funding costs, net income and earnings. An institution can then overlay its specific covenants on the scenarios (e.g., when the unfunded commitment will be closed, or additional collateral required).

Once all the details of critical outcomes are calculated on all of the generated scenarios at every monthly or quarterly point over the time horizon, comprehensive balance sheet performance can be assessed (e.g., expected net income in Q1, 95th percentile worst case net income in Q1, same for capital and liquidity ratios, same for Q2 and others). Then one can back-track (reverse) scenarios leading to particular outcomes, identify specific risk drivers’ values producing these outcomes and design potential balance sheet alternatives that might mitigate adverse outcomes and/or strengthen the good ones.

Multitudes of such balance sheet alternatives or managerial actions will create a cloud of choices from which an efficient frontier can be formed. Each balance sheet alternative (represented by the weights of balance sheet and P&L segments) has many outcomes (represented by statistical expectations and the worst cases of all key indicators). Select two dimensions – one of which will represent risk (e.g., by how much capital can drop over the time horizon) and another return (e.g., expected profitability over the horizon) – and calculate all necessary regulatory and internal constraints (e.g., capital and liquidity ratios, maximum drop in earnings in a single quarter). Now the frontier can be constructed – for each level of risk, what is the balance sheet alternative with the maximum return that satisfies all constraints?

This approach to balance sheet optimization requires only high-level data, helps one discover hidden pockets of risk through the generation of unprecedented scenarios that capture increases in correlation, and makes it possible to overlay dynamic decisions on these scenarios (e.g., reinvestment, refinancing, risk appetite, capacity and limits). Not only is it fully transparent, it incorporates all types of risk considerations and truly informs an institution’s strategic decisions.



Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Formerly the Managing Director and Global Head of Strategy Advisory at Goldman Sachs, Citigroup and Nomura Sec, Alla Gil has more than 25 years of experience as a quantitative researcher and strategic advisor on asset-liability and tail-risk modeling for banks, insurance companies and pension funds worldwide.


September 23, 2019
Balance Sheet Optimization
By: Alla Gil, Founder and CEO – Straterix

Balance sheet optimization has been the Holy Grail of companies’ management for many years now. But to quote Fintekminds on the topic, “A lot has been tried and written on the subject, but very few have been able to achieve meaningful results.”

There were attempts to treat institutions’ (and corporate) balance sheets as portfolios of assets, with balance sheet optimization reduced to traditional investment portfolio optimization in a Markowitz mean-variance framework. It works by finding the best expected return (mean) on portfolio assets for each level of risk (defined as portfolio variance).



September 9, 2019
The Importance of Portfolio Duration
By Robert Segal, CEO, Atlantic Capital Strategies, Inc.

Following a recent meeting in Washington, the Federal Reserve lowered the target range for the Fed Funds rate to 2% to 2.25%. The central bank also stopped reducing the size of its balance sheet two months earlier than expected. The FOMC took action in light of the implications of global developments for the economic outlook as well as muted inflation pressures. While the Committee noted a robust labor market and low unemployment rate, it remains concerned about slowing levels of business investment.

In a press conference following the announcement, Chairman Jerome Powell said the move was intended to insure against downside risks from weak global growth and trade policy uncertainty, and to promote a faster return of inflation to the 2% target. Rather than indicating a major turn in policy, Powell referred to the action as a “mid-cycle adjustment.” He also suggested that after a period of time, the Fed could go back to hiking rates.

For now, bond market participants see more rate cuts coming. According to fed funds futures, traders are anticipating a 100% chance of a rate reduction at the September 18th FOMC meeting. After that gathering, there are even odds for two more moves by the end of the year, which would bring the range to 1.25% to 1.50%.

Financial institutions have been living with a flat yield curve for some time. As an example, the Treasury yield spread as measured by the difference between two- and ten-year maturities was just 13 basis points on the FOMC’s July 31st meeting date. That compares with a slightly-higher 29 basis points on the same date in 2018.

As bond prices have increased, investors continue to chase performance. Money has been pouring into bond funds globally this year, with net buying on pace to reach a record $455 billion in 2019, according to Barron’s. Over the past decade, global bond funds received $1.7 trillion of inflows.

At the same time, the funds are getting riskier as managers try to squeeze out better returns in a low-yield world. A recent Morningstar analysis found that even straightforward funds have increased their holdings of lower-rated bonds and emerging-market debt to juice returns. As interest rates keep falling, bond investors are increasingly “reaching for yield.”

Treasury officers should remember that a main objective of the investment portfolio is to enhance profitability while balancing the institution’s sensitivity to interest rate changes. Accordingly, management should establish a target level of duration that considers the institution’s balance sheet, income requirements and risk tolerance.

Maintaining duration is an essential factor in supporting margin and maximizing net interest income. As portfolios age, duration can fall unless cash flows are reinvested back out on the curve; this “opportunity cost” limits earnings potential. When interest rates decline, for example, portfolios comprised of mortgage securities usually shorten as prepayments ramp up. Investment officers should closely monitor their portfolio and take steps to ensure target durations are preserved to protect net interest income.

Especially in this environment, caution is advised with regard to investment types that may underperform over a cycle. These include certain CMOs and callable Agency bonds in falling-rate scenarios and low-coupon tax-exempt municipals with rising rates.

CMOs
In a base case scenario, many “plain vanilla” CMOs are projected to have an intermediate-term average life, such as four to five years. However, CMOs are often highly sensitive to movements in interest rates. Changes in the rate at which homeowners sell their properties, refinance or otherwise pre-pay their loans could result in large swings in cash flow. In a falling-rate scenario, the securities could shorten dramatically, causing the investor to reinvest cash flows at appreciably lower yields. Investors in these securities may not only be subject to this prepayment risk, but also to heightened market and liquidity risks.

Callable Agency Bonds
Callable bonds expose investors to the same type of reinvestment risk. Issuers tend to call the bonds when interest rates fall so they can refinance at lower costs. A typical Agency issue carries a first call period of three to twelve months, which may not be enough protection. As bonds are redeemed, the investor receives the original proceeds but is unable to match the original return. Moreover, callable Agency bonds cannot appreciate much above par due to the call feature, and this limits any lift to the institution’s capital from unrealized securities gains.

Municipal Bonds
Tax rules describe a price threshold for determining whether a discount bond should be taxed as a capital gain or ordinary income. If a municipal bond is purchased at a large discount, the security’s price appreciation can be taxable, which reduces realized investment income. For this reason, many financial institutions prefer to purchase municipal bonds at a premium, since a larger portion of the expected return is comprised of coupon income. Investment officers may want to tread carefully with low-coupon municipal bonds, as these issues could depreciate rapidly in a rising-rate environment.

Of course, every institution should consider its asset-liability position when making investment decisions. With the market at a potential turning point, investment officers need to pay extra attention to performance in multiple rate scenarios to protect the balance sheet from undesired outcomes.




Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Robert Segal is the founder and CEO of Atlantic Capital Strategies, Inc., which provides investment advisory services for financial institutions. He has over 35 years of experience in the banking industry, having worked in several community banks with roles in mortgage banking sales, trading and ALM. Bob is also currently a Director-at-Large on the FMS Board of Directors.



September 9, 2019
The Importance of Portfolio Duration
By Robert Segal, CEO, Atlantic Capital Strategies, Inc.

Following a recent meeting in Washington, the Federal Reserve lowered the target range for the Fed Funds rate to 2% to 2.25%. The central bank also stopped reducing the size of its balance sheet two months earlier than expected.


August 26, 2019
Applying Profitability Through Cube Analysis
By Ben Braun, Associate Director of Finance, Associated Bank

Vendors and institutions alike talk about the value that profitability analysis can bring to an organization. Generally, this comes with a price tag associated with implementing proper costing techniques, funds transfer pricing assumptions and other reporting needs. But what an institution really needs to know prior to committing is:

What are some creative ways the organization can utilize the information to add value?

Foundational knowledge of utilization starts with the profitability cube framework, wherein any given side of a cube will show a different dimension or view. When the cube turns a different dimension will show, but the cube remains the same size. A cube can have many dimensions, but it is up to an institution to define what is important to them. Here are a few examples of profitability dimensions (outlined in Figure 1):

- Business Unit
- Customer
- Product
- Geographic
- Officer

Profitability is neither destroyed nor created by flipping the cube. For instance, the total of financial metrics in the product dimension would equal the same metrics in the business unit dimension.

Understanding the components of the profitability cube is vital to determining what analyses an institution would like to leverage after implementing profitability analysis. The following three examples highlight the value an institution may derive from a few sides of the cube.

FIGURE 1: Financial Institutions Profitability Cube


Business Unit
Traditionally, financial institutions want to understand which branches drive profitability for the organization. If staffed/funded appropriately, the larger branches (in terms of deposits and loans) tend to be the more profitable branches. However, there are varying components that impact a branch’s performance.

One aspect to review involves the balance sheet strategies surrounding mortgages. A few options an institution has are to 1) sell the balance and servicing rights to another institution; 2) sell the balance and retain the servicing rights; or 3) retain the balance and servicing rights. These corporate-level strategies can directly impact the profitability of an individual branch, which is why institutions need to evaluate these components:

Median home sale price in the market – What size are the mortgages being originated in the market?

Average gain on sale – If an institution is selling any components, they’ll recognize a gain on sale. What is the average gain on sale the institution is achieving?

Cost to originate – What is the cost to fulfill the mortgage through origination? What amount of commission is an institution paying to the officer originating? This can be the most involved.

Knowing these components can help an institution evaluate a branch’s mortgage portfolio and determine if it needs to change strategies to enhance growth. For instance, take these hypothetical figures:

- Mortgage origination balance – $150,000
- Average gain on sale – 2.50%
- Cost to originate – $3,000

These figures would drive a $3,750 gain ($150,000 x 2.50%) and a $750 pre-tax income ($3,750 – $3,000). It would be accretive to sell mortgages above $120,000, given the gain on sale and fulfillment assumptions. If an institution can access the data underlying these mortgage assumptions, it will be better able to understand business unit (branch) profitability and whether its balance sheet strategies are in line with long-term profitability growth.

FIGURE 2: Dot Plot Diagram Outling Market Pricing


Geographic
Exception pricing on deposits is a consistent struggle for many institutions. For varying reasons, a banker might offer a ‘promotional/special’ rate to a customer he or she would like to retain. This type of technique can often be profitable for an organization when deployed appropriately. But how does an institution evaluate if there are certain markets or officers abusing exception pricing?

An institution can generate a dot plot diagram that ranks given markets from smallest to largest across the x-axis and average interest expense rates along the y-axis (Figure 2). Visual representation will allow the institution to analyze trends. Are there outliers? Is there a consistent trend? What questions arise from the shape and correlations of the dots? These questions offer valuable insight into market pricing, but individual bankers are driving the results. Looking at the absolute interest expense might not be as helpful as comparing markets to one another. But what drives average market pricing?

Further analysis is aided by creating a similar diagram that ranks individual bankers, allowing managers to compare and contrast portfolios to hold bankers accountable to their results. There may be justification for pricing performance, but without this type of profitability analysis, the institution will never be able to determine accountability.

Customer
The low-hanging fruit of profitability analysis is often identifying high-usage, low-fee customers that strictly utilize the institution’s branches for transactional purposes.

Given the availability of data, a significant volume of deposits at given branches by a customer will stick out as an anomaly. These low-hanging-fruit customers can be identified by high volumes of deposits, low overall balances and ACH withdrawals consistent with the number of business days in a year. The customer utilizes an institution’s branch network to transact business, but transfers the deposits to another institution where he or she might have a larger corporate relationship.

Identifying these relationships and having appropriate conversations regarding minimum balances or fees (or gaining additional components of the relationship from other institutions) is the quick win this profitability analysis can provide.

The key to meaningful profitability analysis is understanding that profitability is relative, not absolute. It is up to an institution to develop assumptions that define who and what it sees as profitable. If a customer is deemed ‘unprofitable’ through profitability analysis, a decision to get that customer out of the institution may not always be the best strategy. Instead, it may be better to identify a comparable relationship (with similar attributes) that is interacting in a profitable way, and to initiate conversations that evaluate how to migrate this customer to a more profitable interaction given what is known about other profitable customers.

Conclusion
There are endless opportunities for institutions to utilize profitability analysis. Beginning the journey to achieving the full profitability cube analysis is difficult if an institution doesn’t envision what they hope to know after implementation. Therefore, start with the end in mind.




Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

As the Associate Director of Finance for Asociated Bank, Ben Braun works in the areas of strategy, marketing and finance. Ben is also currently a Director-at-Large on the FMS Board of Directors.



August 26, 2019
Applying Profitability Through Cube Analysis
By Ben Braun, Associate Director of Finance, Associated Bank

Vendors and institutions alike talk about the value that profitability analysis can bring to an organization. Generally, this comes with a price tag associated with implementing proper costing techniques, funds transfer pricing assumptions and other reporting needs. But what an institution really needs to know prior to committing is:

What are some creative ways the organization can utilize the information to add value?



August 12, 2019
Breaking Down the Silos in Managing Risk
By Michael Berman, Founder and CEO, Ncontracts

There’s a difference between risk management that starts at the top and risk management that trickles up.

When risk management starts at the top it is thorough and unified. Risk tolerances and risk appetite drive strategic decisions – all decisions. Everyone is following the same approach. In contrast, when risk management trickles up, it’s anyone’s guess what’s really happening. Each business line, department or area does what it wants, how it wants, with no regard for the big picture. It’s a huge waste of resources that creates redundancies, inefficiencies and discrepancies.

The trickle-up approach may have worked in the past when risk management was limited to credit risk, but not today. As regulatory guidance has expanded the scope of regulations over the past few years, the overlap between different areas of risk management has grown significantly. Enterprise risk management, business continuity planning, compliance, cybersecurity and vendor management can no longer be thought of as standalone elements of an institution’s operational risk management program – they are intertwined.

At too many institutions, the IT department handles cybersecurity, compliance tackles vendor management and someone else in IT oversees business continuity planning. The result is an organization comprised of silos. Each team meticulously follows regulatory requirements and best practices for risk management – without ever considering the possibility that someone else in the institution might be tackling a similar task. Consider security breaches of critical vendors. They touch at least five key areas of risk management:

1. Vendor management
Regulators want institutions to know what provisions and policies are in place to ensure notification of a security breach at a critical third-party vendor.

2. Cybersecurity
FFIEC’s Cybersecurity Assessment Tool specifically asks if all critical vendors are required by contract to notify the institution when there is a security breach.

3. Business continuity planning
An institution should know how long it will take critical vendors to notify it of a security breach.

4. Compliance
The Gramm-Leach-Bliley Act specifically mentions that vendors with access to protected data should be required to notify the institution of a security breach.

5. Enterprise risk management
The institution needs to determine if critical vendors are required to notify it of a security breach.

In theory, overlapping requirements like these should make risk management simpler for institutions – one person or team can address these concerns and report back to everyone who needs the information. But that’s not always what happens. Too often institutions rely on a decentralized approach to risk management, which is a problem for three reasons.

1. Redundancies
Consider the aforementioned security breach. There could be five or more groups compiling lists of third-party vendors, assessing the criticality of individual vendors and determining which vendors should report breaches and when. When it comes time to test controls, each control is tested five times instead of simply testing it once and sharing the findings with everyone involved. This repetition isn’t thorough – it’s just a waste of time and resources.

2. Inefficiencies
There may also be five or more teams monitoring and setting policy for security breaches of critical vendors. Instead of working cooperatively to leverage resources and information, each group starts from scratch. The compliance department doesn’t benefit from IT’s knowledge of cybersecurity. The vendor management and contract teams don’t necessarily understand the expectations of business continuity planning. Enterprise risk management isn’t providing the overall leadership needed to make the process function smoothly. It’s a waste of expertise.

3. Discrepancies
When different groups share overlapping responsibilities and don’t realize it, it can create conflict as each group sets different standards and notification times. For instance, the IT team may require breach notification within one hour, while compliance may say 24 hours. Meanwhile, each group may be using different standards to assess the risk of vendors, resulting in different results. Discrepancies like these are red flags for regulators.

Institutions can avoid these problems with a unified approach to risk management – putting in place systems to connect disparate functions and areas so that every requirement can be studied from multiple perspectives. Enterprise risk management should serve as an umbrella for all other areas of risk management – including compliance. Not only does this ensure that an institution’s business strategies are integrated into risk decisions, it also centralizes data so risk management can be viewed holistically.

FIGURE 1: Siloed Decision-Making


ERM Simplifies Banking
The idea of top-down risk management can intimidate institutions where the function is currently spread out. After all, existing processes may appear to be working. However, what seems easier or faster isn’t always right. Just look at the difference between making a strategic decision using ERM versus a siloed approach.

The chart below represents a siloed approach to decision-making. In this case, an institution is faced with the risk of losing small business lending market share to unregulated nonbank competition. To combat the threat, the institution decides to offer unsecured small business loans funded within 24 hours. The competition is doing it. The bank wants to remain competitive. Decision made.

The institution made a decision, but it didn’t follow any sort of process. Someone had an idea and the institution ran with it. There’s no systematic discussion to thoroughly analyze the potential risk involved. There’s no thought as to whether all stakeholders have been consulted. Maybe compliance is invited in, maybe it isn’t. What about IT? Marketing? Other key departments? The opportunity to uncover risks (and opportunities) is lost by failing to include key areas.

Worse yet, once the decision is made and marching orders are passed on, this siloed approach is likely to produce redundancies. Let’s say the institution is outsourcing this new platform to a third-party vendor. That introduces third-party risk, which ties into cyber risk, reputation risk, compliance risk and even credit and financial risk. How will these risks be addressed? If each department attacks third-party risk individually, it will result in an inefficient duplication of resources and also introduce the potential for conflicting results. With different areas using different standards for assessing elements of third-party risk, there will likely be conflicting work that leads to complications.

While the chart in Figure I may look simple and ordered, in reality it’s ineffective and inefficient because there are no connections.

Now let’s look at strategic decision-making with ERM as pictured by the chart in Figure II. The ERM chart is a bit harder to follow. It’s more complicated. But it’s not creating complications, it’s uncovering them, revealing overlap and the need for communication.

FIGURE 2: Strategic Decision-Making with ERM


It may seem like a big effort, but using ERM will actually lead to less work.

Using an ERM approach to strategic decision-making means the institution knows exactly who needs to be in the room before a decision is made. It uncovers problems and conflicts early on, allowing them to be addressed from the beginning when a program has the most flexibility. It allows different areas to benefit from existing work and reach agreement. It leads to smarter decision-making – and that leads to less work for everyone.

But that only happens when ERM work is integrated into the strategic decision-making process. Rather than make a decision and then tell the CFO about it, it’s getting input when it’s still possible to make changes. ERM is about understanding that risk management can’t happen in a silo. By nature, risks are interrelated and uncovering them requires a systematic approach. Risk must be considered collaboratively from the very beginning to make informed decisions.

The status quo may seem acceptable, but managing risks individually instead of with ERM is likely to be wasting resources and creating messes that could have been prevented with foresight. When silos are eliminated, risk management results in better oversight, greater efficiency and lower costs. That’s why risk management needs to start at the top.




Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Michael Berman is the founder and CEO of Ncontracts, a leading provider of risk management solutions. He has an extensive background in legal and regulatory matters, and was involved in numerous regulatory, compliance and contract management challenges during the course of a legal career that included several general counsel posts.






Have a topic you want to share? 

We’d love to hear from you:
Mark Loehrke
Editor and Director, Publications and Research
Direct: 312-630-3421
Email: mloehrke@FMSinc.org