Industry Insights

August 12, 2019
Breaking Down the Silos in Managing risk
By Michael Berman, Founder and CEO, Ncontracts

There’s a difference between risk management that starts at the top and risk management that trickles up.

When risk management starts at the top it is thorough and unified. Risk tolerances and risk appetite drive strategic decisions – all decisions. Everyone is following the same approach. In contrast, when risk management trickles up, it’s anyone’s guess what’s really happening. Each business line, department or area does what it wants, how it wants, with no regard for the big picture. It’s a huge waste of resources that creates redundancies, inefficiencies and discrepancies.

The trickle-up approach may have worked in the past when risk management was limited to credit risk, but not today. As regulatory guidance has expanded the scope of regulations over the past few years, the overlap between different areas of risk management has grown significantly. Enterprise risk management, business continuity planning, compliance, cybersecurity and vendor management can no longer be thought of as standalone elements of an institution’s operational risk management program – they are intertwined.

At too many institutions, the IT department handles cybersecurity, compliance tackles vendor management and someone else in IT oversees business continuity planning. The result is an organization comprised of silos. Each team meticulously follows regulatory requirements and best practices for risk management – without ever considering the possibility that someone else in the institution might be tackling a similar task. Consider security breaches of critical vendors. They touch at least five key areas of risk management:

1. Vendor management
Regulators want institutions to know what provisions and policies are in place to ensure notification of a security breach at a critical third-party vendor.

2. Cybersecurity
FFIEC’s Cybersecurity Assessment Tool specifically asks if all critical vendors are required by contract to notify the institution when there is a security breach.

3. Business continuity planning
An institution should know how long it will take critical vendors to notify it of a security breach.

4. Compliance
The Gramm-Leach-Bliley Act specifically mentions that vendors with access to protected data should be required to notify the institution of a security breach.

5. Enterprise risk management
The institution needs to determine if critical vendors are required to notify it of a security breach.

In theory, overlapping requirements like these should make risk management simpler for institutions – one person or team can address these concerns and report back to everyone who needs the information. But that’s not always what happens. Too often institutions rely on a decentralized approach to risk management, which is a problem for three reasons.

1. Redundancies
Consider the aforementioned security breach. There could be five or more groups compiling lists of third-party vendors, assessing the criticality of individual vendors and determining which vendors should report breaches and when. When it comes time to test controls, each control is tested five times instead of simply testing it once and sharing the findings with everyone involved. This repetition isn’t thorough – it’s just a waste of time and resources.

2. Inefficiencies
There may also be five or more teams monitoring and setting policy for security breaches of critical vendors. Instead of working cooperatively to leverage resources and information, each group starts from scratch. The compliance department doesn’t benefit from IT’s knowledge of cybersecurity. The vendor management and contract teams don’t necessarily understand the expectations of business continuity planning. Enterprise risk management isn’t providing the overall leadership needed to make the process function smoothly. It’s a waste of expertise.

3. Discrepancies
When different groups share overlapping responsibilities and don’t realize it, it can create conflict as each group sets different standards and notification times. For instance, the IT team may require breach notification within one hour, while compliance may say 24 hours. Meanwhile, each group may be using different standards to assess the risk of vendors, resulting in different results. Discrepancies like these are red flags for regulators.

Institutions can avoid these problems with a unified approach to risk management – putting in place systems to connect disparate functions and areas so that every requirement can be studied from multiple perspectives. Enterprise risk management should serve as an umbrella for all other areas of risk management – including compliance. Not only does this ensure that an institution’s business strategies are integrated into risk decisions, it also centralizes data so risk management can be viewed holistically.

FIGURE 1: Siloed Decision-Making

ERM Simplifies Banking
The idea of top-down risk management can intimidate institutions where the function is currently spread out. After all, existing processes may appear to be working. However, what seems easier or faster isn’t always right. Just look at the difference between making a strategic decision using ERM versus a siloed approach.

The chart below represents a siloed approach to decision-making. In this case, an institution is faced with the risk of losing small business lending market share to unregulated nonbank competition. To combat the threat, the institution decides to offer unsecured small business loans funded within 24 hours. The competition is doing it. The bank wants to remain competitive. Decision made.

The institution made a decision, but it didn’t follow any sort of process. Someone had an idea and the institution ran with it. There’s no systematic discussion to thoroughly analyze the potential risk involved. There’s no thought as to whether all stakeholders have been consulted. Maybe compliance is invited in, maybe it isn’t. What about IT? Marketing? Other key departments? The opportunity to uncover risks (and opportunities) is lost by failing to include key areas.

Worse yet, once the decision is made and marching orders are passed on, this siloed approach is likely to produce redundancies. Let’s say the institution is outsourcing this new platform to a third-party vendor. That introduces third-party risk, which ties into cyber risk, reputation risk, compliance risk and even credit and financial risk. How will these risks be addressed? If each department attacks third-party risk individually, it will result in an inefficient duplication of resources and also introduce the potential for conflicting results. With different areas using different standards for assessing elements of third-party risk, there will likely be conflicting work that leads to complications.

While the chart in Figure I may look simple and ordered, in reality it’s ineffective and inefficient because there are no connections.

Now let’s look at strategic decision-making with ERM as pictured by the chart in Figure II. The ERM chart is a bit harder to follow. It’s more complicated. But it’s not creating complications, it’s uncovering them, revealing overlap and the need for communication.

FIGURE 2: Strategic Decision-Making with ERM

It may seem like a big effort, but using ERM will actually lead to less work.

Using an ERM approach to strategic decision-making means the institution knows exactly who needs to be in the room before a decision is made. It uncovers problems and conflicts early on, allowing them to be addressed from the beginning when a program has the most flexibility. It allows different areas to benefit from existing work and reach agreement. It leads to smarter decision-making – and that leads to less work for everyone.

But that only happens when ERM work is integrated into the strategic decision-making process. Rather than make a decision and then tell the CFO about it, it’s getting input when it’s still possible to make changes. ERM is about understanding that risk management can’t happen in a silo. By nature, risks are interrelated and uncovering them requires a systematic approach. Risk must be considered collaboratively from the very beginning to make informed decisions.

The status quo may seem acceptable, but managing risks individually instead of with ERM is likely to be wasting resources and creating messes that could have been prevented with foresight. When silos are eliminated, risk management results in better oversight, greater efficiency and lower costs. That’s why risk management needs to start at the top.

Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Michael Berman is the founder and CEO of Ncontracts, a leading provider of risk management solutions. He has an extensive background in legal and regulatory matters, and was involved in numerous regulatory, compliance and contract management challenges during the course of a legal career that included several general counsel posts.

July 22, 2019
CECL: Overcoming Key Challenges
By Nick Ansley, Partner, Financial Institutions Practice, Wipfli LLP

Financial institutions are facing a number of significant issues as they prepare to adopt the new Current Expected Credit Losses (CECL) impairment model. As implementation deadlines begin bearing down, it may be helpful to review three of the common challenges banks and credit unions are wrestling with and to consider a few ideas for working through these challenges.

Challenge 1: Where to Start?
Switching from an incurred-loss calculation to an expected-loss calculation is a substantial change. ASU 2016-13 provides examples for a wide range of methodologies that would be acceptable under CECL, but provides no suggestion as to which methodology would be best for a particular institution. Trying to understand the different CECL methodologies and weighing their pros and cons can be a time-consuming process.

Recommendations: Regulators have consistently indicated that they expect the sophistication of a CECL solution to be commensurate with the complexity of the institution’s loan portfolio. As a result, a smaller community-based institution likely will not require the same level of complexity in its methodology as a multi-billion dollar institution.

When evaluating the various CECL methodologies, it is apparent that there is a trade-off between simplicity and precision. Certain methodologies, such as the loss rate, remaining life, migration and vintage methods are less complicated, but may offer a little less precision. Conversely, methodologies like probability of default and discounted cash flow offer greater levels of precision, but are more complicated to develop. Financial institutions struggling with where to start may wish to consider working with the simpler methodologies initially. If these less-complex methodologies appear to provide reasonable results, it may be unnecessary to study other alternatives. If, however, the simpler methodologies don’t seem to provide sound estimates, this may suggest a more sophisticated solution is needed. For institutions looking to work with a more precise methodology, it may prove beneficial to invest in a third-party software solution to help with the analysis.

Challenge 2: Data Availability
Most loan systems have the current data necessary to develop a CECL solution, but going back and obtaining that data from previous years has been a challenge for many financial institutions. In some cases necessary loan reports are simply no longer available, while in other instances institutions have discovered certain key loan fields (such as risk rating) have been overwritten with more recent data. The lack of reliable historical data has hindered financial institutions as they attempt to develop an expected loss calculation.

Recommendations: While large financial institutions have created data warehouses for accumulating and testing data, this is probably not a practical option for most institutions. In some instances, certain historical data simply may not be recoverable. Institutions should start early to identify these data gaps and to determine if there are any effective workarounds to help overcome them. For instance, if loan risk ratings from prior periods have been overwritten on the loan system, it may be possible to utilize older watch list reports as an alternative way of gathering the data. In instances where an effective workaround cannot be identified, institutions should at a minimum start gathering their current data. Private companies have until 2022 before CECL becomes effective; while not ideal, at a minimum these institutions should have three years (2019-2021) of historical data to consider by the date of adoption.

If obtaining data is proving to be a substantial challenge, an institution may wish to consider the Weighted Average Remaining Life Methodology (WARM). WARM is unique in that it utilizes a similar average annual loss rate that is used in the current allowance calculation. This methodology does require different forward-looking projections to evaluate how the loan portfolio is expected to pay-down in the future, but because it requires less historical data, it may provide a viable approach for institutions struggling to obtain data from previous periods.

Challenge 3: Historically strong asset quality
Coming out of the 2007-2009 recession, many institutions tightened credit standards and placed a renewed emphasis on asset quality. Improved underwriting, combined with an extended economic recovery, has led to very low charge-off rates for many financial institutions. While this is great for the performance of financial institutions, it does create a unique challenge in estimating future expected credit losses. When CECL was first introduced, many assumed that adoption would require most institutions to significantly increase their allowance for loan losses. However, now that many institutions have gone through an entire loan cycle with minimal losses, the overall impact of CECL is less certain. How will financial institutions forecast future losses when the most recent loan cycle has performed at historically strong levels?

Recommendations: ASU 2016-13 requires institutions to not only consider past performance but also current conditions and “reasonable and supportable” forecasts. This means that institutions will need to adjust historical loss information for both current economic conditions and for forecasts of how conditions will change moving forward. Ideally, these forecasts will be supported by historical information. For example, if an institution anticipates that a recession is likely in the near future, it may be beneficial to look back to the latest recession period to see the impact on loss rates.

Even for institutions with access to substantial amounts of historical data, developing supportable forecasts will be one of the more challenging elements of building a CECL model. For institutions looking to purchase a third-party software solution, evaluation of the model’s forecasting process is an important consideration.

Developing a compliant CECL calculation is a substantial project. With the effective date of CECL still several years out for most financial institutions, there may be a temptation to put off starting the CECL process. But delay is likely to prove a poor strategy.

Financial institutions will undoubtedly face numerous challenges as they work to develop an appropriate approach; it is important to anticipate these challenges and build necessary time into the implementation process to work through both expected and unexpected issues. Institutions that begin the process early will be better positioned to develop strong solutions to their challenges, and to have a strong CECL calculation in place by the adoption date.

Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Nick Ansley has more than 15 years of experience working with financial institution clients, with particular expertise in the areas of external audit, accounting and commercial loan review.

July 8, 2019
Enterprise Risk Management for the Boardroom
By L. Randy Marsicano, NCRM, CRISC, Professional Services Senior Manager, WolfPAC Solutions

Have you ever felt challenged while preparing for an ERM program presentation? Ever had one go badly?

Enterprise Risk Management, by definition, is a process itself, so the reporting of your program’s results by default is also considered a process. Your success in reporting engaging and simplified results has less to do with which report you choose to present than it does with better understanding your audience and how they consume data. Wouldn’t it be great if, during your preparation, you understood how to build a “reporting” narrative tailored to your audience’s consumption of information, in an organized and engaging manner? To do so, you’ll have to start by getting past some of the common myths surrounding ERM presentations.

Myth #1: All Communication is the Same
Imagine your car suddenly makes a funny noise. When you make an appointment with your mechanic, you describe the situation in great detail and participate in a mutual negotiation of what needs to be fixed, the timeline and acceptable payment terms. Now imagine updating your spouse on the car situation. Will you go into that same level of detail? Probably not. You may simply state that you had an issue, weighed out the potential solutions, agreed on a price and ask for a ride home! Take this scenario one step further – imagine explaining the same situation to your boss. Wouldn’t you simply say “My car is in the shop, I’m working from home today and am available on my cell if you need me”?

The same situation is described, but presented very differently depending on the audience and how they consume information. If you can get your head around that, you may also agree that you communicate ERM program results differently with the first line of defense, second line of defense and your board.

Myth #2: All People are the Same
In 1924, lawyer and psychologist William Moulton studied the concepts of will and a person’s sense of power, and their effect on personality and human behavior. Through this research, the DISC profile emerged. Today, we can benefit from understanding different personality types and how they consume information. At a high level, four DISC traits have been identified, each with their own communication style:

- Dominance (sometimes called the Eagle): A direct and results-oriented personality, this profile consumes information quickly and at a high level, without delving into details.
- Influence (sometimes called the Parakeet): With an outgoing, high-spirited and lively personality, this profile consumes high-level information but prefers a more personal approach.
- Steadiness (sometimes called the Dove): Known as having a calm and sensitive personality, this profile methodically consumes information and may desire direct involvement.
- Conscientiousness (sometimes called the Owl): As a reserved and analytical personality, this profile consumes logical and detailed information.

Understanding people’s specific personality types is important, because the right information presented the wrong way may distract from your message.

Myth #3: One Report Does it All
In helping people prepare for ERM boardroom presentations, I notice that some individuals simply ask which reports to print. Although the value of reports should not be dismissed, they are only a supporting player. According to the RMA Governance and Policies Workbook, “risk reports shouldn’t create paper, they should create dialogue. Information reported without context can be extremely dangerous.”

Providing constructive dialogue on ERM programs is essentially telling a good story – complete with a beginning, a middle and an end (or rather, with a process, results and conclusion):

- Process: This includes the period considered, what was covered and who participated
- Results: What did we learn, what are the threats to the business, are appropriate controls in place and are we safe?
- Conclusion: Lessons learned and action plans

Myth #4: You Can Put This Together Quickly
We all have a friend who waits until Christmas Eve to shop for gifts. But preparing a relevant, succinct and effective presentation is not the same as Christmas shopping – it takes time, and must be done over time. Discerning people will see right through a quickly pieced-together presentation.

Now that we have dispelled some of the myths around ERM programs, here is some simple yet effective advice for presenting your ERM program:

1. Start early. Begin by writing down the basic framework and key messages. Seek to understand early what information may be missing, and put together a plan to get it.

2. Make sure you understand how your audience consumes information. If you don’t have the opportunity ahead of time, be ready to quickly determine which trait you are talking to and adjust accordingly. When there is more than one personality in the audience, communicate to the highest ranking person in the room – most likely an “Eagle”. If the highest ranking person is not an Eagle, but someone of influence is, you may still need to start communicating in “Eagle-ease,” but quickly get to areas of detail to accommodate the other styles. Parakeets, Owls, and Doves tend to have more patience than an Eagle.

3. Craft your story. Your presentation should start with the process, or how you got there. This will lay the groundwork and help your audience understand what it is they’re looking at. Results should be communicated with the appropriate detail, but be prepared to drill down into some of the higher-risk areas if asked. Always end with lessons learned and next steps, which can include how results will be used, remediation put in place and linkages to strategic programs.

A well-structured and communicated program shows value not only in the effort, but in the presenter as well. Good luck!

Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Randy Marsicano is a Senior Manager of Professional Services in the WolfPAC Solutions Group, overseeing all Enterprise Risk Management Advisory Services. He has nearly 30 years of experience designing and implementing risk management, vendor management, technology and operational management programs, and works closely with community institutions to create and improve their ERM programs and drive costs down.

June 17, 2019
CECL Status Check: How Prepared is Your Financial Institution?
By Rick Martin, Product Manager, Financial & Risk Management Solutions, Fiserv

It’s already clear that the effect of CECL – the Financial Accounting Standards Board’s (FASB) Current Expected Credit Loss standard – will be far-reaching. The CECL accounting standard, which requires banks and credit unions to record expected losses whenever they make a new loan (at the time of origination or purchase), is scheduled to go into effect as follows:

- Q1 2020 (March 31) for SEC-filing public organizations
- Q1 2021 (March 31) for non-SEC-filing public organizations
- Q1 2022 (March 31) for non-public business entities
- Q1 2022 (March 31) for credit unions

Despite these fast-approaching effective dates, many financial institutions do not have the historical data in place to calculate life-of-loan losses as required by this new credit loss model. Consequently, it may come as no surprise that financial institutions are working at a breakneck pace to prepare themselves for CECL. As financial institutions ramp up to this new standard, considering these following seven tasks will not only help them ensure they have the right data (and enough of it), but also save time and money down the road.

Driving CECL Implementation by Committee
By now, financial institutions should have implementation committees in place, with responsibilities assigned along the three main categories: operational, credit and compliance. These committees should include senior leaders from finance, accounting, lending, risk, operations/IT, compliance and retail.

Beyond the essential to-do lists, institutions should assign responsibility for integrating systems and processes across the organization and re-evaluating growth strategies.

Selecting a Methodology
Financial institutions have a great deal of flexibility when it comes to selecting a methodology for evaluating expected credit loss. In fact, the only requirement is to choose a method that is appropriate and practical – one that can reasonably estimate the expected collectability of financial assets and be applied consistently over time. The right choice depends on the type and size of loans issued by the institution (for instance, car loans versus mortgages) and other internal and external factors. In other words, different methods for different asset groups.

Compiling the Data
Compiling all of the data necessary to comply with CECL often represents the most challenging, time-consuming aspect of the process. For financial institutions, gathering the right historical data is a staggering task.

Begin by learning what data is required for CECL compliance and compare it to existing data to identify gaps. How will missing data be acquired to fill the gaps? Define how data from external or internal sources can be used. Look to external or aggregated sources from peer institutions. Institutions can also extrapolate through historical analysis of data within their organization. Determining how to put processes in place to capture the necessary data going forward is paramount.

Applying Economic Forecasts to Loans
Moving from the incurred-loss model to an expected-loss model will force financial institutions to consider how to apply economic forecasts to the valuation and protection of loan portfolios. Economic forecasts can provide valuable insight into future performance. For instance, unemployment rates could indicate shortfalls on car loan repayments or other short-term loans, while regional growth factors could positively affect repayment/refinancing of home loans, mortgages and longer-term loans.

Once the external data best suited to a loan portfolio has been assessed and gathered, forecasts can be developed to provide an advanced look at loan performance and reserve requirements.

Storing and Managing Data
Data management, retention and storage should also be considered when adapting to CECL standards, and many institutions are assessing their IT systems and planning investments to meet current and future requirements. Going forward, they’ll also need to put the right systems in place to gather data, and develop governance strategies to ensure its retention. By combining data from financial accounting and risk management systems, it can be managed and analyzed in a meaningful way.

Gaining a Strategic Advantage
Fortunately, all of this data preparation can provide a competitive advantage for financial institutions. CECL requirements mark the first time this much data has been aggregated at the individual financial instrument level. And once that history – that instrument-level data – is captured, good things can happen. It can reveal valuable insights and trends that can help the institution improve decision-making around credit risk, interest rates and profitability.

For instance, look at demand for different types of loans over time, or other key factors. Data can be pooled and correlated by collateral or type – including mortgages, auto loans, credit cards and more – and further segmented by cost center, loan officer, FICO score or geography. Practical analysis of such a historical amount of instrument-level data provides a solid foundation for financial institutions to understand their markets and metrics. This includes how portfolios behave and where potential opportunities lie.

As it facilitates risk analysis into interest rates, liquidity, credit, market and regulatory capital, additional data for loans and credit helps forecast and reduce losses. In addition to generating more accurate budget projections, using this data to inform strategic decisions can help lead to lower overall risk and better managed return for every stakeholder, including borrowers.

Making Technology Do the Work for You
Making progress on CECL implementation is more urgent than ever considering the fast-approaching deadline. The best bet for financial institutions to meet the deadline and be positioned to strategically leverage all of that required data is to turn to a consultative vendor with the right CECL technology. Utilizing these resources can have a financial institution up and running with a CECL solution in as little as two months.

Consider these five factors when investigating CECL technologies: data management; methodology; reporting; technology integration; and strategic guidance and expertise. Having the right solution not only ensures compliance and minimizes reserve requirements, but also provides insight into data for strategic value over the long term.

Regardless of the ultimate choice, it is crucial for institutions to accelerate their implementation plans so they can approach the coming deadline with confidence.

Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Rick Martin is a product manager in the Financial & Risk Management Solutions division at Fiserv. He has more than twenty years of experience in banking and financial technology, and educates financial institutions on ensuring compliance with CECL and other standards and regulations.

May 20, 2019
White Paper:
Pricing and Elasticity in Financial Institutions: What happens when pricing changes?

By Matthew E. Speed, Vice President, Ceto and Associates

Summary: The financial services industry employs several methods of making pricing decisions on its products and services. While there is not an agreed-upon industry best practice for how products and services are priced, pricing inefficiencies cause a detrimental effect on income. Furthermore, various regulations in the industry significantly impact pricing strategy and must be taken into consideration.

“The single most important decision in evaluating a business is pricing power,” Warren Buffet, CEO Berkshire Hathaway1. “If you’ve got the power to raise prices without losing business to a competitor, you’ve got a very good business. And if you have to have a prayer session before raising the price by 10%, then you’ve got a terrible business.”

The above statement is true regardless of the industry, which is why it is often quoted. The financial services industry, specifically banks and credit unions, have struggled over the last several decades with pricing. As the industry grew more competitive, giving everything away became commonplace. This “strategy” worked during times of higher interest rates and less regulation around fee income, which is why new banks opened at a rate of over 100 institutions per year from 2000 to 20072.

Read More >

May 6, 2019
The Road to Higher Profits
By Alec Hollis, Director, ALM Strategy Group, ALM First Financial Advisors, LLC

Banks want to achieve above-average profitability. Profitable growth is a critical element to success as an organization. Long-run commercial viability occurs when an organization delivers value to its constituents in a profitable and sustainable manner. But how do banks get that way? Are there distinguishing characteristics of high-profit banks?

The year 2018 was a blockbuster one for the banking industry, thanks in large part to the Tax Cuts and Jobs Act (TCJA). The industry earned $236.7 billion in 2018, a whopping 44.1% improvement over the $164.3 billion in 2017, and return on assets (ROA) was 1.35% – its highest point in over seven years. According to the FDIC’s Quarterly Banking Profile, the 44.1% increase in full-year net income would have only been an estimated 13.5% given a normalized tax rate.

Drawing conclusions from the FDIC’s published data, it’s clear that asset size is a factor to profitability. Medium and larger banks have a much higher profit advantage over smaller banks. For example, banks under $100 million in assets have a ROA disadvantage of 33 basis points (bps) to the industry’s 1.35%, much of which can be attributed to scale that results in greater efficiency. However, the discussion of size and performance recalls the chicken-and-egg conundrum; or as statisticians would put it, correlation does not imply causation.

The effectiveness of an institution’s management team shapes its performance, and hence its size. Growth for the sake of growth is no substitute for profits. The wrong incentives related to growth could lead to uncontrolled increases in operating expenses and a loss of a competitive advantage. Rather, management teams should focus on delivering value in a profitable manner. Growth then becomes a natural byproduct, which can bring scale and further improvements in efficiency.

To view performance outside of the traditional confines of asset size, we created a bank screener to profile high-profit banks. To start, we filtered out the largest of banks, removing banks above $20 billion in assets and other unrepresentative specialty banks.

From there, our high-profit benchmark (HPB) contains banks with the following criteria:
- ROA and ROE higher than the industry in four out of the past five years
- A higher ROA today than five years ago
- Non-performing assets not greater than 1.20% of assets (double the industry’s aggregated figure)

Of the 5,406 institutions reporting according to the FDIC, 575 banks were included in the HPB, representing about 11% of the total number of banks. These are institutions without excessive credit losses and a strong track record of performance over the past five years. The asset size distribution is very similar to the broader industry, indicating high-profit banks across all asset sizes are represented. However, the skew is more towards the larger side, as the average asset size is $873 million in the HPB compared to $670 million on average for banks less than $20 billion in assets. In our findings, the factors these high-profit banks share are expense control, leverage and balance sheet structure.

How High-Profit Banks Do It
Expense control is one factor that leads to higher profits, and arguably the most important one. Operating overhead stands out as the most statistically significant factor in profitability, as HPBs posted 2.57% of average assets in non-interest costs and a 50.38% efficiency ratio – both significantly lower than the numbers of their similarly-sized peers. HPBs also have lower interest costs and higher net interest margins (NIMs). On the other hand, non-interest income and fee income don’t seem to be key factors; HPBs seem to earn less in these diversified sources of income relative to larger banks. Overall, HPBs outpaced the broad industry by a wide margin last year, generating a 1.81% ROA and 16.03% ROE.

Leverage also stands out, but more so when comparing to banks on the smaller side. Larger banks tend to make better use of economically cheaper debt relative to high-cost owners’ capital. Interestingly, HPBs have higher capital ratios than the broader industry – including the largest banks – but risk-based capital ratios are about the same. This indicates HPBs are more likely to utilize risk-based capital, which leads us to the next point.

Balance sheet structure influences performance, at least for the time being when credit performance is strong. Loan-to-deposit and loan-to-asset ratios are significantly higher than the broader industry. HPBs also have higher deposit-to-asset ratios, perhaps giving them a cost of funds advantage.

Ultimately, profitability is a result of many factors. Market forces are certainly a big part of this discussion – once again, think back to 2018’s tax tailwind. A bank’s financial statement performance, however, suffers from the drawback that it is not risk-adjusted. That is the purpose of asset-liability management (ALM) – to increase profits by reducing risks that may adversely impact profitability. Should market forces move unfavorably, efficient, well-run banks will be the best positioned to survive.

Figure 1: The Index of High-Profit Banks Compared to the FDIC's Compiled Data

Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

As a Director for the ALM Strategy Group at ALM First Financial Advisors, Alec Hollis performs asset liability management strategy research for financial institutions, implements firm-wide ALM modeling procedures and assists in the execution of client balance sheet hedging programs.

April 22, 2019
The Growing Push for Financial Literacy
By Robert Segal, CEO, Atlantic Capital Strategies, Inc.

The Massachusetts legislature approved a bill in January that requires state education officials to establish financial literacy standards for students in kindergarten through grade 12. The goal is to train students in skills that will help them become financially self-supporting adults, with topics that include understanding loans, renting or buying a home, saving for college and investing for retirement.

“Financial institutions have said that when they interact with young customers, they’re seeing a lot of young people not fully grasping everything from what credit cards are to compound interest to just general costs once they’re out of high school and college,” said Massachusetts State Senator Jamie Eldridge, who sponsored an original version of the bill.

Also in January, New Jersey Acting Governor Sheila Oliver signed a law that requires the state Board of Education to include financial literacy instruction in the curriculum for students in grades six through eight in public schools across the state. The new law says the lessons should equip students with tools for “sound financial decision-making,” with content covering budgeting, savings, credit, debt, insurance and investments.

“Financial responsibility is an important acquired and learned life skill, and with the increasing financial challenges millennials face, it is a skill that must be a necessary part of our educational curriculum,” said Oliver.

John Pelletier, director of the Center for Financial Literacy at Champlain College in Vermont, reported that only five states received an “A” grade for providing the appropriate financial education for students. He further noted that studies continue to show that financial literacy is linked to positive outcomes like wealth accumulation, retirement planning and avoiding high-cost alternative financial behavior like payday lending and paying interest on credit card balances. Conversely, he says, financial illiteracy was partly to blame for the Great Recession, and that in order to minimize the impact of any future recession or financial crisis, Americans must be educated in personal finance.

The Center asserts that high schoolers are the prime candidates for financial education for the following reasons:

- The number of financial decisions an individual must make continues to increase, and the complexity of financial products continues to grow;
- Many students do not understand that one of the most important financial decisions they will make in their lives is choosing whether they should go to college after high school;
- Most college students borrow to finance their education, yet they often do so without fully understanding how much debt is appropriate for their education;
- Children are not learning about personal finance at home, with a 2017 T. Rowe Price survey noting that 69% of parents are reluctant to discuss financial matters with their children;
- Employee pension plans are disappearing and being replaced by defined contribution retirement programs, which impose greater responsibilities on young adults to save and invest.

It seems most Americans would agree with the study’s conclusions. The National Foundation for Credit Counseling’s (NFCC) “2017 Consumer Financial Literacy Survey” reports that 42% of adults gave themselves grades C, D or F with regard to their personal finance knowledge; 27% have not saved anything for retirement; 32% have no savings; 60% do not have a budget; and 22% do not pay their bills on time.

In a 2015 report, the FINRA Investor Education Foundation revealed that vast improvement in credit behavior resulted from state-mandated personal finance education. The study evaluated the effect on credit scores and delinquencies over a three-year period in the states of Georgia, Idaho and Texas. Individuals in school during the third year following the inception of the program showed greater benefits from personal finance instruction, with credit scores increasing by 10.89 points in Georgia, 16.19 points in Idaho and 31.71 points in Texas, while ninety-day-plus delinquencies dropped nearly 2% in all three states by the third year. FINRA found that if a rigorous financial education program is carefully implemented, it can improve the credit scores and lower the probability of delinquency for young adults.

The data suggest that financial literacy is more than just a “feel-good” exercise. According to most research, consumers who understand the basics of personal finance are more profitable for the banks and credit unions that provide them with financial education. Individuals who participate in these programs tend to be open to advice from that institution and generally say they’re likely to bring business to them.

The FDIC has shown that partnerships with non-profit organizations and local government agencies are key components in outreach efforts. The FDIC stresses that a well-executed strategy is mutually beneficial to banks, their community partners and consumers. Across the nation, a number of depository institutions work with established groups from the local community to provide financial education. This builds trust and, in turn, educates consumers about the benefits of using banking services and the lasting advantages that a banking relationship offers in gaining access to other financial products.

Promising opportunities exist for banks that are considering developing continuing, sustainable relationships with consumers. Financial institutions not yet participating may wish to explore partnering with various state agencies and/or non-profit organizations in to order to support their customer base and ensure the long-term viability of their communities.

Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Robert Segal is the founder and CEO of Atlantic Capital Strategies, Inc., which provides investment advisory services for financial institutions. He has over 35 years of experience in the banking industry, having worked in several community banks with roles in mortgage banking sales, trading and ALM. Bob is also currently a Director-at-Large on the FMS Board of Directors.

Have a topic you want to share? 

We’d love to hear from you:
Mark Loehrke
Editor and Director, Publications and Research
Direct: 312-630-3421