Industry Insights

October 22, 2018
Hedge Accounting: A Q&A with Dan Morrilll and Ryan Henley

As the FASB’s hedge accounting standard continues its march to implementation – and with early adoption permitted – many companies may be wondering exactly how the changes will impact their operations. To get a better handle on life under the new rules, FMS checked in with Dan Morrill of Wolf & Company and Ryan Henley of Stifel for a quick summary of the changes to be aware of and the opportunities that may emerge for financial institutions.

Q: Which institutions would you expect to see the greatest impact from the hedge accounting changes?
A: While the accounting changes improved upon hedge relationships of several types, predominantly the rule changes allow for much greater flexibility in hedging fixed-rate instruments (converting fixed-rate assets or liabilities to floating), otherwise known as a fair value hedge. Given the current rate environment, institutions that would benefit most are those that have a risk to shrinking net interest margins in a continued rising rate environment as funding costs increase and fixed-rate asset yields remain constant.

Q: What are some of the new opportunities that have emerged as a result of these changes?
A: There are two primary strategies currently employed as a result of the changes. First, institutions are hedging assets that typically are offered in the market in a generic fixed-rate form. Fixed-rate loans or bonds of a longer maturity can now be converted to floating-rate asset classes by entering into an interest rate swap utilizing the new hedging rules. Secondly, institutions can create funding strategies paired with these same hedging alternatives to arrive at a cheaper funding profile for a given interest rate risk position.

Q: For those institutions that were hedging under the old rules, how significant will these changes be?
A: FASB’s effort significantly simplifies the accounting results (in constructing, measuring and monitoring) of hedge relationships. For institutions with legacy hedge relationships, this would apply to both future hedging strategies they would employ, as well as the opportunity to amend existing hedging relationships upon adoption. It will simply lead to a cleaner hedging platform for the institution going forward.

Q: Which change in particular do you see as having the most impact on the operations of banks and credit unions?
A: The ability to now hedge fixed-rate assets (swap fixed-rate instruments to float) gives an institution a tremendous amount of flexibility in product offerings to its client base. If the institution’s core market desires a longer-term fixed-rate product (in the consumer or commercial space), management can originate into this demand and then subsequently adjust the interest rate risk of the product without client involvement and in a clean accounting manner.

Q: Are there any effects of these changes that might be seen as a negative?
A: Generally speaking, most accounting changes carry with them a set of considerations or consequences that are not always favorable. Importantly, ASU 2017-12 was an attempt to rectify previous issues within hedge accounting. As a result, the rule really only improves upon the legacy framework. In our opinion, there are no negative consequences, as it affords greater flexibility than before.

Q: What should institutions be doing to prepare for these changes, or to make sure they’re in the best possible position to take advantage of them?
A: It is important to note that a significant number of institutions are currently early adopting the standard to take advantage of these rules. Why is this so important? Because it is likely that a competitor will be employing the strategies mentioned above due to the added flexibility. There are considerations upon adoption that must be analyzed, such as whether the institution has any legacy hedge relationships. If so, should these relationships be amended upon adoption (leaning on the transition provisions of the rule), and does the institution have investment securities classified as held-to-maturity that are eligible to be transferred to available-for-sale as permitted by the standard? If so, does it make economic sense to do so for each eligible instrument? These are some of the questions to be asking now.

Disclaimer: The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Authors

Ryan Henley is a Managing Director and the Head of Depository Strategies at Stifel. In this position, he provides ongoing analysis and balance sheet strategies to financial institutions and portfolio managers nationwide, as well as a broad variety of analysis related to economics, interest rates, investments and interest rate risk management strategies.

Dan Morrill is a Principal at Wolf & Co. and is responsible for the firm’s Professional Practice group. In addition to leading the Audit and Accounting Committee, Dan conducts training on technical issues, performs quality control reviews, participates in learning and development initiatives and conducts technical research.

October 15, 2018
Strategic Uses for Customer Profitability
By Brad Dahlman, Sr. Product and Consulting Services Manager, ProfitStars

As accounting/finance professionals, we spend the majority of our time focused on delivering accurate financial reporting, but less time determining how the information will be used by the business units tasked with driving the institution’s success.

As a result, accounting/finance managers often want to have deep conversations about Funds Transfer Pricing (FTP) methodologies or the benefits of full absorption costing rules. While having good business rules is key to providing accurate results, it is equally important to focus on how front-line employees should be using customer profitability data to effectively drive business decisions..

Identification and Protection of Key Clients
In the more than 100 customer profitability installations I have overseen, it has been universally true that over 180% of a financial institution’s profitability comes from the top 20% of clients. This dramatic concentration of profit among relatively few clients demonstrates the importance of identifying these key clients and putting in place strategies to ensure they never leave your institution. While most well-run institutions have a good idea of who many of their top clients are, there are always some surprises – especially with deposit/service clients that don’t go through credit underwriting processes.

In Figure 1, a well-performing $800-million bank shows average profitability for each client in the top 10% as $4,623 annually. Losing any of these clients will certainly hurt, so the organization must:

1. Identify them
2. Assign relationship officers to these key accounts
3. Put in place programs or rewards to ensure the client is satisfied, including providing key profit information to tellers and personal bankers so they can properly address fee waiver requests
4. Track lost “key” clients

Figure 1: Profit Is Highly Concentrated

Effectively Pricing New Transactions
The second use for customer profitability data is in pricing new transactions. As new business requests (loan or deposit) are considered, institutions with a customer profitability system should:

1. Understand current profitability (i.e. “before”)
2. Price the new transaction, considering various pricing scenarios and terms
3. Assess the “after” – or post-approval profitability – to ensure an adequate return (profit/ROE)
4. Provide only those options to the client that meet targeted profitability thresholds

Segmentation and Marketing Strategies
The third major use for customer profitability data is by the marketing department. Accurate customer profitability data is often loaded into CRM/MCIF systems, as opposed to using rudimentary CRM/MCIF tools to determine profitability. With this accurate data imported into the application, the data is then used to segment clients and develop marketing campaigns targeted around both product usage and profitability data.

While many CRM/MCIF systems have basic profitability analysis included, it is essential to have one consistent view of profitability for use by finance/business leaders, tellers or other front-line personnel, as well as marketing. As such, data from a sophisticated profitability system should ideally be fed into the CRM/MCIF system.

Evaluation of Relationship Managers’ Performance
The fourth major use for customer profitability data is evaluation of relationship managers’ performance. The concept here is to determine the value of a relationship manager’s portfolio at the beginning and end of the year to assess profit improvement.

In most institutions today, relationship manager goals often revolve around production goals like growing loans and/or deposits. While growth is indeed a positive measure, we also want to make sure these goals align with profitability goals. Without profitability targets, relationship managers will be incentivized to simply “price down” transactions to win business that could negatively affect the institution’s overall financial performance. In other words, not all deals are profitable!

When a relationship manager has profitability growth goals, he or she is encouraged to find ways to make transactions profitable. Access to customer profitability data and effective pricing tools are key elements in this process.

Customer profitability systems have been available in the market for many years. However, the number of financial institutions that have accurate, sophisticated customer profitability systems and use them in the manners described above are few.

In my experience, community bank clients who actively use their customer profitability systems have experienced between 8-10 basis points of additional profitability over their peers. As your institution considers future growth and profit objectives, it is therefore worth asking this basic question: “Do we provide our front-line staff with the information to effectively engage with clients to grow profitability?”

If the answer to this question is “no,” perhaps 2019 should be the year your organization explores customer profitability and pricing solutions.

Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Brad Dahlman is a Senior Product and Consulting Services Manager for ProfitStars focusing on the importance and uses for relationship profitability. In addition to developing a customer profitability system in the late 1990s, Brad has personally led the installation of customer profitability solutions at over 100 financial institutions over the past two decades. He has a broad background in banking and has held various positions in finance, audit, operations and technology with several mid-sized community banks.

October 1, 2018
FMS Quick Poll: Views On Fintech
By Financial Managers Society

Thanks to dozens of think pieces and the OCC’s recent decision to offer a new special charter, fintech may be one of the buzziest issues in the banking industry right now. But when it comes to actually working with or competing against these potential new players, just how many FMS members actually have a fintech bee in their bonnets? If the results from our latest Quick Poll are any indication, the buzz is more of a low hum for now.

Among the 76 respondents to our poll (80% from banks/thrifts and 20% from credit unions, ranging in asset size from $199 million or less to $9.99 billion), 18% see financial technology companies solely as potential partners, 12% see them only as competitors and 61% see them as both potential partners and competitors (Figure 1). Meanwhile, a small segment (8%) says fintech players are neither partners nor competitors, and one respondent wrote it to note, with compelling honesty, that his institution was actually not yet sure exactly how to view fintech players.


With a cumulative 79% seeing fintech players as possible partners, it is perhaps not surprising that more than one in four respondents (26%) report already working with a fintech firm on some strategic initiative (Figure 2). Another 39% note that while they are not yet collaborating with a fintech partner, they’re interested in doing so and looking for potential opportunities. The remaining 34%, however, have no plans to partner with a fintech company at this point.


Whether currently working with a fintech partner or just considering doing so, we wanted to know which areas of the institution were most likely to benefit from a fintech collaboration, and P2P payments (63%) and digital account opening (58%) were the runaway favorites, with mobile banking (34%), mobile bill pay (28%) and online bill pay rounding out the top five (Figure 3). Among the 16% of respondents who selected “other” for this question, several see fintech companies as ideal partners for streamlining or digitizing lending services, while other write-ins included “business intelligence” and “customer/member analytics.”


For those respondents already working with fintech partners, when asked what types of initiatives they were focusing on several reported that they are indeed utilizing fintech know-how to streamline and automate their loan processes – from improving loan workflow to building a small business lending portal.

In many respects, FMS members in this Quick Poll echoed the findings from our larger industry-wide proprietary research project from earlier this year. In Community Mindset: Community Bank and Credit Union Viewpoints 2018, nearly half of the 400 community bank and credit union leaders surveyed saw fintech players as both potential partners and competitors, while 32% viewed them only as potential partners and 20% saw them strictly as competitors.

Thank you to everyone who took the time to share their thoughts in this FMS Quick Poll!

September 24, 2018
Evaluating the Liquidity Within Your 1-4 Family Portfolio
By Mark Cary, Sr. Vice President, FTN Financial Capital Assets

As loan-to-deposit ratios and liquidity ratios reach five-year highs and lows, respectively, now is the time to take stock of all of your potential liquidity options. While most liquidity contingency funding plans include options for raising deposits, selling investment securities or obtaining other funding (such as FHLB advances), one often overlooked option as a valuable source of potential liquidity is the 1-4 family first lien mortgage portfolio.

As loan-to-deposit ratios and liquidity ratios reach five-year highs and lows, respectively, now is the time to take stock of all of your potential liquidity options. While most liquidity contingency funding plans include options for raising deposits, selling investment securities or obtaining other funding (such as FHLB advances), one often overlooked option as a valuable source of potential liquidity is the 1-4 family first lien mortgage portfolio.

Got Liquidity?
In 2014, the national marketing arm for the dairy industry retired the popular “Got Milk?” ad campaign that featured celebrities in milk moustaches, but let’s use a twist on the once-popular ad campaign to highlight the need for liquidity in today’s market. Figure 1 below highlights the increasing loan/deposit ratios for banks. This increase, coupled with an increasing cost of funds, can put a real strain on liquidity (and earnings).

Figure 1: U.S. Commercial Banks and Savings Banks (250M-$25B) - Loans-To-Deposits Ratio

Cost of Funds Is Also Rising (Significantly)
As Figure 2 indicates, cost of funds – which is comprised largely of interest-bearing deposits – has increased recently, and it does not seem like this will be ending anytime soon. With digital deposits on the rise and the availability of information on the internet about competing deposit accounts, the war for deposits will likely be more intense than it has ever been.

Figure 2: U.S. Banks ($250M-$25B) - Cost of Funds, Trailing 8 Quarters

Traditional Sources May Not Be Enough
While it has yet to be seen, the impending war for deposits may cause financial institutions to consider sources of liquidity outside of their usual arsenal. Normally, a financial institution’s potential remedies for a shortfall in funding include, but are not limited to:

1. Selling Available-for-Sale (“AFS”) securities
2. Raising rates on deposits
3. Tapping non-core sources such as brokered deposits, FHLB advances or other lines
4. Loans from correspondent banks

As rates rise, the costs for implementing the above strategies could get very expensive. A potentially more efficient option is the liquidity that resides within the on-balance-sheet portfolio of 1-4 family mortgage loans.

The 1-4 Family On-Balance-Sheet Portfolio as an Alternative
The 1-4 family loan portfolio represents an often overlooked source of potential liquidity. Of all loan product types, the 1-4 family portfolio often includes the most liquid and most price-efficient loans from a secondary marketing standpoint, and can be segregated into three separate liquidity grades as follows:

Agency Grade
Loans in this category meet all the general criteria for purchase by one of the agencies (Fannie Mae or Freddie Mac) subject to a loan file documentation review. These are loans that were agency-eligible at origination or could be agency-eligible as the loans season or some corrective action is taken.

Private Grade
Loans in this category may meet one or more of the criteria for being eligible for purchase by the agencies, but are acceptable for purchase by private investors (usually other financial institutions) subject to a loan file documentation review. These loans would be subject to standard secondary marketing guidelines as related to FICO score, LTV, DTI, etc.

Portfolio Grade
The loan data fails one or more criteria for purchase in the standard secondary mortgage market. The loan may be a good credit risk and a performing asset, but from an economic perspective, its profile indicates it should be retained in the portfolio rather than being sold in the secondary market. In other words, the price to sell into the secondary market is quite a bit lower than the value to hold the loans to term.

Agency and private grade loans are collectively referred to as “investment grade” and are the MOST LIQUID loans within the entire loan portfolio. Over 80% of the loans in the average portfolio meet these criteria, representing a large source of untapped liquidity.

Strategies to Improve Liquidity Using Whole Loans
As with any strategy involving the balance sheet, it is important to understand all potential options. When evaluating funding strategies, one additional option to consider is the sale of a pool of loans. Strategies utilizing whole loan sales can be structured as follows:

1. Bulk seasoned MBS securitizations using one of the agencies (Fannie Mae or FHLMC) as a guarantor
2. Whole loan transactions from one institution to another
3. Participation transactions

Each of these strategies can be accomplished on a serviced, released or retained basis

Another “Liquidity” Arrow in Your Quiver

When preparing your Liquidity Contingency Plan, it is important to include the 1-4 family portfolio as a potential source of liquidity along with other more traditional sources. Doing so will provide you with another arrow in your liquidity quiver.

Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Mark Cary is a senior vice president and loan sales manager for FTN Financial Capital Assets, drawing on his more than 30 years of experience in the a financial institution industry to assist clients in developing strategies to better manage their loan portfolios. He is a member of the AICPA and the Tennessee Society of Certified Public Accountants, and is an adjunct professor in the Finance, Insurance and Real Estate Department of the University of Memphis.

September 3, 2018
Manage Vendors to Manage Risk
By Terry Ammons, Systems Partner, Porter Keadle Moore

Banking at its core is the business of managing risk for others. From deposit accounts to payment options and loan products, the entire culture of an institution is centered on identifying, controlling and responding to risk. Despite this, however, one area where the financial services industry is still struggling to succeed is vendor risk management.

Today, third-party vendors are ubiquitous within modern businesses, and financial institutions are no different. Working with technology partners requires institutions to accept a certain level of risk that must be managed both internally and externally. While regulatory, compliance and security issues still resound as top priorities for bankers, when choosing to work with new tech providers, the best approach to risk management is not avoidance, but a deeper understanding that helps the institution identify, prioritize, control and respond to any event that may cause a business interruption.

The hard truth is that responding to risk after a breach or incident has occurred is potentially more expensive – to the bottom line and to reputational brand equity – than implementing the necessary steps to safeguard the institution from the beginning.

Not All Vendors Are Created Equal
While regulatory compliance is not specific to banking, compared to most other industries, banks and credit unions have a much higher bar to reach when developing internal risk management programs. Federal regulators are closely evaluating the institutions they are charged with overseeing, and bankers must be vigilant in holding risk management programs to the highest level of scrutiny. Since a disaster in one area of the bank or credit union can affect the entire institution, risk management is an enterprise-wide concern and should be dealt with as such. This includes incorporating risk management efforts into the institution’s culture, organization, processes, technologies, personnel and physical infrastructure.

The first step toward creating a successful vendor management program is to categorize risk on a sliding scale of priority. Some institutions mistakenly apply the same level of risk to each of their vendors – regardless of the service provided, the level of access granted or the type of data shared. This can be a time-consuming and oftentimes damaging approach, as some vendors pose a larger threat to an institution than others. For example, some vendors will pull more sensitive information from a bank, which naturally necessitates a higher level of scrutiny on the bank’s part. By categorizing vendors based on risks, institutions can help focus their efforts and better ensure that nothing slips through the cracks.

Build Your Safety Net
While an institution may lack direct control over its vendor and their systems, it is nevertheless the institution’s responsibility to ensure that proper safeguards are in place to protect itself, its customers’ information and the integrity of the institution/vendor relationship. After evaluating and determining the risk profile of each vendor, the institution must conduct its own due diligence to ensure that the vendor is upholding its end of the contract.

The vendor bears some responsibility here as well. Regardless of risk assignment, a vendor must provide documentation that demonstrates its security arrangements and controls. While this usually occurs in the beginning of a vendor relationship, institutions should require their partners to provide quarterly and annual reports and analysis of their systems to satisfy not only the institution’s requirements, but its regulators as well.

Ideally, evaluation will be an ongoing effort that does not impede day-to-day operations. After all, even if everything is in place in the beginning of the relationship, those same controls may not necessarily be sufficient in the future. Specialized access to consumer information not only requires protections to be in place, but also to evolve with the changing cybersecurity landscape.

The relationship between vendor and banker needs to be a symbiotic one. For example, banks and vendors alike should work closely to outline the steps necessary to ensure services are restored in the event of an outage, with both organizations assuming responsibility for their part of the equation. To create a comprehensive due diligence program, vendors should provide their own internal and external IT audits to validate the controls they have in place. While this is the ideal, it is too rarely the reality.

Response Tactics
With an extensive range of risk touch points for financial institutions, even seemingly innocuous events such as missing a patch or an employee clicking a malicious email link can lead to enterprise-wide threats. Thus, a bank or credit union’s risk management strategy must also include steps for how to mitigate damage once a breach has occurred. Even with a robust due diligence process and regular audits to ensure compliance, an event can occur – hackers, unfortunately, are still very good at their jobs.

There are a few options to deal with an interruption once it has occurred: remediation, mitigation and acceptance. With an effective risk management and vendor management program in place, these attacks will be limited in scope and occurrence, but still may cause an inconvenience for the institution at the least and a breach of sensitive data in the most severe instances. It is at this point that an institution can learn firsthand where any missteps may have occurred, and if the vulnerability was previously unknown. Of course, every institution wants to avoid this situation, but when and if it does occur, it is certainly better to emerge with more robust controls and an example to assist other institutions in protecting themselves.

There is a balancing act between evolving business requirements and meeting the latest security standards – one that provides little room for error. Integrity of data must be ensured on the vendor’s side, with the institution setting expectations early on in the relationship, and then reevaluating those expectations throughout the life of that partnership. There is no finish line in reaching and maintaining compliance – it is an ever-moving target that requires constant monitoring.

Disclaimer: The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Terry Ammons, CPA, CISA, CTPRP is Systems Partner at Porter Keadle Moore (PKM), an Atlanta-based accounting and advisory firm serving public and private organizations in the financial services, insurance and technology industries.

August 20, 2018
Model Risk Management: How to Be Prepared in a Data-Driven World
By Meredith Piotti, Internal Audit Senior Manager, Wolf & Company, PC

Financial institutions’ reliance on harnessing the power of data through models and using analytics to create reports continues to increase. As a result, regulatory bodies including the FDIC, OCC and the Federal Reserve have issued guidance and increased criticism within exams regarding proper model oversight.

Reliance on poorly designed models or errors in model output could result in missed opportunities or prevent management from identifying major threats on the horizon. Testing model inputs, calculations and outputs can give an institution’s management confidence that their decisions are being based on reliable information.

Creating a Model Risk Management Program
The first step to a strong program is to have a Model Risk Management Policy that ensures all departments within the institution are applying the same definition and oversight of models. This policy should outline the methodologies for the following, along with other regulatory requirements:

-Identify who is responsible for the oversight and execution of the policy
-Describe the step-by-step process for new model creation
-Classify end-user computations versus models for inclusion in the institution’s model inventory
-Develop a standard model risk assessment framework
-State the frequency and extent of model validation based on risk
-Establish ongoing oversight

Identifying and Assessing Models
Institutions should identify which programs, analytics and end-user computations are in use to compare to the policy’s model definition. An inventory should be created to capture all of these that meet the model definition, with end-user computations catalogued separately. Although end-user computations are not as complex or relied upon to the same degree as models, it is important that they are incorporated in audits to verify the completeness of inputs and the accuracy of calculations.

Each model in the inventory should be risk assessed annually using the institution’s framework. Factors that should be incorporated into this framework include, but may not be limited to:

-Input volatility
-Model use
-Financial impact
-Business decision impact
-Model design

Each model should be given a final risk score that will determine the frequency of required validations.

Proactively Monitoring Models
In conjunction with the annual risk assessment process, institutions should develop a standard “annual touch” questionnaire. The annual touch should be reviewed with the model owner to determine if there are any changes to the model’s design, oversight and inputs or other additional factors to be considered when determining the model’s validation frequency.

In addition to verbal responses, documented support should be obtained to corroborate responses, including mapping documentation, evidence of model owner review and assumption documentation. The reviewer should also follow up on any prior validation comments to ensure they have been remediated and discuss any user overrides to the model. Significant changes or overrides may result in the need for an earlier model validation.

Model Validation
Historically, regulators have primarily focused on requiring independent validations of automated AML software models only. However, regulatory scrutiny has increased to require that all models have a validation schedule and to verify adherence with that schedule.

Model validations should verify that the model is performing as expected and in accordance with its business use. It is critical that the validation is performed by someone independent of the oversight of that particular model with the appropriate expertise to validate the model. The extent of the validation will depend on the complexity of the model and the potential risks pertaining to the model.

Establishing a comprehensive model risk program can deter future problems and allow management to get back to banking

Disclaimer: The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Meredith Piotti is a Senior Manager in the Internal Audit Services group at Wolf &Company, responsible for overseeing the firm’s data analytics group and delivering internal audit services to financial institutions.

August 10, 2018
Securing the Most Favorable Prices for Securities Transactions
By Robert Segal, CEO, Atlantic Capital Strategies, Inc.

Banking regulations require that financial institutions implement robust systems to monitor, manage and control risks related to investment activities. The agencies state further that effective management of the risks associated with securities represents an essential component of safe and sound practices. The FDIC, for example, says it is prudent for management to fully understand all relevant market and transaction risks. Accordingly, management has the responsibility to put systems in place to assure that all reasonable efforts are made to obtain the most favorable price for each securities transaction.

The market for fixed-income securities has evolved significantly in recent years. However, according to the Financial Industry Regulatory Authority (FINRA), the amount of “pre-trade” pricing information (bids and offers) is still relatively limited as compared to equities, and generally not readily accessible by the investing public. While new technology and communications in the fixed-income market have advanced, the market remains decentralized, with much trading still occurring on an over-the-counter basis.

Compared to equities, transaction costs for fixed-income securities remain stubbornly high. Academic studies have shown that transaction costs for even small orders of equities are a few pennies per share, while commissions for corporate and municipal bonds can be several dollars per $100 of bond principal value, or one hundred times higher or more.

Approximately ten years ago, the SEC instituted a “post-trade” reporting system that distributes information about bond transactions. Under the program, dealers are required to report, with a 15-minute delay, the price and quantity of every transaction. Corporate bonds were the first sector in the platform, followed by municipals and agencies, and more recently, Treasuries and mortgage securities. This innovation improved transparency by allowing investors to obtain more current information about market values.

In a recent regulatory notice, FINRA reiterated its commitment to best execution as a key investor protection requirement. The agency noted that in light of the advanced nature of fixed-income markets, brokerage firms need to regularly review their procedures to ensure they are designed to incorporate and reflect best execution principles, as the broker is “under a duty to exercise reasonable care to obtain the most advantageous terms for the customer.”

FINRA requires that brokerage firms establish, maintain and enforce robust supervisory procedures and policies regarding “regular and rigorous reviews” for execution quality. As part of its own regulatory reviews, FINRA conducts statistical analyses, establishing pricing parameters for comparison to other transactions in the same security. In fact, if certain transactions show a meaningful variance, FINRA may deem the firm to be in violation of best execution principles.

It is important to keep in mind that best execution does not always mean the lowest possible price. In its Trust Examination Manual, the FDIC said management should consider other factors when determining the quality of execution, including quality of research provided, speed of execution and certainty of execution. Regulators also recognize that obtaining quotes from too many sources could adversely affect pricing due to delays in execution and other factors.

Given the regulatory environment and improvements in transaction reporting, bank management may wish to implement a “back-testing” program to assure that the institution is receiving the most favorable prices for securities transactions. This surveillance tool could compare the institution’s pricing to prevailing market prices at the time of the trade, while also analyzing the bid/offer spread to confirm that the transaction “mark-up” was fair and reasonable.

A direct benefit is that the financial institution should see improved profitability as it routes business to brokerage firms that provide the lowest overall transaction costs. Corporate governance can be enhanced as risk management policies and procedures continue to be strengthened.

As the programs evolve, bank treasurers can ultimately establish a system for evaluating broker/dealer performance. The FDIC requires that financial institutions develop and approve an effective vendor management program framework. What the FDIC is looking for, according to industry observers, are well-defined documentation processes. The regulators see vendor risk management as needing continual monitoring and ongoing risk assessments.

Finance officers typically scrutinize the P&L in a finely-tuned manner. At the same time, most bankers acknowledge they don’t know what they’re paying for brokerage costs for securities transactions. Transaction costs can vary greatly based on the scope of the transaction and access to the most liquid dealers. A review of individual transactions indicates that investors may be “leaving a lot of money on the table.” Thus, a more diligent approach toward trading efficiencies could help support the bottom line.

Disclaimer: The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Financial Managers Society.

About the Author

Robert Segal is the founder and CEO of Atlantic Capital Strategies, Inc., which provides investment advisory services for financial institutions. He has over 35 years of experience in the banking industry, having worked in several community banks with roles in mortgage banking, sales and trading and ALM. Bob is also currently a Director-at-Large on the FMS Board of Directors.

Have a topic you want to share? 

We’d love to hear from you:
Mark Loehrke
Editor and Director, Publications and Research
Direct: 312-630-3421