Intelligence for Risk & Control Professionals (1LoD)m and our good friends hosted an event earlier this year to share observations specific to the 1st line of defense. Here are my key findings from the event and the report that followed.
The concept of resetting the first line of defense for the risk and control function was a dominant theme. That’s a big transformation to move those in the field from an evidence-based and reporting function to one that proactively anticipates risk. Then empowering the 1st line to draft plans and actions to mediate the potential implications of those risks identified.
Historically, 1st line function has been relegated to a reactive position. However, risk management is a vital strategy for every organization and should be regarded as an equal partner for all proactive aspects of the business. Most regulators and banks are already taking a proactive stance surrounding the assessment of risk and creating mitigation strategies in response to the potential implications of a breach. Yet the experts tasked with upholding the first line of defense are still not welcomed at the table – unless something has gone wrong. In a nutshell, the 1st line desires and deserves to be regarded more highly than they currently are.
- The first line of defense is more than an admin and evidencing function – it’s a critical component of operational risk management
- Nearly half (47%) of bankers surveyed feel the 1st line needs to be overhauled to be inclusive of assessing non-financial risk
- The current risk and control self-assessment (RCSA) process requires revision
- 63% of bankers are concerned that change equates with increased investment
- 46% of banks prioritize control automation
- Organizational challenges and inadequate bank collaboration reduce the effectiveness of the 3-lines model
- 58% of 1st line staff worry about covering a growing scope
Experts lamented being ill-prepared for surprises. The 1st line is interested in leveraging automation and MIS to better inform staff about the control and the operating environment. It’s currently unclear how an automated control could – with accuracy – alert a change in a risk profile that is meaningful versus just noise in the system. Perhaps sampling of historical data can be analyzed with performance analytics and penetration testing to uncover true signals of risk.
Overhauling the RCSA
Attendees pointed out numerous shortfalls in the current instance of the RCSA. Too many risks were not recorded, and a full revision was suggested by many. It’s a delicate subject given that “things have been done this way for so long.” Change is hard for most organizations.
But not everyone supports the move away from an episodic risk evaluation toward a combination of annual, daily, hourly, and on-demand assessments of risk and control effectiveness. Transforming the 1st line in this way would afford firms with real-time insights into any potentially heightened risk. And perhaps even give their executive teams that bit of lead time they need – however small it may be – to use that warning to act before it’s too late. Risks can be bucketed with greater precision according to potential severity with varying implications. Even record-keeping was called out for needing reform around the RCSA process.
| One chief of control officer said, “It’s not as robust as we might want it to be. It needs a revamp. The business is only going to get more and more complex.” |
One of the most challenging aspects of overhauling the RCSA is not knowing what we don’t know. New risks are emerging all the time, but those risks aren’t known until after the fact. Long-tail events are another matter. Paul Ford, of Acin, described the “two horizon model.” He highlighted how risks may be on the horizon of other firms, but unknown to your firm. At a higher level, the industry overall may have limited awareness of some of these risks but has not galvanized itself around them to collectively put those risks on the horizon for all.
Of course, the pandemic offers a classic example exposing how ill-prepared firms were for business continuity despite having disaster recovery plans and drills. The catch? Not a single firm known to the world was prepared for immediate, overnight disaster recovery across their entire enterprise.
One theme that didn’t directly surface but was implied was the notion of competitive differentiation versus “for the collective good of the industry.” When it comes to horizon scanning, there is an industry platform collating control details and risk assessments from more than a dozen unidentified banks. Anonymity is essential to foster collaboration with immunity – and it’s a way for everyone to learn from each other’s gaps and errors.
Another form of horizon scanning requires the internal creation of a specialized risk identification team tasked with translating legal and regulatory risks in response to every new global regulatory change: but not everyone is willing to share lessons learned or ideas for risk mitigation. Some 1st line experts pushed back on having all the horizon-scanning put on their shoulders when that responsibility could and possibly should be a 2nd line function.
The Technology Gap
Less than half of the attendees believed there to be any automation in their controls. A desire for better data analytics and enhanced data quality were common themes. Notably, most participants (74%) felt that more effective controls outweighed the cost thereof. The tight rope is that more technology and more money is needed to get that tech – but the money isn’t flowing.
One concept that emerged from the discussion is a shared need to have some language or documentation as justification for the investments in technology. Requesting a budget for new tech is only one side of the equation: the dark side is navigating the discussions around the costs of retiring legacy systems into obsolescence yet still being able to pull data from them if audited and required to do so. You may be surprised at the seemingly “low-tech” requests. This includes ETL (Extract Transform Load) tools, process automation, and intelligent workflow tools. The general consensus was that augmenting existing legacy systems with some of these operational improvement tools (automating scheduling and routine task) may offer a good interim step on the path to making legacy systems redundant.
Block and tackle tech-like data cleansing tools seem like an obvious area for investment. However, 1st line experts cite that those tools mask the bigger problem which is a lack of standardized data. Additionally, the sheer volume and level of “noise,” particularly for messy, rich data like e-Communications, is exacerbating the challenges of implementing a smarter risk management solution. Attendees advocated for enterprise-level commitments and adherence to better data capture. Designing solutions upfront, and regularly refreshing them to keep up with the increasing number of data sources (like Reddit, YouTube, RingCentral, and other digital forums where brokers and customers interact) is not as straightforward as it sounds. But it’s critical to do all the above for end-to-end surveillance to be effective.
More Work to Be Done
Fundamentally, things need to change across the industry. When it comes to developing and applying a structured approach to 1st line efforts, 25% of financial firms still have this in progress, over half of firms surveyed only have a partial/incomplete assessment of the need, and 60% of firms admit to being unable to solve 1st line problems due to a lack of clearly defined risk and control responsibilities.
See you at XloD 2022!