Without a consolidated viewpoint on what new risk data requirements mean, firms will be at a loss when it comes to determining best practice.
The deadline for firms to upgrade their risk data aggregation capabilities is fast approaching. The Basel Committee for Banking Supervision’s Principles for Risk Data Aggregation and Risk Reporting are due to be implemented at the start of 2016, but research shows that the industry is well behind schedule. What is needed is a consensus industry view on what ‘good’ looks like in order for firms to determine their own roadmap to implementation.
Separate polls carried out by JWG and the Institute of International Finance, in 2011 and 2012 respectively, showed that more than half of firms had identified significant work still to be done in improving their ability to collect and report risk data. And this was before the regulatory driver was introduced with the issue of the BCBS’ Principles in January 2013.
[accordion][pane title=”Known Unknowns”]
- Will firms and regulators come to an agreement on a set of risk data aggregation standards?
- Will regulators respond to calls for further guidance?
- What will the penalties be for firms that fail to make improvements by 2016?
Even now, with 18 months left on the clock, firms are still in the opening stages of implementation. This difficulty is in part due to the high-level nature of the Principles, which focus on qualitative policy aims rather than detailed, quantifiable standards. This means that the Principles can mean different things to different firms. For instance, even their scope is open to interpretation: The Principles set the threshold for data inclusion (i.e., the tipping point at which data is in or out of scope) at ‘critical’ data or ‘material’ data. This will mean something different to every firm, and may even mean different things within firms and across departments; in other words, there is no such thing as a view from nowhere.
The way to overcome this relativity problem is to set an objective industry standpoint from which firms’ capabilities can be judged and measured against one another. For this reason, it is necessary to look at the Principles from an operational impact perspective. In order to achieve this, there are five operational ‘lenses’ through which all the requirements can be viewed and compared between firms.
The first of these is ‘scope’: What is the scope of application within your firm, and what will the impact area look like? In order to make conclusions here it is necessary to consider three measures: The depth of scope: where will the ‘criticality’ and ‘materiality’ thresholds be set within the firm? The breadth of explicit requirements: where will your firm apply service level agreements or what data will have to be reconciled? And, finally, the breadth of implicit requirements: will any of the changes have an effect on the bank’s central reference data, for instance?
The second lens is ‘data quality’: The Principles require firms to measure and improve six aspects of data quality: timeliness, accuracy, completeness, consistency, flexibility and adaptability. Firms will have to decide how they will set objective measures for all of these and then how high to set the bar. The operational implications will then become clearer: How will data reconciliation be incorporated into workflow? Will a systems upgrade be necessary in order to achieve intraday aggregation?
The third lens is ‘data standards’: What new standards, metadata and linkages will have to be implemented to meet the firm’s target operating model and to ensure compliance? For instance, the Principles ask for unique counterparty identifiers to be implemented. Will this involve full LEI implementation (before we even have an EU mandate)?
The fourth lens is ‘infrastructure and controls’: Firms will need to be able to monitor the breakdowns in their risk data aggregation chains in order to remedy them effectively. For instance, firms are required to find a balance between automated and manual processes. However, with other requirements elsewhere asking for accurate data delivered to short time frames, the continuing use of manual systems, including spread sheets, may become untenable without upgrades to processes and end user controls.
Finally, the fifth lens is ‘governance and incentives’: Though it has its own section in the Principles, governance cuts across all the requirements as the buck will ultimately stop with the board and senior management. Implementing new technology and procedures is also futile without enforcing the adoption of these new practices across the firm. Therefore, banks will have to examine their incentive frameworks in order to make sure they aren’t rewarding poor data practices.
These lenses only offer an introductory step into thinking about the risk data aggregation in an objective way. In order to complete the process, and establish an objective and universal view on what ‘good’ looks like, JWG is currently talking to banks and regulators about the possibility of clear industry guidance in relation to the Principles. With this in mind, we encourage banks currently implementing the Principles to get in touch about our preliminary peer assessment.
- New initiatives that bridge Risk and Data are struggling to find sponsorship within firmsn
- Firms are having difficulty interpreting the operational impact of high-level principles
- Industry guidance is necessary to prevent macro systems changes having to be repeated in the near future
[pane title=”Top Alerts”]
- Risk Data Aggregation: Minutes from our successful meeting with the PRA over further guidance now onlinen
- Big data: Can new approaches to data capture, storage and retrieval help firms to meet Basel III requirements?n
- FSB Data Gaps Initiative clears Phase 1: Framework to be expanded to bilateral funding dependencies and consolidated balance sheet reporting[/pane]