While I remain doubtful this technology will prove as beneficial as described in this Central Banking article, it is a clever approach to reducing the strain caused by the ever increasing number of regulations. The idea is to use Natural Language Processing tools to understand the regulation and translate it into more actionable machine readable formats. If unaware of how NLP is capable of accomplishing this, this article may help. The article identifies a few ways these tools can be used:
“Reporting institutions can often find it difficult to meet these obligations; it requires significant effort to navigate and interpret regulation and there is often a need to rely on external professional services providers to understand what information the regulator needs and when.
Firms then implement and codify these interpretations into their in-house regulatory reporting systems. Each firm does this manually, which creates the risk of different interpretations and inconsistent reporting.
“Whichever way you look at it, there are currently a lot of inefficient processes that try to close the gap between what the handbooks are trying to achieve and what is actually reported,” says PJ Di Giammarino, chief executive of regulatory analysis firm JWG.
Market solutions
A number of regtech firms have attempted to make these processes simpler.
Speaking at an event in London in February this year, Mark Holmes, chief executive of tech firm Waymark, explained how artificial intelligence (AI) can be integrated into existing systems to scan and dissect the reams of regulation sent to firms daily.
“AI can help connect firms to relevant information, and can aggregate data to then break regulation down into a universal language,” he said. Waymark’s solution applies a natural-language processing system that sits within a firm’s current system and parses through the regulation documents, effectively translating them into a marked-up HTML file.
Firms are then able to discern which parts of the regulation are applicable to them and send it to the right part of the business to be implemented in whichever way it sees fit.
New start-up Covi Analytics offers a similar solution with its product Cmile. Like Waymark, Cmile dissects the information within regulatory documents and extracts the relevant sections based on a customer’s specific requirements.
The information is then compiled onto a dashboard and colour-coded to allow financial institutions to see whether a piece of regulation has been enacted by the relevant department.
The software can also provide industry benchmark information to highlight to financial institutions where they rank among their peers in terms of compliance. However, chief executive of Covi Analytics Waleed Sarwaar says this is heavily dependent on more institutions using the software to get an accurate reading.
Intelligent regulation
One firm, however, has gone a step further and taken the tech to the regulator.
In September 2017, the FCA became the first regulator to publish an intelligent regulatory handbook. The handbook, which is used by thousands of regulated financial institutions and their advisers daily, is more than 20,000 pages long and contains binding regulatory obligations and guidance for firms. Partnering with Corlytics, the FCA sought to “democratise the handbook”, making it more accessible. In doing so, it hoped to transform the handbook from a legal document to a fully searchable database.
“We put a metadata structure – much like that used by Google – in place, transforming the handbook from a comprehensive legal index to a highly accessible tool for all users,” Corlytics’ Byrne explains.
The software essentially tags words and phrases with a central taxonomy, making it machine-readable; 3,000 metadata tags were added to the original text.
“The teams have gone to different sections of the handbook and machine-learnt them. Then, using a combination of regulatory lawyers and data scientists, they have auto-tagged the rest of the handbook,” Corlytics said in a statement at the time.
A similar approach is used in certain sectors of the medical profession – most notably in cancer research. By analysing the text, using machine-learning analytics, oncology research has made great strides in the diagnosis of certain forms of cancer. In one approach, a machine is ‘trained’ using a dataset of sample images of tumours that have been classified by a physician. The computer uses the classification information to develop its own pattern-recognition criteria with which to identify tumour types.
“At Corlytics, we have moved into the same building as a lot of specialist medical data scientists to better understand what they do. Using trained models, we are able to teach them how to understand and interpret the data,” Byrne explains.
“To best do this you need subject experts who can program and understand analytics, working alongside data scientists,” he adds. “Lawyers – in our case – who can code; we have swapped the oncologists with regulatory lawyers. Their training makes for consistent and accurate analytics.”
Corlytics’ solution is the first step towards standardised regulation, an initiative that, if devised on a global scale, could exponentially reduce the regulatory burden.
According to Hardoon, for standardised regulation to be implemented, the industry would require a common understanding of data – a centralised data taxonomy and data model could be one option to achieve this.”
And that last paragraph identifies the core problem; semantics. People assign names to data elements and attempt to describe how that element should be constructed. However it is almost impossible to provide an unambiguous description that will assure another enterprise has properly constructed that data element. Small differences in the business process can make such data mappings impossible. Several efforts have been promoted to define data elements by providing the business process model that creates the data element, but when that isn’t the process a company utilizes it may be extremely difficult to modify it to become compliant.
Overview by Tim Sloane, VP, Payments Innovation at Mercator Advisory Group