You are on page 1of 17

AML Transaction Monitoring Data Governance & Controls:

Challenges, Risks and the Approach to meet Audit &


Regulatory Requirements
Contents

Executive summary

AML Transaction Monitoring Overview

AML Structures & process

Transaction Monitoring Framework

Challenges & Upcoming Regulations

Risks

Approach to Transaction Monitoring Governance

Data Governance Policiess& standards

Guiding Principles Transaction Monitoring Data Quality & Controls

Data Control Framework

Roles assurance & Independent testing

Approach to Audit TM Data Governance & Controls


Executive Summary
AML Transaction Monitoring

AML compliance centers on sifting through thousands of transactions and matching them
against risk profiles. The result of this process is a focused examination of transactions and
identification of suspicious transactions.

Financial institutions create a customer risk profile against which they measure specific
transactions as they occur. This customer profile is used as a measuring stick against which
transactions can be tested in order to identify specific transactions for in depth examination.

The basic purpose of having a strong AML transaction monitoring system is to identify and
protect the institution from any transactions that may lead to money laundering and terrorist
financing and result in the institution filing relevant Suspicious Activity Reports (SARs).

Most financial institutions rely on AML technology software to cull the transactions and pick
out the potentially suspect transactions. Some smaller institutions use manually designed
systems. Automated AML solutions include sanctions/black list screenings, customer
profiling, and comprehensive transaction monitoring with reports/alerts.

Transactions are monitored based on a customer profile and specific details relating to that
customer. The monitoring rules can reflect a number of factors relating to that customer (e.g.
aggregate transactions, type, amount, frequency, business).

When a transaction is flagged, a notice has to be generated and a procedure for resolving the
red flag has to be defined and enforced.For flagged transactions, AML staff have to
investigate the specific circumstances surrounding the transaction. High-risk products, areas
of operation, business lines and basic customer information can influence the amount of
transaction testing.

Financial institutions create a centralized investigative unit to follow-up on flagged


transactions. A centralized unit can develop standard protocols for investigations and
develop data bases which consolidate information learned during each investigation.

An investigation can lead to the filing of a SAR, an important source of information for law
enforcement agencies to initiate enforcement actions. Investigators collect all relevant
information, prepare a report and submit the report to a manager for review on the conclusion
whether or not to file a SAR.
AML Structures and Processes:

1. Know your customer procedures are the tools that help financial institutions gain a detailed
understanding of their customers, including their identity, citizenship status, occupation,
source of funds, volume and type of expected activity, countries with which they do business,
etc. By collecting this information and keeping it continually updated via transaction
monitoring, companies are able to assign their customers into high-, medium-, and low-risk
categories and apply further due-diligence as appropriate.

2. Surveillance processes allow banks to monitor for money laundering typologies: people
moving money inside and outside the bank very quickly; a pattern of “structuring,” in which
a customer continually makes deposits just below the reporting threshold; a single beneficiary
receiving money from multiple originators; customers who are depositing large sums and
making wire transfers to high-risk countries; and so on. Surveillance also typically includes
Office of Foreign Assets Control (OFAC) screening, in which bank customers’ names are
compared against lists of known terrorists and other high-risk individuals.

3. Investigations and reporting efforts are based on KYC and surveillance data. Once a
customer or transaction has been flagged, it goes through a case management workflow to
manually investigate the cases and file suspicious activity reports (SARs) to the Treasury
Department’s Financial Crimes Enforcement Network (FinCEN).

4. Enterprise foundational and core components underlie the entire AML effort, assuring that
the institution has conducted a risk assessment to identify money-laundering and terrorist
financing exposures across its products, services, customers, and geographic locations;
understands how money-laundering and terrorist financing typologies apply across those
products, services, and geographies; has put the appropriate AML policies, procedures, and
training mechanisms in place; and runs regular audits to test its AML program controls.
Transaction Monitoring Framework

Transaction Monitoring should be embedded in to Financial Institution's Integrated AML


program and the appropriateness of a institution framework should be assessed using the
principle that the framework should be aligned to and focussed on , the perceived risks
related to the Institutions business model, the products and the services it offers and the
nature of the customer base.

Wolfsburg group believes that a risk-based approach enhances the effectiveness of


monitoring unusual and potentially suspicious activity, to the extent that such activity is
distinguishable from legitimate activity It is the reason that Wolfsburg Group supports
introduction of risk based Monitoring Models and frameworks that are sufficiently flexible
to meet the needs and nature of individual financial institutions.

If The Risk analysis indicates that the use of dedicated automated system is likely to be
effective as part of an risk-based AML Transaction Monitoring framework, some or all of
the following functional capabilities may be determined to be appropriate including the
ability to :

 Compare a clients or account transaction activity during the reported period against
the relevant transaction history;
 over a period of time that institution thinks reasonable and appropriate;
 Compare customer or transaction specific data against risk scoring models;
 Issue alerts if unusual and potentially suspicious activity are identified;
 Track those alerts in order to ensure that they are appropriately managed within
financial institution and the suspicious activity is reported to appropriate authorities as
applicable;
 Maintain an audit trail for inspection by institution audit function and by the banks
supervisors;
 Provide appropriate aggregated information and statistics

http://www.pwc.com/us/en/risk-assurance-services/publications/assets/pwc-avoiding-the-
drift.pdfhttps://www.pwc.com/us/en/anti-money-laundering/publications/assets/aml-monitoring-system-
risks.pdf

Challenges

Lack of understanding of data – As financial institutions continue to grow and acquire and/or
update data sources, the enterprise and AML data governance team may fail to take into
account the downstream impacts to various applications, resulting in ineffective data usage.
Understanding the data requires not only knowledge of the technical lineage of the data, but
also the business knowledge to understand how the data is used within key business processes
and across the organization. This is one of the main reasons the AML compliance department
needs to drive, or at least be a key player, in the effort to understand data.

Data quality gaps – Many front-end systems and business processes capturing data for AML
may not populate key data elements (e.g., country of domicile, ISIN, counterparty) uniformly,
or may capture this data in free-form fields or hard-to-leverage formats. This limits the ability
to use this data for high-volume transaction analysis, leading to potential false positives or
overall misses in the identification process.

Lack of a centralized data dictionary and metadata – Many financial institutions do not have
dedicated resources (people and processes) who can act as data stewards and can educate the
downstream users on data changes as well as decide how best to harness the data. Such data
stewardship is a key requirement in getting to KYD.

Technological gaps and challenges – Financial institutions are already inundated with both
structured and unstructured data, and the data flow is ever-increasing. Without common data
repositories/warehouses to support seamless integration, technology organizations are unable
to meet the business demands to integrate, process and sort this data on a timely basis.
Frequently, businesses attempt a solution through building data processes outside of IT
(“shadow IT” solutions). Unfortunately, this approach often exacerbates the problem. Many
times, these unsanctioned sources lack uniform master or reference data, may be using
outdated, inaccurate information, or may not have data of sufficient granularity.

Management silos – Larger institutions especially are often plagued by communication gaps
among departments. This can make effective data collaboration difficult, and often leads to
data duplication, disparate data processes and multiple versions of data transformation logic.
All of these issues make it difficult to centralize functions for AML compliance and result in
ineffective AML data analyses. KYD is key in integrating these silos by providing the
answers to important questions about the data – where it is stored, how it was created, what is
its definition, what business rules and standards have been applied to it, and how it is used
across the organization.

Other Challenges

 Identifying and mapping the AML Risks in scope of the Financial Institution
products, customers, accounts, transactions and associated reference data
 Poor data integrity and validation capabilities
 Lack of documentation and knowledge transfer of the key data flow and operational
processes and inadequate training
 In accurate and inconsistent KYC and risk assessment data
 Systemic deficiencies in transaction monitoring and customer due diligence processes
 Inadequate system of internal controls, ineffective independent testing
 Lack of central data governance and inconsistent change control process
 Managing Regulatory expectations
Excerpts from New regulation from NYS DFS section 504 part C

The transaction monitoring and filtering programs should identify the data sources and
validate the quality of the data as it flows from its source into the monitoring and filtering
programs.

The specific aspects of this requirement include the identification of all data sources, data
extraction and loading processes to achieve accurate transfer and vendor selection processes,
among others.

Even the largest Regulated Institutions may grapple with challenges as they introduce new
systems or programs and quality assurance processes to meet this requirement.

These challenges will be compounded by the convergence of different information technology


systems used throughout the various departments of a large organization

(c) Each Transaction Monitoring and Filtering Program shall, at a minimum, require the

following:

1. Identification of all data sources that contain relevant data;


2. Validation of the integrity, accuracy and quality of data to ensure that accurate and
complete data flows through the Transaction Monitoring and Filtering Program;
Data extraction and loading processes to ensure a complete and accurate transfer of
Data from its source to automated monitoring and filtering systems, if automated
systems are used;
3. Governance and management oversight, including policies and procedures governing
changes to the Transaction Monitoring and Filtering Program to ensure that changes
aredefined, managed, controlled, reported, and audited;
4. Qualified personnel or outside consultant responsible for the design, planning,
implementation, operation, testing, validation, and on-going analysis, of the
5. Transaction Monitoring and Filtering Program, including automated systems if
applicable, as well as case management, review and decision making with respect to
generated alerts and potential filings; and
6. Periodic training of all stakeholders with respect to the Transaction Monitoring and
Filtering Program
Risks associated with source data feeds / TM data flow

 In accurate extraction logic for the source data file which could result in lost number
data records for products, accounts, customers, transactions and their associated
reference data
 In complete coverage of the source data feeds mapped to TM system that results in
severe gaps in AML monitoring
 Incorrect transformation and data mapping between source data and TM system
would result in inconsistent/ invalid data for AML monitored
 Lack of consistent mechanism for identifying , definition and classification of
elements used critical to the Transaction Montoring

Risks associated with Data Governance and processes

The below deficiencies will result into Poor data integrity, completeness and validation
capabilities of a financial institution and lead to severe risks in terms of the Monitoring
coverage
 Lack of consistent mechanism to identifying new applications / changes to the
existing applications that effects TM coverage and monitoring
 Insufficient TM data governance, quality standards, assurance framework and
process sustainability procedures will lead to poor oversight over the usage of data
assets
 Insufficient identification and documentation of technical information about Critical
data Elements and data flows
 Insufficient Oversight of data flows and inadequate controls over data assets

 Lack of effective mechanism to identify and remediate DQ and Data reconciliation


issues and maintain the Consistent data quality overtime
 Inadequate assurance over reliability of the data loaded into the TM system
 Inadequate identification of risks associated with data quality and reconciliation
issues; resulting in inadequate assessment to determine potential impacts to TM
systems
 Ineffective mechanism for determining impact of changes on data flows and impact of
changes to model sets
 The risk arising through failing to meet internal Operations standards or through non
compliance with external requirements as they affect operations activities
Approach to Transaction Monitoring Data Governance

For AML professionals already stretched and weary of continuing scrutiny, becoming
proficient in data management may sound like a lot of extra work – adding yet another layer
of complexity to an already difficult job. Still, AML departments stand to benefit the most
from knowing their data activities and improved data management.

Data governance is the best approach to combining these three components – the
sophisticated software applications, the knowledge of what the customer needs, and the
accurate understanding of data definitions for inputting the appropriate data. Data governance
efforts are viewed well by regulators, who increasingly put pressure on financial institutions
to formally document business processes, data controls, source-to-target mapping, and defend
all activities around data management.

Data governance is a wide set of management and technical disciplines designed to ensure
that an institution has the right data available at the right time and that the data is accurate and
in the correct format required to satisfy specific business needs. Much like AML compliance
generally, technology enables the process, but it is specific business knowledge and context
being applied to a set of information that really adds the value.
While technology platforms are certainly enablers in supporting this governance (e.g., data
quality monitoring, centralized data dictionaries), AML leads must work closely with first-
line process owners to ensure a good definition, ownership and monitoring of key data assets
required for the AML programming. Technology components supporting this include the
management of master and reference data, which helps to ensure uniformity and improve
quality across data sets flowing from diverse systems.
From a transaction monitoring process standpoint, a single customer with multiple accounts
and conducting multiple types of transactions will have the customer name, transaction
details and other identifying information appear in multiple records, across multiple systems.
The process of consolidating this information into a single customer record for transaction
purposes (to prevent the same customer from generating duplicate alerts) can be facilitated
through strong reference and master data management.

Transaction Monitoring data Governance Framework

Having a consistent Transaction Monitoring data Governance Framework can provide the below
benefits :

1. Efficient monitoring of data quality and ensure effectiveness of transaction monitoring data
2. Reliable and sustainable control over data quality and completeness
3. Reduce operational and regulatory risk
4. Facilitate DQ Issue management, Root Cause Analysis and prioritisation of DQ issue
remediation
5. Clarity around role and responsibility towards data will result in better data management
The Above benefits could be achieved by defining and implementing :
1. Data governance standards, policies, procedures, rules and controls to test for data quality
and data completeness on an ongoing basis
2. Controls to monitor any changes to the business, products, systems and transactions in the
AML landscape
3. Methodology, process, procedures to implement data quality framework & controls
4. Control Metrics to provide regular oversight of any issues discrepencies or enhancements
5. Issue Management and remediation process
6. Embed change control & process sustainability in data governance framework

Data Governance Policies & Standards

Steps to an Effective Data Governance Function: Role of Data Management Office


Institute enforceable enterprise-wide data governance strategy and processes. The institution
will use this strategy to tear down the data silos and create a free flow of data within the
enterprise.

Establish Data Governance Policy & Data Councils,Governance policy supported by a set
of governance and decision making bodies that must ensure adequate adoption and
subsequent compliance with the mandates in the policy

Institute and enforce effective master and reference-data management programs. This will
enable the institution to uncover data structure issues and, in the event of data unavailability,
elicit new efforts to source data that downstream applications like AML transaction
monitoring systems can leverage to perform a more refined data analysis.

Be proactive in assigning data ownership, roles and responsibilities . Assigning ownership


and responsibility for key data within the AML processes will help ensure continued
compliance. It is important, for example, to determine who has the responsibility to inform
the AML monitoring team when new products or customer types are added into source
systems.

Define/ Implement Data Quality rules and monitoring of data quality and controlsIt is also
important to provide models to include minimum standards,types of controls and tools for
continuous monitoring of data quality and to assign responsibility for any problems that may
arise with the data.

Create a centralized repository for metadata, Data Model / Dictionary, Data Lineage, data
flows and Golden Sources Documentation. A centralized repository will help the institution
gain an understanding of redundant data processes and eliminate them. This will streamline
downstream consumption and lead to reduction in the total cost of ownership of various data
sourcing applications. IT will also allow new data processes to be less time-consuming and
cheaper to implement due to clearer understanding of the data that is available to support the
processes.

Support big data initiatives. Financial institutions are deluged with new data daily, and the
ability to incorporate new ways of monitoring the large volume of transactions and extract
value from the data is critical to effectively managing and maturing AML programs. It is
important for institutions to maintain strong data governance as it allows institutions to
transition easily to big data analytical platforms and tools through easier data integration.

Guiding Principles for Transaction Monitoring Data Quality & controls

Data Quality is the degree to which data is accurate, complete, consistent and relevant. The
data needs to be of a high quality, Fit for the intended use in operations decision making and
planning

Data levels where Data Quality Controls could be implemented:

 Data Feeds- Discrete data file with file name or Data Extract from a API
 Data Records – Individual line of data composed of data elements delimited by a
common delimiter
 Data element – Single column or field containing a data point or null value.

Critical Data Elements – CDEs are identified by analysing the fileds which have high impact
on transaction monitoring system and meets at least one of the below criteria

 Data Element is part of the segmentation process


 Data Element id part of the detection process/ scenario logic with an impact to the
alert out put
 Data Element is part of the investigation analysis / Compliance MIS

Data Quality Dimensions / Rules should be adopted and implemented for the organisational
data policy which may include the below data quality checks
Data Controls Framework

The financial institution should have the guidelines for data controls at an enterprise level for
the definition, implementation, execution & monitoring of data related controls to ensure that
data risk and is adequately mitigated.

Purpose of controls:

1) Ensure that newly created/ introduced data (manual/automated,


internal/external)meets data quality requirements.
2) Ensure requirement are not compromised during the storage of data , including
outsourcing.
3) Ensure data is transmitted properlyvia standards for extracting , transforming, and
moving data files
4) Ensure the processing of data is done correctly and is being adapted to systems and
/or used requirements.
5) Ensure that the data that is used is correct and true , and that it can be used by without
concern.
6) Ensure the production of information for internal and external stakeholders meets
content and DQ requirements.
7) Ensure data is disposed of in compliance with the retention strategy and that business
confidentiality is not compromised.

Data Control Framework: data control framework must be in line with the overall Operation
Risk Framework, its components have been simplified and adapted to be applicable to data
assets, and to be aligned with the Enterprise Data Governance Policy.

As a part of Data Control Framework at least the following should be included:

1. Identification and assessment of Critical Data Elements(CDE)


2. Control Model, defining the different control activities that should be in place in each
specific end to end data lifecycle sub-process, as well as those controls applying
across more than one sub-process
3. Intervention of process and controls that allow formalisation of scope under control,
universal controls defined and their effectiveness, this includes at least :
a. An inventory of end to end data lifecycle processes
b. An inventory of controls
c. An inventory of transformation rules. This might include inventory of
adjustments/manual interventions
d. An inventory of Data Quality Indicators and measures
e. An inventory of Critical Data Elements.
4. A Risk and Control Assessment process that allows to regularly evaluating existing
risks and control mitigating them.
5. A Governance and oversight structure, including roles and responsibilities, decision
making forums that make decision s around data controls and a body of procedures
and standards to set guidance on how controls should be performed.
6. A Management information and report layer that consolidates the information on
control coverage and performance and reports it to the relevant stakeholders for
decision making.
7. Independent verification and monitoring , understood as independent programmes that
provide assurance and verification of the design and effective implementation of the
data control activities.
8. Issue management and remediation, understood as a standardised way to deal with the
severity assessment, root-cause analysis and business case production of Data Quality
related issues.
Control Model:

Controls activities that should be classified and implemented in the end to end data
lifecycle in order to control data risk.

Based on their function and nature into:

o Preventive, the attempt to proactively deter un desirable events from


occurring.
o Detective, which provide evidence that a loss has occurred but does not
Prevent it from occurring

Based on the level of automation, into:

o Manual, requiring human intervention and therefore time consuming and


prone to error.
o Semi- Automated, operated partly automatically and partly by hand
o Automated, performed autonomously, usually by systems and therefore more
efficient

FIGURE Some examples of controls classified in different categories

Monitor& Control:

At this stage the business as usual monitoring of score cards,outstanding data quality issues,
inventory root cause the remediation and data quality improvements for the organisation are
performed. During this stage production score cards and dashboards are built, ongoing
control monitoringprocesses are put in place, operational owners and escalation routines are
formalised, and long term planning is conducted.

An effective monitoring and control reporting process is essential for continuous


improvement and ongoing monitoring analysis& planning.
Role of Assurance and independent Testing Function:

Team is responsible for ongoing monitoring of the design andoperation of controls in the 1st
line of defence as well of providing advise and facilitating risk management activities the
team should constantly and independently review the performance of key TM controls

Irrespective of the group /Unit/Line of defence that performs them in order to be reasonably
assured that the controls meant to identify and mitigate TM risks are designed and operating
effectively

The QA program should obtain holistic understanding of the effectiveness of TM key


controls based on centrally reported metrics and results through rational testing and thematic
assurance reviews.

The QA program should define the minimum data quality standards for the key controls
providing the associated risks , scope, frequency, sample size, review method, test scenarions,
documentation , reporting and rates of compliance

QA Program should also include a set of minimum operational standards for the completion
of QA reviews using an informed, independent global process, Processes for proactive
identification of areas of TM risk in order to escalate them promptly to senior management
and requirement design to ensure that effective and timely corrective action plans are in place
to address areas of identified TM risks.

Review Area Examples of the Review details

System Interface A review of a system Interface Overview diagramor Data lineage


Control documentation
A regular monitoring of inbound and out bound feeds- Check if a checksum/
Batch Hash sum total is used to reconcile the number of records sent by the source
Management system against the no of records received by the target application

IT Support A clear definition and documentation of feed status ex: success/


Procedures warning/failures
A clear definition of Mandatory fields and data format
Data Field
A mapping of each field between the source data to the target application
Defintion -Data
A automated data validation checks a regular monitoring of data errors
Transformation
occurred during data transformation
Monitoring Key A recording monitoring and reporting mechanism of errors occurred while
Calculations key calculations are run
Logical Access A Clear defined process of user access management (addition ,
Control modification& removal)
User access A regular user access reviewincluding system administrator access/ access to
Management key modifiable scripts
A well-defined change management process
Change User acceptance testing and sign off before deployment in to the production
Management environment
Audit Process requirements and assessment focus on data governance and controls

The Controls Verification Testing & evidence documentation ensures that the
requirements of the independent / internal audit are met with great degree of
confidence

You might also like