Tải bản đầy đủ - 0 (trang)
33 Establishing documentary targets and thresholds can help limit operational and legal risks; incomplete documentation should be prioritized by creditworthiness and risk exposure

33 Establishing documentary targets and thresholds can help limit operational and legal risks; incomplete documentation should be prioritized by creditworthiness and risk exposure

Tải bản đầy đủ - 0trang

Risk Management



119



r All risks must be included in official trading systems, without exception; failure to do so

leaves a firm subject to operational risks and possible fraud.



r Risk mitigation/migration devices form part of the management process, and should be used

whenever it makes sense to do so.



r Legal processes, based on appropriate documentation, should exist as a key element of the

management discipline in order to reduce, or eliminate, legal/operational risk losses.



8

Risk Infrastructure

Risk infrastructure makes possible the identification, quantification, reporting and management

of risks. Though infrastructure is often “invisible” to those outside of the risk function —

encompassing “behind the scenes” data, technology and internal analytics — it is a vital element

of the risk process. Indeed, in the absence of solid infrastructure a firm is unlikely to be able to

satisfy its internal/external obligations and fiduciary requirements, or convince shareholders,

creditors and regulators that it is operating in a controlled manner. Though the development

of an appropriate layer of infrastructure can be complex, time-consuming and expensive,

there is often no substitute or alternative. Financial business is so complicated that it is no

longer practical to manage risks without proper infrastructure; a solid, committed and wellplanned infrastructure investment is therefore a requirement. Reliance on an outdated, manual

or inflexible platform may ultimately lead to a greater amount of operating risks and financial

losses.



8.1 DATA IS THE FUNDAMENTAL COMPONENT OF ANY RISK

PROCESS — BAD DATA LEADS TO BAD INFORMATION

AND BAD RISK DECISIONS

Information makes the management of risk possible. Data, the basic component of any information process, is the fundamental mechanism used to convey details about the nature, size,

location and maturity of a firm’s risks. Large firms that operate many lines of business, with

multiple counterparties in various global locations, face a considerable challenge in ensuring

that risk data is of the highest possible quality; however, even small firms need to implement

processes that ensure data integrity. Bad data will, in many cases, lead to bad (or misinformed)

risk decisions. If a firm’s data processes cannot reflect basic information — such as whether a

business is long or short, owns $1MM or $10MM of risk, or faces Bank XYZ or Bank ABC as

counterparty — then risk decisions cannot be made with confidence. Data must be correct at

the position, or trade, level. Since risks are often aggregated into broader portfolios — whether

by counterparty, region, entity or risk class — a data error at the position level can have significant consequences; not only will portfolio information be incorrect, but error identification is

likely to be time-consuming. A robust data process therefore relies heavily on integrity at the

position level. Time and effort must be spent converting trade data into clean and robust form,

and processes must exist to maintain the quality of the process.

Though every firm has unique data needs, most share certain common requirements; these

may include business unit, desk, trade size, trade type (e.g. swap, put, call, bond, equity,

and so forth), security identifier (e.g. a unique tag in order to avoid duplication or mismatch

errors), maturity, counterparty currency, country, industry, settlement time, documentation

flags (e.g. for confirmations, ISDAs, guarantees, and so on), and so forth. Such data permits

the computation of core credit, market and liquidity exposures. In some cases trading systems



122



The Simple Rules of Risk



apply risk analytics to each position to derive actual exposures; in other cases “raw” data is

ported to the independent risk function, where internal risk analytics compute relevant risks.

Regardless of where the risk calculations are performed, aggregation and netting rules must

then be applied in order to construct actual portfolios of risks.

The data process developed must be flexible enough to accommodate new products, counterparties, markets and analytics, and an audit cycle should exist to ensure integrity. These are

obvious, if sometimes overlooked, elements of the process. Since the pace of change in the

financial markets is so rapid, any “data capture” mechanism that is not sufficiently flexible

will soon become obsolete — putting a firm back where it started. Likewise, failure to audit

the process on an ongoing basis could mean that new products are not captured appropriately,

spurious information is produced and left unchecked, and so forth. Earlier in the book we noted

that hiring the best, most experienced and qualified risk personnel is a worthwhile investment

in a firm’s risk process and overall financial future. Investing in data processes must rank as a

close second; spending the time and resources to develop a robust data process leads ultimately

to more informed decision-making and safer risk-taking.

The availability of high quality information — built on robust data processes — is of such

importance that it becomes one of the “cardinal rules.”



8.2 A SINGLE SOURCE OF TRADE DATA SHOULD BE USED

WHENEVER POSSIBLE TO ENSURE CONSISTENCY;

WHEN THIS IS NOT POSSIBLE, DATA PROCESSES MUST

BE PROPERLY RECONCILED AND AUDITED

One of the most common “information problems” comes from the use of multiple sources of

data to produce similar, or identical, management and risk reports. When multiple sources are

employed for reporting purposes, it is often impossible to create the same end-use information;

this is especially true when complex businesses are involved. For reasons of corporate history, a

firm may use multiple trade repositories for different aspects of financial processing. Thus, one

database might contain a trade population used to produce risk information, a second for P&L

generation, a third for settlements, a fourth for collateral valuation, and so on. In addition, for

complicated businesses where the standard trade platform cannot handle difficult structures,

spreadsheets might be used to manage a portion of the management/reporting task. Under this

type of structure, it becomes necessary to reconcile multiple sources of information; since it is of

little use to compute P&L explain from the finance database if the risk database does not feature

the same trade population, reconciliation is a necessary procedure. The best way of gaining

confidence in the quality of information is to obtain all data from the same source. However,

many institutions are burdened with legacy systems and cannot realistically obtain their risk

information from a single data source. In the short-term the only practical solution is to institute

as many audit checks as possible; this, however, must only be regarded as a “stop-gap” measure.

With the advent of new technologies it has become more realistic to consider and implement

a cohesive data platform that contains all business data (risk, finance, operations, settlement,

legal, counterparty, management, and so on) and becomes the sole source of information for

any control or business function. Firms that have not yet considered such architecture may

be delaying the inevitable and should prioritize their efforts — particularly since information

demands are likely to increase in the future.



Tài liệu bạn tìm kiếm đã sẵn sàng tải về

33 Establishing documentary targets and thresholds can help limit operational and legal risks; incomplete documentation should be prioritized by creditworthiness and risk exposure

Tải bản đầy đủ ngay(0 tr)

×