FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

LEI-Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again.

But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I think there’s nothing more to say on the topic, there is – well – more to say. With the artifice of the March ‘launch date’ behind us, it’s time to deal with reality. And the reality practitioners are having to deal with is one that’s changing rapidly.

Down load full and detailed report.

LEI-Dealing_with_reality-how_to_ensure_data_quality with Entity Identifiers_06_13.pdf

Source: A-Team, 26.06,2013

Filed under: Data Management, Data Vendor, Library, Reference Data, Standards, , , , , , , , , , , , , , ,

Outsourcing Reference Data Management: Cost Reduction and New Revenue Opportunities

The past 12 months has seen the emergence of new players offering Business Process Outsourcing (BPO) services for Reference Data Management. These new arrivals expand the range of options available to financial institutions for addressing the challenges of regulatory compliance, operational cost reduction and scalability.

But BPO has other benefits, and innovative adopters have benefited from using the model to create new value-added services. By catering to clients’ data management needs, these players have been able to transform what’s traditionally been considered a cost centre into a new and significant source of revenue.

This paper – from AIM Software – explores this exciting new trend, and describes how an established financial institution took advantage of BPO to turn its enterprise data management initiative into a new source of revenue and business growth.

Download the White Paper Now to Find Out More

Source: A-Team, March 2013

Filed under: Corporate Action, Data Management, Data Vendor, Library, Market Data, Reference Data, Standards, , , , , , , , , , , , , , ,

Capco Proposes the Creation of a Data Culture to Advance Data Management RDR

Financial firms are falling short on data management issues such as calculating the true cost of data, identifying the operational cost savings of improved data management and embracing social media data, but according to research by consultancy Capco, these issues can be resolved with a cross-organisational and practical approach to data management and the development of a data culture.

The business and technology consultancy’s report – ‘Why and how should you stop being an organisation that manages data and become a data management organisation’ – is based on interviews with towards 100 senior executives at European financial institutions. It considers the many approaches to data management across the industry and within individual enterprises, as well as the need to rethink data management. It states: “There is one certainty: data and its effective management can no longer be ignored.”

The report suggests an effective data management culture will include agreed best practices that are known to a whole organisation and leadership provided by a chief data officer (CDO) with a voice at board level and control of data management strategy and operational implementation.

For details on the report click here

Turning the situation around and attaching practical solutions to the data management vision of an all-encompassing data culture, Capco lists regulatory compliance, risk management, revenue increase, innovation and cost reduction as operational areas where good data management can have a measurable and positive effect on profit and loss.

Setting out how an organisation can create an effective data culture, Capco notes the need to change from being an organisation that is obliged to do a certain amount of data management, to a mandated and empowered data management organisation in which data has ongoing recognition as a key primary source. The report concludes: “Every organisation has the potential, as well as the need, to become a true data management organisation. However, the journey needs to begin now.”

Source: Reference Data Review, 24.10.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , ,

Reference Data: Current Solutions Lacking, Despite Foundational Nature of Reference Data RDR

Reference data management (RDM) is a foundational element of financial enterprises, yet the collection of solutions used to manage reference data in most firms is not satisfactory, according to a report published this week.

The report – Reference Data Management: Unlocking Operational Efficiencies, published by Tabb Group in conjunction with data integration specialist Informatica – describes current sentiment around RDM. It looks at development through four generations of solutions, details the obstacles to RDM success and sets out how firms at different levels of RDM adoption can move forward towards the holy grail of centralised RMD coupled to consistent reference data processing.

Despite huge investments in RDM over the past decade, research carried out among 20 firms – 25% in Europe, 75% in the US, 50% on the buy side and 50% on the sell side – in April 2012 found 86% of respondents dissatisfied with their RDM capabilities. Of these, 48% are being driven to improvement for reasons related to resource optimisation and outcomes, while 35% are responding to specific catalysts such as compliance.

For details on the report click here.

Recommending how to navigate the road ahead, the study suggests firms committed to bolstering existing suites of RDM solutions should focus on wrapping current solutions with technology that enables a consistent enterprise data governance process, while those yet to make a significant commitment to an RDM solution should seek solutions that manage multiple reference data domains in a consistent and integrated enterprise framework.

The report concludes: “There can be no glory without doing the hard work first. Data fluency, a critical precursor to data consumability, simply means that data flows more easily, which in turn means that end users must be able to find it. And, finding data requires meticulous attention to standards, labels and other metadata, however imperfect they may be now or in the future. That way, no matter how big or complex the data gets, end users will have a much better shot at harvesting value from it.”

Source: Reference Data Review, 19.10.2012

Filed under: Data Management, Reference Data, , , , , , , ,

Whitepaper: Bloomberg to embrace emerging LEI

The industry initiative to develop and promote a standard global legal entity identifier (LEI) is expected to significantly reduce the opacity associated with complex financial instruments, widely acknowledged to be a major contributing factor in the 2008 credit crisis.

In this white paper, Bloomberg explains the implications of the emerging LEI for financial institutions, and outlines how it is embracing the new standard to help clients better understand the entities whose instruments they trade and hold (like mapping of LEI to Blombergs numeric BUID, etc.)

Download the White Paper Now

Source: A-TEAM 28.06.2012

Filed under: Data Management, Reference Data, Standards, , , , , , , , , , ,

A-TEAM launches Big Data 4 Finance

 A-Team Group launched today – BigDataForFinance.com where it will cover the emerging science of big data and how it relates to financial markets applications – such as analysis of time series pricing, management of reference data and determination of sentiment from news archives.  A-Team will also cover the evolving technology infrastructure that underpins big data applications, from storage to analytics and business intelligence.

A-TEAM: Let’s start by addressing a working definition for big data, as we see it.  Wikipedia has a pretty good starter: “Datasets that grow so large that they become awkward to work with using on-hand database management tools.”

But here’s our improvement on that: “Datasets whose characteristics – size, data type and frequency – are beyond efficient processing, storage and extraction by traditional database management tools.”

And let’s be clear, the focus is as much on the analysis of data to derive actionable business information as it is on handling different data types and high frequency updates.

Make sure that you don’t miss news and contributions that could be valuable.  Be sure to sign up for the weekly email update here.

Source: A-TEAM, 18.01.2012

Filed under: Data Management, Data Vendor, Market Data, Reference Data, Risk Management, , , , , , , , , ,

SunGard -10 Historical Market Data Trends for 2012

Oliver Muhr, senior vice president of SunGard’s MarketMap business unit, said, “Economists, equity, fixed income researchers and quant traders need historical data to better understand growth opportunities and validate market positions and trading strategy. This requires not only more data, but more minute and granular information provided in a fast and efficient manner. SunGard offers information management tools that help enterprises filter and deliver accurate data for price discovery, financial modeling, risk management and business intelligence.”

The ten market data trends SunGard has identified for 2012 in historical data management are:

Transparency (Transparency and Evaluation Prices White Paper):

1. Firms need more consistent and timely reporting to meet new regulations and investor demands, creating greater strain on data infrastructures that feed risk reporting
2. Risk reports will be required by regulators and investors almost daily, while on-demand data will be needed to meet more advanced analytics
3. Greater transparency in analyzing the relationships between asset classes, such as complex derivatives, is driving the need for standardized entity and security identifiers, and cross symbology

Efficiency:

4. Larger data sets are required to feed predictive models, as more historical data over longer time periods and increased granularity of data sets power back-tests, forecasts and trading impacts throughout the day
5. Firms are focused on controlling variable data costs by centralizing historical data in one location to assess best price
6. Practitioners such as MBAs and CFAs want more flexible data management solutions that require less IT support so that they can spend more time discovering market opportunities
7. With globalization of markets, historical data brings greater complexity in terms of cross-border currencies, valuations and accounting standards – requiring improved accuracy and more market data coverage across assets and regions

Networks:

8. In order to perform advanced analytics and calculations required to support electronic trading strategies, firms must implement platforms that can store greater quantities of data and quickly retrieve and accurately process historical and time series data.
9. Vector storage, rather than traditional relational databases, will be needed to understand complex trends and scenarios
10. Cleaning and storing historical data is driving firms to seek plug-and-play technology that fits with industry standard infrastructures

Paul Rowady, senior analyst at TABB Group, said, “Data management has been, and always will be, an among the most critical components of the quantitative process. It is well known in the quant world that the depth of historical archive – the timeframe of data used for backtesting – is inversely proportional to the turnover of the strategy in question. Therefore, today’s trend toward slower-turnover strategies means that a proportional increase in the scale of the data will be required, as well as the most granular data possible in order to provide maximum flexibility for strategy development today and down the road.  In fact, dealing with data at the granular level and in a hands-on environment is paradoxically the most valuable exercise a quant can do to understand subtle market inefficiencies.”

Source: SunGard, 09.01.2012

Filed under: Data Management, Data Vendor, Market Data, Reference Data, Risk Management, , , , , , ,

Special Report: Evaluated Pricing Oct 2011 – A-TEAM

Valuations and pricing teams are facing a much higher degree of scrutiny from both the regulatory community and the investor community in the glare of the post-crisis data transparency spotlight. Fair value price transparency requirements and the gradual move towards a more harmonised accounting standards environment is set within the context of the whole debate about data quality across the financial services business, in light of incoming regulations such as Basel III and the Alternative Investment Fund Managers Directive (AIFMD). Whether it is related to risk management, pricing, trading or reporting, firms need to be able to stand behind their numbers.

The goal of the AIFMD is to create a level playing field and set basic standards for the operation of alternative investment funds in Europe via new reporting and governance requirements. On the pricing and valuations side of things, firms must establish what the directive calls “appropriate and consistent” procedures to allow for the independent valuation of a fund’s assets. In order to achieve this, the valuation must either be performed by an independent third party or by the asset manager, as long as there is functional separation between the pricing and portfolio management functions.

Download free report here

Source: A-Team, 12.10.2011

Filed under: Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , ,

Markets in Financial Instruments Regulation (MiFIR): A New Breed of Data Requirements – A-TEAM

Rather than opting for an all in one directive to herald the second coming of MiFID, the European Commission has split the update into two parts: a regulation and a directive. The Markets in Financial Instruments Regulation (MiFIR) should be of particular interest to the data management community due to its focus on all aspects of data transparency, from trade data through to transaction reporting.

According to the draft of MiFIR, which is available to download at the bottom of the blog, the regulation: “sets out requirements in relation to the disclosure of trade transparency data to the public and transaction data to competent authorities, the authorisation and ongoing obligations applicable to providers of data services, the mandatory trading of derivatives on organised venues, and specific supervisory actions regarding financial instruments and positions in derivatives.” The data transparency requirements have therefore been neatly tied together under one regulatory banner, leaving the directive to deal with aspects such as the provision of investment services and conduct of business requirements for investment firms.

The draft regulation is the culmination of the work of the European Securities and Markets Authority (ESMA) and its predecessor over the last couple of years to gather industry feedback on the implementation of the first version of MiFID and to fill in any gaps, as well as to extend the regulation beyond the equities market. The draft paper notes that the European Commission has focused on assessing the impact of these new requirements including cost effectiveness and transparency; hence it is adopting a defensive stance ahead of any possible industry backlash on the subject.

Much like its predecessor, MiFIR is focused on improving cross border transparency and ensuring a level playing field with regards to data reporting requirements and access. Although the regulation contains a number of important pre-trade data transparency requirements such as equal access to data about trading opportunities, the most important aspects for data managers will likely reside in the post-trade section of MiFIR.

The extension of transparency requirements to OTC derivatives and fixed income instruments and the multilateral trading facility (MTF) and organised trading facility (OTF) contingents in the market is one such development. These markets, however, will not face the same level of transparency requirements as the equity markets, although “equity like” instruments such as depository receipts and exchange traded funds will see the MiFID requirements extended to cover them directly. All trading venues and their related trades will therefore now be subject to the same level of transparency requirements, but these will be tailored to the individual instrument types in question (the level of transparency will be determined by instrument type rather than venue).

On transaction reporting (the area of most relevance with regards to reference data standards), MiFIR aims to improve the quality of the data underlying these reports (a common theme across a lot of recent regulation – see commentary on which here) by being much more prescriptive in the standards that must be used. The idea is for firms to provide “full access to records at all stages in the order execution process” and for trading venues, beyond just traditional exchanges to encompass MTFs and OTFs, to store relevant data for a period of five years. This data includes legal entity identification data that the regulation indicates must be reported via approved mechanisms and formatted in a certain manner that will make it accessible for regulatory oversight purposes cross border.

The exact nature of the legal entity identification (LEI) and instrument identification standards that are to be used by firms in their transaction reports is likely to be impacted by the ongoing work at a global level as part of the systemic risk monitoring effort (see more here). At the moment, a range of identifiers is acceptable, but the regulatory community has been pushing towards the Bank Identifier Code (BIC) for some time (see more on which here), but this may change before MiFIR comes into force.

Another important section of MiFIR is the one devoted to the “increased and more efficient data consolidation” for market data, which necessarily entails a reduction in the cost of this data. A City of London paper published earlier this year addressed this issue directly, noting that the majority of the European firms participating in the study believe poor data quality, high costs of pricing data and a reliance on vendors are the main barriers to post-trade transparency (see more here), and MiFIR appears to be aiming to directly address those issues.

The argument for some form of consolidated tape or tapes is an integral part of that endeavour (see recent industry commentary on this issue here) and MiFIR indicates that the aim is for data to be “reliable, timely and available at a reasonable cost.” On that last point, the regulation also includes a provision that all trading venues must make post-trade information available free of charge 15 minutes after execution, thus enabling data vendors to stay in business but increasing transparency overall (or so the logic goes). Moreover, the regulator is keen for a number of consolidated tape providers to offer market data services and improve access to a comparison of prices and trades across venues, rather than a single utility version.

In order to tackle the issue of a lack of data quality for trade reporting, all firms will also be required to publish their trade reports through approved publication arrangements (APAs), thus ensuring certain standards are adhered to.

The full MiFIR Draft paper is downloabale here  from A-TEAM

Source: A-Team Virgina´s Blog, 08.09.2011

Filed under: Data Management, Market Data, News, Reference Data, Risk Management, Standards, , , , , , , , , , , ,

Follow

Get every new post delivered to your Inbox.

Join 82 other followers