FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

Reference Data Utilities Offer Cost Savings, but Challenges Remain DMS Review

Managed services and utilities can cut the cost of reference data, but to be truly effective managed services must be more flexible and utilities must address issues of data access and security.

A panel session led by A-Team Group editor-in-chief Andrew Delaney at the A-Team Group Data Management Summit in London set out to discover the advantages and challenges of managed services and utilities, starting with a definition of these data models.

Martijn Groot, director at Euroclear, said: “A managed service lifts out existing technology and hands it over to the managed service provider, while a utility provides common services for many users.” Tom Dalglish, CTO, group data at UBS, added: “Managed services run data solutions for us and utilities manage data for themselves.”

Based on these definitions, the panellists considered how and why managed services and utilities are developing. Dalglish commented: “We need to move away from all doing the same things with data. Managed business process outsourcing services are well understood, but utilities present more challenges – will they be run as monopolies and make data difficult to access, what is the vendor interest?” Steve Cheng, global head of data management at Rimes Technologies, added: “The market has moved on from lift outs. New technologies mean managed services can be more flexible than outsourcing.”

It is not only the nature of available services that is driving financial firms to third-party providers, but also cost and regulation, both of which are high on the agenda. Jonathan Clark, group head of financial services at Tech Mahindra, explained: “Cost is significant, but regulation is the number one issue. Regulations require more holistic and high quality data and that is high cost for firms, so they are trying to get data quality at a reasonable price point.”

Dalglish focussed on cost, saying: “The business case is about money. Large companies have lost the ability to change, a utility can help to reduce costs. Banks are looking at these data models to regain efficiencies they have lost internally and are difficult to rebuild.”

Cheng described the reference data utility model as being more like the satellite television model than water or electricity models, and noted that Rimes’ experience of customers is that they want to innovate, but not allow their cost base to increase.

While consensus among the panellists was that managed services and utilities can provide cost savings, they also agreed that it is not the cost of data, but the infrastructure, sources, services and people around the data that rack up the cost to an extent that is leading firms to seek lower cost solutions. Firms that opt to use a data utility can convert capital costs to expenditure and chip away at elements such as multiple data sources.

Dalglish commented: “If you can achieve savings of 30% to 35% that is good, but this is a conservative estimate and it should be possible to save more going forward.” Cheng added: “The rule of thumb is that for every £1 spent on data licences, £2 or £3 is spent on infrastructure and staff. The need is to identify those hidden costs so that the use of a managed service or utility can be justified.”

Returning to the pressure of regulation, Delaney asked the panel whether managed reference data services and utilities would be regulated in the same way as banks. While this is not happening at the moment, some panel members expect it to happen and warn that utilities may find a way around regulation by using disclaimers. Cheng said: “Forthcoming regulations are very prescriptive about data models and regulators may look at the whole data chain. This means utilities and managed services may in future be subject to the same regulatory requirements as other market participants.”

The concept of managed services and utilities is not new. Dalglish recalled an effort to set up a utility that did not take off back in 2005 and said that the moment has now come for utilities as the technology stack has improved, data is better understood and this is a good time for competition and collaboration in the market. Groot added: “Data delivery mechanisms have changed, the bar has been raised on projects and the business case for an internal service is difficult, making external services attractive.” Panellists also noted technologies such as the Internet and cloud facilitating mass customisation, and the benefit of utilities that are built for a single purpose.

With so much to offer, Delaney questioned the panel on what type of organisations will benefit from third-party utilities. Panel members said both large and small firms could benefit, with large companies reducing today’s massive data costs and small firms being able to hand off non-core reference data services. Clark added: “Firms that can benefit most are those that find it difficult to discover the cost of data, perhaps because it is managed in different departments or geographic regions. But these firms are also the hardest to convert because they don’t know their costs.”

A question from the audience about defining reference data, making it open and putting it in a utility for all to use, met a consensus response from panel members who said it is a great idea, but will not happen because there are too many vendors with vested interests in the market.

Closing with a blue skies scenario, Delaney asked how far the utility concept could go. Groot concluded: “There is a need for operational procedures and recovery planning, but utilities could go a long way as there is a lot of data in scope.”

Source: Reference Data Review, 08.10.2013.

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , ,

Reference Data: Tech Mahindra Details Global Data Utility Based on Acquired UBS Platform

Tech Mahindra, a business process outsourcing specialist and parent of London-based investment management technology consultancy Citisoft, has repositioned a reference data platform acquired from UBS Global Asset Management to offer an offshore reference data utility aimed at meeting market demand for lower cost, high quality data that can reduce risk and increase efficiency.

The global data utility has been introduced under the Tech Mahindra Managed Data Services brand and offers securities reference data across all asset types, reference data for corporate actions, tax information and end-of-day and intra-day validated pricing data. The utility handles data cleansing and validation, with clients buying licences to access the data.

Tech Mahindra suggests the utility differs from other offerings in the enterprise data management market as it is owned by the company and can be developed. It is also agnostic on data feeds, including 20 from vendors including SIX, Markit, Bloomberg, Thomson Reuters and DTCC.

The company’s first customer is UBS Fund Services in Luxembourg. Under the terms of a five-year services contract with UBS, Tech Mahindra will create and store golden copy data and provide multiple intra-day golden copies to the asset manager. As part of the acquisition and customer deal, Tech Mahindra, which is headquartered in Hyderabad, India, will take on some staff from UBS Global Asset Management who were working on the platform in Luxembourg, but most staff will be located in India.

As a repositioned platform, Tech Mahindra MDS already covers all time zones, markets and asset types, updates 2.5 million issuers on a daily base, receives 200,000 customer price requests and validates 75,000 prices. Some 20,000 corporate actions are checked every day, along with 1,800 tax figures. Looking forward, Tech Mahindra plans to extend these metrics and add reference data around indices and benchmarks, legal entity identifiers and clients.

While Tech Mahindra will lead sales of the service to the banking, financial services and insurance sectors, Citisoft will be able to provide consultancy as necessary. Steve Young, CEO of Citisoft, says Tech Mahindra MDS has been designed to improve data quality and drive down the total cost of data ownership, in turn reducing risk and increasing efficiency. To manage clients’ cost issues, the company has built a toolkit into the data management system that allows users to analyse the cost of owning data, including people, processes and technology. Data quality will be underpinned by service level agreements and key performance indicators will be added as more clients sign up for services and data volumes grow.

Reflecting on the data challenges faced by financial firms, Citisoft Group CEO Jonathan Clark, concludes: “Outsourcing models have evolved over time and attitudes are changing as firms acknowledge that there is a big difference between outsourcing and offshoring, and that captive outsourcing is not an efficient approach. The need is for a commercial relationship with a centralised data utility that can deliver high-quality, accurate data and a lower total cost of ownership.”

Source: Reference Data Review, 24.07.2013

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , , , , , , , ,

Chile: Comder to launch Central Counterparty next year with Calypso clearing solution

A consortium of Chilean banks is forming a new central counterparty (CCP) next year for over-the-counter (OTC) derivatives. The Comder CCP has selected Calypso to provide the core clearing platform for the new launch which will enable compliance with the post-crash rules laid out at the Pittsburgh G20 mandating more transparency and effectively ‘on exchange’ clearing.

The new Comder CCP will begin clearing non-deliverable forwards (NDFs) in Q4 2014 and interest rate derivatives (IRD) in Q1 2015. The CCP will be powered by Calypso with its platform providing legal novation, affirmation, registration, limits, initial and variation risk margins, collateral management data, and default management and centralised trade repository storage and reporting.

According to Felipe Ledermann, the chief executive of Comder, Calypso was chosen for its experience in OTC derivatives central clearing. Comder will receive on-going maintenance and support from the vendor after the platform is rolled out next year.

“We see Calypso as a strategic partner for one of the most important projects in the Chilean banking industry,” continued Ledermann. “This initiative allows us to build a best-in-class CCP with the highest standards and align with BIS-IOSCO principles for market infrastructures.”

Calypso already provides OTC derivatives clearing and processing infrastructure and technology to leading clearing houses, such as the Chicago Mercantile Exchange (CME), Eurex, BM&FBovespa, the Tokyo (TSE) and Singapore exchanges (SGX) and Hong Kong Exchanges and Clearing (HKEX). The Calypso clearing solution provides full cross-asset coverage, manages each step in the clearing process and delivers visibility into risk for cash and OTC derivatives products, claims the vendor. The single platform should also be scalable if Comder attracts significant volumes.

Commenting on the deal, Kishore Bopardikar, president and chief executive of Calypso Technology, said he was excited to provide a solution that will enable the Chilean market to move towards a centrally cleared derivatives environment, adding that “we are pleased to be supporting the development of such an important platform for the country”.

Source: Bobsguide, 23.07.2013

Filed under: Chile, Latin America, Standards, , , , ,

Data Management – a Finance, Risk and Regulatory Perspective- White Paper

Download this  white paper – Data Management – a Finance, Risk and Regulatory Perspective – about the challenges facing financial institutions operating across borders with respect to ensuring data consistency and usability across multiple user types.

 The finance, risk and compliance operations of any financial institution need nimble access to business information, for performance measurement, risk management, and client and regulatory reporting. But although the underlying data may be the same, their individual requirements are different, reflecting the group-level view required by senior management and regulators, and the more operational view at the individual business level.

Where in the past, risk managers have been left to figure out what’s most appropriate for their particular institutions, regulators today are adopting a more aggressive stance, challenging the assumptions underpinning banks’ approaches to risk management. As a result, today’s challenge is not only to understand the current regulatory, risk or finance requirements, but also to set in place the analytical framework that will help anticipate future requirements as they come on stream. To find out more, download the white paper now

Full text and downloads available to members only, please log in or become a member to view premium content and attached documents.  Becoming a member of the ReferenceDataReview.com community is free and easy – just click the link below.      
Source: A-Team, 02.07.2013

Filed under: Corporate Action, Data Management, Library, Reference Data, Risk Management, Standards, , , , , ,

LEI-Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again.

But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I think there’s nothing more to say on the topic, there is – well – more to say. With the artifice of the March ‘launch date’ behind us, it’s time to deal with reality. And the reality practitioners are having to deal with is one that’s changing rapidly.

Down load full and detailed report.

LEI-Dealing_with_reality-how_to_ensure_data_quality with Entity Identifiers_06_13.pdf

Source: A-Team, 26.06,2013

Filed under: Data Management, Data Vendor, Library, Reference Data, Standards, , , , , , , , , , , , , , ,

Thomson Reuters Outlines Plans to Lighten the Burden of Symbology

Thomson Reuters has set out its store on symbology saying it does not support the promotion of new identifiers as a means of improving data management, but is keen to support industry standards and has plans to offer services such as symbology cross-referencing to ease the burden on data managers.

The company documents the development of symbology, its use and complexity in a white paper authored by Jason du Preez, head of symbology services at Thomson Reuters, and entitled ‘Solving for Symbology Discord, the Identity Challenge’.

Thomson Reuters set up a symbology business last year and published the white paper to acknowledge the importance of symbology and recognise its challenges. Du Preez says: “We don’t believe there is a silver bullet that will answer the problems of symbology. Innovative new products continue to exacerbate the problem and that is not going to change. We can, using our core competencies, create linkages, invest to take on the burden of linking data sets, and maintain code mapping. And we can allow the market to make more use of our intellectual property.”

Du Preez cites licences introduced last summer to extend the use of the company’s proprietary Reuters Instrument Codes (RICs) in non real-time content, as well as its agreement in response to a European Commission antitrust investigation to extend the use of RICs in real-time consolidated data feeds, as moves to open up how RICs are licensed and make them more accessible across all asset classes.

Integration of RICs with Proprietary Identifiers

He says: “As there is no silver bullet, we will invest more in cross-referencing services and tie in quality of information. We will have interesting things to offer over the next 18 months.” Among these he lists the integration of RICs and proprietary identifiers, with firms submitting their codes to Thomson Reuters and the company playing them back as part of its own codes. Other broad cross-referencing services will be tailored to allow clients to access only required cross references and linkages.

“Thomson Reuters doesn’t promote a new code, there are enough out there already. We will continue to use existing codes and extract value from them; the key is linkages between market vendor codes and proprietary structures. While clients face regulatory and cost drivers, we will take care of linkages and cross referencing to improve the breadth and quality of client content.”

Thomson Reuters’ white paper details the development of symbology and notes the company’s intent, as described by du Preez. It starts by mentioning irregular incidents in the market that remind the industry of the challenges involved when an aggregated or consolidated view across positions is needed, including the incompatibility of core data symbols. The paper states: “The core elements: security identification, counterparty identification and price discovery, were never developed to work efficiently and effectively on an enterprise/global scale.”

Looking at the current state of symbology, the paper flags the fragmented identification methods resulting form the market’s approach to symbology, including data providers’ and data aggregators’ different means of identifying the various parts of securities or counterparties, as well as firms’ creation of proprietary identifiers to fill gaps in vendor provision. The paper reports: “[Symbology] is still a ‘cottage industry’ where the identification schemes put in place by one group are locally focused and usually limited to a specific slice of the securities market. This consumes resources: in many cases the task of mapping multiple sets of disjointed or partially overlapping symbols can consume as much (or more) development time and computing resource as programming the business logic itself.”

The paper reviews changes in the financial industry since 1993 that have complicated symbology and notes the increasing difficulty, yet increasing need, to integrate information across a firm’s complete range of trading businesses to achieve effective risk management. On the flip side, it points to the parallel need to analyse rapidly growing stores of information and connect increasingly diverse datasets to find relevant information in the quest for alpha. It states: “The sophistication of the methods we employ to aggregate, rationalise and navigate information bears a direct relationship to the size of the lead a firm can have in the financial marketplace.”

How to Unambiguously Identify Information

While the outcome of linking and navigating information can be positive, it presents significant challenges as a lack of consistent and comprehensive global industry standards means firms must maintain symbology cross references, a difficult and often flawed task, particularly in banks with many different trade and compliance-related systems. Du Preez writes: “A popular approach is ‘we can build an adaptor’. Adaptors have become some of the most complex processes in banking technology. That is not data management. It is trying not to get eaten by the alligators.” He goes on to surmise: “Data managers do not want to deal with these problems – they ultimately want services they can reliably use to unambiguously identify information.”

Enter Thomson Reuters with its vision of how to resolve these problems. “We believe that these linkages are the key to enormous untapped value. Being able to enter the data model through any entity identifier (quote, security or legal entity) and easily navigate and explore all the linkages between related entities not only puts a firm in control of its risk position, but also creates a window into opportunities. Industry standards have a significant part to play as they provide a universal start and end point; Thomson Reuters is a strong supporter of symbology standards in the data industry and we will be first in line to adopt and link industry standard identifiers to our content sets.”

The report discusses the challenges propagated by the use of multiple symbologies and the workload associated with the maintenance of cross reference tables in local security master databases. It touches on Thomson Reuters’ plans to provide cross reference services centrally and leverage its core competencies and infrastructure to ease the burden on institutions that have traditionally solved the problems themselves.

It states: “Cross referencing is a reality that cannot be avoided – we aim to make this as accurate and cost-effective as possible for our customers. We also understand that while symbology is an important part of the picture, translation and synchronisation services will also play a critical part. The need for these services is evidenced by the burgeoning desire of the market to offload these onerous data management functions to specialist providers.” The report concludes: “Thomson Reuters is investing now to continue to expose the growing capabilities of its data management infrastructure and ensure that structured and unstructured data come together in a rich tapestry of knowledge with the aim of maximizing utility to trading algorithms, research, analysis and information discovery.”

Source: A-Team Reference Data Review, 26.03.2013

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , , , , ,

Outsourcing Reference Data Management: Cost Reduction and New Revenue Opportunities

The past 12 months has seen the emergence of new players offering Business Process Outsourcing (BPO) services for Reference Data Management. These new arrivals expand the range of options available to financial institutions for addressing the challenges of regulatory compliance, operational cost reduction and scalability.

But BPO has other benefits, and innovative adopters have benefited from using the model to create new value-added services. By catering to clients’ data management needs, these players have been able to transform what’s traditionally been considered a cost centre into a new and significant source of revenue.

This paper – from AIM Software – explores this exciting new trend, and describes how an established financial institution took advantage of BPO to turn its enterprise data management initiative into a new source of revenue and business growth.

Download the White Paper Now to Find Out More

Source: A-Team, March 2013

Filed under: Corporate Action, Data Management, Data Vendor, Library, Market Data, Reference Data, Standards, , , , , , , , , , , , , , ,

Capco Proposes the Creation of a Data Culture to Advance Data Management RDR

Financial firms are falling short on data management issues such as calculating the true cost of data, identifying the operational cost savings of improved data management and embracing social media data, but according to research by consultancy Capco, these issues can be resolved with a cross-organisational and practical approach to data management and the development of a data culture.

The business and technology consultancy’s report – ‘Why and how should you stop being an organisation that manages data and become a data management organisation’ – is based on interviews with towards 100 senior executives at European financial institutions. It considers the many approaches to data management across the industry and within individual enterprises, as well as the need to rethink data management. It states: “There is one certainty: data and its effective management can no longer be ignored.”

The report suggests an effective data management culture will include agreed best practices that are known to a whole organisation and leadership provided by a chief data officer (CDO) with a voice at board level and control of data management strategy and operational implementation.

For details on the report click here

Turning the situation around and attaching practical solutions to the data management vision of an all-encompassing data culture, Capco lists regulatory compliance, risk management, revenue increase, innovation and cost reduction as operational areas where good data management can have a measurable and positive effect on profit and loss.

Setting out how an organisation can create an effective data culture, Capco notes the need to change from being an organisation that is obliged to do a certain amount of data management, to a mandated and empowered data management organisation in which data has ongoing recognition as a key primary source. The report concludes: “Every organisation has the potential, as well as the need, to become a true data management organisation. However, the journey needs to begin now.”

Source: Reference Data Review, 24.10.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , ,

LEI Development Embraces Change and Picks up Speed Ahead of G20 Meeting

The Financial Stability Board’s (FSB) third progress note on the legal entity identifier (LEI) initiative, released last week, has met with a positive response from those involved in shaping the system, potential infrastructure providers and market data vendors, despite changes to some proposals and the collapse of perceptions that have built up during debate on how the final system could shape up.

But while progress is positive, there are still fundamental concerns around corporate hierarchies, as without agreed reference data on legal entity parent and relationship information, the LEI will not fulfil the effective risk aggregation function at the heart of the global system development.

The decisions and to-do lists outlined in the FSB progress note are significant steps forward in developing a global LEI system and come ahead of another major milestone this week when G20 finance ministers and central bank governors meet in Mexico City and will be asked to endorse a draft charter for the system’s Regulatory Oversight Committee (ROC). The charter has been drawn up by the FSB Implementation Group (IG) and is expected to be approved by the G20 meeting, setting in motion the creation of the ROC and the global LEI foundation that will underpin the Central Operating Unit (COU) and secure a governance framework designed to sustain the public good of the system.

One of the late changes identified in the progress note is a shift away from perceptions that entity identifier codes would be 20-character random numbers. Instead, the note describes a part-structured, part-random character string resulting from an ‘urgent’ request made by the FSB IG in September for the FSB LEI Private Sector Preparatory Group (PSPG) to consider how identifiers could best be issued for the purposes of a federated, global LEI system. The PSPG’s views were considered at meetings of the PSPG and IG in Basel earlier this month and a technical specification has been endorsed by the FSB plenary.

The FSB states in the progress note: “The FSB decision is provided now to deliver clarity and certainty to the private sector on the approach to be taken by potential pre-LEI systems that will facilitate the integration of such local precursor solutions in to the global LEI system.”

On the basis of the arguments presented and discussed by the PSPG, the FSB has selected a structured number as the best approach for the global LEI system, although it acknowledges that the 20-character code, which complies with the existing ISO 17442 standard, will have no permanent embedded meaning. Instead it is aimed to avoid any overlap of random numbers in a federated issuing system by adding a code for each local operating unit (LOU) assigning LEIs in front of the numbers.

The breakdown then looks like this:

· Characters 1-4: a four character prefix allocated uniquely to each LOU

· Characters 5-6: two reserved characters set to zero

· Characters 7-18: entity-specific part of the code generated and assigned by LOUs

· Characters 19-20: two check digits as described in ISO 17442.

If this information has been a long time coming, the time to organise behind it is short with pre-LEI solutions wanting to transition into the global LEI system required to adopt the numbering scheme no later than November 30, just a month away. The LEI will be portable within the global LEI system, implying that the LEI code can be transferred from one LOU to another and that each LOU must have capacity to take responsibility for LEIs issued by other LOUs.

Following recommendations on data quality achieved through self-registration of legal entities in the FSB’s June 2012 report, the FSB goes on to decree that pre LEI-services should be based on self-registration, although this can include third-party registration made with the permission of the entity to be registered, and that from November 9 all pre-LEI systems must allow self-registration only.

No specific recommendations are made on how the Commodity Futures Trading Commission’s (CFTC) CFTC Interim Compliant Identifiers, or CICIs, which are entirely random numbers, will integrate with the LEI system, although the 27,000 or so already issued are expected to be grandfathered and accepted into the system without being restated.

Commenting on the LEI number structure, Peter Warms, global head of ID and symbology development at Bloomberg, says: “But for the prefix that identifies where the number was assigned from, the number is still random. This is good for data management practices as the number has no other data dependencies. I would question, however, whether the prefix of an identifier would be changed if it is moved to another LOU as this is not clear.”

Tim Lind, head of legal entity and corporate actions at Thomson Reuters, says: “We must put the debate on intelligent versus dumb numbers behind us and leave it as a milestone. Either solution could work and ongoing argument is not productive. The LEI principles are in place and we need to get on and get the work done.”

Both Warms and Lind applaud the advances made by the FSB and its working groups, but the need for speed remains if deadlines are to be met. And as the complex tasks of developing a legal foundation, ROC and governance framework for the LEI continue, Lind proposes a balance of perfection and pragmatism as the only way forward.

Another outcome of the Basel meetings that deflates earlier perceptions, is a clear indication that the COU will not be located in one central place, but will instead be distributed across several locations. This is likely to emanate from the FSB’s hard fought for and well held desire to ensure the LEI system is a collective development for the public good including a governance and operational framework that will encourage all jurisdictions to join in.

On the same basis, it has also become apparent that any suggestion that an LEI system could initially be based on a replica of the DTCC and Swift utility set up for the CFTC’s CICIs has been quashed. Instead, LOUs are expected to make their own technology choices to support the LEI – indeed they may already have systems in place – although they will, necessarily, have to conform with standards set by the COU.

If these are some of the recent gains in the LEI development, there is still much to be done ahead of having an ROC, COU and some LOUs in place by March 2013. Again sustaining a level playing field for the public good on a global basis, the FSB has asked the PSPG to build on initial work and consider the next phase of operational work that will focus on how the system can best address key issues in areas such as data quality, supporting local languages and characters, and drawing effectively on local infrastructure to deliver a truly global federated LEI system. The PSPG’s deadline to make proposals on these issues is the end of the year, generating the need for extremely swift action if the LEI system is to be up and running to any extent in March.

The final issue raised in the FSB’s progress note and one which has yet to be openly debated and resolved is ownership and hierarchy data associated with LEIs. The note states: “Addition of information on ownership and corporate hierarchies is essential to support effective risk aggregation, which is a key objective for the global LEI system. The IG is developing proposals for additional reference data on the direct and ultimate parents(s) of legal entities and on relationship (including ownership) data more generally and will prepare initial recommendations by the end of 2012. The IG is working closely with the PSPG to develop the proposals.”

This might be the FSB’s final note, but the issue has to be a top priority. As one observer puts it: “The next big thing is hierarchies. They need to be nailed down and there needs to be transparency. Work is being done on this, but without a good solution there will be no meaning in the LEI.”

Source: Reference Data Review, 29.10.2012

Filed under: Data Management, Reference Data, Standards, , , , , , , ,

News and updates on LEI standard progress and development

As a follow up on G20 acceptance in Los Cabos in July 2012 and the Financial Stability Board guidelines and recommendations of the Legal Entity Identifier  LEI, we will regularly update this post with news and article to provide an overview of  LEI standard progress and development.

 
First Published  13.07.2012 , Last Update 27.09.2012

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , , , , , , , , , , , , , , , , ,

Symbology: EDI’s Corporate Actions Service Adopts Bloomberg Open Symbology

Free-use Data Tagging System Reduces Costs and Risks in Trading

Exchange Data International (EDI), a premier back office financial data provider, today announced it adopted Bloomberg’s Global Securities Identifiers (‘BBGID’) to name and track all equities securities in its Worldwide Corporate Actions service.

EDI is the latest financial data provider to adopt Bloomberg’s Open Symbology (BSYM), an open and free-use system for naming global securities across all asset classes with a BBGID, a 12 digit alpha-numeric identifier for financial instruments. EDI has implemented BBGID numbers in its equities reference, pricing and corporate actions data feeds. Its Worldwide Corporate Actions service provides detailed information on 50 corporate action event types affecting equities listed on 160 exchanges.

“EDI decided to integrate Bloomberg’s Open Symbology, as it is easily accessible and has no license fee or restrictions on usage,” said Jonathan Bloch, the Chief Executive Officer of EDI. “Bloomberg’s Symbology also advances straight-through processing of equity orders, which aids reporting and compliance management.”

Peter Warms, Global Head of Bloomberg Open Symbology, said, “Existing identifiers that change due to underlying corporate actions introduce inefficiencies, increase costs and add complexity to the data management process. Bloomberg and EDI recognise the importance of comprehensive, open and unchanging identifiers, like the BBGID, in enabling customers to track unique securities consistently and to process corporate action data seamlessly. As BSYM grows in adoption, interoperability across market systems and software using BSYM will improve steadily and reduce operational costs.”

Source: Bobsguide, 24.09.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , ,

Whitepaper: Bloomberg to embrace emerging LEI

The industry initiative to develop and promote a standard global legal entity identifier (LEI) is expected to significantly reduce the opacity associated with complex financial instruments, widely acknowledged to be a major contributing factor in the 2008 credit crisis.

In this white paper, Bloomberg explains the implications of the emerging LEI for financial institutions, and outlines how it is embracing the new standard to help clients better understand the entities whose instruments they trade and hold (like mapping of LEI to Blombergs numeric BUID, etc.)

Download the White Paper Now

Source: A-TEAM 28.06.2012

Filed under: Data Management, Reference Data, Standards, , , , , , , , , , ,

Thomson Reuters Opens RICs to all with Non-Realtime License

Thomson Reuters is taking a step toward answering client calls for more open access to its Reuters Instrument Code (RIC) symbology. The company is making RICs available for use with non-real-time information in client and non-client financial institutions’ trade processing systems.

Enterprise content chief Gerry Buggy, who has spearheaded Thomson Reuters’ response to the EC anti-competition complaint, the new facility is the “first step in supporting the financial community’s symbology needs across all parts of the trading life cycle through our evolving symbology services.”

The move comes in the wake of the EC investigation and subsequent complaint into the use of RICs in real-time consolidated data feeds. In response to that complaint, many financial services practitioners have called for more open access to the RIC, which is entrenched in many firms front-, middle- and back-office trading and trade processing systems.

According to Jason du Preez, Global Business Manager, Enterprise Platform, at Thomson Reuters, the latest initiative “has nothing to do with the EC investigation. The EC is focused on use of RICs for accessing real-time information, while the new licences are focused at firms looking to trade with the RIC or use the RIC to access non-real-time information.”

Du Preez says that latest move means that “any market participant can buy a license that will allow them to trade using the RIC. This will allow the use of the RIC for pre- and post-trade activities, and the right to redistribute RICS in this regard.”

The new RICs arrangement will allow market participants to use and cross-reference the RIC symbol for trade activities. As such, it can be used to facilitate the advertisement of liquidity, acceptance of trade flow and execution of post trade activities with the RIC symbol as a consistent identifier throughout the process.

Additionally, the service will allow Thomson Reuters pricing and reference data customers to use RICs to reference and retrieve securities data from their securities master databases and navigate to connected content such as legal entity identifier (LEI) information.

Du Preez says that “Firms that purchase reference data from Thomson Reuters will also be granted the right to use the RIC to access any non-real-time information, essentially allowing them to use the RIC to access any content, including third-party party content, held in their securities master databases.”

Thomson Reuters believes the new service will encourage more efficient and reliable capital markets by giving market participants the freedom to use RICs symbols irrespective of whether they use Thomson Reuters enterprise data products.

As part of the latest initiative, the Bats Chi-X Europe exchange has signed up for the service, which will allow it to deploy RICs in the post-trade services it offers.

According to Paul O’Donnell, COO at BATS Chi-X Europe, “Cross-referencing the BATS Chi-X Europe instrument codes with the Thomson Reuters RIC symbols will enable us to reach new market participants as well as improve efficiency and data transparency by facilitating accurate identification of securities on our platform.”

Du Preez says obvious candidates for adopting the new arrangement include “trade hubs, third-party trade/post-trade processing firms or anyone that wants to send, receive or cross reference messages that contain securities identified with a RIC.”

Source: A-Team Reference Data Review 27.06.2012

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , , , , , , ,

Reference Data: LEI system Real and Ready for Use…or maybe not?

The morning after the G-20 leaders endorsed the Financial Stability Board’s recommendations for a global system of precisely identifying legal entities, the co-chairwoman of the LEI Trade Association Group said, “I think we have something that is real and ready for use.’’

Robin Doyle, a senior vice president at JPMorgan Chase, noted that 20,000 ready-to-use “legal entity identifiers” have already been generated by a prototype jointly developed by the Depository Trust and Clearing Corporation and the Society for Worldwide Interbank Financial Telecommunication. A copy of that file can be downloaded here.

The online portal that would allow financial market participants to register and receive 20-character ID codes and to search for the codes of counterparties or other entities was demonstrated Wednesday morning at the 2012 Technology Leaders Forum of the Securities Industry and Financial Markets Association.

That portal can be turned live “within 24 hours” of its need, said Mark Davies, 
Vice President, Business Development
 at The Depository Trust & Clearing Corporation, during the demonstration.

The LEI Trade Association Group represents a group of firms and financial industry trade associations trying to develop a global and uniform legal entity identifier. The group is supported by the Global Financial Markets Association, which includes SIFMA.

SIFMA and a variety of other trade groups have recommended that DTCC and SWIFT operate a central authority for registering and issuing the codes that the leaders of the G-20 industrial nations Tuesday endorsed.

The G-20 endorsed the 35 recommendations of an international coordinator known as the Financial Stability Board.

The board’s recommendations differed in one significant aspect from the SIFMA and trade association recommendation. Where the trade groups recommended a centralized system for registering and issuing ID codes – a point reinforced Tuesdya in opening remarks at SIFMA Tech by SIFMA president T. Timothy Ryan Jr. – the FSB recommended a “federated” registration model. Under that approach, local authorities, aka nations, could and theoretically would act as the agencies for registration, issuing and storing the codes.

The central authority would maintain a database that would be logically managed, but whose contents might be spread around the world, as on servers spread across the Internet.

“We think it can work,” but it has to be set up and maintained properly, Doyle said.

The federated model will only be as good as it adheres to the global standards set by the FSB and the International Organization for Standardization, which defined the 20-character code.

Doyle said a central authority under the FSB approach likely will need to conduct audits of local operating units, to ensure compliance with the overall standards. The challenge will be to make sure the codes are kept correctly and not, in some fashion, duplicated.

The local authorities will need to take on the expense of maintaining high standards. “It is an expensive, difficult process to validate data,” Doyle said.

“A public-facing system like this needs a huge amount of control,” Davies said.

The next shoe to drop on the development of the system will come within the next couple weeks. That’s when Commodity Futures Trading Commission member Scott O’Malia said a decision will be announced on what organization or organizations will handle the registration and issuance of ID codes for the swaps markets it will oversee. O’Malia said at SIFMA Tech Tuesday that the decision among what industry executives say are four competing proposals will come “very soon.”

Srinivas Bangarbale, the CFTC’s Chief Data Officer, said Wednesday that the regulator’s “interim compliant identifier” will support the ISO 17442 standard set out by the FSB and ISO. r

It’s decision to move ahead “presupposed the standard” and that the chosen implementing group would “adopt the standards as published.” The CFTC will not directly or indirectly create another set of reference data for the industry to keep track of

“It’s important to use the standard as soon as possible,” he said, however.

O’Malia said the CFTC is likely to begin issuing IDs as early as September. That is so the commission can fulfill its mandate to oversee interest-rate and credit-default swap markets, as mandated by the 2010 Dodd-Frank Wall Street Reform Act.

The FSB’s implementation schedule calls for a functional system to be ready to use by March 2013.

Source: Securities Technology Monitor, 20.06.2012 by Tom Steinert-Threlkeld

Filed under: Data Management, Reference Data, Standards, , , , , , , , , ,

Follow

Get every new post delivered to your Inbox.

Join 79 other followers