FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

Reference Data Utilities Offer Cost Savings, but Challenges Remain DMS Review

Managed services and utilities can cut the cost of reference data, but to be truly effective managed services must be more flexible and utilities must address issues of data access and security.

A panel session led by A-Team Group editor-in-chief Andrew Delaney at the A-Team Group Data Management Summit in London set out to discover the advantages and challenges of managed services and utilities, starting with a definition of these data models.

Martijn Groot, director at Euroclear, said: “A managed service lifts out existing technology and hands it over to the managed service provider, while a utility provides common services for many users.” Tom Dalglish, CTO, group data at UBS, added: “Managed services run data solutions for us and utilities manage data for themselves.”

Based on these definitions, the panellists considered how and why managed services and utilities are developing. Dalglish commented: “We need to move away from all doing the same things with data. Managed business process outsourcing services are well understood, but utilities present more challenges – will they be run as monopolies and make data difficult to access, what is the vendor interest?” Steve Cheng, global head of data management at Rimes Technologies, added: “The market has moved on from lift outs. New technologies mean managed services can be more flexible than outsourcing.”

It is not only the nature of available services that is driving financial firms to third-party providers, but also cost and regulation, both of which are high on the agenda. Jonathan Clark, group head of financial services at Tech Mahindra, explained: “Cost is significant, but regulation is the number one issue. Regulations require more holistic and high quality data and that is high cost for firms, so they are trying to get data quality at a reasonable price point.”

Dalglish focussed on cost, saying: “The business case is about money. Large companies have lost the ability to change, a utility can help to reduce costs. Banks are looking at these data models to regain efficiencies they have lost internally and are difficult to rebuild.”

Cheng described the reference data utility model as being more like the satellite television model than water or electricity models, and noted that Rimes’ experience of customers is that they want to innovate, but not allow their cost base to increase.

While consensus among the panellists was that managed services and utilities can provide cost savings, they also agreed that it is not the cost of data, but the infrastructure, sources, services and people around the data that rack up the cost to an extent that is leading firms to seek lower cost solutions. Firms that opt to use a data utility can convert capital costs to expenditure and chip away at elements such as multiple data sources.

Dalglish commented: “If you can achieve savings of 30% to 35% that is good, but this is a conservative estimate and it should be possible to save more going forward.” Cheng added: “The rule of thumb is that for every £1 spent on data licences, £2 or £3 is spent on infrastructure and staff. The need is to identify those hidden costs so that the use of a managed service or utility can be justified.”

Returning to the pressure of regulation, Delaney asked the panel whether managed reference data services and utilities would be regulated in the same way as banks. While this is not happening at the moment, some panel members expect it to happen and warn that utilities may find a way around regulation by using disclaimers. Cheng said: “Forthcoming regulations are very prescriptive about data models and regulators may look at the whole data chain. This means utilities and managed services may in future be subject to the same regulatory requirements as other market participants.”

The concept of managed services and utilities is not new. Dalglish recalled an effort to set up a utility that did not take off back in 2005 and said that the moment has now come for utilities as the technology stack has improved, data is better understood and this is a good time for competition and collaboration in the market. Groot added: “Data delivery mechanisms have changed, the bar has been raised on projects and the business case for an internal service is difficult, making external services attractive.” Panellists also noted technologies such as the Internet and cloud facilitating mass customisation, and the benefit of utilities that are built for a single purpose.

With so much to offer, Delaney questioned the panel on what type of organisations will benefit from third-party utilities. Panel members said both large and small firms could benefit, with large companies reducing today’s massive data costs and small firms being able to hand off non-core reference data services. Clark added: “Firms that can benefit most are those that find it difficult to discover the cost of data, perhaps because it is managed in different departments or geographic regions. But these firms are also the hardest to convert because they don’t know their costs.”

A question from the audience about defining reference data, making it open and putting it in a utility for all to use, met a consensus response from panel members who said it is a great idea, but will not happen because there are too many vendors with vested interests in the market.

Closing with a blue skies scenario, Delaney asked how far the utility concept could go. Groot concluded: “There is a need for operational procedures and recovery planning, but utilities could go a long way as there is a lot of data in scope.”

Source: Reference Data Review, 08.10.2013.

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , ,

Reference Data: Tech Mahindra Details Global Data Utility Based on Acquired UBS Platform

Tech Mahindra, a business process outsourcing specialist and parent of London-based investment management technology consultancy Citisoft, has repositioned a reference data platform acquired from UBS Global Asset Management to offer an offshore reference data utility aimed at meeting market demand for lower cost, high quality data that can reduce risk and increase efficiency.

The global data utility has been introduced under the Tech Mahindra Managed Data Services brand and offers securities reference data across all asset types, reference data for corporate actions, tax information and end-of-day and intra-day validated pricing data. The utility handles data cleansing and validation, with clients buying licences to access the data.

Tech Mahindra suggests the utility differs from other offerings in the enterprise data management market as it is owned by the company and can be developed. It is also agnostic on data feeds, including 20 from vendors including SIX, Markit, Bloomberg, Thomson Reuters and DTCC.

The company’s first customer is UBS Fund Services in Luxembourg. Under the terms of a five-year services contract with UBS, Tech Mahindra will create and store golden copy data and provide multiple intra-day golden copies to the asset manager. As part of the acquisition and customer deal, Tech Mahindra, which is headquartered in Hyderabad, India, will take on some staff from UBS Global Asset Management who were working on the platform in Luxembourg, but most staff will be located in India.

As a repositioned platform, Tech Mahindra MDS already covers all time zones, markets and asset types, updates 2.5 million issuers on a daily base, receives 200,000 customer price requests and validates 75,000 prices. Some 20,000 corporate actions are checked every day, along with 1,800 tax figures. Looking forward, Tech Mahindra plans to extend these metrics and add reference data around indices and benchmarks, legal entity identifiers and clients.

While Tech Mahindra will lead sales of the service to the banking, financial services and insurance sectors, Citisoft will be able to provide consultancy as necessary. Steve Young, CEO of Citisoft, says Tech Mahindra MDS has been designed to improve data quality and drive down the total cost of data ownership, in turn reducing risk and increasing efficiency. To manage clients’ cost issues, the company has built a toolkit into the data management system that allows users to analyse the cost of owning data, including people, processes and technology. Data quality will be underpinned by service level agreements and key performance indicators will be added as more clients sign up for services and data volumes grow.

Reflecting on the data challenges faced by financial firms, Citisoft Group CEO Jonathan Clark, concludes: “Outsourcing models have evolved over time and attitudes are changing as firms acknowledge that there is a big difference between outsourcing and offshoring, and that captive outsourcing is not an efficient approach. The need is for a commercial relationship with a centralised data utility that can deliver high-quality, accurate data and a lower total cost of ownership.”

Source: Reference Data Review, 24.07.2013

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , , , , , , , ,

Data Management – a Finance, Risk and Regulatory Perspective- White Paper

Download this  white paper – Data Management – a Finance, Risk and Regulatory Perspective – about the challenges facing financial institutions operating across borders with respect to ensuring data consistency and usability across multiple user types.

 The finance, risk and compliance operations of any financial institution need nimble access to business information, for performance measurement, risk management, and client and regulatory reporting. But although the underlying data may be the same, their individual requirements are different, reflecting the group-level view required by senior management and regulators, and the more operational view at the individual business level.

Where in the past, risk managers have been left to figure out what’s most appropriate for their particular institutions, regulators today are adopting a more aggressive stance, challenging the assumptions underpinning banks’ approaches to risk management. As a result, today’s challenge is not only to understand the current regulatory, risk or finance requirements, but also to set in place the analytical framework that will help anticipate future requirements as they come on stream. To find out more, download the white paper now

Full text and downloads available to members only, please log in or become a member to view premium content and attached documents.  Becoming a member of the ReferenceDataReview.com community is free and easy – just click the link below.      
Source: A-Team, 02.07.2013

Filed under: Corporate Action, Data Management, Library, Reference Data, Risk Management, Standards, , , , , ,

LEI-Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again.

But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I think there’s nothing more to say on the topic, there is – well – more to say. With the artifice of the March ‘launch date’ behind us, it’s time to deal with reality. And the reality practitioners are having to deal with is one that’s changing rapidly.

Down load full and detailed report.

LEI-Dealing_with_reality-how_to_ensure_data_quality with Entity Identifiers_06_13.pdf

Source: A-Team, 26.06,2013

Filed under: Data Management, Data Vendor, Library, Reference Data, Standards, , , , , , , , , , , , , , ,

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities.

These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency of update than before.

Download this special report for FREE now! Click the link below.

Source: A-Team, 26.06.2013

Filed under: Data Vendor, Library, Market Data, Reference Data, , , , , , , ,

B-Source (Avaloq Group) takes over Wealth Management Operations of Back-Office of Deutsche Bank (Switzerland)

Avaloq subsidiary B-Source is taking over operational responsibility for the Wealth Management Operations Back-Office of the Deutsche Bank (Switzerland) Ltd with effect from July 1, 2013 and supports the bank in further concentrating on its core business.

With effect from July 1, 2013, B-Source, a member of the Avaloq group, is taking over operational responsibility for the Wealth Management Operations Back-Office of Deutsche Bank (Switzerland) Ltd, including 80 employees in Geneva. Transformation of the core-banking platform to the integrated Avaloq Banking Suite is scheduled for summer 2014.

Markus Gröninger, CEO of B-Source, said: “Our continuously increasing community corresponds clearly to our growth path. This confirms us to be on the right track with our strategy of offering highly industrialised services that let banks concentrate on their core business and generate future growth.”

Francisco Fernandez, CEO of Avaloq, is also very pleased with the new deal: “In migrating to the integrated Avaloq Banking Suite, our clients are setting the highest standards in terms of individualising processes and industrialising operations. We love challenges like that. This is how we generate maximum added value for our clients.”

Source: Avaloq 28.06.2013

Filed under: Data Management, Reference Data, Wealth Management, , , , , , , , , , ,

Thomson Reuters Outlines Plans to Lighten the Burden of Symbology

Thomson Reuters has set out its store on symbology saying it does not support the promotion of new identifiers as a means of improving data management, but is keen to support industry standards and has plans to offer services such as symbology cross-referencing to ease the burden on data managers.

The company documents the development of symbology, its use and complexity in a white paper authored by Jason du Preez, head of symbology services at Thomson Reuters, and entitled ‘Solving for Symbology Discord, the Identity Challenge’.

Thomson Reuters set up a symbology business last year and published the white paper to acknowledge the importance of symbology and recognise its challenges. Du Preez says: “We don’t believe there is a silver bullet that will answer the problems of symbology. Innovative new products continue to exacerbate the problem and that is not going to change. We can, using our core competencies, create linkages, invest to take on the burden of linking data sets, and maintain code mapping. And we can allow the market to make more use of our intellectual property.”

Du Preez cites licences introduced last summer to extend the use of the company’s proprietary Reuters Instrument Codes (RICs) in non real-time content, as well as its agreement in response to a European Commission antitrust investigation to extend the use of RICs in real-time consolidated data feeds, as moves to open up how RICs are licensed and make them more accessible across all asset classes.

Integration of RICs with Proprietary Identifiers

He says: “As there is no silver bullet, we will invest more in cross-referencing services and tie in quality of information. We will have interesting things to offer over the next 18 months.” Among these he lists the integration of RICs and proprietary identifiers, with firms submitting their codes to Thomson Reuters and the company playing them back as part of its own codes. Other broad cross-referencing services will be tailored to allow clients to access only required cross references and linkages.

“Thomson Reuters doesn’t promote a new code, there are enough out there already. We will continue to use existing codes and extract value from them; the key is linkages between market vendor codes and proprietary structures. While clients face regulatory and cost drivers, we will take care of linkages and cross referencing to improve the breadth and quality of client content.”

Thomson Reuters’ white paper details the development of symbology and notes the company’s intent, as described by du Preez. It starts by mentioning irregular incidents in the market that remind the industry of the challenges involved when an aggregated or consolidated view across positions is needed, including the incompatibility of core data symbols. The paper states: “The core elements: security identification, counterparty identification and price discovery, were never developed to work efficiently and effectively on an enterprise/global scale.”

Looking at the current state of symbology, the paper flags the fragmented identification methods resulting form the market’s approach to symbology, including data providers’ and data aggregators’ different means of identifying the various parts of securities or counterparties, as well as firms’ creation of proprietary identifiers to fill gaps in vendor provision. The paper reports: “[Symbology] is still a ‘cottage industry’ where the identification schemes put in place by one group are locally focused and usually limited to a specific slice of the securities market. This consumes resources: in many cases the task of mapping multiple sets of disjointed or partially overlapping symbols can consume as much (or more) development time and computing resource as programming the business logic itself.”

The paper reviews changes in the financial industry since 1993 that have complicated symbology and notes the increasing difficulty, yet increasing need, to integrate information across a firm’s complete range of trading businesses to achieve effective risk management. On the flip side, it points to the parallel need to analyse rapidly growing stores of information and connect increasingly diverse datasets to find relevant information in the quest for alpha. It states: “The sophistication of the methods we employ to aggregate, rationalise and navigate information bears a direct relationship to the size of the lead a firm can have in the financial marketplace.”

How to Unambiguously Identify Information

While the outcome of linking and navigating information can be positive, it presents significant challenges as a lack of consistent and comprehensive global industry standards means firms must maintain symbology cross references, a difficult and often flawed task, particularly in banks with many different trade and compliance-related systems. Du Preez writes: “A popular approach is ‘we can build an adaptor’. Adaptors have become some of the most complex processes in banking technology. That is not data management. It is trying not to get eaten by the alligators.” He goes on to surmise: “Data managers do not want to deal with these problems – they ultimately want services they can reliably use to unambiguously identify information.”

Enter Thomson Reuters with its vision of how to resolve these problems. “We believe that these linkages are the key to enormous untapped value. Being able to enter the data model through any entity identifier (quote, security or legal entity) and easily navigate and explore all the linkages between related entities not only puts a firm in control of its risk position, but also creates a window into opportunities. Industry standards have a significant part to play as they provide a universal start and end point; Thomson Reuters is a strong supporter of symbology standards in the data industry and we will be first in line to adopt and link industry standard identifiers to our content sets.”

The report discusses the challenges propagated by the use of multiple symbologies and the workload associated with the maintenance of cross reference tables in local security master databases. It touches on Thomson Reuters’ plans to provide cross reference services centrally and leverage its core competencies and infrastructure to ease the burden on institutions that have traditionally solved the problems themselves.

It states: “Cross referencing is a reality that cannot be avoided – we aim to make this as accurate and cost-effective as possible for our customers. We also understand that while symbology is an important part of the picture, translation and synchronisation services will also play a critical part. The need for these services is evidenced by the burgeoning desire of the market to offload these onerous data management functions to specialist providers.” The report concludes: “Thomson Reuters is investing now to continue to expose the growing capabilities of its data management infrastructure and ensure that structured and unstructured data come together in a rich tapestry of knowledge with the aim of maximizing utility to trading algorithms, research, analysis and information discovery.”

Source: A-Team Reference Data Review, 26.03.2013

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , , , , ,

Outsourcing Reference Data Management: Cost Reduction and New Revenue Opportunities

The past 12 months has seen the emergence of new players offering Business Process Outsourcing (BPO) services for Reference Data Management. These new arrivals expand the range of options available to financial institutions for addressing the challenges of regulatory compliance, operational cost reduction and scalability.

But BPO has other benefits, and innovative adopters have benefited from using the model to create new value-added services. By catering to clients’ data management needs, these players have been able to transform what’s traditionally been considered a cost centre into a new and significant source of revenue.

This paper – from AIM Software – explores this exciting new trend, and describes how an established financial institution took advantage of BPO to turn its enterprise data management initiative into a new source of revenue and business growth.

Download the White Paper Now to Find Out More

Source: A-Team, March 2013

Filed under: Corporate Action, Data Management, Data Vendor, Library, Market Data, Reference Data, Standards, , , , , , , , , , , , , , ,

SIX Financial Information and LUZ Engenharia Financeira in strategic cooperation in Brazil

Zurich, Switzerland – SIX Financial Information and LUZ Engenharia Financeira, the largest provider of risk management software and consulting services to buy- and sell-side institutions in Brazil, have established a strategic relationship to meet the growing need for broader and deeper international financial market data in Brazil.

As Brazil’s investment community increasingly turns to foreign markets to achieve superior returns, reliable, high quality pricing and reference data becomes more important every day. SIX Financial Information, a leading provider of global financial information since 1930, will fill that need for clients of LUZ Engenharia Financeira.

Edivar Vilela Queiroz, CEO of LUZ Engenharia Financeira commented, “While the Brazilian financial services community has been well served by local data providers, SIX Financial Information has the breadth and depth of global market data to support current needs as well as the capacity to grow as our local market evolves.” He continued, “And as the world’s markets become ever more connected and transparent, this is an important differentiator that will allow seamless growth into offshore markets.”

“As Brazil becomes a major force in the global financial markets, foreign investors will undoubtedly continue their steadily increasing interest in the Brazilian markets and help foster even more growth,” said Barry Raskin, Managing Director for SIX Financial Information USA. “We are excited to extend our focus to this vibrant market, and very pleased that LUZ-EF has given us their stamp of approval through this strategic partnership.”

On Wednesday, March 13 2013, SIX Financial Information and LUZ-EF will jointly host a client event in São Paulo where they will formally announce their partnership and describe the international equity and options pricing and reference data available through the LUZ-EF platform.

Source: SIX Financial Information, 12.03.2012

Filed under: Brazil, Data Management, Data Vendor, Reference Data, , , , , , , ,

Thomson Reuters to Open Up RICs for Consolidated Feeds Under EC Settlement

The European Commission has ended its lengthy enquiry into Thomson Reuters’ licensing policies for Reuters Instrument Codes (RICs), accepting commitments from the company that will create a more fluid market for real-time consolidated data feeds. The deal creates a new environment for Thomson Reuters as it finds itself competing in an increasingly open market.

The company welcomed the Commission’s decision – perhaps on the basis of the end of the enquiry rather than the commitments it must stick to – and was quick to point out that its new licensing commitment is consistent with the move it made in June 2012 to make RICS available foruse with non-real-time information in client and non-client financial institutions’ trade processing systems. At that time, Thomson Reuters’ then-Enterprise content chief Gerry Buggy, described the move as the “first step in supporting the financial community’s symbology needs across all parts of the trading life cycle through our evolving symbology services.”

The Commission made its decision to end the enquiry after accepting commitments put forward by Thomson Reuters in May 2012 that were then market tested with a third-party comment period running until August 2012. The commitments have been made legally binding, with the key outcomes being that Thomson Reuters’ customers can continue to use RICs in real-time applications after they have switched to an alternative real-time consolidated data feed provider and use RICs in combination with the alternative provider’s data.

Commenting on the Commission’s decision, Commission vice president of competition policy, Joaquin Almunia, says: “Information plays a key role in ensuring that financial markets operate in a healthy and efficient way. In order to correctly assess investment opportunities, market participants need to access accurate and timely financial data, for example through consolidated real-time data feeds. The commitments offered by Thomson Reuters will enhance competition in this market. Financial institutions that use RICs will now be able to switch to alternative providers more easily.”

Responding to the Commission’s decision to adopt Thomson Reuters’ commitments, David Craig, president of Financial & Risk at the company, says: “Following a detailed examination of the facts, the Commission accepted our proposal without any finding of infringement of EU competition law by Thomson Reuters. We now look forward to continuing to work with our customers to bring world-class, real-time data feed and symbology solutions to market.”

In essence, Thomson Reuters’ commitments allow customers to license additional RICs usage rights for the purpose of switching data vendors and to use RICs for retrieving data from other providers against a monthly licence fee. The company will also provide customers with the necessary information to map RICs to alternative symbology, and allow third parties to develop and maintain switching tools that allow RICs and rival services to interoperate by mapping RICs to the financial identifiers of other data feed providers. Third-party developers can use and keep RICs in their switching tools on payment of a monthly licence.

If the Commission’s decision is favourable for users of consolidated real-time data feeds, it must also be of great interest to their suppliers, with 2013 promising to be both a battleground and a peace mission as suppliers struggle to maintain market share while responding to market demand for more open symbology solutions.

Source: Reference Data Review, 20.12.2012

Filed under: Data Management, Data Vendor, Reference Data, , , , , , , ,

Capco Proposes the Creation of a Data Culture to Advance Data Management RDR

Financial firms are falling short on data management issues such as calculating the true cost of data, identifying the operational cost savings of improved data management and embracing social media data, but according to research by consultancy Capco, these issues can be resolved with a cross-organisational and practical approach to data management and the development of a data culture.

The business and technology consultancy’s report – ‘Why and how should you stop being an organisation that manages data and become a data management organisation’ – is based on interviews with towards 100 senior executives at European financial institutions. It considers the many approaches to data management across the industry and within individual enterprises, as well as the need to rethink data management. It states: “There is one certainty: data and its effective management can no longer be ignored.”

The report suggests an effective data management culture will include agreed best practices that are known to a whole organisation and leadership provided by a chief data officer (CDO) with a voice at board level and control of data management strategy and operational implementation.

For details on the report click here

Turning the situation around and attaching practical solutions to the data management vision of an all-encompassing data culture, Capco lists regulatory compliance, risk management, revenue increase, innovation and cost reduction as operational areas where good data management can have a measurable and positive effect on profit and loss.

Setting out how an organisation can create an effective data culture, Capco notes the need to change from being an organisation that is obliged to do a certain amount of data management, to a mandated and empowered data management organisation in which data has ongoing recognition as a key primary source. The report concludes: “Every organisation has the potential, as well as the need, to become a true data management organisation. However, the journey needs to begin now.”

Source: Reference Data Review, 24.10.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , ,

LEI Development Embraces Change and Picks up Speed Ahead of G20 Meeting

The Financial Stability Board’s (FSB) third progress note on the legal entity identifier (LEI) initiative, released last week, has met with a positive response from those involved in shaping the system, potential infrastructure providers and market data vendors, despite changes to some proposals and the collapse of perceptions that have built up during debate on how the final system could shape up.

But while progress is positive, there are still fundamental concerns around corporate hierarchies, as without agreed reference data on legal entity parent and relationship information, the LEI will not fulfil the effective risk aggregation function at the heart of the global system development.

The decisions and to-do lists outlined in the FSB progress note are significant steps forward in developing a global LEI system and come ahead of another major milestone this week when G20 finance ministers and central bank governors meet in Mexico City and will be asked to endorse a draft charter for the system’s Regulatory Oversight Committee (ROC). The charter has been drawn up by the FSB Implementation Group (IG) and is expected to be approved by the G20 meeting, setting in motion the creation of the ROC and the global LEI foundation that will underpin the Central Operating Unit (COU) and secure a governance framework designed to sustain the public good of the system.

One of the late changes identified in the progress note is a shift away from perceptions that entity identifier codes would be 20-character random numbers. Instead, the note describes a part-structured, part-random character string resulting from an ‘urgent’ request made by the FSB IG in September for the FSB LEI Private Sector Preparatory Group (PSPG) to consider how identifiers could best be issued for the purposes of a federated, global LEI system. The PSPG’s views were considered at meetings of the PSPG and IG in Basel earlier this month and a technical specification has been endorsed by the FSB plenary.

The FSB states in the progress note: “The FSB decision is provided now to deliver clarity and certainty to the private sector on the approach to be taken by potential pre-LEI systems that will facilitate the integration of such local precursor solutions in to the global LEI system.”

On the basis of the arguments presented and discussed by the PSPG, the FSB has selected a structured number as the best approach for the global LEI system, although it acknowledges that the 20-character code, which complies with the existing ISO 17442 standard, will have no permanent embedded meaning. Instead it is aimed to avoid any overlap of random numbers in a federated issuing system by adding a code for each local operating unit (LOU) assigning LEIs in front of the numbers.

The breakdown then looks like this:

· Characters 1-4: a four character prefix allocated uniquely to each LOU

· Characters 5-6: two reserved characters set to zero

· Characters 7-18: entity-specific part of the code generated and assigned by LOUs

· Characters 19-20: two check digits as described in ISO 17442.

If this information has been a long time coming, the time to organise behind it is short with pre-LEI solutions wanting to transition into the global LEI system required to adopt the numbering scheme no later than November 30, just a month away. The LEI will be portable within the global LEI system, implying that the LEI code can be transferred from one LOU to another and that each LOU must have capacity to take responsibility for LEIs issued by other LOUs.

Following recommendations on data quality achieved through self-registration of legal entities in the FSB’s June 2012 report, the FSB goes on to decree that pre LEI-services should be based on self-registration, although this can include third-party registration made with the permission of the entity to be registered, and that from November 9 all pre-LEI systems must allow self-registration only.

No specific recommendations are made on how the Commodity Futures Trading Commission’s (CFTC) CFTC Interim Compliant Identifiers, or CICIs, which are entirely random numbers, will integrate with the LEI system, although the 27,000 or so already issued are expected to be grandfathered and accepted into the system without being restated.

Commenting on the LEI number structure, Peter Warms, global head of ID and symbology development at Bloomberg, says: “But for the prefix that identifies where the number was assigned from, the number is still random. This is good for data management practices as the number has no other data dependencies. I would question, however, whether the prefix of an identifier would be changed if it is moved to another LOU as this is not clear.”

Tim Lind, head of legal entity and corporate actions at Thomson Reuters, says: “We must put the debate on intelligent versus dumb numbers behind us and leave it as a milestone. Either solution could work and ongoing argument is not productive. The LEI principles are in place and we need to get on and get the work done.”

Both Warms and Lind applaud the advances made by the FSB and its working groups, but the need for speed remains if deadlines are to be met. And as the complex tasks of developing a legal foundation, ROC and governance framework for the LEI continue, Lind proposes a balance of perfection and pragmatism as the only way forward.

Another outcome of the Basel meetings that deflates earlier perceptions, is a clear indication that the COU will not be located in one central place, but will instead be distributed across several locations. This is likely to emanate from the FSB’s hard fought for and well held desire to ensure the LEI system is a collective development for the public good including a governance and operational framework that will encourage all jurisdictions to join in.

On the same basis, it has also become apparent that any suggestion that an LEI system could initially be based on a replica of the DTCC and Swift utility set up for the CFTC’s CICIs has been quashed. Instead, LOUs are expected to make their own technology choices to support the LEI – indeed they may already have systems in place – although they will, necessarily, have to conform with standards set by the COU.

If these are some of the recent gains in the LEI development, there is still much to be done ahead of having an ROC, COU and some LOUs in place by March 2013. Again sustaining a level playing field for the public good on a global basis, the FSB has asked the PSPG to build on initial work and consider the next phase of operational work that will focus on how the system can best address key issues in areas such as data quality, supporting local languages and characters, and drawing effectively on local infrastructure to deliver a truly global federated LEI system. The PSPG’s deadline to make proposals on these issues is the end of the year, generating the need for extremely swift action if the LEI system is to be up and running to any extent in March.

The final issue raised in the FSB’s progress note and one which has yet to be openly debated and resolved is ownership and hierarchy data associated with LEIs. The note states: “Addition of information on ownership and corporate hierarchies is essential to support effective risk aggregation, which is a key objective for the global LEI system. The IG is developing proposals for additional reference data on the direct and ultimate parents(s) of legal entities and on relationship (including ownership) data more generally and will prepare initial recommendations by the end of 2012. The IG is working closely with the PSPG to develop the proposals.”

This might be the FSB’s final note, but the issue has to be a top priority. As one observer puts it: “The next big thing is hierarchies. They need to be nailed down and there needs to be transparency. Work is being done on this, but without a good solution there will be no meaning in the LEI.”

Source: Reference Data Review, 29.10.2012

Filed under: Data Management, Reference Data, Standards, , , , , , , ,

Reference Data: Current Solutions Lacking, Despite Foundational Nature of Reference Data RDR

Reference data management (RDM) is a foundational element of financial enterprises, yet the collection of solutions used to manage reference data in most firms is not satisfactory, according to a report published this week.

The report – Reference Data Management: Unlocking Operational Efficiencies, published by Tabb Group in conjunction with data integration specialist Informatica – describes current sentiment around RDM. It looks at development through four generations of solutions, details the obstacles to RDM success and sets out how firms at different levels of RDM adoption can move forward towards the holy grail of centralised RMD coupled to consistent reference data processing.

Despite huge investments in RDM over the past decade, research carried out among 20 firms – 25% in Europe, 75% in the US, 50% on the buy side and 50% on the sell side – in April 2012 found 86% of respondents dissatisfied with their RDM capabilities. Of these, 48% are being driven to improvement for reasons related to resource optimisation and outcomes, while 35% are responding to specific catalysts such as compliance.

For details on the report click here.

Recommending how to navigate the road ahead, the study suggests firms committed to bolstering existing suites of RDM solutions should focus on wrapping current solutions with technology that enables a consistent enterprise data governance process, while those yet to make a significant commitment to an RDM solution should seek solutions that manage multiple reference data domains in a consistent and integrated enterprise framework.

The report concludes: “There can be no glory without doing the hard work first. Data fluency, a critical precursor to data consumability, simply means that data flows more easily, which in turn means that end users must be able to find it. And, finding data requires meticulous attention to standards, labels and other metadata, however imperfect they may be now or in the future. That way, no matter how big or complex the data gets, end users will have a much better shot at harvesting value from it.”

Source: Reference Data Review, 19.10.2012

Filed under: Data Management, Reference Data, , , , , , , ,

News and updates on LEI standard progress and development

As a follow up on G20 acceptance in Los Cabos in July 2012 and the Financial Stability Board guidelines and recommendations of the Legal Entity Identifier  LEI, we will regularly update this post with news and article to provide an overview of  LEI standard progress and development.

 
First Published  13.07.2012 , Last Update 27.09.2012

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , , , , , , , , , , , , , , , , ,

Symbology: EDI’s Corporate Actions Service Adopts Bloomberg Open Symbology

Free-use Data Tagging System Reduces Costs and Risks in Trading

Exchange Data International (EDI), a premier back office financial data provider, today announced it adopted Bloomberg’s Global Securities Identifiers (‘BBGID’) to name and track all equities securities in its Worldwide Corporate Actions service.

EDI is the latest financial data provider to adopt Bloomberg’s Open Symbology (BSYM), an open and free-use system for naming global securities across all asset classes with a BBGID, a 12 digit alpha-numeric identifier for financial instruments. EDI has implemented BBGID numbers in its equities reference, pricing and corporate actions data feeds. Its Worldwide Corporate Actions service provides detailed information on 50 corporate action event types affecting equities listed on 160 exchanges.

“EDI decided to integrate Bloomberg’s Open Symbology, as it is easily accessible and has no license fee or restrictions on usage,” said Jonathan Bloch, the Chief Executive Officer of EDI. “Bloomberg’s Symbology also advances straight-through processing of equity orders, which aids reporting and compliance management.”

Peter Warms, Global Head of Bloomberg Open Symbology, said, “Existing identifiers that change due to underlying corporate actions introduce inefficiencies, increase costs and add complexity to the data management process. Bloomberg and EDI recognise the importance of comprehensive, open and unchanging identifiers, like the BBGID, in enabling customers to track unique securities consistently and to process corporate action data seamlessly. As BSYM grows in adoption, interoperability across market systems and software using BSYM will improve steadily and reduce operational costs.”

Source: Bobsguide, 24.09.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , ,

Follow

Get every new post delivered to your Inbox.

Join 77 other followers