FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

Reference Data: Tech Mahindra Details Global Data Utility Based on Acquired UBS Platform

Tech Mahindra, a business process outsourcing specialist and parent of London-based investment management technology consultancy Citisoft, has repositioned a reference data platform acquired from UBS Global Asset Management to offer an offshore reference data utility aimed at meeting market demand for lower cost, high quality data that can reduce risk and increase efficiency.

The global data utility has been introduced under the Tech Mahindra Managed Data Services brand and offers securities reference data across all asset types, reference data for corporate actions, tax information and end-of-day and intra-day validated pricing data. The utility handles data cleansing and validation, with clients buying licences to access the data.

Tech Mahindra suggests the utility differs from other offerings in the enterprise data management market as it is owned by the company and can be developed. It is also agnostic on data feeds, including 20 from vendors including SIX, Markit, Bloomberg, Thomson Reuters and DTCC.

The company’s first customer is UBS Fund Services in Luxembourg. Under the terms of a five-year services contract with UBS, Tech Mahindra will create and store golden copy data and provide multiple intra-day golden copies to the asset manager. As part of the acquisition and customer deal, Tech Mahindra, which is headquartered in Hyderabad, India, will take on some staff from UBS Global Asset Management who were working on the platform in Luxembourg, but most staff will be located in India.

As a repositioned platform, Tech Mahindra MDS already covers all time zones, markets and asset types, updates 2.5 million issuers on a daily base, receives 200,000 customer price requests and validates 75,000 prices. Some 20,000 corporate actions are checked every day, along with 1,800 tax figures. Looking forward, Tech Mahindra plans to extend these metrics and add reference data around indices and benchmarks, legal entity identifiers and clients.

While Tech Mahindra will lead sales of the service to the banking, financial services and insurance sectors, Citisoft will be able to provide consultancy as necessary. Steve Young, CEO of Citisoft, says Tech Mahindra MDS has been designed to improve data quality and drive down the total cost of data ownership, in turn reducing risk and increasing efficiency. To manage clients’ cost issues, the company has built a toolkit into the data management system that allows users to analyse the cost of owning data, including people, processes and technology. Data quality will be underpinned by service level agreements and key performance indicators will be added as more clients sign up for services and data volumes grow.

Reflecting on the data challenges faced by financial firms, Citisoft Group CEO Jonathan Clark, concludes: “Outsourcing models have evolved over time and attitudes are changing as firms acknowledge that there is a big difference between outsourcing and offshoring, and that captive outsourcing is not an efficient approach. The need is for a commercial relationship with a centralised data utility that can deliver high-quality, accurate data and a lower total cost of ownership.”

Source: Reference Data Review, 24.07.2013

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , , , , , , , ,

Data Management – a Finance, Risk and Regulatory Perspective- White Paper

Download this  white paper – Data Management – a Finance, Risk and Regulatory Perspective – about the challenges facing financial institutions operating across borders with respect to ensuring data consistency and usability across multiple user types.

 The finance, risk and compliance operations of any financial institution need nimble access to business information, for performance measurement, risk management, and client and regulatory reporting. But although the underlying data may be the same, their individual requirements are different, reflecting the group-level view required by senior management and regulators, and the more operational view at the individual business level.

Where in the past, risk managers have been left to figure out what’s most appropriate for their particular institutions, regulators today are adopting a more aggressive stance, challenging the assumptions underpinning banks’ approaches to risk management. As a result, today’s challenge is not only to understand the current regulatory, risk or finance requirements, but also to set in place the analytical framework that will help anticipate future requirements as they come on stream. To find out more, download the white paper now

Full text and downloads available to members only, please log in or become a member to view premium content and attached documents.  Becoming a member of the ReferenceDataReview.com community is free and easy – just click the link below.      
Source: A-Team, 02.07.2013

Filed under: Corporate Action, Data Management, Library, Reference Data, Risk Management, Standards, , , , , ,

LEI-Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again.

But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I think there’s nothing more to say on the topic, there is – well – more to say. With the artifice of the March ‘launch date’ behind us, it’s time to deal with reality. And the reality practitioners are having to deal with is one that’s changing rapidly.

Down load full and detailed report.

LEI-Dealing_with_reality-how_to_ensure_data_quality with Entity Identifiers_06_13.pdf

Source: A-Team, 26.06,2013

Filed under: Data Management, Data Vendor, Library, Reference Data, Standards, , , , , , , , , , , , , , ,

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities.

These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency of update than before.

Download this special report for FREE now! Click the link below.

Source: A-Team, 26.06.2013

Filed under: Data Vendor, Library, Market Data, Reference Data, , , , , , , ,

Thomson Reuters Outlines Plans to Lighten the Burden of Symbology

Thomson Reuters has set out its store on symbology saying it does not support the promotion of new identifiers as a means of improving data management, but is keen to support industry standards and has plans to offer services such as symbology cross-referencing to ease the burden on data managers.

The company documents the development of symbology, its use and complexity in a white paper authored by Jason du Preez, head of symbology services at Thomson Reuters, and entitled ‘Solving for Symbology Discord, the Identity Challenge’.

Thomson Reuters set up a symbology business last year and published the white paper to acknowledge the importance of symbology and recognise its challenges. Du Preez says: “We don’t believe there is a silver bullet that will answer the problems of symbology. Innovative new products continue to exacerbate the problem and that is not going to change. We can, using our core competencies, create linkages, invest to take on the burden of linking data sets, and maintain code mapping. And we can allow the market to make more use of our intellectual property.”

Du Preez cites licences introduced last summer to extend the use of the company’s proprietary Reuters Instrument Codes (RICs) in non real-time content, as well as its agreement in response to a European Commission antitrust investigation to extend the use of RICs in real-time consolidated data feeds, as moves to open up how RICs are licensed and make them more accessible across all asset classes.

Integration of RICs with Proprietary Identifiers

He says: “As there is no silver bullet, we will invest more in cross-referencing services and tie in quality of information. We will have interesting things to offer over the next 18 months.” Among these he lists the integration of RICs and proprietary identifiers, with firms submitting their codes to Thomson Reuters and the company playing them back as part of its own codes. Other broad cross-referencing services will be tailored to allow clients to access only required cross references and linkages.

“Thomson Reuters doesn’t promote a new code, there are enough out there already. We will continue to use existing codes and extract value from them; the key is linkages between market vendor codes and proprietary structures. While clients face regulatory and cost drivers, we will take care of linkages and cross referencing to improve the breadth and quality of client content.”

Thomson Reuters’ white paper details the development of symbology and notes the company’s intent, as described by du Preez. It starts by mentioning irregular incidents in the market that remind the industry of the challenges involved when an aggregated or consolidated view across positions is needed, including the incompatibility of core data symbols. The paper states: “The core elements: security identification, counterparty identification and price discovery, were never developed to work efficiently and effectively on an enterprise/global scale.”

Looking at the current state of symbology, the paper flags the fragmented identification methods resulting form the market’s approach to symbology, including data providers’ and data aggregators’ different means of identifying the various parts of securities or counterparties, as well as firms’ creation of proprietary identifiers to fill gaps in vendor provision. The paper reports: “[Symbology] is still a ‘cottage industry’ where the identification schemes put in place by one group are locally focused and usually limited to a specific slice of the securities market. This consumes resources: in many cases the task of mapping multiple sets of disjointed or partially overlapping symbols can consume as much (or more) development time and computing resource as programming the business logic itself.”

The paper reviews changes in the financial industry since 1993 that have complicated symbology and notes the increasing difficulty, yet increasing need, to integrate information across a firm’s complete range of trading businesses to achieve effective risk management. On the flip side, it points to the parallel need to analyse rapidly growing stores of information and connect increasingly diverse datasets to find relevant information in the quest for alpha. It states: “The sophistication of the methods we employ to aggregate, rationalise and navigate information bears a direct relationship to the size of the lead a firm can have in the financial marketplace.”

How to Unambiguously Identify Information

While the outcome of linking and navigating information can be positive, it presents significant challenges as a lack of consistent and comprehensive global industry standards means firms must maintain symbology cross references, a difficult and often flawed task, particularly in banks with many different trade and compliance-related systems. Du Preez writes: “A popular approach is ‘we can build an adaptor’. Adaptors have become some of the most complex processes in banking technology. That is not data management. It is trying not to get eaten by the alligators.” He goes on to surmise: “Data managers do not want to deal with these problems – they ultimately want services they can reliably use to unambiguously identify information.”

Enter Thomson Reuters with its vision of how to resolve these problems. “We believe that these linkages are the key to enormous untapped value. Being able to enter the data model through any entity identifier (quote, security or legal entity) and easily navigate and explore all the linkages between related entities not only puts a firm in control of its risk position, but also creates a window into opportunities. Industry standards have a significant part to play as they provide a universal start and end point; Thomson Reuters is a strong supporter of symbology standards in the data industry and we will be first in line to adopt and link industry standard identifiers to our content sets.”

The report discusses the challenges propagated by the use of multiple symbologies and the workload associated with the maintenance of cross reference tables in local security master databases. It touches on Thomson Reuters’ plans to provide cross reference services centrally and leverage its core competencies and infrastructure to ease the burden on institutions that have traditionally solved the problems themselves.

It states: “Cross referencing is a reality that cannot be avoided – we aim to make this as accurate and cost-effective as possible for our customers. We also understand that while symbology is an important part of the picture, translation and synchronisation services will also play a critical part. The need for these services is evidenced by the burgeoning desire of the market to offload these onerous data management functions to specialist providers.” The report concludes: “Thomson Reuters is investing now to continue to expose the growing capabilities of its data management infrastructure and ensure that structured and unstructured data come together in a rich tapestry of knowledge with the aim of maximizing utility to trading algorithms, research, analysis and information discovery.”

Source: A-Team Reference Data Review, 26.03.2013

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , , , , ,

Outsourcing Reference Data Management: Cost Reduction and New Revenue Opportunities

The past 12 months has seen the emergence of new players offering Business Process Outsourcing (BPO) services for Reference Data Management. These new arrivals expand the range of options available to financial institutions for addressing the challenges of regulatory compliance, operational cost reduction and scalability.

But BPO has other benefits, and innovative adopters have benefited from using the model to create new value-added services. By catering to clients’ data management needs, these players have been able to transform what’s traditionally been considered a cost centre into a new and significant source of revenue.

This paper – from AIM Software – explores this exciting new trend, and describes how an established financial institution took advantage of BPO to turn its enterprise data management initiative into a new source of revenue and business growth.

Download the White Paper Now to Find Out More

Source: A-Team, March 2013

Filed under: Corporate Action, Data Management, Data Vendor, Library, Market Data, Reference Data, Standards, , , , , , , , , , , , , , ,

Capco Proposes the Creation of a Data Culture to Advance Data Management RDR

Financial firms are falling short on data management issues such as calculating the true cost of data, identifying the operational cost savings of improved data management and embracing social media data, but according to research by consultancy Capco, these issues can be resolved with a cross-organisational and practical approach to data management and the development of a data culture.

The business and technology consultancy’s report – ‘Why and how should you stop being an organisation that manages data and become a data management organisation’ – is based on interviews with towards 100 senior executives at European financial institutions. It considers the many approaches to data management across the industry and within individual enterprises, as well as the need to rethink data management. It states: “There is one certainty: data and its effective management can no longer be ignored.”

The report suggests an effective data management culture will include agreed best practices that are known to a whole organisation and leadership provided by a chief data officer (CDO) with a voice at board level and control of data management strategy and operational implementation.

For details on the report click here

Turning the situation around and attaching practical solutions to the data management vision of an all-encompassing data culture, Capco lists regulatory compliance, risk management, revenue increase, innovation and cost reduction as operational areas where good data management can have a measurable and positive effect on profit and loss.

Setting out how an organisation can create an effective data culture, Capco notes the need to change from being an organisation that is obliged to do a certain amount of data management, to a mandated and empowered data management organisation in which data has ongoing recognition as a key primary source. The report concludes: “Every organisation has the potential, as well as the need, to become a true data management organisation. However, the journey needs to begin now.”

Source: Reference Data Review, 24.10.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , ,

LEI Development Embraces Change and Picks up Speed Ahead of G20 Meeting

The Financial Stability Board’s (FSB) third progress note on the legal entity identifier (LEI) initiative, released last week, has met with a positive response from those involved in shaping the system, potential infrastructure providers and market data vendors, despite changes to some proposals and the collapse of perceptions that have built up during debate on how the final system could shape up.

But while progress is positive, there are still fundamental concerns around corporate hierarchies, as without agreed reference data on legal entity parent and relationship information, the LEI will not fulfil the effective risk aggregation function at the heart of the global system development.

The decisions and to-do lists outlined in the FSB progress note are significant steps forward in developing a global LEI system and come ahead of another major milestone this week when G20 finance ministers and central bank governors meet in Mexico City and will be asked to endorse a draft charter for the system’s Regulatory Oversight Committee (ROC). The charter has been drawn up by the FSB Implementation Group (IG) and is expected to be approved by the G20 meeting, setting in motion the creation of the ROC and the global LEI foundation that will underpin the Central Operating Unit (COU) and secure a governance framework designed to sustain the public good of the system.

One of the late changes identified in the progress note is a shift away from perceptions that entity identifier codes would be 20-character random numbers. Instead, the note describes a part-structured, part-random character string resulting from an ‘urgent’ request made by the FSB IG in September for the FSB LEI Private Sector Preparatory Group (PSPG) to consider how identifiers could best be issued for the purposes of a federated, global LEI system. The PSPG’s views were considered at meetings of the PSPG and IG in Basel earlier this month and a technical specification has been endorsed by the FSB plenary.

The FSB states in the progress note: “The FSB decision is provided now to deliver clarity and certainty to the private sector on the approach to be taken by potential pre-LEI systems that will facilitate the integration of such local precursor solutions in to the global LEI system.”

On the basis of the arguments presented and discussed by the PSPG, the FSB has selected a structured number as the best approach for the global LEI system, although it acknowledges that the 20-character code, which complies with the existing ISO 17442 standard, will have no permanent embedded meaning. Instead it is aimed to avoid any overlap of random numbers in a federated issuing system by adding a code for each local operating unit (LOU) assigning LEIs in front of the numbers.

The breakdown then looks like this:

· Characters 1-4: a four character prefix allocated uniquely to each LOU

· Characters 5-6: two reserved characters set to zero

· Characters 7-18: entity-specific part of the code generated and assigned by LOUs

· Characters 19-20: two check digits as described in ISO 17442.

If this information has been a long time coming, the time to organise behind it is short with pre-LEI solutions wanting to transition into the global LEI system required to adopt the numbering scheme no later than November 30, just a month away. The LEI will be portable within the global LEI system, implying that the LEI code can be transferred from one LOU to another and that each LOU must have capacity to take responsibility for LEIs issued by other LOUs.

Following recommendations on data quality achieved through self-registration of legal entities in the FSB’s June 2012 report, the FSB goes on to decree that pre LEI-services should be based on self-registration, although this can include third-party registration made with the permission of the entity to be registered, and that from November 9 all pre-LEI systems must allow self-registration only.

No specific recommendations are made on how the Commodity Futures Trading Commission’s (CFTC) CFTC Interim Compliant Identifiers, or CICIs, which are entirely random numbers, will integrate with the LEI system, although the 27,000 or so already issued are expected to be grandfathered and accepted into the system without being restated.

Commenting on the LEI number structure, Peter Warms, global head of ID and symbology development at Bloomberg, says: “But for the prefix that identifies where the number was assigned from, the number is still random. This is good for data management practices as the number has no other data dependencies. I would question, however, whether the prefix of an identifier would be changed if it is moved to another LOU as this is not clear.”

Tim Lind, head of legal entity and corporate actions at Thomson Reuters, says: “We must put the debate on intelligent versus dumb numbers behind us and leave it as a milestone. Either solution could work and ongoing argument is not productive. The LEI principles are in place and we need to get on and get the work done.”

Both Warms and Lind applaud the advances made by the FSB and its working groups, but the need for speed remains if deadlines are to be met. And as the complex tasks of developing a legal foundation, ROC and governance framework for the LEI continue, Lind proposes a balance of perfection and pragmatism as the only way forward.

Another outcome of the Basel meetings that deflates earlier perceptions, is a clear indication that the COU will not be located in one central place, but will instead be distributed across several locations. This is likely to emanate from the FSB’s hard fought for and well held desire to ensure the LEI system is a collective development for the public good including a governance and operational framework that will encourage all jurisdictions to join in.

On the same basis, it has also become apparent that any suggestion that an LEI system could initially be based on a replica of the DTCC and Swift utility set up for the CFTC’s CICIs has been quashed. Instead, LOUs are expected to make their own technology choices to support the LEI – indeed they may already have systems in place – although they will, necessarily, have to conform with standards set by the COU.

If these are some of the recent gains in the LEI development, there is still much to be done ahead of having an ROC, COU and some LOUs in place by March 2013. Again sustaining a level playing field for the public good on a global basis, the FSB has asked the PSPG to build on initial work and consider the next phase of operational work that will focus on how the system can best address key issues in areas such as data quality, supporting local languages and characters, and drawing effectively on local infrastructure to deliver a truly global federated LEI system. The PSPG’s deadline to make proposals on these issues is the end of the year, generating the need for extremely swift action if the LEI system is to be up and running to any extent in March.

The final issue raised in the FSB’s progress note and one which has yet to be openly debated and resolved is ownership and hierarchy data associated with LEIs. The note states: “Addition of information on ownership and corporate hierarchies is essential to support effective risk aggregation, which is a key objective for the global LEI system. The IG is developing proposals for additional reference data on the direct and ultimate parents(s) of legal entities and on relationship (including ownership) data more generally and will prepare initial recommendations by the end of 2012. The IG is working closely with the PSPG to develop the proposals.”

This might be the FSB’s final note, but the issue has to be a top priority. As one observer puts it: “The next big thing is hierarchies. They need to be nailed down and there needs to be transparency. Work is being done on this, but without a good solution there will be no meaning in the LEI.”

Source: Reference Data Review, 29.10.2012

Filed under: Data Management, Reference Data, Standards, , , , , , , ,

Reference Data: Current Solutions Lacking, Despite Foundational Nature of Reference Data RDR

Reference data management (RDM) is a foundational element of financial enterprises, yet the collection of solutions used to manage reference data in most firms is not satisfactory, according to a report published this week.

The report – Reference Data Management: Unlocking Operational Efficiencies, published by Tabb Group in conjunction with data integration specialist Informatica – describes current sentiment around RDM. It looks at development through four generations of solutions, details the obstacles to RDM success and sets out how firms at different levels of RDM adoption can move forward towards the holy grail of centralised RMD coupled to consistent reference data processing.

Despite huge investments in RDM over the past decade, research carried out among 20 firms – 25% in Europe, 75% in the US, 50% on the buy side and 50% on the sell side – in April 2012 found 86% of respondents dissatisfied with their RDM capabilities. Of these, 48% are being driven to improvement for reasons related to resource optimisation and outcomes, while 35% are responding to specific catalysts such as compliance.

For details on the report click here.

Recommending how to navigate the road ahead, the study suggests firms committed to bolstering existing suites of RDM solutions should focus on wrapping current solutions with technology that enables a consistent enterprise data governance process, while those yet to make a significant commitment to an RDM solution should seek solutions that manage multiple reference data domains in a consistent and integrated enterprise framework.

The report concludes: “There can be no glory without doing the hard work first. Data fluency, a critical precursor to data consumability, simply means that data flows more easily, which in turn means that end users must be able to find it. And, finding data requires meticulous attention to standards, labels and other metadata, however imperfect they may be now or in the future. That way, no matter how big or complex the data gets, end users will have a much better shot at harvesting value from it.”

Source: Reference Data Review, 19.10.2012

Filed under: Data Management, Reference Data, , , , , , , ,

News and updates on LEI standard progress and development

As a follow up on G20 acceptance in Los Cabos in July 2012 and the Financial Stability Board guidelines and recommendations of the Legal Entity Identifier  LEI, we will regularly update this post with news and article to provide an overview of  LEI standard progress and development.

 
First Published  13.07.2012 , Last Update 27.09.2012

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , , , , , , , , , , , , , , , , ,

Brazil: Perseus Telecom acquires ETradeLab Brazilen Trading Tech Co

Perseus Telecom, a global connectivity provider, today announces the acquisition of ETradeLab, a Sao Paulo-based financial technology provider of hosting, managed connectivity, order routing and trade monitoring support. The purchase comes at a time of global demand for efficient trading systems with low-latency connectivity and local support models suited for banks, hedge funds and proprietary firms.

The joint company expects to add significant value to its services by tightly integrating them. Combining ETradeLab’s hosting solutions with Perseus’ ultra-low latency networks will provide cost effective, efficient and valuable network solutions for its customers. Anticipating and responding to innovative demands while having pricing sensitivities further led to this agreement.

“Our purchase of ETradeLab shines light on the accelerating market growth in Brazil, Peru, Chile, Panama and Colombia where capital markets require ultra-fast, reliable connections to mitigate risk and to provide worldwide reach,” states Dr. Jock Percy, Chief Executive of Perseus Telecom. “Incorporating ETradeLab into our brand was an easy decision given its expertise in the market and trade monitoring services.”

Effective immediately, Marcos Guimarães, founder of ETradeLab, is President of Perseus Telecom, Brazil. “It is a thrill to be part of Perseus Telecom’s top-tier management team. Perseus brings innovation and high performance networks at fair prices due to its strong portfolio that allow customers to increase revenues while reducing operating costs,” states Guimarães. “The LATAM region’s continuing market growth requires such building blocks for optimum time-to-market and even faster development; Perseus has the DNA to deliver them allied to ETradeLab’s local market knowledge. I’m ready for the challenge and can’t wait to start working with our customers.”

The expansion of Perseus’ Brazilian presence follows its recent Global Telecoms Business award for “Best Innovation,” related directly to building the fastest connectivity from London to BM&F BOVESPA, Brazil. Prior to the award, Perseus announced the fastest route to BM&F BOVESPA with its strategic partner GlobeNet.

“Our recent announcements and awards with regards to Brazil and South America have indicated the firm is taking a permanent and local stake in the region and we have done this with the valuable acquisition of ETradeLab,” says Percy. “We welcome Marcos Guimarães as President of Perseus do Brazil and will continue our path of commitment to providing the lowest latency networks globally whilst delivering intelligent and cost efficiencies.”

Source: LowLatency.c0m, 12.09.2012

Filed under: BM&FBOVESPA, Brazil, Latin America, News, Trading Technology, , , , , , , , , , , , ,

Whitepaper: Bloomberg to embrace emerging LEI

The industry initiative to develop and promote a standard global legal entity identifier (LEI) is expected to significantly reduce the opacity associated with complex financial instruments, widely acknowledged to be a major contributing factor in the 2008 credit crisis.

In this white paper, Bloomberg explains the implications of the emerging LEI for financial institutions, and outlines how it is embracing the new standard to help clients better understand the entities whose instruments they trade and hold (like mapping of LEI to Blombergs numeric BUID, etc.)

Download the White Paper Now

Source: A-TEAM 28.06.2012

Filed under: Data Management, Reference Data, Standards, , , , , , , , , , ,

Thomson Reuters Opens RICs to all with Non-Realtime License

Thomson Reuters is taking a step toward answering client calls for more open access to its Reuters Instrument Code (RIC) symbology. The company is making RICs available for use with non-real-time information in client and non-client financial institutions’ trade processing systems.

Enterprise content chief Gerry Buggy, who has spearheaded Thomson Reuters’ response to the EC anti-competition complaint, the new facility is the “first step in supporting the financial community’s symbology needs across all parts of the trading life cycle through our evolving symbology services.”

The move comes in the wake of the EC investigation and subsequent complaint into the use of RICs in real-time consolidated data feeds. In response to that complaint, many financial services practitioners have called for more open access to the RIC, which is entrenched in many firms front-, middle- and back-office trading and trade processing systems.

According to Jason du Preez, Global Business Manager, Enterprise Platform, at Thomson Reuters, the latest initiative “has nothing to do with the EC investigation. The EC is focused on use of RICs for accessing real-time information, while the new licences are focused at firms looking to trade with the RIC or use the RIC to access non-real-time information.”

Du Preez says that latest move means that “any market participant can buy a license that will allow them to trade using the RIC. This will allow the use of the RIC for pre- and post-trade activities, and the right to redistribute RICS in this regard.”

The new RICs arrangement will allow market participants to use and cross-reference the RIC symbol for trade activities. As such, it can be used to facilitate the advertisement of liquidity, acceptance of trade flow and execution of post trade activities with the RIC symbol as a consistent identifier throughout the process.

Additionally, the service will allow Thomson Reuters pricing and reference data customers to use RICs to reference and retrieve securities data from their securities master databases and navigate to connected content such as legal entity identifier (LEI) information.

Du Preez says that “Firms that purchase reference data from Thomson Reuters will also be granted the right to use the RIC to access any non-real-time information, essentially allowing them to use the RIC to access any content, including third-party party content, held in their securities master databases.”

Thomson Reuters believes the new service will encourage more efficient and reliable capital markets by giving market participants the freedom to use RICs symbols irrespective of whether they use Thomson Reuters enterprise data products.

As part of the latest initiative, the Bats Chi-X Europe exchange has signed up for the service, which will allow it to deploy RICs in the post-trade services it offers.

According to Paul O’Donnell, COO at BATS Chi-X Europe, “Cross-referencing the BATS Chi-X Europe instrument codes with the Thomson Reuters RIC symbols will enable us to reach new market participants as well as improve efficiency and data transparency by facilitating accurate identification of securities on our platform.”

Du Preez says obvious candidates for adopting the new arrangement include “trade hubs, third-party trade/post-trade processing firms or anyone that wants to send, receive or cross reference messages that contain securities identified with a RIC.”

Source: A-Team Reference Data Review 27.06.2012

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , , , , , , ,

Brazilian Markets Still Driving Low-Latency Connectivity

The Brazilian financial markets – with the Sao Paulo-based BM&F BOVESPA securities market in particular – continue to drive activity among connectivity and infrastructure providers looking to support trading firms looking to take advantage of trading opportunities in a hot market.

Most recently, GlobeNet – which operates a submarine cable from Nasdaq’s Carteret, NJ data centre to BM&F BOVESPA with 106 milliseconds of round-trip latency – reported that is has built out its own network infrastructure within Sao Paulo, connecting in to Florencio de Abreu, 195, a connection point for the exchange.  Previously, it relied on third party telecom providers for the leg within the city.  The new link gives it more control and faster implementation of customer connections.

Meanwhile, Sidera Networks is looking to the future by purchasing bandwidth on Seabras-1, a new 32 Tb/second submarine cable, being built by Seaborn Networks, linking Miami, FL to Sao Paulo.  The network – which Seaborn execs say will be the fastest route, but they are not saying yet by how much – is expected to be operational in the fourth quarter of 2014.  Seaborn execs also say connectivity from Miami to the New York City metro area will also be announced in due course.

Also, back in April, Thomson Reuters opened an Elektron Hosting and Managed Services centre in Sao Paulo, to provide low-latency access to BM&F BOVESPA.

Source: Low-Latency, 12.06.2012 Peter Harris

Filed under: BM&FBOVESPA, Brazil, Exchanges, Trading Technology, , , , , , , ,

Follow

Get every new post delivered to your Inbox.

Join 79 other followers