FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

Reference Data: Tech Mahindra Details Global Data Utility Based on Acquired UBS Platform

Tech Mahindra, a business process outsourcing specialist and parent of London-based investment management technology consultancy Citisoft, has repositioned a reference data platform acquired from UBS Global Asset Management to offer an offshore reference data utility aimed at meeting market demand for lower cost, high quality data that can reduce risk and increase efficiency.

The global data utility has been introduced under the Tech Mahindra Managed Data Services brand and offers securities reference data across all asset types, reference data for corporate actions, tax information and end-of-day and intra-day validated pricing data. The utility handles data cleansing and validation, with clients buying licences to access the data.

Tech Mahindra suggests the utility differs from other offerings in the enterprise data management market as it is owned by the company and can be developed. It is also agnostic on data feeds, including 20 from vendors including SIX, Markit, Bloomberg, Thomson Reuters and DTCC.

The company’s first customer is UBS Fund Services in Luxembourg. Under the terms of a five-year services contract with UBS, Tech Mahindra will create and store golden copy data and provide multiple intra-day golden copies to the asset manager. As part of the acquisition and customer deal, Tech Mahindra, which is headquartered in Hyderabad, India, will take on some staff from UBS Global Asset Management who were working on the platform in Luxembourg, but most staff will be located in India.

As a repositioned platform, Tech Mahindra MDS already covers all time zones, markets and asset types, updates 2.5 million issuers on a daily base, receives 200,000 customer price requests and validates 75,000 prices. Some 20,000 corporate actions are checked every day, along with 1,800 tax figures. Looking forward, Tech Mahindra plans to extend these metrics and add reference data around indices and benchmarks, legal entity identifiers and clients.

While Tech Mahindra will lead sales of the service to the banking, financial services and insurance sectors, Citisoft will be able to provide consultancy as necessary. Steve Young, CEO of Citisoft, says Tech Mahindra MDS has been designed to improve data quality and drive down the total cost of data ownership, in turn reducing risk and increasing efficiency. To manage clients’ cost issues, the company has built a toolkit into the data management system that allows users to analyse the cost of owning data, including people, processes and technology. Data quality will be underpinned by service level agreements and key performance indicators will be added as more clients sign up for services and data volumes grow.

Reflecting on the data challenges faced by financial firms, Citisoft Group CEO Jonathan Clark, concludes: “Outsourcing models have evolved over time and attitudes are changing as firms acknowledge that there is a big difference between outsourcing and offshoring, and that captive outsourcing is not an efficient approach. The need is for a commercial relationship with a centralised data utility that can deliver high-quality, accurate data and a lower total cost of ownership.”

Source: Reference Data Review, 24.07.2013

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , , , , , , , ,

Outsourcing Reference Data Management: Cost Reduction and New Revenue Opportunities

The past 12 months has seen the emergence of new players offering Business Process Outsourcing (BPO) services for Reference Data Management. These new arrivals expand the range of options available to financial institutions for addressing the challenges of regulatory compliance, operational cost reduction and scalability.

But BPO has other benefits, and innovative adopters have benefited from using the model to create new value-added services. By catering to clients’ data management needs, these players have been able to transform what’s traditionally been considered a cost centre into a new and significant source of revenue.

This paper – from AIM Software – explores this exciting new trend, and describes how an established financial institution took advantage of BPO to turn its enterprise data management initiative into a new source of revenue and business growth.

Download the White Paper Now to Find Out More

Source: A-Team, March 2013

Filed under: Corporate Action, Data Management, Data Vendor, Library, Market Data, Reference Data, Standards, , , , , , , , , , , , , , ,

ICE to acquire NYSE Euronext for 8.2 billion USD – Back Ground and Analysis

The overall mix of the $8.2 billion of merger consideration being paid by ICE is approximately 67% shares and 33% cash. The transaction value of $33.12 represents a 37.7% premium over NYSE Euronext’s closing share price on Wednesday.  The transaction is expected to close in the second half of 2013, subject to regulatory approvals.

Investors see plenty of upsides in a takeover by ICE, which would create a powerhouse in cross-asset trading and reduce Nyse Euronext’s reliance on stagnating, hyper-competitive equity markets. Nyse’s share of trading on stocks listed on the Big Board has shrunk from 82% to just 21% in a fiercely competitive market.  For ICE, a tie-up with Nyse Euronext will give the energy trading bourse a leg-up into the expanding market for over-the-counter derivatives contracts and the geographical outreach to take on the Chicago Mercantile Exchange.

The two companies have already inked an agreement for Nyse Liffe to move its clearing operations to ICE Clear Europe. The implications of the deal for Nyse Liffe’s plans to move its clearing from LCH.Clearnet to a newly-constructed inhouse CCP by June 2013 have not been spelled out.

The combined company is expected to save up to $450 million through cost synergies in the second full year post closing. ICE has successfully integrated more than a dozen acquisitions in the last decade.  An earlier bid by ICE to take over Nyse Euronext in tandem with Nasdaq OMX was nixed by the US Justice Department on anti-competitive grounds. Observers see no similar objections being raised to a straight merger, with Nasdaq OMX removed from the equation. FinExtra 20.12.2012

NYSE and ICE: Not So Nice for European Equities

Given the sweeping changes hitting exchanges on the back of growing regulation and falling equity volumes in Europe, the combined entity would increase its chance of success, dominating European energy, commodity and short-dated fixed income trading, as well as OTC credit clearing; and leap-frogging Deutsche Boerse to become the world’s third-largest exchange group, with a combined market value of $15.2 billion.

However, not all divisions would benefit. Whilst a tie up with ICE would enable London-based Liffe to compete more effectively with CME Group in both trading and clearing of OTC products, for Euronext the future looks less certain. According to NYSE’s investor presentation explaining the deal, ICE “intends to explore an IPO of Euronext if market conditions allow and if European policy makers are supportive.” See full article at TABB Forum 20.12.2012.

ICE and NYSE Euronext Enter Clearing Services Agreement; ICE Clear Europe to Clear NYSE Liffe’s Derivatives Markets

ICE and NYSE Euronext agree that their wholly owned subsidiaries, ICE Clear Europe Limited and LIFFE Administration and Management have entered into a clearing services agreement pursuant to which ICE Clear Europe will provide clearing services to the London market of NYSE Liffe (“NYSE Liffe”). The clearing services agreement will allow NYSE Liffe to transition seamlessly from their current clearing arrangements. See full article at Bob´s Guide 20.12.2012

Inside ICE takeover of NYSE Euronext ( Tabb Forum Video Interview)

Exchange Consolidation: Getting Over Merger Mania

At this time last year, NYSE Euronext and Deutsche Bourse were more than midway through a year-long merger push that would have resulted in an exchange operator with an estimated $16 billion in combined market capital and a near monopoly on the European exchange-traded derivatives business. Consolidation, it seemed, was the key to competing in the global exchange landscape.

But dreams of consolidation, synergies and economies of scale were quickly dashed. The two biggest cross-border exchange deals — NYSE/DB merger and the proposed merger of the Singapore Exchange (SGX) and the Australian Securities Exchange (ASX) — were blocked by regulators, and the LSE’s attempt to buy the Toronto Stock Exchange (TMX), which also failed initially due to reluctant regulators, eventually lost out to a domestic bid from Maple Group Acquisition Corp. earlier this year. see full article at TABB Forum 12.12.2012.

Filed under: Exchanges, News, , , , , , , ,

Integration of Histroical Reference Data

Historical data is becoming more crucial to managing risk, but to make it useful, data updates must be reconciled with the moments actual changes in data occurred, says Xenomorph’s Brian Sentance.

There has been much talk recently about integrated data management, as the post-crisis focus on risk management demands a more integrated approach to how the data needed by the business can be managed and accessed within one consistent data framework. Much of the debate has been around how different asset classes are integrated within one system, or how different types of data—such as market and reference data—should be managed together.

However, there has been little discussion on how historical components can be integrated into the data management infrastructure. This will have to change if the needs of regulators, clients, auditors and the business are to be met in the future.

Why is history and historical data becoming more important to data management? There are many reasons. First, data management for risk needs historical data in a way that simply was not necessary for the reference data origins of the industry over a decade ago.

Another reason would be the increasing recognition that market data and reference data need to be more integrated, and that having one without the other limits the extent of the data validation that can be performed. For example, how can terms and conditions data for a bond be fully validated if the security is not valued by a model and prices not compared to the market?

As another example, how many data management staff were overloaded by the “false positives” of price movement exceptions during the highly volatile markets of the financial crisis? I would suggest many organizations would have saved hours of manual effort if the price validation thresholds used could have automatically adjusted to follow levels of market volatility derived from historical price data.

Regulators and other organizations in the financial markets now want to know more of the detail behind the headline risk and valuation reports. The post-crisis need for an increase in the granularity of data should be taken as a given. This is progressing to an extent where external and internal oversight bodies not only want to know what your data is now, but want the ability to see what the data was at the time of market or institutional stress. Put another way, can you easily reproduce all the data used to generate a given report at a specific point in time? Can you also describe how and why this data differs from the data you have today?

“But I already have an audit trail on all my data,” I hear you say. Yes, that is a necessary condition on being able to “rewind the tape” to where you were at a given time, but is that sufficient? An audit trail could be considered as a sparse form of historical “time series” storage for data, but as we all are aware, there are not many pieces of “static” data that do not change over time (corporate events being the main cause behind these kinds of changes). The main issue with audit trail use here is that it can only represent the times when the data value was updated in the database, which is not necessarily the same time as when the data value was valid in the real world.

So for example, for the sovereign, that forces a change in the maturity dates of its issued bonds. You can only capture when your data management team implemented the change in the database, not necessarily when the change was actually made in the market. Hopefully, the two times may turn out to be the same if your data management team is efficient and your data suppliers are accurate and timely. But don’t count on it, and don’t be too surprised if a regulator, client or auditor is displeased with your explanation of what the data represents and why it was changed when it was. We are heading into times where not knowing the data detail beneath the headline numbers is no longer acceptable, and historic storage of any kind of data—not just market data—will necessarily become much more prevalent.

Source: Xenomorph, 13.07.2011

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Risk Management, Standards, , , , , , , , ,

Follow

Get every new post delivered to your Inbox.

Join 82 other followers