FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

Deutsche Börse exclusive licensor of BSE (Bombay Stock Exchange) market data to international clients

Deutsche Börse will be exclusive licensor of BSE market data to international clients New partnership gives market participants easier access to market data and information products of both exchange groups Deutsche Börse Market Data + Services and BSE today announced a partnership under which Deutsche Börse will act as the exclusive licensor of BSE market data and information products to all international clients. The new cooperation will benefit existing and potential customers by giving them access to both exchanges’ market data products under a single license agreement. A signing ceremony was held in Frankfurt on 2 October 2013.

The partnership also allows Deutsche Börse to deepen its client service capabilities in important Asian markets such as India, as well as strengthen the strategic alliance between the two exchanges.

“By partnering with BSE we give customers access to the full suite of real-time, delayed and end-of-day data products offered by both exchanges under a single license agreement. This approach meets clients’ market data needs while reducing their administrative requirements and increasing overall efficiency,” said Georg Gross, Head of Front Office Data + Services, Deutsche Börse.

“BSE is once again happy to partner with Deutsche Börse as this will enhance BSE’s visibility with international clients in the area of market data and information products. BSE will also get access to the innovative product development expertise of Deutsche Börse, which shall help BSE to provide an improved customer experience,” said Balasubramaniam Venkataramani, Chief Business Officer, BSE Ltd.

Under the new cooperation, Deutsche Börse will be responsible for sales and marketing of all BSE market data products to customers outside of India, while BSE continues to serve its domestic clients. Deutsche Börse will also share joint responsibility for product development and innovation, which includes extending its existing and the creation of new market data solutions and infrastructure to support BSE’s product offerings.

Products covered under the cooperation agreement include Real-time, Delayed and End-of-day data for BSE’s Equity and Derivatives markets, corporate data such as Results, Announcements, Shareholding Patterns and Corporate Actions as well as Real-time and Delayed Indices.

This market data agreement also further strengthens the cooperation between Deutsche Börse and BSE that began earlier this year. In March 2013, the two exchanges announced a long-term technology partnership in which BSE will deploy Deutsche Börse Group’s trading infrastructure.

Source: Bobsguide, 07.10.2013

Filed under: Data Management, Data Vendor, Exchanges, India, Market Data, , , , , , , , , ,

NYSE Launches Data-as-a-Service

NYSE Technologies, the commercial technology division of NYSE Euronext (NYX) and First Derivatives, a provider of software and consulting services to the capital markets industry, are collaborating to create a new suite of historical data ‘as a service’ solutions.

Combining NYSE Technologies’ historical and real-time data expertise covering cash, options, futures and corporate actions with First Derivatives’ products and market expertise, the Tick as a Service offering will build into a suite of innovative market services for clients to gain efficient access to large data stores for analytical back testing and compliance.

“By integrating First Derivatives’ suite of services with our diverse portfolio of technology solutions, including our consolidated feed service, we can offer comprehensive data collection, storage, and analysis ‘as a service’ to our entire global trading community,” said Jon Robson, CEO, NYSE Technologies. “This new service will allow participants to move from a client deployed to managed service for the storage, support and delivery of tick history infrastructure to back-test their algorithms and interrogate their data through a flexible, fully-managed solution.”

The Tick As A Service is the first of a number of historical data solutions NYSE will offer.

“Our collaboration with NYSE Technologies will deliver substantial benefits to clients – improving time to market while efficiently minimizing operational overhead and reducing costs,” said Brian Conlon, CEO, First Derivatives. “I am delighted that First Derivatives is forging a relationship with one of the most capable service providers in the global capital markets community who understand that the community needs managed solutions to address commoditized services and thus release capital for differentiating opportunities.”

NYSE Technologies offers a diverse array of products and services to the buy side including order routing, liquidity discovery and access to a community of over 630 broker-dealers and execution destinations globally; and to the sell side, including high-performance, end-to-end messaging software and market data products; and market venues and exchanges, including multi-asset exchange platform services, managed services and expert consultancy.

NYSE Technologies’ technology portfolio includes a broad array of real-time, historical and reference data alongside the capital markets community cloud, a hosted consolidated feed service (SuperFeed), and one of the world’s largest FIX-based order routing networks (Marketplace), all available across the Secure Financial Transactions Infrastructure (SFTI) network.

First Derivative’s flagship Delta suite of products include Delta Flow, Delta Data Factory, Delta Algo, Delta Margin and Delta Stream which are used in high volume, low latency environments. Combining key elements of each company’s product sets and unique functionality, NYSE Technologies and First Derivatives will develop a one-of-a-kind solution delivering an innovative suite of high-performance services that enhance real-time trading, CEP, market data and trading applications.

Source: NYSE Tech, 18.09.2013

Filed under: Data Management, Data Vendor, Market Data, , , , , , , ,

Reference Data: Tech Mahindra Details Global Data Utility Based on Acquired UBS Platform

Tech Mahindra, a business process outsourcing specialist and parent of London-based investment management technology consultancy Citisoft, has repositioned a reference data platform acquired from UBS Global Asset Management to offer an offshore reference data utility aimed at meeting market demand for lower cost, high quality data that can reduce risk and increase efficiency.

The global data utility has been introduced under the Tech Mahindra Managed Data Services brand and offers securities reference data across all asset types, reference data for corporate actions, tax information and end-of-day and intra-day validated pricing data. The utility handles data cleansing and validation, with clients buying licences to access the data.

Tech Mahindra suggests the utility differs from other offerings in the enterprise data management market as it is owned by the company and can be developed. It is also agnostic on data feeds, including 20 from vendors including SIX, Markit, Bloomberg, Thomson Reuters and DTCC.

The company’s first customer is UBS Fund Services in Luxembourg. Under the terms of a five-year services contract with UBS, Tech Mahindra will create and store golden copy data and provide multiple intra-day golden copies to the asset manager. As part of the acquisition and customer deal, Tech Mahindra, which is headquartered in Hyderabad, India, will take on some staff from UBS Global Asset Management who were working on the platform in Luxembourg, but most staff will be located in India.

As a repositioned platform, Tech Mahindra MDS already covers all time zones, markets and asset types, updates 2.5 million issuers on a daily base, receives 200,000 customer price requests and validates 75,000 prices. Some 20,000 corporate actions are checked every day, along with 1,800 tax figures. Looking forward, Tech Mahindra plans to extend these metrics and add reference data around indices and benchmarks, legal entity identifiers and clients.

While Tech Mahindra will lead sales of the service to the banking, financial services and insurance sectors, Citisoft will be able to provide consultancy as necessary. Steve Young, CEO of Citisoft, says Tech Mahindra MDS has been designed to improve data quality and drive down the total cost of data ownership, in turn reducing risk and increasing efficiency. To manage clients’ cost issues, the company has built a toolkit into the data management system that allows users to analyse the cost of owning data, including people, processes and technology. Data quality will be underpinned by service level agreements and key performance indicators will be added as more clients sign up for services and data volumes grow.

Reflecting on the data challenges faced by financial firms, Citisoft Group CEO Jonathan Clark, concludes: “Outsourcing models have evolved over time and attitudes are changing as firms acknowledge that there is a big difference between outsourcing and offshoring, and that captive outsourcing is not an efficient approach. The need is for a commercial relationship with a centralised data utility that can deliver high-quality, accurate data and a lower total cost of ownership.”

Source: Reference Data Review, 24.07.2013

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , , , , , , , ,

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities.

These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency of update than before.

Download this special report for FREE now! Click the link below.

Source: A-Team, 26.06.2013

Filed under: Data Vendor, Library, Market Data, Reference Data, , , , , , , ,

GFI Group to Supply Market Data to Mexico’s PiP LATAM

GFI Market Data, a division of GFI Group Inc. (NYSE:”GFIG”) announced today that it has signed an agreement with Mexico’s Proveedor Integral de Precios “PiP” under which GFI Market Data will become an official price contributor to PiP’s eurobond pricing and curve calculations.

PiP started operations in the year 2000 and was the first price vendor company authorized by the Mexican Securities Commission (CNBV) to provide prices for the valuation of financial assets. They currently have operations in Mexico, Peru, Colombia, Panama and Costa Rica.

Francesco Cicero, Head of eTrading at GFI Group said: “We are very happy to be working with PiP and to be able to supply them and their clients with an independent view of the eurobond markets derived from our highly experienced brokers as well as from our premier electronic trading screen for fixed income, GFI CreditMatch®”.

PiP distributes official closing prices via its PiP- Latam© system.

GFI Market Data provides real bid and offer prices and spreads for a broad range of instruments including asset backed securities, corporate bonds, emerging market bonds, floating rate notes, high yield bonds and structured products. Sourced directly from GFI CreditMatch®, our award-winning electronic trading platform for bonds and fixed income derivatives GFI data reflects market sentiment rather than indications gleaned through aggregated pricing.

Source: Bobsguide 03.01.2013

Filed under: Colombia, Data Vendor, Latin America, Market Data, Mexico, Peru, , , , , , , , , , ,

Nyse Technologies expands SFTI network in Asia

Nyse Technologies, the commercial technology division of Nyse Euronext, today announced the continuing expansion of its Secure Financial Transaction Infrastructure (SFTI) in Asia with the introduction of two access centres located in Hong Kong.

Customers now, for the first time, have direct access to the SFTI network, allowing them to connect from Hong Kong to services offered by NYSE Technologies through SFTI, including access to Hong Kong Exchanges & Clearing (HKEx), all major international trading venues, market data solutions, plus the NYSE Euronext capital markets community.

As part of the expansion of the SFTI network to include Hong Kong, NYSE has also extended SFTI to the new HKEx Data Centre colocation facility, giving customers there access to all the services available on SFTI through a simple cross connect to their colo racks. NYSE Technologies also plans to expand SFTI in the region to connect other markets like Australia and Korea.

NYSE Technologies’ Secure Financial Transaction Infrastructure provides access to a comprehensive range of capital markets products through a single point of access and offers low-latency trading access to the NYSE Liffe and NYSE Euronext markets. SFTI Asia is the most recent extension of the global backbone, enabling Asian firms to receive market data and trade on multiple markets. Designed to be the industry’s most secure and resilient network, SFTI is specifically built for electronic trading and market data traffic thus enabling firms to reduce their time-to-market, improve their performance and significantly lower the cost of their trading infrastructure. Furthermore, the global backbone allows customers to connect to their trading infrastructure distributed in financial centres around the world using a SFTI connection on the other side of the world.

“The addition of these important access centres in Hong Kong is a further step in the expansion of NYSE Technologies’ footprint and reach of the SFTI Asia network and adds to our established presence in Singapore and Tokyo.” Daniel Burgin, Head of Asia Pacific, NYSE Technologies, commented. “Offering multiple access centres in the Asia Pacific region allows them to use SFTI Asia to connect to regional and global exchanges and markets in a cost effective way through a single connection at each of the client’s locations around the region. This eliminates the overheads and costs associated with maintaining separate network connections in each location to multiple trading venues.”

Source: NYSE Technology 06.12.2012

Filed under: Australia, China, Data Management, Data Vendor, Exchanges, Hong Kong, Japan, Korea, Market Data, News, Singapore, , , , , , , , , , , , , ,

Capco Proposes the Creation of a Data Culture to Advance Data Management RDR

Financial firms are falling short on data management issues such as calculating the true cost of data, identifying the operational cost savings of improved data management and embracing social media data, but according to research by consultancy Capco, these issues can be resolved with a cross-organisational and practical approach to data management and the development of a data culture.

The business and technology consultancy’s report – ‘Why and how should you stop being an organisation that manages data and become a data management organisation’ – is based on interviews with towards 100 senior executives at European financial institutions. It considers the many approaches to data management across the industry and within individual enterprises, as well as the need to rethink data management. It states: “There is one certainty: data and its effective management can no longer be ignored.”

The report suggests an effective data management culture will include agreed best practices that are known to a whole organisation and leadership provided by a chief data officer (CDO) with a voice at board level and control of data management strategy and operational implementation.

For details on the report click here

Turning the situation around and attaching practical solutions to the data management vision of an all-encompassing data culture, Capco lists regulatory compliance, risk management, revenue increase, innovation and cost reduction as operational areas where good data management can have a measurable and positive effect on profit and loss.

Setting out how an organisation can create an effective data culture, Capco notes the need to change from being an organisation that is obliged to do a certain amount of data management, to a mandated and empowered data management organisation in which data has ongoing recognition as a key primary source. The report concludes: “Every organisation has the potential, as well as the need, to become a true data management organisation. However, the journey needs to begin now.”

Source: Reference Data Review, 24.10.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , ,

Nyse Technologies, Bolsa Mexicana and ATG build Mexican trading infrastructure

Nyse Technologies, the commercial technology division of Nyse Euronext (NYX: NYX) today announced that in collaboration with Bolsa Mexicana de Valores (BMV) and Americas Trading Group (ATG) it has built and deployed a state-of-the-art trading infrastructure complete with global connectivity, risk management functionality and direct market data distribution for customers trading in Mexican markets.

Designed to support the launch of Bolsa Mexicana’s new matching engine and midpoint hidden order book, this solution incorporates advanced technology developed specifically for every part of the trade cycle to provide unprecedented accessibility, performance and risk management for trading on Bolsa Mexicana’s exchanges with the aim of establishing Mexico as a premier Latin American investment destination.

Initially, this collaboration will provide:
• A new co-location model for access to cash and derivatives markets (through ATG directly at the KIO Data Center)
• Global connectivity for buy side, sell side and vendors from the US, Europe, Asia and also other Latin American markets such as Brazil and Chile.
• Sophisticated risk management functionality for international order routing (solution implemented by NYSE Technologies)
• Low touch order stamping by Bolsa Mexicana’s members to settle orders
• Global Market Data distribution via NYSE Technologies Secure Financial Transaction Infrastructure (SFTI) with direct contracting with BMV

“We are excited to again work with one of Latin America’s leading market operators in Bolsa Mexicana and market participants in ATG to deliver dramatic improvements across critical elements of the trade cycle,” said Dominique Cerruti, NYSE Technologies. “By continuing to improve access to key Latin American exchanges and customers, we continue to realize our vision of creating a global capital markets community with cutting-edge connectivity, performance and risk management.”

“Today’s announcement with NYSE Technologies and ATG demonstrates our ongoing commitment to grow and enhance our markets in Mexico to deliver highly flexible multi-market, multi-asset trading,” said Jorge Alegria, Head of Market Operations, Bolsa Mexicana de Valores. “We look forward to extending our relationship and cooperation with NYSE Technologies in several important areas that will f further expand that growth and performance in the near future.”

Source: FinExtra, 18.10.2012

Filed under: Asia, BMV - Mexico, Chile, Colombia, Data Management, Data Vendor, Latin America, Market Data, Mexico, Risk Management, Trading Technology, , , , , , , , , , , , , , ,

BM&FBOVESPA Market Data Feeds and Order Routing – News Letter 14

Changes to PUMA UMDF Market Data Feed in Certification Environment
Since September 11, 2012, a new distribution of the PUMA UMDF instrument groups has been available in the certification environment. The change was necessary to adapt this environment to the start of trading in the S&P 500 futures contract. On October 1, 2012, distribution of SELIC instruments in the certification environment is scheduled to start via existing channels 3 and 4.
New Version of MegaDirect Order Entry Interface Available in Certification Environment
Version 4 of the MegaDirect order entry interface for the BOVESPA segment is available in the certification environment. Participants who use versions 2 and 3 must have them updated, in order to maintain compatibility with the BM&FBOVESPA PUMA Trading System matching engine. MegaDirect users must perform the tests mentioned in External Communication 023/2012-DI by no later than September 28, 2012.
EntryPoint Order Entry Interface Available in Production Environment
The EntryPoint trading interface for the BOVESPA segment is now available in the production environment. Although it does not entail significant performance benefits (lower latency) at this time, it mitigates the risk of impacts during BOVESPA segment migration to the BM&FBOVESPA PUMA Trading System.

Customers who will use this new interface must observe the requirement for uniqueness in the ClOrdID (11) tag in the order entry, change and cancellation messages, to prevent any crossed references between orders of the same customer and instrument from generating inconsistencies in the participant’s management. The customer is responsible for correctly filling out the tag, as announced in External Communication 021/2012-DI.

Alterations to ProxyDiff Market Data Feed in Certification Environment
The ProxyDiff market data feed is now available in the certification environment in accordance with the alterations described in External Communication 024/2012-DI. This market data feed includes a large number of test cases (e.g. clash of messages with the same Order ID, and numeric and alphanumeric groups of quotations) that will be implemented in the production environment during and after migration to the BM&FBOVESPA PUMA Trading System in the BOVESPA segment. All customers who use the ProxyDiff market data feed must perform the tests by September 28, 2012.

See full IT News letter Nr.14

Source: BM&FBOVESPA, IT News Letter Nr. 14, 17.09.2012

Filed under: BM&FBOVESPA, Brazil, Data Management, Exchanges, Market Data, , , , , , ,

Brazil:CMA – The Latin American Market Data and Trading Company offers Direct BM&F BOVESPA Connectivity

July 30, 2012– New York, NY (USA) and São Paulo(Brazil) – Latin American trading services provider CMA Inc. http://www.cma.net, has announced a new delivery method for direct BM&F and BOVESPA market data and trading connectivity for International firms.

CMA has been providing BM&F and BOVESPA market data for over thirty five years to the trading community of Brazil. It now has leading exchange trading software services in Spain, Mexico, Colombia, Peru, Argentina and Chile with 20,000 subscribers worldwide. Today, CMA’s platforms such as CMA Series 4 have been rolled out on an impressive network called “CMA Redes Digitais.” The Redes Digitais  infrastructure is installed and directly connected within the exchange’s datacenter for the lowest possible latency.

Today’s announcement by CMA represents the launch and deployment of a directly connected infrastructure at the BM&F BOVESPA in São Paulo, Brazil with the CMA datacenter in New York. Companies can now co-locate their routers and servers with CMA at the BM&F BOVESPA datacenter or chose to receive the raw market data over CMA’s multi-gig private lines which terminate at CMA’s datacenter in New York City. The offering was developed to help firms trading with counter parties in São Paulo or for going directly to the exchange’s trading systems in a Direct Market Access (DMA) fashion.

 Many firms need to bring market data back to the USA and in return send trades messages to the exchange in Brazil. In both cases planning, paperwork and relationships are needed in order to complete the set-up. CMA is a certified exchange vendor able to help participants with the required documentation needed by the exchange to receive market data and to send trade messages. CMA also provides the relationships and connectivity to Brazilian brokers who can handle orders for foreign firms.

 “CMA’s market visibility as a prime vendor of the exchange and to 90% of the exchange’s broker dealers allows for our customers to be installed, up and running and trading as fast as possible,” Mario Chuman, General Manager of CMA commented. “International firms rely on us to help them with both exchange and broker connectivity, enabling market data and trading right from our switches in São Paulo which are now directly connected to our New York datacenter.“

 CMA is utilizing the fastest Trans Atlantic cable systems available, giving connectivity managers the security they require for proper networking, the lowest possible latency for competitiveness, multi-market/asset availability and an array of choices in being able to do so. Connectivity managers can now expand their market reach with CMA as they look to join both the BM&F BOVESPA Equities and Futures markets at roughly 50% lower IT and communications costs than other offerings which generally only provide one feed stream and one market at a time. CMA’s solution is the most cost effective, fastest and easiest way to implement an electronic trading solution for Brazilian securities.

Source, CMA, 30.07.2012

Filed under: Brazil, Data Vendor, Exchanges, Market Data, Trading Technology, , , , , , , , , , ,

Fidessa explores the development of electronic trading in Latin America

Fidessa group plc , provider of high-performance trading, investment management and information solutions for the world’s financial community, has today announced the publication of a white paper, Life in the fast lane: the development of electronic trading in Latin America. The paper explores the current trading landscape in Brazil, Mexico and the Andean region, and how recent technology and regulatory developments will affect domestic and international brokers trying to establish a rewarding position in these fast-paced markets.

White paper looks at market growth and trading technology in Brazil, the Andean region and Mexico

To highlight the unique trading conditions, market challenges, technology and regulatory changes shaping each market, Fidessa’s white paper considers specific regions in Latin America individually: from the extreme growth of Brazil as a strategic trading destination, to upgrades being made to Mexico’s trading infrastructure as well as the Andean region’s efforts to boost liquidity and exploit economies of scale. The paper explores the challenges presented by Latin America’s varying stages of growth as an electronic marketplace and concludes that flexibility, agility and scalability will be key attributes of the technology solution.

Alice Botis, Fidessa’s Head of Business Development in Latin America comments: “Latin America is attracting significant interest from global market participants and this shows no signs of stopping. Brokers are looking at the unique benefits each country has to offer and are taking the necessary steps to gain a presence in multiple locations across the region, in financial centers such as Brazil, Chile, Colombia, Mexico and Peru. Each country retains its unique style of trading, so it is important for buy-side and sell-side firms to understand how the marketplace is evolving in each region within Latin America and how those developments fit in with their local and global trading strategies.”

Source: Bobsguide, Fidessa 12.07.2012

Filed under: Brazil, Chile, Colombia, Latin America, Mexico, News, Peru, , , , , , , , , , , , , , , , , , , ,

Interactive Data Asian ticker plants go live

Interactive Data Corporation, a leading global provider of managed ultra-low latency IT and market data services to facilitate electronic trading today announced that its ticker plants in Asia are now live.  Based in two data centres in Hong Kong, the new ticker plants offer a significant reduction in the latency of PlusFeed, Interactive Data’s low-latency, consolidated global data feed.

With increasing adoption of electronic trading in Asia, market data has become a crucial issue. Firms require high quality data at the desired speed from across the region. The new ticker plants, located with Pacnet at MEGA-iAdvantage and with Equinix in their HK1 facility, provide Interactive Data’s clients with lower latency access to Asian venues covered by PlusFeed, as well as to a wide range of additional international sources.

Interactive Data’s new low-latency co-location facilities in Hong Kong will also offer international clients the option to co-locate their applications alongside the ticker plants. This will enable them to obtain optimised low-latency delivery of Asian data via the Interactive Data sites in Hong Kong.

Emmanuel Doe, president, Trading Solutions Group for Interactive Data, said: “With the growth of electronic trading in Asia and higher data volumes globally, clients in Asian markets have an increasing need for cost-effective, real-time market data and delivery. We continue to expand our electronic trading services in Asia and elsewhere throughout the world to meet these requirements.”

Dan Videtto, managing director for Asia Pacific for Interactive Data, added: “The addition of two new ticker plants within one of the region’s primary trading hubs is a significant development. This is one of many enhancements that we will be delivering to Asian markets as we look to support firms in the region through our low latency data and global trading infrastructure solutions.”

Interactive Data’s PlusFeed delivers low-latency data from more than 450 sources worldwide, covering more than 140 exchanges and including multi-asset class instrument coverage and extensive Level 2 data. The feed is used by financial institutions globally to power algorithmic and electronic trading applications and is now supported by ticker plants located in Europe, the US and Asia.

In addition, clients can use the Interactive Data 7ticks network to gain direct market access (DMA), advanced co-location and proximity hosting to global direct exchange data, consolidated data, as well as reference and corporate actions data.

Direct or cross-connect access to a wide range of global exchanges is also available in Asia through Interactive Data’s Points of Presence (POPs) with leading global providers of data centers and technology services, including Equinix, Inc. (Nasdaq: EQIX), Interxion, Telex and KVH.

Source: Finextra, 27.06.2012

Filed under: Asia, Data Management, Data Vendor, Market Data, , , , , ,

Thomson Reuters Opens RICs to all with Non-Realtime License

Thomson Reuters is taking a step toward answering client calls for more open access to its Reuters Instrument Code (RIC) symbology. The company is making RICs available for use with non-real-time information in client and non-client financial institutions’ trade processing systems.

Enterprise content chief Gerry Buggy, who has spearheaded Thomson Reuters’ response to the EC anti-competition complaint, the new facility is the “first step in supporting the financial community’s symbology needs across all parts of the trading life cycle through our evolving symbology services.”

The move comes in the wake of the EC investigation and subsequent complaint into the use of RICs in real-time consolidated data feeds. In response to that complaint, many financial services practitioners have called for more open access to the RIC, which is entrenched in many firms front-, middle- and back-office trading and trade processing systems.

According to Jason du Preez, Global Business Manager, Enterprise Platform, at Thomson Reuters, the latest initiative “has nothing to do with the EC investigation. The EC is focused on use of RICs for accessing real-time information, while the new licences are focused at firms looking to trade with the RIC or use the RIC to access non-real-time information.”

Du Preez says that latest move means that “any market participant can buy a license that will allow them to trade using the RIC. This will allow the use of the RIC for pre- and post-trade activities, and the right to redistribute RICS in this regard.”

The new RICs arrangement will allow market participants to use and cross-reference the RIC symbol for trade activities. As such, it can be used to facilitate the advertisement of liquidity, acceptance of trade flow and execution of post trade activities with the RIC symbol as a consistent identifier throughout the process.

Additionally, the service will allow Thomson Reuters pricing and reference data customers to use RICs to reference and retrieve securities data from their securities master databases and navigate to connected content such as legal entity identifier (LEI) information.

Du Preez says that “Firms that purchase reference data from Thomson Reuters will also be granted the right to use the RIC to access any non-real-time information, essentially allowing them to use the RIC to access any content, including third-party party content, held in their securities master databases.”

Thomson Reuters believes the new service will encourage more efficient and reliable capital markets by giving market participants the freedom to use RICs symbols irrespective of whether they use Thomson Reuters enterprise data products.

As part of the latest initiative, the Bats Chi-X Europe exchange has signed up for the service, which will allow it to deploy RICs in the post-trade services it offers.

According to Paul O’Donnell, COO at BATS Chi-X Europe, “Cross-referencing the BATS Chi-X Europe instrument codes with the Thomson Reuters RIC symbols will enable us to reach new market participants as well as improve efficiency and data transparency by facilitating accurate identification of securities on our platform.”

Du Preez says obvious candidates for adopting the new arrangement include “trade hubs, third-party trade/post-trade processing firms or anyone that wants to send, receive or cross reference messages that contain securities identified with a RIC.”

Source: A-Team Reference Data Review 27.06.2012

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , , , , , , ,

NYSE Data Services to deliver all Market Data via Web Services

NYSE Technologies, the commercial technology division of NYSE Euronext, and Xignite Inc., provider of web-based market data services, have announced their agreement to launch a new service providing access to real-time, historical, and reference market data for all NYSE Euronext markets via the Internet. In extending the benefits offered by the NYSE Technologies Capital Markets Community platform introduced in 2011, NYSE Technologies Market Data Web Services is geared towards non-latency sensitive clients and those in remote locations. The first phase offers real-time retail reference pricing for NYSE, NYSE MKT, and NYSE Arca markets.

NYSE Technologies Market Data Web Services, which is powered by XigniteOnDemand, allows clients the flexibility to access only the content that they need for a wide range of purposes from developing trading solutions for financial web portals to enabling Internet-powered devices. The user interface offers data services from across NYSE Technologies’ full portfolio of market data assets. The second phase scheduled for the third quarter of 2012 will offer NYSE Bonds data, NYSE Liffe Level 1 and Level 2 data, and NYSE and NYSE MKT Order Imbalances.

“Our goal is to connect data consumers directly to our content in multiple ways- via collocation at our Liquidity Centers, direct connection to our SFTI network and now via the web,” said Jennifer Nayar, Head of Global Data Products, NYSE Technologies. “We are pleased to partner with Xignite to address the demand for internet-based delivery of market data and as a result, further extend our client-base to non-latency sensitive and remote clients.”

Using a standard Internet connection, users can access NYSE Euronext market data and customize it according to their specific trading needs. Customers anywhere around the world, including those in remote locations, are able to access the data they need and develop to it with ease for fast time-to-market.

“The delivery of market data content via websites and mobile devices continues to build momentum and we are excited to leverage these applications to help increase access to NYSE Euronext data,” said Stephane Dubois, Xignite’s CEO and founder. “Both NYSE Technologies and Xignite have demonstrated a strong commitment to the electronic delivery of market data and the ability to serve today’s growing, diverse array of applications, especially the mobile market.”

The initiative with Xignite complements NYSE Technologies’ enterprise cloud strategy. NYSE Technologies Capital Markets Community Platform enables a range of industry firms and registered market participants to purchase computing power as needed, freeing them to focus on core business strategy rather than complicated IT infrastructure. NYSE Technologies Market Data Web Services provides clients with another market data delivery option for NYSE Euronext content, supporting current access methods offered by NYSE Technologies where direct connect clients and SuperFeed clients have the choice of collocating in NYSE Technologies’ Liquidity Center or connecting to its Secure Financial Transaction Infrastructure® (SFTI) network.

Source: NYSE Technologies, 06.06.2012

Filed under: Data Management, Data Vendor, Market Data, Trading Technology, , , , , , , , , ,

Coming to Grips With Big Data Challenges by Dan Watkins

The rate of data growth in financial markets has scaled beyond the means of manageability.

Debates have gone so far as to dismiss Big Data as being tamable and controlable in the near term with the current computing architecture commonly adopted as an acceptable solution. I agree but argue that conventional data transport – not management – is the real challenge of handling and utilizing Big Data effectively.

From exchange to trading machine, the amount of new ticks and market data depth are delivered only as fast as the delivery speed can endure. Common market data feeds that are used in conventional exchange trading are but a fraction of the market information actually available.

Perhaps due to high costs of $100,000 per terabyte, many market participants deem the use of more data as a bit too aggressive. Or they believe that high performance computing (HPC) is the next generation technology solution for any Big Data issue. Firms, therefore, are sluggishly advancing their information technology in a slow cadence in tune with the old adage: “if it ain’t broke don’t fix it.”

Over the last decade, Wall Street business heads have agreed with engineers that the immense perplexity of Big Data is best categorized by Doug Laney’s 2001 META Group report’s Three B’s: Big Volume, Big Velocity and Big Variety.

When looking at “Big Volume” 10 years ago, the markets had just defragmented under Regulation ATS. A flurry of new market centers arose in U.S. equities as did dark liquidity pools. This gave rise to a global “electronic trading reformation.” Straight-through processing (STP) advocates and evangelized platforms such as BRASS, REDIPlus and Bloomberg Order Management Systems (OMS) resulted in voluminous and fragmented market data streaming to 5,000 NASD/FINRA trading firms and 700,000 professional traders.

Today, the U.S. has 30+ Securities and Exchange Commission-recognized self-regulatory organizations (SROs), commonly known as exchanges and ECNs. For the first time since 2002, full market depth feeds from NASDAQ allow firms to collect, cache, react, store and retrieve feeds on six hours of trading for nearly 300 days a year more transparently than ever. Big Data volume has grown 1,000 percent and has reached three terabytes of market data depth per day.

Billions of dollars are being spent on increasing “Big Velocity.” The pipes that wire exchanges through the STP chain to the trader have become 100 times faster and larger but still not fast enough to funnel the bulk of information laying idle back in the database. Through “proximity hosting,” the telco is eliminated and latency is lowered. This structure results in adjustments made for larger packets but not really for more information as Big Data remains the big, quiet elephant in the corner.

Five years after Reg ATS, markets are bursting at the seams with electronic trading that produces explosive market data that breaks new peak levels seemingly every day. The SEC’s Regulation National Market System (Reg NMS), struck in 2007, requires exchanges and firms to calculate the best price for execution to be compliant. Firms are also now mandated to sweep all exchanges’ market order books and process all of that data for a smart execution.

After the execution, traders have to track the “order trail” from price to execution for every trade and store all of that information for seven years in the event of an audit recall of a transaction.

Under Reg NMS, subscribing to the full depth of all 30+ markets in “real time” would mean a firm would have to have a 1x terabyte pipe for low latency. Since a T-pipe is not realistic, data moves at 1x gigabits, which is relatively slow with the data in queue at 50-100 terabytes deep. Multi-gbs pipes, as fast as they seem, are still similar to driving five miles an hour on a 55 mph highway.

Analysts typically call data from a database with R (Revolution Analytics) and “SAS” Connectors. The process includes bringing data to an analytical environment in which the user runs models and computations on the subsets of a larger store before moving on to the next data crunch job. The R and SAS Connectors between the file servers and the database are at 10/100BASE-T, making the movement of 50 terabyte environment like driving one mile per hour in a 55 mph zone.

We all hear the polemics regarding data formats and the jigsaw puzzle of unstructured data and the fact that “Big Variety” is the obstacle. Even after standardization of SQL-based queries where analysts can ask any “ad hoc” question, too many sources and too many pipes from analytic servers cause traffic jams. SQL databases are ideal for unstructured queries but are slow in unstructured data compiling. Aggregating market information is where much of market’s processing technologies are being evaluated today to meet the requirements of regulations, sweeping for best execution and for risk management.

Comparing where current prices of stocks are against bids and asks to trade across multiple exchanges, markets, sources, asset classes and clients is essentially the Big Data task of risk management. In addition to managing data changes, firms are also tasked with managing their trading accounts, client portfolios and trading limits such as with the implementation of Credit Valuation Adjustments (CVAs) for counterparty risk.

So why are we still piping data around the enterprise when we just need more compute and memory power? Hardware-accelerated core processing in databases such as XtremeData’s dbX and IBM’s Netezza are powered by FPGAs (field programmable gate arrays). Processing of massive amounts of data with FPGAs can now occur at “wireless” speed. Along with high performance computing, high-speed messaging technology provided by companies like TIBCO, Solace Systems and Informatica have redefined transport times into ultra-low latency terms from one database to another in single microseconds, sometimes in nanoseconds, from memory-cache to memory-cache.

The colloquial phrase “in-database” analytics is an approach of running analytics and computations as near as possible inside a database where the data is located. Fuzzy Logix, an algorithmic HPC vendor, replaces the need for SAS and R connecting analytics, which stretch along the wire from the database to the analyst. With Fuzzy Logix, the need to call a database for small files is eliminated because computations can be done with the rest of the database in real-time: days to seconds faster.

With in-database or in-memory analytics, BI engineers can eliminate transport latency altogether and now compute at server speeds with computations sitting inside the database or in memory for tasks to be completed locally, not on the transport wire.

Wall Street is as risk averse as ever in today’s atmosphere so the adoption of new technology or new vendors continues to present operational risk challenges. ParAccel is a company that appears to be addressing the operational risk of new technology adoption by helping firms utilize the power of parallel processing of Big Data analytics on OEM hardware.

Since ParAccel is software, an IBM, HP or Dell shop could essentially rely on the reliability of their well-known, established database vendor but use next generation Big Data analytic processing an order of magnitude faster than what is currently in place. ParAccel allows firms to aggregate, load and assimilate different data sets faster than traditional platforms through its “columnar database” nodal system. The columns in a ParAccel environment provides firms with the flexibility to first run analytics in-database or in-memory, then bring massive amounts of data to a common plane and finally, aggregate the unstructured data and do it all in lightning speed.

Other companies like NVIDIA have been building graphic processing units (GPUs) for the video game industry for three decades and are now swamped with customer requests to help build parallel computing environments, giving financial firms the ability to run trillions of algorithmic simulations in microseconds for less than $10,000 per card, essentially. GPUs can have up to 2,000 cores of processing on a single NVIDIA Tesla card embedded inside. A GPU appliance can be attached to a data warehouse for advanced complex computations. Low-latency processing can also be achieved due to minimum movement of data over a short distance analyzing most of what Wall Street claims is Big Data in seconds compared with the days it takes now.

The vendors and players are ready to get to work; there just needs to be some consensus that the Big Elephant in the room is there and it’s standing on a straw when it could be surfing a Big Wave!

Source: Tabb Forum , 02.05.2012 by Dan Watkins, President @ CC- Speed dwatkins@cc-speed.com

Filed under: Data Management, Market Data, Risk Management, Trading Technology, , , , , , , , , , ,

Follow

Get every new post delivered to your Inbox.

Join 76 other followers