FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

NYSE Data Services to deliver all Market Data via Web Services

NYSE Technologies, the commercial technology division of NYSE Euronext, and Xignite Inc., provider of web-based market data services, have announced their agreement to launch a new service providing access to real-time, historical, and reference market data for all NYSE Euronext markets via the Internet. In extending the benefits offered by the NYSE Technologies Capital Markets Community platform introduced in 2011, NYSE Technologies Market Data Web Services is geared towards non-latency sensitive clients and those in remote locations. The first phase offers real-time retail reference pricing for NYSE, NYSE MKT, and NYSE Arca markets.

NYSE Technologies Market Data Web Services, which is powered by XigniteOnDemand, allows clients the flexibility to access only the content that they need for a wide range of purposes from developing trading solutions for financial web portals to enabling Internet-powered devices. The user interface offers data services from across NYSE Technologies’ full portfolio of market data assets. The second phase scheduled for the third quarter of 2012 will offer NYSE Bonds data, NYSE Liffe Level 1 and Level 2 data, and NYSE and NYSE MKT Order Imbalances.

“Our goal is to connect data consumers directly to our content in multiple ways- via collocation at our Liquidity Centers, direct connection to our SFTI network and now via the web,” said Jennifer Nayar, Head of Global Data Products, NYSE Technologies. “We are pleased to partner with Xignite to address the demand for internet-based delivery of market data and as a result, further extend our client-base to non-latency sensitive and remote clients.”

Using a standard Internet connection, users can access NYSE Euronext market data and customize it according to their specific trading needs. Customers anywhere around the world, including those in remote locations, are able to access the data they need and develop to it with ease for fast time-to-market.

“The delivery of market data content via websites and mobile devices continues to build momentum and we are excited to leverage these applications to help increase access to NYSE Euronext data,” said Stephane Dubois, Xignite’s CEO and founder. “Both NYSE Technologies and Xignite have demonstrated a strong commitment to the electronic delivery of market data and the ability to serve today’s growing, diverse array of applications, especially the mobile market.”

The initiative with Xignite complements NYSE Technologies’ enterprise cloud strategy. NYSE Technologies Capital Markets Community Platform enables a range of industry firms and registered market participants to purchase computing power as needed, freeing them to focus on core business strategy rather than complicated IT infrastructure. NYSE Technologies Market Data Web Services provides clients with another market data delivery option for NYSE Euronext content, supporting current access methods offered by NYSE Technologies where direct connect clients and SuperFeed clients have the choice of collocating in NYSE Technologies’ Liquidity Center or connecting to its Secure Financial Transaction Infrastructure® (SFTI) network.

Source: NYSE Technologies, 06.06.2012

Filed under: Data Management, Data Vendor, Market Data, Trading Technology, , , , , , , , , ,

Coming to Grips With Big Data Challenges by Dan Watkins

The rate of data growth in financial markets has scaled beyond the means of manageability.

Debates have gone so far as to dismiss Big Data as being tamable and controlable in the near term with the current computing architecture commonly adopted as an acceptable solution. I agree but argue that conventional data transport – not management – is the real challenge of handling and utilizing Big Data effectively.

From exchange to trading machine, the amount of new ticks and market data depth are delivered only as fast as the delivery speed can endure. Common market data feeds that are used in conventional exchange trading are but a fraction of the market information actually available.

Perhaps due to high costs of $100,000 per terabyte, many market participants deem the use of more data as a bit too aggressive. Or they believe that high performance computing (HPC) is the next generation technology solution for any Big Data issue. Firms, therefore, are sluggishly advancing their information technology in a slow cadence in tune with the old adage: “if it ain’t broke don’t fix it.”

Over the last decade, Wall Street business heads have agreed with engineers that the immense perplexity of Big Data is best categorized by Doug Laney’s 2001 META Group report’s Three B’s: Big Volume, Big Velocity and Big Variety.

When looking at “Big Volume” 10 years ago, the markets had just defragmented under Regulation ATS. A flurry of new market centers arose in U.S. equities as did dark liquidity pools. This gave rise to a global “electronic trading reformation.” Straight-through processing (STP) advocates and evangelized platforms such as BRASS, REDIPlus and Bloomberg Order Management Systems (OMS) resulted in voluminous and fragmented market data streaming to 5,000 NASD/FINRA trading firms and 700,000 professional traders.

Today, the U.S. has 30+ Securities and Exchange Commission-recognized self-regulatory organizations (SROs), commonly known as exchanges and ECNs. For the first time since 2002, full market depth feeds from NASDAQ allow firms to collect, cache, react, store and retrieve feeds on six hours of trading for nearly 300 days a year more transparently than ever. Big Data volume has grown 1,000 percent and has reached three terabytes of market data depth per day.

Billions of dollars are being spent on increasing “Big Velocity.” The pipes that wire exchanges through the STP chain to the trader have become 100 times faster and larger but still not fast enough to funnel the bulk of information laying idle back in the database. Through “proximity hosting,” the telco is eliminated and latency is lowered. This structure results in adjustments made for larger packets but not really for more information as Big Data remains the big, quiet elephant in the corner.

Five years after Reg ATS, markets are bursting at the seams with electronic trading that produces explosive market data that breaks new peak levels seemingly every day. The SEC’s Regulation National Market System (Reg NMS), struck in 2007, requires exchanges and firms to calculate the best price for execution to be compliant. Firms are also now mandated to sweep all exchanges’ market order books and process all of that data for a smart execution.

After the execution, traders have to track the “order trail” from price to execution for every trade and store all of that information for seven years in the event of an audit recall of a transaction.

Under Reg NMS, subscribing to the full depth of all 30+ markets in “real time” would mean a firm would have to have a 1x terabyte pipe for low latency. Since a T-pipe is not realistic, data moves at 1x gigabits, which is relatively slow with the data in queue at 50-100 terabytes deep. Multi-gbs pipes, as fast as they seem, are still similar to driving five miles an hour on a 55 mph highway.

Analysts typically call data from a database with R (Revolution Analytics) and “SAS” Connectors. The process includes bringing data to an analytical environment in which the user runs models and computations on the subsets of a larger store before moving on to the next data crunch job. The R and SAS Connectors between the file servers and the database are at 10/100BASE-T, making the movement of 50 terabyte environment like driving one mile per hour in a 55 mph zone.

We all hear the polemics regarding data formats and the jigsaw puzzle of unstructured data and the fact that “Big Variety” is the obstacle. Even after standardization of SQL-based queries where analysts can ask any “ad hoc” question, too many sources and too many pipes from analytic servers cause traffic jams. SQL databases are ideal for unstructured queries but are slow in unstructured data compiling. Aggregating market information is where much of market’s processing technologies are being evaluated today to meet the requirements of regulations, sweeping for best execution and for risk management.

Comparing where current prices of stocks are against bids and asks to trade across multiple exchanges, markets, sources, asset classes and clients is essentially the Big Data task of risk management. In addition to managing data changes, firms are also tasked with managing their trading accounts, client portfolios and trading limits such as with the implementation of Credit Valuation Adjustments (CVAs) for counterparty risk.

So why are we still piping data around the enterprise when we just need more compute and memory power? Hardware-accelerated core processing in databases such as XtremeData’s dbX and IBM’s Netezza are powered by FPGAs (field programmable gate arrays). Processing of massive amounts of data with FPGAs can now occur at “wireless” speed. Along with high performance computing, high-speed messaging technology provided by companies like TIBCO, Solace Systems and Informatica have redefined transport times into ultra-low latency terms from one database to another in single microseconds, sometimes in nanoseconds, from memory-cache to memory-cache.

The colloquial phrase “in-database” analytics is an approach of running analytics and computations as near as possible inside a database where the data is located. Fuzzy Logix, an algorithmic HPC vendor, replaces the need for SAS and R connecting analytics, which stretch along the wire from the database to the analyst. With Fuzzy Logix, the need to call a database for small files is eliminated because computations can be done with the rest of the database in real-time: days to seconds faster.

With in-database or in-memory analytics, BI engineers can eliminate transport latency altogether and now compute at server speeds with computations sitting inside the database or in memory for tasks to be completed locally, not on the transport wire.

Wall Street is as risk averse as ever in today’s atmosphere so the adoption of new technology or new vendors continues to present operational risk challenges. ParAccel is a company that appears to be addressing the operational risk of new technology adoption by helping firms utilize the power of parallel processing of Big Data analytics on OEM hardware.

Since ParAccel is software, an IBM, HP or Dell shop could essentially rely on the reliability of their well-known, established database vendor but use next generation Big Data analytic processing an order of magnitude faster than what is currently in place. ParAccel allows firms to aggregate, load and assimilate different data sets faster than traditional platforms through its “columnar database” nodal system. The columns in a ParAccel environment provides firms with the flexibility to first run analytics in-database or in-memory, then bring massive amounts of data to a common plane and finally, aggregate the unstructured data and do it all in lightning speed.

Other companies like NVIDIA have been building graphic processing units (GPUs) for the video game industry for three decades and are now swamped with customer requests to help build parallel computing environments, giving financial firms the ability to run trillions of algorithmic simulations in microseconds for less than $10,000 per card, essentially. GPUs can have up to 2,000 cores of processing on a single NVIDIA Tesla card embedded inside. A GPU appliance can be attached to a data warehouse for advanced complex computations. Low-latency processing can also be achieved due to minimum movement of data over a short distance analyzing most of what Wall Street claims is Big Data in seconds compared with the days it takes now.

The vendors and players are ready to get to work; there just needs to be some consensus that the Big Elephant in the room is there and it’s standing on a straw when it could be surfing a Big Wave!

Source: Tabb Forum , 02.05.2012 by Dan Watkins, President @ CC- Speed dwatkins@cc-speed.com

Filed under: Data Management, Market Data, Risk Management, Trading Technology, , , , , , , , , , ,

Latin America: NYSE Technology & ATG stream line Trading & Data Access to LatAm

NYSE Technologies, the commercial technology unit of NYSE Euronext, and Americas Trading Group (ATG) are pleased to announce the production use of their high-performance order routing and market-data platform offering the global trading community low-latency access to the key trading venues in Latin America.  Leveraging NYSE Technologies’ Secure Financial Transaction Infrastructure (SFTI), the network connection delivers the lowest possible latency between New York and Sao Paulo.

Now the Global Capital Markets Community can leverage their existing SFTI connectivity to access ATG’s sponsored access gateways for direct order routing to Latin American exchanges and brokers.   Market data from key global financial markets is also available to clients in Latin America while Latin American market data can now be distributed world-wide.

“We are pleased to continue our strong partnership with ATG by working closely to expand our presence in Latin America to offer faster, simplified access to these highly attractive trading venues,” said Stanley Young, CEO, NYSE Technologies. “This is a key step in increasing access to, and liquidity in Latin America and working with ATG we will operate the highest performing route in the region.”

“Our local expertise, relationships and long-term commitment to the region combined with the technology and know-how NYSE Technologies brings to this project, create a compelling customer solution to a challenging market,” commented Martin Fernando Cohen, CEO, ATG.  “With the emergence of Sao Paulo as one of the world’s financial capitals, the increased access to local markets by global investors will enable local buy and sell side firms to play a significant role in the further emergence of a global capital markets community.”

ATG uses NYSE Technologies’ Managed Transaction Hub to offer access to local and cross border order flow between exchanges and brokers in Brazil, Mexico, Chile, Colombia and Peru.   All SFTI customers will have the ability to directly access Chile’s Bolsa de Comercio de Santiago, Colombia’s Bolsa de Valores de Colombia and Peru’s Bolsa de Valores de Lima using the ATG’s Mercados Integrados Latino Americanos (MILA) infrastructure.

Source: Mondovisione, 01.05.2012

Filed under: Brazil, Chile, Colombia, Exchanges, Latin America, Market Data, Mexico, Peru, Trading Technology, , , , , , , , , , , , , , , , , , , , ,

Market Data Technology to Hit $3.6B in 2012

Demand for market data acceleration is driving the global investment in sell-side, market-data distribution technology in 2012 to $3.6 billion, according to a report released by the Tabb Group.

The report, Market Data Acceleration: More than Just Speed, also predicts 4.5% compound annual growth in these investments for the next three years based on expected growth in FX, Derivatives and Commodities as well as movement by Asian markets towards automation.

The largest segment of this investment, 73%, will come from Europe and North America, but according to Tabb Group, there’s considerable growth potential from the Asian markets.

Moreover, while the equities markets are matured from a growth perspective, driving 45% of the global spend, a strong percentage of growth will come from over-the-counter (OTC) derivatives, FX and commodities.

According to the report, market data is an area where performance can play a crucial role for a host of trading activities. Obtaining, decoding and utilizing market data in a timely and efficient manner are no longer the purview of the ultra-low-latency firms; everyone involved needs to be able to get at market data in as timely a fashion as possible.

“This is not to say that everyone needs to be at the ‘tip of the spear’; however, it does mean that anyone who is actively involved in trading needs to be moving in that direction,” said the report.

However, according to the research firm, firms are struggling with conflicting pressures of the “need for speed” in comparison to the “need to save,” as they try reconcile price with performance.

“Market participants need to ensure that their investment in speed gets them more than just a solitary solution for a single platform,” said Tabb partner and report writer Alexander Tabb in a statement.

Different firms, according to Tabb, have different strategies, thus different needs. Whether a firm is a high frequency trader, an institutional market maker, or an algo-trading desk, the challenge is placing speed into its proper context within the accelerated market data equation.

“Due to the democratization of speed, it’s essential for every buyer to remember to factor in total cost of ownership, price versus performance, operational flexibility, control, scalability and time-to-market,” says the report.

Source: Securities Technology Monitor. 23.04.2012

Filed under: Data Management, Data Vendor, Market Data, , , , , , , , , , ,

Peru: Lima Stock Exchange (Bolsa de Valores de Lima) selects the SunGard Global Network for Market Data Distribution Bolsa de Valores de Lima selects the SunGard global network for Market Data Distribution

Bolsa de Valores de Lima S.A. (BVL), the Peruvian stock exchange, has selected the SunGard Global Network (SGN) as a distribution channel for its data.

Peru is expected to be Latin America’s fastest growing economy in 2012, according to its Economy and Finance Minister. U.S. Department of State figures show that its economic growth averaged 7.0% a year for the 7 years up to 2010, due in large part to market-oriented economic reforms and privatization, as well as high international prices for the country’s largest commodity exports.

SGN will help BVL reach a broad international audience by delivering its real-time market data and multi-asset class historical data and analytics to asset managers and brokers worldwide. Those firms will also be able to route orders to the exchange via SGN, contributing to activity on BVL.

Francis Stenning, chief executive officer of the Bolsa de Valores de Lima, said, “Joining the SunGard Global Network will help Bolsa de Valores de Lima increase our visibility internationally and create new trading opportunities for our members.”

Philippe Carré, global head of connectivity of SunGard’s capital markets business, said, “We are seeing increased demand for direct connectivity to Latin America as the debt crisis drives international investors to seek alternatives to western Europe and US markets. The SunGard Global Network makes it easy to route orders and access real-time, high quality data, helping our customers operate more efficiently and make more informed trading decisions.”

Source: BobsGuide,17.04.2012

Filed under: Exchanges, Latin America, Market Data, Peru, , , , , , ,

Tullett Prebon to pay out over BGC data misuse

Tullett Prebon has agreed to pay $800,000 to its US rival BGC after misuse of data by some of its brokers, closing one chapter in a long-running legal battle between the two interdealer brokers.

The settlement, which was ordered by a US arbitrator, is smaller than Tullett had expected, and far lower than the sum sought by BGC, which claimed it had suffered damages of “hundreds of millions of dollars”. The arbitrator found that BGC was not the “prevailing party”, meaning it was not entitled to reclaim legal costs from Tullett.

The dispute related to trading data provided by BGC that Tullett packaged with its own data and sold to information providers such as Reuters and Bloomberg. Under the terms of the deal between the two groups, Tullett’s brokers were not allowed to use the BGC data after January 25 last year but some continued to do so.

In Tullett’s latest full-year results statement it reported a provision of £12.4m to cover the anticipated cost of settling the data misuse case and the costs of two other cases it is pursuing against BGC. Those cases relate to BGC’s alleged “poaching” of more than 50 brokers from Tullett’s US division in late 2009. Last year BGC made an out-of-court payment to settle a similar claim relating to Tullett’s UK business.

Source: FT, 22.03.2012 by Simon Mundy

Filed under: Data Vendor, Market Data, , , , , ,

S&P Capital IQ aquieres QuantHouse low latency market data provider

S&P Capital IQ, a business line of The McGraw-Hill Companies (NYSE: MHP) offering global multi-asset class data solutions, market research and portfolio risk analytics to global investors, today announced it has acquired QuantHouse, an independent global provider of market data and end-to-end systematic trading solutions. This includes ultra-low-latency market data technologies, algo-trading development frameworks, proximity hosting and order routing services for hedge funds, market makers, proprietary desks and latency-sensitive sell-side firms.

“The acquisition of QuantHouse will provide our clients with access to exchange pricing globally, including securities valuations and portfolio analytics, throughout all our desktop and enterprise solutions. In addition, the extensive capabilities QuantHouse brings will enable S&P Capital IQ to build our own unique real-time monitors, derived data sets and analytics,” said Lou Eccleston, President of S&P Capital IQ and S&P Indices.  “As the foundation for our growing Enterprise Solutions business, QuantHouse will enable us to offer one integrated low-latency feed for all our data, including fundamental, fixed-income, equity and derivatives.”

“We are very excited to be a part of S&P Capital IQ,” said Pierre-Francois Filet, chairman and co-founder, QuantHouse. “Together, we can focus on developing a new generation of alpha-generation tools, low-latency transaction infrastructure and integrated low-latency data feeds to maximize offerings and strengthen S&P Capital IQ’s competitive positioning.”

This purchase, along with the recently announced acquisition of R2 Financial Technologies and the expected acquisition of CMA later this year, provides S&P Capital IQ with the components necessary to offer its clients the most comprehensive market data and risk analytics platforms in the industry.

Following the acquisition, QuantHouse’s 90 employees, based in Paris, London and New York, will become a critical component to S&P Capital IQ’s global growth strategy as part of the Enterprise Solutions unit. In the short term, its products and services will continue to be sold as standalone feeds and applications, although all S&P Capital IQ and S&P Indices content will gradually be consolidated into QuantHouse feeds.

Source: Mondovisione, 04.04.2012

Filed under: Data Vendor, Market Data, , , , ,

Active Financial expands offerings through Global Infrastructure ActivNet

Activ Financial, a global provider of fully managed low-latency and enterprise market data solutions, today announced that ActivNet, the firm’s global infrastructure for the transmission of financial data worldwide, is expanding its offerings to include raw feed delivery and trade order routing. Originally developed to provide access to locally aggregated raw direct exchange feeds as well as globally aggregated exchange feeds, the collocated, highly managed and reliable system is now also being utilized for trade/execution transport and other latency sensitive data.

“ActivNet is a fully realized global infrastructure that has a proven track record as a high-performance, stable and low-latency communications platform,” said Antonio Bernard, Chief Network Architect of Activ Financial. “By offering additional capabilities vital to the financial sector, businesses across the world can access high-quality data and execute trades at the fastest speeds available, all without the added costs necessitated by building and maintaining their own global network infrastructure.”

ActivNet is optimized for real-time information services related to price discovery, eliminating transfer issues often facing multi-purpose networks. To support the infrastructure, ACTIV’s proximity centers operate in more than 20 physical global centers around the world, either in or near exchanges in key market locations, ensuring space, power, cooling, and an extensive, flexible network that is able to meet demanding requirements.

The architecture behind ActivNet provides the speed advantages of local direct feeds plus fast delivery of content from away markets. By leveraging the concept of an A & B Ring, which provides a fully resilient and dynamic path for delivery of raw exchange data and market data, coupled with strategic points of presence across major international locations, users are granted extremely high levels of traffic engineering capability. This architecture provides best in class latency characteristics while also providing the highest degree of resiliency. In addition, the commitment to carrier neutrality allows ACTIV to leverage the best providers between points where reliability is critical.

“ActivNet was originally designed to drive our core business at Activ Financial, and as such customers can trust that the network is maintained at the highest level.” said Activ Financial President Frank Piasecki. “We are consistently searching for and testing new routes and offerings and investing in our infrastructure, and businesses can take advantage of the quality system that Activ Financial uses for its own data needs every day.”

Source: A-Team, 03.04.2012

Filed under: Data Vendor, Market Data, Trading Technology, , , , , ,

HKEx selects NYSE Technology Exchange Data Publisher for Hong Kong Market Data Plattform

Hong Kong Exchanges and Clearing Limited (HKEx) has selected NYSE Technologies’ Exchange Data Publisher (XDP)™ to drive the HKEx Orion Market Data Platform.  XDP is an ultra-low latency solution designed to collect, integrate and disseminate real-time market data to local customers and, using regional hubs, to customers around the globe.

The HKEx Orion Market Data Platform will deliver market data for all securities and derivatives traded by HKEx in a common message format.  It will be capable of distributing more than 100,000 messages per second at microsecond latency.  It will be rolled out for HKEx’s securities markets towards the end of the second quarter of 2013, with a remote distribution hub in Mainland China and integration with HKEx’s derivatives markets to follow.

With the establishment of remote distribution hubs under the new market data platform, HKEx will be able to establish points of presence for market data distribution outside of Hong Kong, such as in Mainland China, where information can be relayed to local customers.

“The HKEx Orion Market Data Platform will enable us to improve our customers’ market data experience by providing a suite of market data product feeds with content, market depth and bandwidth requirements tailored to suit the needs of different types of customers,” said Bryan Chan, Head of Market Data at HKEx.  “We selected NYSE Technologies’ XDP solution based on its high performance capabilities as well as the flexibility it offered to meet our customer requirements.”

XDP is based on NYSE Technologies DataFabric 6.0, an industry-leading platform offering high throughput, scalable application messaging and microsecond latency.

“HKEx’s selection of XDP reaffirms our technology expertise and ability to deliver innovative products that operate effectively in markets around the world, particularly in the growing Asian marketplace,” said Stanley Young, CEO NYSE Technologies.  “In XDP we are providing HKEx with the robust features of a proven platform and the advantages of functions tailored to their unique trading environment.  We are pleased to be working with one of the world’s leading markets to deploy a world class market data platform that will serve customers in Hong Kong and Mainland China.”

Source, MondoVisione 29.03.2012

Filed under: Data Management, Exchanges, Hong Kong, Market Data, , , , , , ,

Thomson Reuters fails RIC licensing market test – Symbology

Thomson Reuters has failed to appease EU antitrust bodies over proposed concessions in the way it licenses the proprietary Reuters Instrument Codes.

In 2009 the European Commission opened antitrust proceedings against Thomson Reuters over possible abuse of its dominant market position in the supply of RICs – codes that identify securities and are used by financial institutions to retrieve data from Thomson Reuters’ real-time feeds.The EC argued that the firm could be abusing its dominant position in the market for these consolidated real-time datafeeds by stopping customers from using RICs for retrieving data from alternative providers and mapping them for such a purpose to alternative symbols.

In an attempt to ward off further action by the Commission, the vendor agreed to let customers license RICs for mapping purposes over a five-year period for a monthly fee based on the number of RIC symbols to be used.

However, in a speech in Copenhagen today, EU competition chief Joaquín Almunia, said that a market test of the new measures had failed to deliver a desirable outcome.

“We have now reached a critical stage in this investigation,” said Almunia. “If no effective solution can be agreed upon, then we will have to draw the adequate conclusions.”

The company could face fines of up to 10% of its turnover if it does not offer further concessions to users.

Thomson Reuters’ rival Bloomberg has moved to make its own proprietary symbology available for free to developers and market practitioners.

Filed under: Data Vendor, Market Data, News, Reference Data, , , , , , ,

LEIs – Increasing Usability & Benefits of the New Standardised Identifier – IDC

The development of the standardised legal entity identifier (LEIs) is very much underway, but how can firms and market participants utilise this new identifier to improve internal data flow and risk monitoring processes whilst also meeting the regulatory reporting requirements?

Listen to the Podcast here

Moderator/ Speakers:
Julia Schieffer
- Founder, DerivSource.com
Chris Johnson – Head of Product Management, Market Data Services, HSBC Securities Services
Darren Marsh – European Business Manager, Risk Management and Compliance, Interactive Data

Filed under: Data Vendor, Events, Market Data, Reference Data, , , , , , , , ,

Bloomberg boosts network speed and efficiency in Asia

Bloomberg L.P., financial data, news and analytics provider, has extended availability of its enterprise data services in markets throughout Asia. An efficient network of data centers across the region enables the delivery of reliable collocation, connectivity and infrastructure services along with reduced latency in data distribution.

The expansion of low-latency services throughout Asia includes Equinix Inc.‘s Hong Kong and Sydney-based International Business Exchange (IBXÒ) data centers, further strengthening the existing Bloomberg deployments in New York, Chicago, Slough and Frankfurt. Current and prospective customers located inside these Equinix IBX data centers can now directly connect to Bloomberg’s real time data, B-Pipe and Event Driven Trading feeds.

The coordinated service delivery also benefits customers of Bloomberg’s agency broker, Bloomberg Tradebook, in New York, London and Hong Kong.

“Together, Equinix and Bloomberg address a market need for secure and reliable connectivity to multiple partners simultaneously,” said John Knuff, general manager, global financial services at Equinix. “The expansion of services into Asia enables Bloomberg customers to leverage our global footprint of data centers. And Bloomberg’s presence across these markets adds further value to the global financial ecosystem inside Equinix.”

Source: NetworkAsia,02.03.2012

Filed under: Asia, Australia, Data Vendor, Hong Kong, Japan, Market Data, Singapore, , , , , ,

Bloomberg unveils its NEXT terminal

On its 30th anniversary Bloomberg officially launched an updated $100 million version of its core terminal yesterday in London and New York simultaneously. The NEXT platform of the Bloomberg Professional Service is intended to give traders and financial services end users faster, deeper insights into the markets and to enable the market data terminal to answer questions more intuitively in future, not just present research and data, via an enhanced ‘natural language’ search function and ‘give me the answer’ front-end tool.

According to Tom Secunda, the co-founder and vice chairman of Bloomberg speaking at the launch, “this is an evolutionary step” that helps order increasingly complex markets and aids productivity, while continuing the company’s mission to deliver on “Mike Bloomberg’s famous three-legged stool, consisting of news, data and analytics”. The NEXT platform consolidates and crucially integrates these feeds better than ever before believes the company, giving users easier access to the information that exists on the terminal and enhancing the customer experience.  “For example, you can ask what was US CPI in 2000 …and bang, there is the answer.” Users can then drill down into the answer for further research, added Jean-Paul Zammitt, global head of core product development at Bloomberg, pointing out that this is the key presentational change in the NEXT platform, requiring every help screen and back end process to be rewritten and updated.

Under development for the last two years, Bloomberg asserts that 3,000 technologists were involved in the overhaul of its core terminal, which is used by traders, analysts and even some large multinational corporate treasuries looking to hedge their foreign exchange exposure. A select group of existing clients, including OCBC Bank, Credit Agricole CIB, and Glenhill Capital were involved in the development phrase, allowing Bloomberg to review common keystrokes and commands across an array of functions in order to improve the customer experience.

More than 100,000 clients have already converted to Bloomberg NEXT at no extra cost in the £20,000 per year outlay since its ‘soft launch’ at the end of last year, with less than 1% converting back to their old terminal. The company said that two thirds of them are using the NEXT platform more than their old terminal and that it wants to convert all of its 313,000 subscriber base for the Bloomberg Professional Service by the end of this year.

“Bloomberg NEXT saves me time by discovering functions and data more quickly,” said Seth Hoenig, head trader at one of the ‘soft launch’ development partners, Glenhill Capital. “The new help menus enable users to find the answer that they need fast. Stumbling upon the hidden gems within Bloomberg has always been revelatory; now it’s easier.”

According to Lars Hansen, senior portfolio manager at Denmark’s DIP, the Danish Pension Fund for Engineers: “Bloomberg NEXT is a major step forward. It is much more intuitive – you can see multiple pieces of information on one screen, which lets you see new interrelationships.”

Bloomberg highlighted what it sees as three key improvements in its updated terminal:

• Better discoverability: Bloomberg NEXT’s new discoverability features allow users to get quick, direct answers to their queries as well as pull together a wide variety of related details such as companies, research and charts. A more powerful search engine means users can type just a few words and go directly to the desired securities, functions, people and news. The streamlined menu listing puts the most relevant information and topics right up front.

• More uniformity: Every screen of the Bloomberg Professional Service has been redesigned to provide a common look and feel. This consistent interface across all asset classes, from FX to commodities and fixed income, and across all functions should allow expert users and generalists alike to more efficiently navigate often-used functions and discover new ones. An educational overview of each market segment for novices is also included in the update.

• Intuitive workflow: The functionality of the Bloomberg Professional service has been re-engineered so that a user should be able to quickly and seamlessly navigate through the series of questions and answers essential to making smart market decisions. The new workflow, with user prompts, in Bloomberg NEXT is intended to allow expert users to drill deeper into the data and to let occasional users discover new functions.

“The complexity and interconnectedness of the global financial marketplace has grown significantly. Business and financial professionals need to synthesize astounding amounts of information to make intelligent investment decisions,” explained co-founder, Tom Secunda. The firm is still a big believer in a single product approach, however, he stressed at the official launch of NEXT but this, “obviously gives us challenges as markets get more and more complex.”

NEXT is Bloomberg’s response. “The pace of change in financial markets will only accelerate and with it the need for more information,” added Secunda, before concluding that he believes, “Bloomberg is now positioned to quickly answer those evolving questions and ensure that our clients will always have the leading edge in making investment decisions.”

News Analysis 

Bloomberg’s new NEXT platform will go head-to-head against Thomson Reuters in the market data sector, which is increasing in value as financial markets get more and more complex and new post-crash regulations place new information demands upon market participants. Both companies are running neck and neck in terms of market data share, with estimates of 30% for each at present.

One terminal is proprietary, of course, with Bloomberg maintaining its closed market data platform in its NEXT iteration, while Thomson Reuters is now following an open access model with its Eikon terminal, allowing users to add their own data and applications. The relative failure of Thomson Reuters Eikon platform, which has sold only tens of thousands of units since launch rather than the hoped for hundreds of thousands, is what prompted the open access model from Thomson Reuters, although it does of course take time to build up a following. It will be interesting to see if Thomson Reuters move allows the firm to win back lost market data share or if Bloomberg’s updated terminal can keep it on its recent upward curve. The former is still benefiting from the 2008 merger that united Thompson Financial with Reuters, giving it synergies in the data collection and delivery areas, but the competition between the two has just hotted up.

Source: Bobsguide, 28.02.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, News, Reference Data, , , , , ,

White Paper: Big Data Solutions in Capital Markets – A Reality Check

Big Data has emerged in recent months as a potential technology solution to the issue of dealing with vast amounts of data within the enterprise. As in other industries, financial services firms of all kinds are drowning in data, both in terms of the sheer volume of information they generate and / or have to deal with, and in terms of the growing and diverse types of data they confront in those efforts.

But the relative immaturity of Big Data solutions, and widespread lack of understanding of what the term really means, leads some to question whether ‘Big Data’ is no more than a technology solution looking for Big Problem to solve.

So is Big Data for real? Can so-called Big Data solutions provide relief to the embattled data architects at financial institutions? Or is Big Data a solution looking for a set of problems to solve?

Research conducted by A-Team Group on behalf of Platform Computing suggests that current market sentiment, financial hardships and regulatory scrutiny may be conspiring to create the perfect conditions for Big Data solutions to provide value to financial institutions.

Download the White Paper Now

Source: A-Team, 15.02.2012

Filed under: Data Management, Data Vendor, Library, Market Data, Reference Data, , , ,

Follow

Get every new post delivered to your Inbox.

Join 82 other followers