FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

Reference Data Utilities Offer Cost Savings, but Challenges Remain DMS Review

Managed services and utilities can cut the cost of reference data, but to be truly effective managed services must be more flexible and utilities must address issues of data access and security.

A panel session led by A-Team Group editor-in-chief Andrew Delaney at the A-Team Group Data Management Summit in London set out to discover the advantages and challenges of managed services and utilities, starting with a definition of these data models.

Martijn Groot, director at Euroclear, said: “A managed service lifts out existing technology and hands it over to the managed service provider, while a utility provides common services for many users.” Tom Dalglish, CTO, group data at UBS, added: “Managed services run data solutions for us and utilities manage data for themselves.”

Based on these definitions, the panellists considered how and why managed services and utilities are developing. Dalglish commented: “We need to move away from all doing the same things with data. Managed business process outsourcing services are well understood, but utilities present more challenges – will they be run as monopolies and make data difficult to access, what is the vendor interest?” Steve Cheng, global head of data management at Rimes Technologies, added: “The market has moved on from lift outs. New technologies mean managed services can be more flexible than outsourcing.”

It is not only the nature of available services that is driving financial firms to third-party providers, but also cost and regulation, both of which are high on the agenda. Jonathan Clark, group head of financial services at Tech Mahindra, explained: “Cost is significant, but regulation is the number one issue. Regulations require more holistic and high quality data and that is high cost for firms, so they are trying to get data quality at a reasonable price point.”

Dalglish focussed on cost, saying: “The business case is about money. Large companies have lost the ability to change, a utility can help to reduce costs. Banks are looking at these data models to regain efficiencies they have lost internally and are difficult to rebuild.”

Cheng described the reference data utility model as being more like the satellite television model than water or electricity models, and noted that Rimes’ experience of customers is that they want to innovate, but not allow their cost base to increase.

While consensus among the panellists was that managed services and utilities can provide cost savings, they also agreed that it is not the cost of data, but the infrastructure, sources, services and people around the data that rack up the cost to an extent that is leading firms to seek lower cost solutions. Firms that opt to use a data utility can convert capital costs to expenditure and chip away at elements such as multiple data sources.

Dalglish commented: “If you can achieve savings of 30% to 35% that is good, but this is a conservative estimate and it should be possible to save more going forward.” Cheng added: “The rule of thumb is that for every £1 spent on data licences, £2 or £3 is spent on infrastructure and staff. The need is to identify those hidden costs so that the use of a managed service or utility can be justified.”

Returning to the pressure of regulation, Delaney asked the panel whether managed reference data services and utilities would be regulated in the same way as banks. While this is not happening at the moment, some panel members expect it to happen and warn that utilities may find a way around regulation by using disclaimers. Cheng said: “Forthcoming regulations are very prescriptive about data models and regulators may look at the whole data chain. This means utilities and managed services may in future be subject to the same regulatory requirements as other market participants.”

The concept of managed services and utilities is not new. Dalglish recalled an effort to set up a utility that did not take off back in 2005 and said that the moment has now come for utilities as the technology stack has improved, data is better understood and this is a good time for competition and collaboration in the market. Groot added: “Data delivery mechanisms have changed, the bar has been raised on projects and the business case for an internal service is difficult, making external services attractive.” Panellists also noted technologies such as the Internet and cloud facilitating mass customisation, and the benefit of utilities that are built for a single purpose.

With so much to offer, Delaney questioned the panel on what type of organisations will benefit from third-party utilities. Panel members said both large and small firms could benefit, with large companies reducing today’s massive data costs and small firms being able to hand off non-core reference data services. Clark added: “Firms that can benefit most are those that find it difficult to discover the cost of data, perhaps because it is managed in different departments or geographic regions. But these firms are also the hardest to convert because they don’t know their costs.”

A question from the audience about defining reference data, making it open and putting it in a utility for all to use, met a consensus response from panel members who said it is a great idea, but will not happen because there are too many vendors with vested interests in the market.

Closing with a blue skies scenario, Delaney asked how far the utility concept could go. Groot concluded: “There is a need for operational procedures and recovery planning, but utilities could go a long way as there is a lot of data in scope.”

Source: Reference Data Review, 08.10.2013.

Filed under: BPO Business Process Outsourcing, Data Management, Data Vendor, Reference Data, Standards, , , , ,

Reference Data: Tech Mahindra Details Global Data Utility Based on Acquired UBS Platform

Tech Mahindra, a business process outsourcing specialist and parent of London-based investment management technology consultancy Citisoft, has repositioned a reference data platform acquired from UBS Global Asset Management to offer an offshore reference data utility aimed at meeting market demand for lower cost, high quality data that can reduce risk and increase efficiency.

The global data utility has been introduced under the Tech Mahindra Managed Data Services brand and offers securities reference data across all asset types, reference data for corporate actions, tax information and end-of-day and intra-day validated pricing data. The utility handles data cleansing and validation, with clients buying licences to access the data.

Tech Mahindra suggests the utility differs from other offerings in the enterprise data management market as it is owned by the company and can be developed. It is also agnostic on data feeds, including 20 from vendors including SIX, Markit, Bloomberg, Thomson Reuters and DTCC.

The company’s first customer is UBS Fund Services in Luxembourg. Under the terms of a five-year services contract with UBS, Tech Mahindra will create and store golden copy data and provide multiple intra-day golden copies to the asset manager. As part of the acquisition and customer deal, Tech Mahindra, which is headquartered in Hyderabad, India, will take on some staff from UBS Global Asset Management who were working on the platform in Luxembourg, but most staff will be located in India.

As a repositioned platform, Tech Mahindra MDS already covers all time zones, markets and asset types, updates 2.5 million issuers on a daily base, receives 200,000 customer price requests and validates 75,000 prices. Some 20,000 corporate actions are checked every day, along with 1,800 tax figures. Looking forward, Tech Mahindra plans to extend these metrics and add reference data around indices and benchmarks, legal entity identifiers and clients.

While Tech Mahindra will lead sales of the service to the banking, financial services and insurance sectors, Citisoft will be able to provide consultancy as necessary. Steve Young, CEO of Citisoft, says Tech Mahindra MDS has been designed to improve data quality and drive down the total cost of data ownership, in turn reducing risk and increasing efficiency. To manage clients’ cost issues, the company has built a toolkit into the data management system that allows users to analyse the cost of owning data, including people, processes and technology. Data quality will be underpinned by service level agreements and key performance indicators will be added as more clients sign up for services and data volumes grow.

Reflecting on the data challenges faced by financial firms, Citisoft Group CEO Jonathan Clark, concludes: “Outsourcing models have evolved over time and attitudes are changing as firms acknowledge that there is a big difference between outsourcing and offshoring, and that captive outsourcing is not an efficient approach. The need is for a commercial relationship with a centralised data utility that can deliver high-quality, accurate data and a lower total cost of ownership.”

Source: Reference Data Review, 24.07.2013

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , , , , , , , ,

Data Management – a Finance, Risk and Regulatory Perspective- White Paper

Download this  white paper – Data Management – a Finance, Risk and Regulatory Perspective – about the challenges facing financial institutions operating across borders with respect to ensuring data consistency and usability across multiple user types.

 The finance, risk and compliance operations of any financial institution need nimble access to business information, for performance measurement, risk management, and client and regulatory reporting. But although the underlying data may be the same, their individual requirements are different, reflecting the group-level view required by senior management and regulators, and the more operational view at the individual business level.

Where in the past, risk managers have been left to figure out what’s most appropriate for their particular institutions, regulators today are adopting a more aggressive stance, challenging the assumptions underpinning banks’ approaches to risk management. As a result, today’s challenge is not only to understand the current regulatory, risk or finance requirements, but also to set in place the analytical framework that will help anticipate future requirements as they come on stream. To find out more, download the white paper now

Full text and downloads available to members only, please log in or become a member to view premium content and attached documents.  Becoming a member of the ReferenceDataReview.com community is free and easy – just click the link below.      
Source: A-Team, 02.07.2013

Filed under: Corporate Action, Data Management, Library, Reference Data, Risk Management, Standards, , , , , ,

B-Source (Avaloq Group) takes over Wealth Management Operations of Back-Office of Deutsche Bank (Switzerland)

Avaloq subsidiary B-Source is taking over operational responsibility for the Wealth Management Operations Back-Office of the Deutsche Bank (Switzerland) Ltd with effect from July 1, 2013 and supports the bank in further concentrating on its core business.

With effect from July 1, 2013, B-Source, a member of the Avaloq group, is taking over operational responsibility for the Wealth Management Operations Back-Office of Deutsche Bank (Switzerland) Ltd, including 80 employees in Geneva. Transformation of the core-banking platform to the integrated Avaloq Banking Suite is scheduled for summer 2014.

Markus Gröninger, CEO of B-Source, said: “Our continuously increasing community corresponds clearly to our growth path. This confirms us to be on the right track with our strategy of offering highly industrialised services that let banks concentrate on their core business and generate future growth.”

Francisco Fernandez, CEO of Avaloq, is also very pleased with the new deal: “In migrating to the integrated Avaloq Banking Suite, our clients are setting the highest standards in terms of individualising processes and industrialising operations. We love challenges like that. This is how we generate maximum added value for our clients.”

Source: Avaloq 28.06.2013

Filed under: Data Management, Reference Data, Wealth Management, , , , , , , , , , ,

Falcon Private Bank goes live with the B-Source Wealthmanagment Outsource Solution

B-Source successfully migrated Falcon Private Bank with its global locations to the B-Source Master at the beginning of the year. This will enable the established Swiss private bank to further optimize its processes and concentrate on its strategic expansion.

The successful migration of Falcon Private Bank to the B-Source Master means another Swiss financial institution has put its faith in B-Source’s reliable and innovative banking solution. Falcon Private Bank opted to outsource the operation of its banking platform and migrate it to the B-Source Master, an Avaloq-based banking application landscape using the ASP (application service provisioning) model. All three banking locations in Switzerland, Hong Kong and Singapore were migrated. The work was successfully completed within 15 months, a short period given the differing regional legal regulations. Orbium, a long-standing partner of B-Source, also played a decisive role in the successful project implementation.

By outsourcing its banking platform, Falcon Private Bank has a powerful, efficient and scalable banking solution that will allow it to focus on its strategic expansion in emerging markets. The bank chose B-Source in part due to its extensive expertise and long-standing experience not only in Switzerland but also with locations in other countries.

“The main reason behind our decision was B-Source’s experience in international outsourcing business, as we wanted to migrate several locations to the new banking system at the same time,” explains Tobias Unger, COO of Falcon Private Bank. “The migration of our banking platform to the B-Source Master creates the basis for optimal fulfilment both of our clients’ growing demands for higher quality service and of new regulatory requirements, and for pressing ahead with our global strategy and direction,” adds Unger.

“The migration of Falcon Private Bank to the B-Source Master is a further success for us, and we are proud to count another renowned first-class Swiss private bank among our clients in the shape of Falcon Private Bank. B-Source’s long-standing experience with international private banks enabled us to successfully implement this challenging project in a very short time and to a high level of quality,” says Markus Gröninger, CEO of B-Source AG.

Source: B-Source, 30.01.2013

Filed under: Data Management, News, , , , , , , , ,

Nyse Technologies expands SFTI network in Asia

Nyse Technologies, the commercial technology division of Nyse Euronext, today announced the continuing expansion of its Secure Financial Transaction Infrastructure (SFTI) in Asia with the introduction of two access centres located in Hong Kong.

Customers now, for the first time, have direct access to the SFTI network, allowing them to connect from Hong Kong to services offered by NYSE Technologies through SFTI, including access to Hong Kong Exchanges & Clearing (HKEx), all major international trading venues, market data solutions, plus the NYSE Euronext capital markets community.

As part of the expansion of the SFTI network to include Hong Kong, NYSE has also extended SFTI to the new HKEx Data Centre colocation facility, giving customers there access to all the services available on SFTI through a simple cross connect to their colo racks. NYSE Technologies also plans to expand SFTI in the region to connect other markets like Australia and Korea.

NYSE Technologies’ Secure Financial Transaction Infrastructure provides access to a comprehensive range of capital markets products through a single point of access and offers low-latency trading access to the NYSE Liffe and NYSE Euronext markets. SFTI Asia is the most recent extension of the global backbone, enabling Asian firms to receive market data and trade on multiple markets. Designed to be the industry’s most secure and resilient network, SFTI is specifically built for electronic trading and market data traffic thus enabling firms to reduce their time-to-market, improve their performance and significantly lower the cost of their trading infrastructure. Furthermore, the global backbone allows customers to connect to their trading infrastructure distributed in financial centres around the world using a SFTI connection on the other side of the world.

“The addition of these important access centres in Hong Kong is a further step in the expansion of NYSE Technologies’ footprint and reach of the SFTI Asia network and adds to our established presence in Singapore and Tokyo.” Daniel Burgin, Head of Asia Pacific, NYSE Technologies, commented. “Offering multiple access centres in the Asia Pacific region allows them to use SFTI Asia to connect to regional and global exchanges and markets in a cost effective way through a single connection at each of the client’s locations around the region. This eliminates the overheads and costs associated with maintaining separate network connections in each location to multiple trading venues.”

Source: NYSE Technology 06.12.2012

Filed under: Australia, China, Data Management, Data Vendor, Exchanges, Hong Kong, Japan, Korea, Market Data, News, Singapore, , , , , , , , , , , , , ,

Capco Proposes the Creation of a Data Culture to Advance Data Management RDR

Financial firms are falling short on data management issues such as calculating the true cost of data, identifying the operational cost savings of improved data management and embracing social media data, but according to research by consultancy Capco, these issues can be resolved with a cross-organisational and practical approach to data management and the development of a data culture.

The business and technology consultancy’s report – ‘Why and how should you stop being an organisation that manages data and become a data management organisation’ – is based on interviews with towards 100 senior executives at European financial institutions. It considers the many approaches to data management across the industry and within individual enterprises, as well as the need to rethink data management. It states: “There is one certainty: data and its effective management can no longer be ignored.”

The report suggests an effective data management culture will include agreed best practices that are known to a whole organisation and leadership provided by a chief data officer (CDO) with a voice at board level and control of data management strategy and operational implementation.

For details on the report click here

Turning the situation around and attaching practical solutions to the data management vision of an all-encompassing data culture, Capco lists regulatory compliance, risk management, revenue increase, innovation and cost reduction as operational areas where good data management can have a measurable and positive effect on profit and loss.

Setting out how an organisation can create an effective data culture, Capco notes the need to change from being an organisation that is obliged to do a certain amount of data management, to a mandated and empowered data management organisation in which data has ongoing recognition as a key primary source. The report concludes: “Every organisation has the potential, as well as the need, to become a true data management organisation. However, the journey needs to begin now.”

Source: Reference Data Review, 24.10.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , ,

ATG taps Nyse Technologies for ATS Brasil

Americas Trading Group (ATG) today announced the formation of a new company that will develop a liquidity center targeting the Brazilian exchange market called Americas Trading System Brasil or ATS Brasil.

Utilizing trading solutions developed by NYSE Technologies, the technology unit of NYSE Euronext, ATS Brasil will offer customers a new equities matching platform in Latin America.

ATG will maintain the controlling interest as well as operational management of the company with NYSE Technologies as a minority shareholder and the core technology provider. ATS Brasil plans to begin operations in 2013, subject to approvals by the Central Bank of Brazil and the Brazilian Securities Commission (CVM).

Fernando Cohen, ATG´s President, believes that the entry of ATS Brasil will have a positive effect on the local stock market as it will contribute to expanding the range of products and services offered to investors in the region. Cohen also emphasized the importance of NYSE Technologies’ decision to expand into the Brazilian market by becoming a partner of ATS Brasil.

Cohen stated that initially ATS Brasil intends to operate in a model known as “the organized OTC market” based on computerization and transparency in order registration and execution, and adopt rigid mechanisms of self-regulation. He further noted that ATS Brasil was not created to compete with BM&FBovespa, but rather to complement it by improving liquidity and price formation for Brazilian assets.

“The entry of ATS Brasil starts a new cycle in the Brazilian exchange market. Our innovative, high-performance order execution platform will generate more liquidity for the capital markets. This initiative should stimulate cost reduction by offering efficiency gains for investors and create the real possibility of placing the Brazilian market within international standards,” said Fernando Cohen.

Dominique Cerruti, President and Deputy CEO, NYSE Euronext said, “As a leading operator of global markets and market technology, we have designed and deployed proven, market-tested trading platforms in key market centers around the world. We are pleased to partner with ATG ith ATG as they expand their business into equities matching with the ATS Brasil initiative. Our technology platform should provide customers and market participants with the same high-quality trading experience, performance and reliability that they’ve come to expect from NYSE Euronext’s own exchanges.”

ATS Brasil will use the Universal Trading Platform (UTP) developed by NYSE Technologies and used by NYSE Euronext’s global markets. UTP has the capacity to process high volumes of messages with very low latency, giving market participants the opportunity to submit thousands of orders per second while also improving market transparency and liquidity. Additionally, ATS Brasil is expected to attract new investors to the Brazilian market, including local and international high frequency traders. ATG will also utilize NYSE Technologies’ Secure Financial Transaction Infrastructure (SFTI) network to provide global access and direct market data distribution for customers trading outside Brazil. 

Source: FinExtra , 06.11.2012

Filed under: BM&FBOVESPA, Brazil, Exchanges, Latin America, Market Data, Mexico, Trading Technology, , , , , , , , , , ,

Reference Data: Current Solutions Lacking, Despite Foundational Nature of Reference Data RDR

Reference data management (RDM) is a foundational element of financial enterprises, yet the collection of solutions used to manage reference data in most firms is not satisfactory, according to a report published this week.

The report – Reference Data Management: Unlocking Operational Efficiencies, published by Tabb Group in conjunction with data integration specialist Informatica – describes current sentiment around RDM. It looks at development through four generations of solutions, details the obstacles to RDM success and sets out how firms at different levels of RDM adoption can move forward towards the holy grail of centralised RMD coupled to consistent reference data processing.

Despite huge investments in RDM over the past decade, research carried out among 20 firms – 25% in Europe, 75% in the US, 50% on the buy side and 50% on the sell side – in April 2012 found 86% of respondents dissatisfied with their RDM capabilities. Of these, 48% are being driven to improvement for reasons related to resource optimisation and outcomes, while 35% are responding to specific catalysts such as compliance.

For details on the report click here.

Recommending how to navigate the road ahead, the study suggests firms committed to bolstering existing suites of RDM solutions should focus on wrapping current solutions with technology that enables a consistent enterprise data governance process, while those yet to make a significant commitment to an RDM solution should seek solutions that manage multiple reference data domains in a consistent and integrated enterprise framework.

The report concludes: “There can be no glory without doing the hard work first. Data fluency, a critical precursor to data consumability, simply means that data flows more easily, which in turn means that end users must be able to find it. And, finding data requires meticulous attention to standards, labels and other metadata, however imperfect they may be now or in the future. That way, no matter how big or complex the data gets, end users will have a much better shot at harvesting value from it.”

Source: Reference Data Review, 19.10.2012

Filed under: Data Management, Reference Data, , , , , , , ,

BM&FBOVESPA Market Data Feeds and Order Routing – News Letter 14

Changes to PUMA UMDF Market Data Feed in Certification Environment
Since September 11, 2012, a new distribution of the PUMA UMDF instrument groups has been available in the certification environment. The change was necessary to adapt this environment to the start of trading in the S&P 500 futures contract. On October 1, 2012, distribution of SELIC instruments in the certification environment is scheduled to start via existing channels 3 and 4.
New Version of MegaDirect Order Entry Interface Available in Certification Environment
Version 4 of the MegaDirect order entry interface for the BOVESPA segment is available in the certification environment. Participants who use versions 2 and 3 must have them updated, in order to maintain compatibility with the BM&FBOVESPA PUMA Trading System matching engine. MegaDirect users must perform the tests mentioned in External Communication 023/2012-DI by no later than September 28, 2012.
EntryPoint Order Entry Interface Available in Production Environment
The EntryPoint trading interface for the BOVESPA segment is now available in the production environment. Although it does not entail significant performance benefits (lower latency) at this time, it mitigates the risk of impacts during BOVESPA segment migration to the BM&FBOVESPA PUMA Trading System.

Customers who will use this new interface must observe the requirement for uniqueness in the ClOrdID (11) tag in the order entry, change and cancellation messages, to prevent any crossed references between orders of the same customer and instrument from generating inconsistencies in the participant’s management. The customer is responsible for correctly filling out the tag, as announced in External Communication 021/2012-DI.

Alterations to ProxyDiff Market Data Feed in Certification Environment
The ProxyDiff market data feed is now available in the certification environment in accordance with the alterations described in External Communication 024/2012-DI. This market data feed includes a large number of test cases (e.g. clash of messages with the same Order ID, and numeric and alphanumeric groups of quotations) that will be implemented in the production environment during and after migration to the BM&FBOVESPA PUMA Trading System in the BOVESPA segment. All customers who use the ProxyDiff market data feed must perform the tests by September 28, 2012.

See full IT News letter Nr.14

Source: BM&FBOVESPA, IT News Letter Nr. 14, 17.09.2012

Filed under: BM&FBOVESPA, Brazil, Data Management, Exchanges, Market Data, , , , , , ,

Coming to Grips With Big Data Challenges by Dan Watkins

The rate of data growth in financial markets has scaled beyond the means of manageability.

Debates have gone so far as to dismiss Big Data as being tamable and controlable in the near term with the current computing architecture commonly adopted as an acceptable solution. I agree but argue that conventional data transport – not management – is the real challenge of handling and utilizing Big Data effectively.

From exchange to trading machine, the amount of new ticks and market data depth are delivered only as fast as the delivery speed can endure. Common market data feeds that are used in conventional exchange trading are but a fraction of the market information actually available.

Perhaps due to high costs of $100,000 per terabyte, many market participants deem the use of more data as a bit too aggressive. Or they believe that high performance computing (HPC) is the next generation technology solution for any Big Data issue. Firms, therefore, are sluggishly advancing their information technology in a slow cadence in tune with the old adage: “if it ain’t broke don’t fix it.”

Over the last decade, Wall Street business heads have agreed with engineers that the immense perplexity of Big Data is best categorized by Doug Laney’s 2001 META Group report’s Three B’s: Big Volume, Big Velocity and Big Variety.

When looking at “Big Volume” 10 years ago, the markets had just defragmented under Regulation ATS. A flurry of new market centers arose in U.S. equities as did dark liquidity pools. This gave rise to a global “electronic trading reformation.” Straight-through processing (STP) advocates and evangelized platforms such as BRASS, REDIPlus and Bloomberg Order Management Systems (OMS) resulted in voluminous and fragmented market data streaming to 5,000 NASD/FINRA trading firms and 700,000 professional traders.

Today, the U.S. has 30+ Securities and Exchange Commission-recognized self-regulatory organizations (SROs), commonly known as exchanges and ECNs. For the first time since 2002, full market depth feeds from NASDAQ allow firms to collect, cache, react, store and retrieve feeds on six hours of trading for nearly 300 days a year more transparently than ever. Big Data volume has grown 1,000 percent and has reached three terabytes of market data depth per day.

Billions of dollars are being spent on increasing “Big Velocity.” The pipes that wire exchanges through the STP chain to the trader have become 100 times faster and larger but still not fast enough to funnel the bulk of information laying idle back in the database. Through “proximity hosting,” the telco is eliminated and latency is lowered. This structure results in adjustments made for larger packets but not really for more information as Big Data remains the big, quiet elephant in the corner.

Five years after Reg ATS, markets are bursting at the seams with electronic trading that produces explosive market data that breaks new peak levels seemingly every day. The SEC’s Regulation National Market System (Reg NMS), struck in 2007, requires exchanges and firms to calculate the best price for execution to be compliant. Firms are also now mandated to sweep all exchanges’ market order books and process all of that data for a smart execution.

After the execution, traders have to track the “order trail” from price to execution for every trade and store all of that information for seven years in the event of an audit recall of a transaction.

Under Reg NMS, subscribing to the full depth of all 30+ markets in “real time” would mean a firm would have to have a 1x terabyte pipe for low latency. Since a T-pipe is not realistic, data moves at 1x gigabits, which is relatively slow with the data in queue at 50-100 terabytes deep. Multi-gbs pipes, as fast as they seem, are still similar to driving five miles an hour on a 55 mph highway.

Analysts typically call data from a database with R (Revolution Analytics) and “SAS” Connectors. The process includes bringing data to an analytical environment in which the user runs models and computations on the subsets of a larger store before moving on to the next data crunch job. The R and SAS Connectors between the file servers and the database are at 10/100BASE-T, making the movement of 50 terabyte environment like driving one mile per hour in a 55 mph zone.

We all hear the polemics regarding data formats and the jigsaw puzzle of unstructured data and the fact that “Big Variety” is the obstacle. Even after standardization of SQL-based queries where analysts can ask any “ad hoc” question, too many sources and too many pipes from analytic servers cause traffic jams. SQL databases are ideal for unstructured queries but are slow in unstructured data compiling. Aggregating market information is where much of market’s processing technologies are being evaluated today to meet the requirements of regulations, sweeping for best execution and for risk management.

Comparing where current prices of stocks are against bids and asks to trade across multiple exchanges, markets, sources, asset classes and clients is essentially the Big Data task of risk management. In addition to managing data changes, firms are also tasked with managing their trading accounts, client portfolios and trading limits such as with the implementation of Credit Valuation Adjustments (CVAs) for counterparty risk.

So why are we still piping data around the enterprise when we just need more compute and memory power? Hardware-accelerated core processing in databases such as XtremeData’s dbX and IBM’s Netezza are powered by FPGAs (field programmable gate arrays). Processing of massive amounts of data with FPGAs can now occur at “wireless” speed. Along with high performance computing, high-speed messaging technology provided by companies like TIBCO, Solace Systems and Informatica have redefined transport times into ultra-low latency terms from one database to another in single microseconds, sometimes in nanoseconds, from memory-cache to memory-cache.

The colloquial phrase “in-database” analytics is an approach of running analytics and computations as near as possible inside a database where the data is located. Fuzzy Logix, an algorithmic HPC vendor, replaces the need for SAS and R connecting analytics, which stretch along the wire from the database to the analyst. With Fuzzy Logix, the need to call a database for small files is eliminated because computations can be done with the rest of the database in real-time: days to seconds faster.

With in-database or in-memory analytics, BI engineers can eliminate transport latency altogether and now compute at server speeds with computations sitting inside the database or in memory for tasks to be completed locally, not on the transport wire.

Wall Street is as risk averse as ever in today’s atmosphere so the adoption of new technology or new vendors continues to present operational risk challenges. ParAccel is a company that appears to be addressing the operational risk of new technology adoption by helping firms utilize the power of parallel processing of Big Data analytics on OEM hardware.

Since ParAccel is software, an IBM, HP or Dell shop could essentially rely on the reliability of their well-known, established database vendor but use next generation Big Data analytic processing an order of magnitude faster than what is currently in place. ParAccel allows firms to aggregate, load and assimilate different data sets faster than traditional platforms through its “columnar database” nodal system. The columns in a ParAccel environment provides firms with the flexibility to first run analytics in-database or in-memory, then bring massive amounts of data to a common plane and finally, aggregate the unstructured data and do it all in lightning speed.

Other companies like NVIDIA have been building graphic processing units (GPUs) for the video game industry for three decades and are now swamped with customer requests to help build parallel computing environments, giving financial firms the ability to run trillions of algorithmic simulations in microseconds for less than $10,000 per card, essentially. GPUs can have up to 2,000 cores of processing on a single NVIDIA Tesla card embedded inside. A GPU appliance can be attached to a data warehouse for advanced complex computations. Low-latency processing can also be achieved due to minimum movement of data over a short distance analyzing most of what Wall Street claims is Big Data in seconds compared with the days it takes now.

The vendors and players are ready to get to work; there just needs to be some consensus that the Big Elephant in the room is there and it’s standing on a straw when it could be surfing a Big Wave!

Source: Tabb Forum , 02.05.2012 by Dan Watkins, President @ CC- Speed dwatkins@cc-speed.com

Filed under: Data Management, Market Data, Risk Management, Trading Technology, , , , , , , , , , ,

HKEx selects NYSE Technology Exchange Data Publisher for Hong Kong Market Data Plattform

Hong Kong Exchanges and Clearing Limited (HKEx) has selected NYSE Technologies’ Exchange Data Publisher (XDP)™ to drive the HKEx Orion Market Data Platform.  XDP is an ultra-low latency solution designed to collect, integrate and disseminate real-time market data to local customers and, using regional hubs, to customers around the globe.

The HKEx Orion Market Data Platform will deliver market data for all securities and derivatives traded by HKEx in a common message format.  It will be capable of distributing more than 100,000 messages per second at microsecond latency.  It will be rolled out for HKEx’s securities markets towards the end of the second quarter of 2013, with a remote distribution hub in Mainland China and integration with HKEx’s derivatives markets to follow.

With the establishment of remote distribution hubs under the new market data platform, HKEx will be able to establish points of presence for market data distribution outside of Hong Kong, such as in Mainland China, where information can be relayed to local customers.

“The HKEx Orion Market Data Platform will enable us to improve our customers’ market data experience by providing a suite of market data product feeds with content, market depth and bandwidth requirements tailored to suit the needs of different types of customers,” said Bryan Chan, Head of Market Data at HKEx.  “We selected NYSE Technologies’ XDP solution based on its high performance capabilities as well as the flexibility it offered to meet our customer requirements.”

XDP is based on NYSE Technologies DataFabric 6.0, an industry-leading platform offering high throughput, scalable application messaging and microsecond latency.

“HKEx’s selection of XDP reaffirms our technology expertise and ability to deliver innovative products that operate effectively in markets around the world, particularly in the growing Asian marketplace,” said Stanley Young, CEO NYSE Technologies.  “In XDP we are providing HKEx with the robust features of a proven platform and the advantages of functions tailored to their unique trading environment.  We are pleased to be working with one of the world’s leading markets to deploy a world class market data platform that will serve customers in Hong Kong and Mainland China.”

Source, MondoVisione 29.03.2012

Filed under: Data Management, Exchanges, Hong Kong, Market Data, , , , , , ,

White Paper: Big Data Solutions in Capital Markets – A Reality Check

Big Data has emerged in recent months as a potential technology solution to the issue of dealing with vast amounts of data within the enterprise. As in other industries, financial services firms of all kinds are drowning in data, both in terms of the sheer volume of information they generate and / or have to deal with, and in terms of the growing and diverse types of data they confront in those efforts.

But the relative immaturity of Big Data solutions, and widespread lack of understanding of what the term really means, leads some to question whether ‘Big Data’ is no more than a technology solution looking for Big Problem to solve.

So is Big Data for real? Can so-called Big Data solutions provide relief to the embattled data architects at financial institutions? Or is Big Data a solution looking for a set of problems to solve?

Research conducted by A-Team Group on behalf of Platform Computing suggests that current market sentiment, financial hardships and regulatory scrutiny may be conspiring to create the perfect conditions for Big Data solutions to provide value to financial institutions.

Download the White Paper Now

Source: A-Team, 15.02.2012

Filed under: Data Management, Data Vendor, Library, Market Data, Reference Data, , , ,

Deutsche Börse developing new business – all Data and IT related activities pooled into IT.

By way of underpinning its growth strategy, Deutsche Börse is creating a new business area geared to extending its client reach and service offering. In this move, notably IT with its system and service development and operating capabilities, Market Data and Analytics as well as selected external services are to be pooled under one roof. This includes, for example, the use of trading systems for other exchange companies, the business process offering in its entirety, IT operations for other financial service providers as well as network services.

The goal on the one hand is to tap new, integrated business opportunities while, on the other, supporting clients with tailored IT and other services, thereby further enhancing customer loyalty, broadening client reach and meeting the growing demand for outsourcing services with an expanded range of services.

Reto Francioni, CEO of Deutsche Börse AG, said: “Our services, also in the field of IT in particular, put us firmly in the premier league of global providers. Deploying our combined expertise and capabilities to optimum effect in customer acquisition, competition for market share and regional presence will be increasingly important when it comes to boosting the Group’s international positions. This sends out a clear mandate to our new business area to play a key role as a critical and strategic competitive factor for Deutsche Börse AG going forward, as well as to harness and expand cross-selling potential with our existing business areas.”

In addition, to the launch of the new business unit there will be a change in the top management of the IT business unit. Dr.-Ing. Michael Kuhn (57) and Deutsche Börse AG agreed on the best of terms and by mutual consent that the Executive Board contract of Michael Kuhn due to run out at the end of 2012 will not be extended. The company is looking for a successor. Michael Kuhn will be available for the company.

Supervisory Board Chairman Manfred Gentz and CEO Reto Francioni thanked Michael Kuhn for his total of 23 years’ service for the Deutsche Börse Group. He has been a member of the Group Executive Board since 1999. “We wish to express our gratitude to Michael Kuhn and his team. It is thanks to him and his colleagues that Deutsche Börse AG sets global standards with its systems,” said Gentz.

Source: MondoVisione, 14.02.2012

Filed under: Data Management, Data Vendor, Exchanges, Market Data, News, Reference Data, Trading Technology, , , , ,

Follow

Get every new post delivered to your Inbox.

Join 82 other followers