SlideShare una empresa de Scribd logo
1 de 21
Descargar para leer sin conexión
June 2010
                                                                                          $499.00


Analytics.InformationWeek.com




 A n a l y t i c s              R e p o r t
                                 POWERED BY




                                Accelerating Wall Street 2010
                                N ex t Sto p : N a n o s e co n d s
                                With data latency on its way to being measured in
                                nanoseconds, message volume exploding, and intensified
                                demand for innovative trading products, Wall Street
                                organizations are turning to the fastest and newest tech-
                                nologies to stay ahead. This special Wall Street &
                                Technology digital report examines some of the latest
                                innovations in the low-latency arms race, including hard-
                                ware acceleration, complex event processing and coloca-
                                tion, and provides exclusive insights from WS&T’s
                                Accelerating Wall Street 2010 conference.

Report ID: S1330610
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s           R e p o r t



                 CO NTENT S                                3
                                                           4
                                                                 The Low-Latency Imperative: How Fast Is Fast Enough?
                                                                 Figure 1: 5 Factors Influencing Latency
                                                           8     What’s All the Fuss About?
                                                           11    How Low Can You Go?
                                                           14    Silicon: The Next Tool for Low Latency
                                                           16    What’s the Next Low-Latency Tool? Try People
                                                           18    Data Center Costs, Oversight Challenge the Sell Side
                                                           20    Firms Still Not Analyzing Unstructured Real-Time News
         F
         O
         E
         L
         B
         A
         T




                    2      June 2010                        © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s          R e p o r t




                         The Low-Latency Imperative: How Fast Is Fast Enough?
                         Conventional wisdom may assert that everyone on Wall Street wants to be as fast as
                         possible. But latency is a matter of perspective, influenced by trading style, instrument
                         class and even the tools used to measure it.

                         By Daniel Safarik

                         Ask any trader, vendor or marketplace operator in the global securities market, “How fast do
                         you need to be in order to be successful?” and the answer will most likely be, “It depends.”

                         Technological advances move at such a pace, and firms rely on such varying strategies, that the
                         level of latency—defined as the time it takes for an order to travel to a marketplace and to be
                         executed or canceled, and then for a confirmation of that activity to return to its source—
                         acceptable to any given party will vary, though none of the intervals are perceptible to the
                         human eye. For the past few years, hardware and software providers have been able to decrease
                         latency exponentially each year. We have gone from talking about milliseconds to microseconds
                         and even nanoseconds.

                         Although it may seem to be accepted wisdom that everyone wants to be “as fast as possible,”
                         that’s not necessarily true. And as the market saw graphically and frighteningly on May 6,
                         speed, by itself, is not the ultimate goal. In fact, lacking business rules that acknowledge the
                         full implications of instantaneous transactions, speed is dangerous.

                         According to Adam Honore, senior analyst at Aite Group, the level of latency considered
                         acceptable by market participants depends largely on several factors, including:

                         • Trading style. If you have an aggressive trading style that relies on opportunistic pricing dif-
                         ferentials, you need to be the fastest. If you are a long-only quant fund, speed is not as critical.

                         • Instrument class. Generally speaking, equities are the fastest-moving markets, with futures,
                         foreign exchange and fixed income lagging behind.

                         • Venue characteristics. The capabilities offered by each exchange and marketplace will




                   3       June 2010                       © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s          R e p o r t




                         vary—dark pools, intraday matching, live limit-order books and so on—as will the level of
                         traffic they attract throughout the course of the day.

                         • Instrument characteristics. Trading shares of highly liquid Microsoft would have a vastly
                         lower latency requirement than an illiquid OTC Bulletin Board stock, for example.

                         There are ranges of latencies that can be instructive, according to Steve Rubinow, CIO at NYSE
                         Euronext. But it is important to remember that these numbers vary greatly with market condi-
                         tions and that the methodologies of measuring latency are not consistent, he adds, stressing
                         that if latencies are quoted out of context with market traffic, they are essentially useless.

                         “Everyone publishes numbers that were generated under the best possible conditions and
                         hopes nobody asks the details, because those details would reveal whether it was comparable
                         or not,” Rubinow says. “Having said all that, to be competitive today, you have to be in the few
                         hundred microseconds of turnaround time.”

                         When it embarked on its Universal Trading Platform (UTP) program last year, NYSE Euronext
                         stated that it was aiming for 150 microseconds to 400 microseconds of latency per round trip
                         for its European Cash Markets. By comparison, on May 14 NYSE rival Nasdaq OMX published
                         an overall average result of 157 microseconds, while noting that 99.9 percent of orders were
                         completed within 757 microseconds, at a rate of 194,205 orders per second.

                         To illustrate how quickly the standard moves, the top class was in the tens of milliseconds a
                          Figure 1                                              year ago, according to Donal Byrne,
                                                                                CEO and founder of Corvil, a Dublin,
                                   5 Factors Influencing Latency                Ireland-based vendor of latency meas-
                                                                                urement technology. [Ed. Note: 1 mil-
                           Trade logic—the code that runs matching engines      lisecond = 1,000 microseconds.]
                           and algorithms
                           Speed of calculation hardware                        About 40 percent of the U.S. equities
                           Speed of telecom switch hardware
                                                                                market volume is comprised of market
                           Quality and number of connections
                           Distance between network nodes                       makers that are trying to match the
                           Source: Kevin McPartland, Analyst, TABB Group
                                                                                latency of the marketplace they are
                                                                                using, notes Shawn Melamed, founder,




                   4       June 2010                       © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                            A n a l y t i c s          R e p o r t




                         president and CEO of New York-based Correlix, which also sells latency monitoring devices to
                         exchanges, including Nasdaq OMX. That group needs the fastest response times, he says.

                         On the other hand, “For someone who is doing index arbitrage, the average latency they would
                         require is really situational,” Melamed comments. “There is no fixed number there. [Because
                         they need to get information from several marketplaces], they will be tolerant to higher latency,
                         so that you can at least get full price information before you make your decision.”


                         Knowing Latency Is Half the Battle
                         The emergence of companies such as Corvil and Correlix, which did not exist five years ago,
                         illustrates an important consideration about latency: Often, knowing the level of latency at a
                         given marketplace is more important than the number itself. Traders—and the algorithms they
                         deploy—now can make decisions about execution venues using latency data, just as they
                         would use fill rate and price quality as decision factors.


                                                           “To the extent that we have multiple paths to get to a
                                                           venue, we can effectively normalize out the native latency
                                                           within that venue.”
                                                                                                       —Jason Lenzo, Russell Investments


                         At Tacoma, Wash.-based Russell Investments, which operates the Russell 2000 small-cap index,
                         traders rely on this operational transparency to make decisions, bringing the latency data about
                         each execution venue and market-data source right onto trader desktops, relates Jason Lenzo,
                         head of equity and fixed income. “To the extent that we have multiple paths to get to a venue,
                         we can effectively normalize out the native latency within that venue,” he says. “We can then
                         optimize the speed to market across specific optical and telephony links in those networks.”

                         When evaluating latency, it’s vital to consider all the contributing factors, including the trade
                         logic (the code that runs matching engines and algorithms), the speed of calculation hard-
                         ware, the speed of telecom switch hardware, the quality and number of connections, and the
                         distance between network nodes, according to Kevin McPartland, analyst at TABB Group. The




                   5       June 2010                        © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s          R e p o r t




                         industry as a whole is rapidly approaching the point where, “The code is so tight, hardware
                         improvements are the main thing that will increase the overall efficiency of the operation,”
                         McPartland says.

                         Enter, vendors such as Blade Network Technologies (telecom hardware), Equinix (colocation
                         hosting) and Nvidia (gaming graphics cards re-tasked to calculate derivatives). Each of these
                         technology providers is feverishly trying to reduce the latency of its layer in the stack.

                         Santa Clara, Calif.-based Blade makes a 10-GB switch that connects feed handlers, algorithm
                         boxes and matching engines at colocation centers, where firms have increasingly found it use-
                         ful to situate their machines across the hallway from their counterparts, even if their offices and
                         trading staff are on opposite sides of the globe.

                         By merging routers with switches, Blade has eliminated a layer that previously added precious
                         microseconds to a round trip, explains David Iles, Blade Network Technologies’ director of
                         product management.

                         “We are providing sub-700 nanoseconds of latency, port to port,” Iles asserts. “We are also
                         deterministic. You don’t want stale market data getting to devices. It has to take the same time
                         to get from Port 1 to Port 24 as it does from Port 1 to Port 2.”

                         Technologies such as this tend to live side by side in colocation centers run by companies such
                         as Foster City, Calif.-based Equinix. Here, the issue is bandwidth and energy efficiency, both of
                         which are major cost contributors in the low-latency race. Trading firms are increasingly opting
                         for colocation rather than running expensive, high-throughput dedicated fiber from their
                         offices to the marketplace.

                         “We have a customer in Greenwich, Conn.,” relates John Knuff, general manager, global
                         financial markets, at Equinix. “They were spending about $20,000 a month to get trades to
                         New York. They moved a couple of cabinets in with us. It’s $3,000 to $5,000 a month for a
                         cabinet, and $200 to $300 for the cross-connects. They essentially offset the cost of their
                         colocation by getting rid of the network costs back to their office, which had no economic or
                         competitive advantage.”




                   6       June 2010                       © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s          R e p o r t




                         Think Fast(er)
                         An equally important question pervades the minds of traders: Once you’re satisfied with the
                         turnaround time to your market, how do you maximize the value of time between transac-
                         tions? That question interested Tobias Preis, managing director of Artemis Capital Asset
                         Management of Holzheim, Germany, to such a degree that he became one of the first customers
                         of Nvidia’s graphical processing unit (GPU), a processor that has 480 cores, compared to the
                         typical four- to 12-core CPU.

                         The GPU originally was developed to render high-resolution details for computer games. Preis,
                         also a computational physicist, uses the GPU to calculate time series for the DAX-index futures
                         algorithms he deploys on Eurex.

                         “The increases in speed represented by the GPU are many times faster than the reductions in
                         latency by the exchanges,” according to Preis, who says he gets by on 100 milliseconds to 150
                         milliseconds of average latency to Eurex. “We can now perform parallel-computing calculations
                         that used to take one minute in one second.”

                                                                                        Where will low latency be in
                                                                                        a year? Many market partici-
                                              “I know we will break the                 pants say they won’t be sur-
                                              100-microsecond barrier.”                 prised if the discussion is
                                                                                        about nanoseconds in a year.
                                                     —Steve Rubinow, NYSE Euronext
                                                                                        “I know we will break the
                                                                                        100 microsecond barrier,”
                                                                                        NYSE’s Rubinow says. Beyond
                         that, it becomes enormously expensive to add each zero behind the decimal point, he notes.

                         Despite the excitement over low latency and the extreme competitiveness of financial firms and
                         the vendors that serve them, it’s important to keep a clear head about the need for speed, adds
                         Adam Afshar, president of program trading at Atlanta-based Hyde Park Global Investments,
                         which is 100 percent automated and has no manual traders.

                         “High frequency is just a method for implementing a strategy—it is not the strategy itself,”
                         says Afshar. He notes approvingly that the decreasing cost of technology means that a $10




                   7       June 2010                       © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s          R e p o r t




                         million investment in technology allows a smaller firm to rival the speed of the biggest banks
                         on Wall Street.

                         But the key to success in the marketplace, according to Afshar, is adaptability, and that still
                         comes from human ingenuity. For Afshar, going forward, the more interesting question is not
                         “How fast can you hit the market?” but “What do you do with that speed?”

                         “The bottom line,” he says, “is your adaptability to the nonlinearity of markets.”




                         What’s All the Fuss About?
                         Despite the controversy surrounding high-frequency trading, the trading style is
                         beneficial to long-term investors and to the market at large, argues Arzhang Kamarei,
                         Managing Partner, Tradeworx.

                         By Melanie Rodier

                         High-frequency trading remains mired in controversy, with regulators fearing that unscrupu-
                         lous traders are taking advantage of individual investors. But what critics don’t realize is that
                         high-frequency trading actually is beneficial to long-term investors and to the market at large,
                         according to Arzhang Kamarei, managing partner at Tradeworx, a quantitative investment man-
                         agement firm with expertise in high-frequency market strategy.

                         “High-frequency trading creates opportunities for long-term investors by providing more liq-
                         uidity,” asserted Kamarei, who presented the keynote address at Wall Street & Technology’s
                         recent Accelerating Wall Street conference.

                         The extra liquidity that high-frequency trading provides, he explained, narrows spreads for
                         long-term investors, ultimately helping them get better prices.

                         “During the turbulent fourth quarter of 2008, it was high-frequency traders that stepped up
                         and provided liquidity,” Kamarei argued. “High-frequency trading provides U.S. markets with




                   8       June 2010                       © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                            A n a l y t i c s          R e p o r t




                         better prices and deeper liquidity than markets in any other country or region. It helps smooth
                         the course of long-term investors.”

                         High-frequency trading is estimated to generate as much as two-thirds of U.S. equities daily
                         trading volume. But as it grows in popularity, it also has attracted the scrutiny of regulators,
                         eager to appease uneasy investors after the financial crisis.

                         Addressing the controversy surrounding high-frequency trading strategies, Kamarei pointed out
                         that high-frequency trading isn’t always profitable. “High-frequency traders make money
                         through spread capture,” he noted. “They optimize adverse selection to match rebates. More
                         volatility increases spreads.”

                         In April, the SEC unanimously approved a new proposal that would track transactions by high-
                         frequency trading firms to improve oversight of their activity. Under the new rule, firms will be

                                                           “High-frequency trading provides U.S. markets with
                                                           better prices and deeper liquidity than markets in any
                                                           other country or region.”
                                                                                                           —Arzhang Kamarei,Tradeworx

                         given unique identifiers and will be required to report next-day transaction data when request-
                         ed by regulators. This will allow authorities to keep closer tabs on traders that aren’t registered
                         market makers or broker-dealers without having to follow lengthy audit trails from exchanges
                         when they scrutinize a particular firm or trade.

                         Next-day access to trading data also could assist investigators in finding manipulative, abusive
                         or otherwise illegal trading activity. The SEC estimates that the rule will apply to the largest
                         400 market participants—firms or individuals whose transactions in exchange-listed securities
                         equal or exceed 2 million shares or $20 million during any calendar day, or 20 million shares
                         or $200 million during any calendar month.

                         Regulators also have been scrutinizing flash orders, which let traders briefly expose their
                         orders to others in the market, and “naked access,” which allows firms to buy and sell




                   9       June 2010                        © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s          R e p o r t




                         stocks on exchanges using a broker’s computer code without regulators knowing who is
                         making the trades.

                         Also under fire from regulators is the strategy of colocating at or near exchange data cen-
                         ters, which authorities say gives high-frequency trading firms an unfair advantage over
                         slower traders.


                         Spurring Competition
                         Kamarei told attendees, however, that colocation isn’t unfair to long-term investors, as it helps
                         to create competition among high-frequency traders. Further, both high-frequency traders and
                         long-term investors can colocate, he noted.

                         Meanwhile, Kamarei argued that any attempts to change the market structure will fail to
                         reverse technology advances. Instead, costs will come down for less advanced users, which in
                         turn will drive further adoption of high-frequency trading technology, he said.

                         As for the future of high-speed trading, Kamarei suggested that sell-side broker-dealers will be
                         the main force for spreading the use of high-frequency trading to all market participants.

                         He predicted that high-frequency trading volumes will stay at their current levels on a volatili-
                         ty-adjusted basis, but many high-frequency trading desks will go out of business, even as high-
                         frequency technology grows more ubiquitous.

                         “New high-frequency trading firms will process more dimensional and complex data,”
                         Kamarei said.

                         Meanwhile, he noted, “The fascination with colocation will decrease as the technology
                         becomes commonplace.”




                   10       June 2010                      © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                            A n a l y t i c s          R e p o r t




                         How Low Can You Go?
                         Even as firms explore bleeding-edge technologies to lower latency, Lime Brokerage
                         President and CEO Jeffrey Wecker says there’s still plenty of latency that can be elimi-
                         nated with conventional products and a simple understanding of the nature of latency.

                         By Ivy Schmerken

                         To survive on Wall Street, firms believe they need speed, according to Lime Brokerage presi-
                         dent and CEO Jeffrey Wecker, who delivered the closing keynote address at Wall Street &
                         Technology’s Accelerating Wall Street conference in May.

                         Wecker insisted that anyone willing to make the appropriate investments could achieve mini-
                         mum latency using conventional technology. But with low-latency trading already measured in
                         microseconds, he predicted it would be difficult to reduce latency further.

                         “We use just about every trick available from conventional technology and have squeezed
                         latency out,” said Wecker. Lime Brokerage is an agency brokerage firm that provides

                                                           “The latency game is nearly over for data delivery, order
                                                           management and matching logic.”
                                                                                                         —Jeffrey Wecker, Lime Brokerage




                         high–throughput, low-latency technologies to high-frequency traders and other proprietary
                         trading shops, as well as more traditional buy-side firms, primarily as a managed service.

                         “Short of addressing greater volumes of data, the top performers have a good handle on what
                         they need to do to reduce latency,” Wecker continued. “Software architecture is reaching
                         canonical perfection in the order management process.”

                         Without more research breakthroughs from computer science, he added, reducing latency fur-
                         ther will become more and more difficult.




                   11      June 2010                        © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s          R e p o r t




                         Miles to Go to Eliminate Latency
                         Nonetheless, Wecker acknowledged, latency still exists—from third-party suppliers because of
                         physical distances, in hardware and network equipment, and in software.

                         Physical distance can add latency depending on “how many miles of wires or fiber connect
                         your systems to the matching engine,” Wecker explained. On the network equipment side, he
                         added, latency stems from the number of pieces of equipment and network hops, including the
                         the network interface and any switches through which an order must pass before it reaches the
                         market.

                         “You can lose 10 microseconds in the authentication before you get to the match,” Wecker said.

                         But, he continued, “The greater sources of latency do not come from the search for better hard-
                         ware solutions. I still believe the greatest amount of latency is in code architecture and code
                         restructuring.”

                         According to Wecker, software can cause latency from the client side, from the broker-dealer
                         validation and from the TCP/IP stack. But, he said, firms can work with providers to restruc-
                         ture their software code.

                         Wecker cited “asynchronous parallel code” as a programming method for reducing latency, and
                         said he’s spoken with firms that are making the leap to graphical processing units (GPUs) to
                         parallelize their algorithms. But Wecker cautioned firms against going out on “the bleeding
                         edge” with hardware acceleration tools such as field-programmable gate arrays (FPGAs).

                         “This path is potentially full of a lot of pitfalls,” he told the audience. “Suboptimal bus rates
                         have derailed a lot of firms attempting to deploy GPUs.”

                         Wecker also noted that firms are exploring application-specific integrated circuits (ASICs)—
                         integrated circuits customized for a particular use rather than a general-purpose use—and fab-
                         rication technology. But again, he suggested caution. “Going down that path is very foolish and
                         an expensive venture if you don’t have the experience with software architecture,” he said.




                   12       June 2010                      © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s          R e p o r t




                         Comparing Apples With Oranges
                         But even if firms deploy these technologies successfully, Wecker emphasized, it’s essential to
                         monitor latency, and to sift through all the marketing hype around the term. “There’s evidence
                         of poor latency monitoring and reporting by different firms,” he said, noting that many vendors
                         offer “apples and oranges comparisons.”

                         Though there are industry standards for measuring latency, Wecker said, hype surrounds the word.
                         “Frankly, I think the industry can do a lot better to be intellectually honest about latency,” he said.

                         While customers understand this, Wecker added, they still can get confused. For instance,
                         though many customers are colocated at exchange data centers, “They haven’t made the full
                         investment in either infrastructure, staff or monitoring technology to know if they’re getting the
                         best from the exchange,” he insisted.

                         In one case, Lime found that the client’s “net latency introduced by switches, telecom providers
                         and software architecture had the equivalent of putting their rack five miles away from the
                         market center,” Wecker revealed. “Colocation has been sold to many firms as the be-all, end-
                         all; it’s not. It doesn’t substitute for understanding the contributors to latency. I would argue
                         that just by being in the same data center as the exchange doesn’t help.”

                         While there is a benefit to colocation and the edge it brings as compared with locating in
                         another data center, Wecker explained, the benefit could be negated by intraprocess latency—
                         meaning delays in processing from inside the boxes and applications, including the match
                         interaction and message acknowledgement. To truly minimize latency, a firm must isolate the
                         causes of latency and break out the measurements, Wecker advised. “You have a right to ask
                         questions about the methodologies used in report statistics,” he said.

                         In addition, latency must be examined in context. There is a difference between reporting
                         latency for a single message during normal market activity and reporting latency during peak
                         throughput. “Look at the metric itself,” said Wecker, urging firms to question whether the
                         number is a mean or median value, for an individual message, or for peak throughput.

                         “The nature of scientific latency requires that you approach it in a disciplined way, and that
                         means understanding the full environment,” Wecker said. “That sways you one way or another.”




                   13       June 2010                      © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s          R e p o r t




                         Silicon: The Next Tool for Low Latency
                         GPUs and FPGAs can help lower latency, but they often are difficult to program or
                         change, experts caution.

                         By Melanie Rodier

                         Firms are relying on the newest and fastest processors to power their cutting-edge trading
                         infrastructures. For the most innovative firms, the low-latency work is taking place in silicon,
                         with the use of graphic cards (GPUs) and FPGAs—reprogrammable silicon chips that give trad-
                         ing firms ultra-low-latency methods for identifying and responding to specific information from
                         data feeds.

                         According to panelists at Wall Street & Technology’s Accelerating Wall Street conference in May,
                         hardware acceleration can help shave off microseconds, and soon nanoseconds, in latency.
                         FPGA and GPU technology continues to gain momentum.

                         The benefits are clear to engineers: FPGAs are parallel in nature so different processing opera-
                         tions do not have to compete for the same resources. A GPU can relieve the microprocessor of
                         2-D or even 3-D graphics rendering and is commonly used in embedded systems, mobile
                         phones and game consoles. Like FPGAs, its highly parallel structure makes it effective for a
                         range of complex algorithms.

                         But before they rush to implement these new toys, financial institutions still need to evaluate
                         the costs and benefits of GPUs and FPGAs carefully, according to experts who participated in a
                         panel discussion, “Silicon: The Next Tool for Low Latency,” at WS&T’s Accelerating Wall Street
                         conference. “You have to look at the cost of new technologies,” said Andrew Athan, managing
                         member, Athan Capital Holdings, “and ask yourself what is it you’re trying to gain.”

                         Not all financial institutions will actually gain a trading advantage by shaving a few microsec-
                         onds off their execution times, Athan suggested. “Will every order I send be matched ahead of
                         a competitor? I am not sure that for every millisecond I gain, I will gain a matched order,” he
                         related. “When we looked at the relative benefits, it didn’t yet make sense for us.”




                   14       June 2010                      © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                            A n a l y t i c s          R e p o r t




                         Panelist Ryan Eavy, associate director, enterprise architecture, CME Group, agreed that FPGAs
                         and GPUs aren’t right for everyone. “You can squeeze milliseconds out of an application—that’s
                         huge,” he said. “But microseconds? It depends if it’s worth it.”

                         Even if firms do decide to adopt hardware acceleration technologies, there are some limitations
                         they must overcome, the panelists noted. For example, “The hardware can be difficult to debug
                         and to troubleshoot,” according to Peter Krey, founder and president, Krey Associates.

                         Finding a good team of application developers who understand both the technology behind
                         hardware acceleration as well as the financial industry’s requirements presents its own set of
                         challenges, the panelists concurred.

                         “The industry is very dynamic, with all the regulations, changes in protocols, algorithms, etc.,”
                         said Athan Capital’s Athan, who is based in San Diego. Being in Southern California has


                                                           “You have to look at the cost of new technologies and ask
                                                           yourself, what is it you’re trying to gain for yourself?”
                                                                                                 —Andrew Athan, Athan Capital Holdings




                         allowed Athan to dip into a large talent pool filled with developers who normally build iPods
                         and cell phones, he noted.

                         Others are looking beyond the private sector for qualified technologists. With military contrac-
                         tors now using FPGAs for aerospace and defense systems, Andy Bach, SVP, technology, NYSE
                         Euronext, said the exchange has been hiring experienced developers from the defense industry.

                         But if everyone is leveraging the same hardware, won’t this level out the playing field?

                         “If you can only compete by buying hardware, you’ll do it,” said Athan. “But if you can com-
                         pete by hiring smarter people, or building better algorithms, you’ll do that, too.”




                   15       June 2010                       © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s          R e p o r t




                         What’s the Best Low-Latency Tool? Try People
                         While faster chips and 10 gigabit Ethernet connections help, having the right develop-
                         ers and engineers is the key to winning the low-latency race.

                         By Greg MacSweeney

                         Most financial firms equate being the fastest with having the latest technology. But panelists at
                         May’s Accelerating Wall Street conference agreed that the best tool to reduce latency is not a
                         faster server or FPGA; it’s the right people.

                         While the latest servers and networks certainly help in the low-latency race, the topic of hiring
                         the right people came up again and again during the “What’s in Your Low-Latency Toolbox”
                         session at WS&T’s recent conference. “If you look into the entire value chain of where things
                         are slowed during the trade [process], you realize you need to have the right developers and
                         the right engineers,” said panelist Steven Sadoff, EVP and CIO at Knight Capital Group. “The
                         biggest reductions in latency come when we optimize our software.”

                         At Nasdaq OMX, CTO Mats Andersson said, simplifying software has been tremendously bene-
                         ficial. To do that, he explained, not only must IT and the business be on the same page, the
                         networking engineers have to know what the other technology developers are doing. “When
                         we bring our developers and networking people together, we see great results,” Andersson
                         asserted.

                         But having “simple” software is easier said than done. “Our goal is to be as simple as possible,”
                         noted Scott Ignall, CTO at Lightspeed Financial. “We want our systems to run straight and
                         fast.”

                         But in order to do that, developers and engineers need to know how the markets operate and
                         understand the business goals, he added. Essentially, they need a financial market IQ, as well
                         as technical acumen, Ignall said.

                         “It’s very hard to find the right people, the ones who know technology and finance,” he said. “We
                         are a small company and we move quickly. We don’t have time to teach people about the industry.”




                   16       June 2010                      © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                            A n a l y t i c s          R e p o r t




                         Panelist Michael Mollemans, head of electronic execution sales at Daiwa Capital Markets
                         America, agreed. “There is no substitute for financial experience,” he commented. “When I am
                         looking [for technologists], I stay within my circle of industry contacts because they know me
                         and they know the business. They won’t recommend someone who isn’t right.”

                         Aside from having the best and brightest working on your systems, the panelists also said that
                         while colocation provides a noticeable improvement, 10 gigabit Ethernet connections also pro-
                         vide large latency improvements. “Today, it’s pretty safe to say that 10 gigabit Ethernet is
                         required in this race,” said Nasdaq’s Andersson. “It is preferred by our customers, and 10 giga-
                         bit Ethernet is definitely taking off.”

                         Daiwa’s Mollemans added that 10 gigabit Ethernet is definitely a better option than infiniband.

                                                           “Our goal is to be as simple as possible. We want our
                                                           systems to run straight and fast.”
                                                                                                       —Scott Ignall, Lightspeed Financial




                         Finally, scalability to handle spikes in market data is a prerequisite in this market, added
                         Lightspeed’s Ignall. “If you don’t have scalability and uptime during the high watermark for
                         market data, it really doesn’t matter,” he said. “Market data is still the challenge. We test and
                         benchmark constantly.”




                   17       June 2010                       © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s          R e p o r t




                         Data Center Costs, Oversight Challenge the Sell Side
                         Financial services firms spent $1.8 billion on data centers in 2009, and TABB Group
                         expects 2010 spending to go even higher.

                         By Justin Grant

                         Data center oversight and network capacity are the biggest infrastructure-related challenges fac-
                         ing U.S. sell-side equity firms this year, even as spending on the segment continues to rise,
                         TABB Group analyst Kevin McPartland said during his presentation at Wall Street & Technology’s
                         annual Accelerating Wall Street conference in May.

                         Although costs remain a key concern, U.S. equity firms ramped up their investment in data
                         centers within the past year, according to the recent TABB report “U.S. Equity Technology
                         2010: The Sell-Side Perspective,” which noted that the larger players each support nearly five
                         data centers on average.

                         “Clearly, the sell side loves its data centers,” McPartland told attendees. “There’s a lot of horse-
                         power that has to sit behind these equity businesses in the U.S. … It’s getting more and more
                         complex to manage the infrastructure.”

                         And more costly. Equity firms spent $1.8 billion last year on data centers, with half of that total
                         coming from sell-side shops, according to the TABB Group report, which predicts the sell side’s
                         use of data center space will increase slightly in 2010.

                         “There’s a race here to try to compete,” McPartland continued. “Despite the cost-consciousness,
                         spending is still high, with the sell side spending the most.” In general, these sell-side shops are
                         pursuing a technology-driven agenda with an eye on lower latency, sleeker infrastructure and
                         shrewder IT investment in the wake of slow budget growth, according to TABB Group.

                         The report, which was based on conversations with high-ranking executives at 24 sell-side
                         firms, found that proximity hosting has become prominent among all the major broker-dealers,
                         McPartland revealed. “There’s an old mentality where we still need to be close to our equip-
                         ment,” he said.




                   18       June 2010                      © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                            A n a l y t i c s           R e p o r t




                         Still, the vast majority of U.S. hedge funds are not yet colocated, with 76 percent opting to
                         look to their brokers for infrastructure rather than buying it themselves. But even for hedge
                         funds that do not have an ultra-low-latency trading strategy, location does matter. Proximity
                         also is beneficial to smaller hedge funds, which historically have kept their servers in-house,
                         the report said.

                         Sell-side firms, meanwhile, also are aiming to boost their network capabilities. Demand for
                         bandwidth is projected to soar this year on the strength of rising trading volumes, which will
                         result in more data being pumped out by exchanges.

                         McPartland said improved use of bandwidth will be crucial for brokers going forward since


                                                           “Clearly, the sell side loves its data centers. There’s a lot of
                                                           horsepower that has to sit behind these equity
                                                           businesses.”
                                                                                                          —Kevin McPartland,TABB Group


                         growing data rates and the costs of managing data may slice into margins. This is helping to
                         spark a rush toward server upgrades as well, with most sell-side firms expected to opt for
                         Hewlett Packard and Intel’s servers, according to TABB Group, which noted that the sell side
                         already has spent $113.5 million this year on network servers.

                         “Connectivity is getting cheaper, but the price tag is still high,” McPartland said, while also
                         pointing out that the large firms are increasingly opting for hardware acceleration. “For the
                         bulge bracket, everybody’s either using it or is looking into using it. Smaller firms are priced
                         out—it’s too costly to buy and maintain.”

                         The report also noted that while virtualization at sell-side shops is growing, a completely virtu-
                         alized and utilized infrastructure is still a long way off for U.S. equities firms. “Even as virtual-
                         ization improves,” said McPartland, “there will still be some latency.”




                   19       June 2010                        © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                           A n a l y t i c s          R e p o r t




                         Firms Still Not Analyzing Unstructured Real-Time News
                         Real-time news isn’t reliable enough to include in automated trading strategies, say
                         industry executives.

                         By Greg MacSweeney

                         Although the ability to analyze news events in real time has been promoted as the next step in
                         the evolution of automated trading, panelists at Wall Street & Technology’s recent Accelerating
                         Wall Street conference are doubtful the capability can provide an edge.

                         “Two years ago, only 2 percent of the market was testing trading strategies with real-time
                         news,” said Adam Honore, research director at analyst firm Aite Group, during the panel ses-
                         sion “Using High-Performance Databases & CEP in an Automated Trading Strategy.” “Today,
                         two-thirds of the market is at least testing real-time news. The problem, however, is real-time
                         news is not very reliable.”

                         Panelist Robert Almgren, cofounder of Quantitative Brokers, an agency-only brokerage that
                         provides best execution algorithms and transactional cost measurement for fixed income mar-
                         kets, also said he hasn’t seen much value in analyzing news in an automated strategy. “I have
                         always been a pessimist about news, and I would rather use a quantitative method, such as
                         looking at historical data or economic indicators,” he revealed.

                         “The problem with news,” Almgren continued, “is humans don’t even know how to interpret it.
                         So how could a computer?” For example, he noted, how do you know the news story is accu-
                         rate, or if you are really the first person to see it?

                         Andrew Haines, CIO at GAIN Capital Group, also has been hesitant to deploy the real-time
                         news analysis capability in a trading strategy. “It’s hard to come up with a defensible trading
                         strategy based on unstructured news,” he told attendees.

                         In addition, Haines said, he also decided not to try to analyze Twitter messages for trade ideas.
                         “Streambase, our CEP [complex event processing] provider, has the ability to analyze Twitter,
                         but we are not using it,” he commented. GAIN Capital, however, is looking at real-time news




                   20         June 2010                    © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
Accelerating Wall Street 2010
A n a l y t i c s . I n f o r m a t i o n We e k . c o m




                                                            A n a l y t i c s          R e p o r t




                         and Twitter for enhancing risk management, he added, but hasn’t deployed either.

                         Instead of using CEP to analyze news, Haines noted, GAIN Capital is using the technology on
                         its foreign exchange ECN. “We selected our vendor CEP tool last year,” he reported. “With a

                                                           “It’s hard to come up with a defensible trading strategy
                                                           based on unstructured news.”
                                                                                                          —Andrew Haines, GAIN Capital



                         vendor-provided tool, we can focus on business functionality rather than building the technol-
                         ogy. We have bolted the CEP tool onto our ECN with great success.”

                         But even without adding real-time news or Twitter messages to an automated trading strategy,
                         all the panelists agreed that managing the sheer volume of traditional data is a challenge. “The
                         data volumes are so huge,” said Quantitative Brokers’ Almgren. “And if you want to go back
                         and add historical data to a calculation, the problem with processing and analyzing all the data
                         is tremendous.”




                   21       June 2010                       © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited

Más contenido relacionado

Similar a Accelerating wall-street-2010-next-stop-nanoseconds 8049450

Pwc High Frequency Trading Dark Pools
Pwc High Frequency Trading Dark PoolsPwc High Frequency Trading Dark Pools
Pwc High Frequency Trading Dark PoolsPwC
 
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu SubbuApidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbuapidays
 
Cryptocurrency And NSE Stock Market: A Comparative Data Analysis
Cryptocurrency And NSE Stock Market: A Comparative Data AnalysisCryptocurrency And NSE Stock Market: A Comparative Data Analysis
Cryptocurrency And NSE Stock Market: A Comparative Data AnalysisIRJET Journal
 
Algorithmic trading Research by hushbot
Algorithmic trading  Research by hushbotAlgorithmic trading  Research by hushbot
Algorithmic trading Research by hushbotedayaldis43
 
Merchant Data and Lending
Merchant Data and LendingMerchant Data and Lending
Merchant Data and LendingITU
 
Global Exchanges Pov 0715[1]
Global Exchanges Pov 0715[1]Global Exchanges Pov 0715[1]
Global Exchanges Pov 0715[1]YossiRosen10471
 
hushbot Revolutionizing Cryptocurrency Trading
hushbot  Revolutionizing Cryptocurrency Tradinghushbot  Revolutionizing Cryptocurrency Trading
hushbot Revolutionizing Cryptocurrency Tradingedayaldis43
 
Tronbotix Whitepaper ICO 2018 English
Tronbotix Whitepaper ICO 2018 EnglishTronbotix Whitepaper ICO 2018 English
Tronbotix Whitepaper ICO 2018 EnglishIoannisBalasis1
 
Tronbotix Whitepaper ICO 2018 English
Tronbotix Whitepaper ICO 2018 EnglishTronbotix Whitepaper ICO 2018 English
Tronbotix Whitepaper ICO 2018 Englishtronbotix
 
Algorithmic Trading-An Introduction
Algorithmic Trading-An IntroductionAlgorithmic Trading-An Introduction
Algorithmic Trading-An IntroductionRajeev Ranjan
 
Algorithmic Trading in FX Market By Dr. Alexis Stenfors
Algorithmic Trading in FX Market By Dr. Alexis StenforsAlgorithmic Trading in FX Market By Dr. Alexis Stenfors
Algorithmic Trading in FX Market By Dr. Alexis StenforsQuantInsti
 
Outlier Ventures State of Blockchain Q3 2018
Outlier Ventures State of Blockchain Q3 2018Outlier Ventures State of Blockchain Q3 2018
Outlier Ventures State of Blockchain Q3 2018Outlier Ventures
 
Latency arbitrage
Latency arbitrageLatency arbitrage
Latency arbitrageSmith Kim
 
Quantifi newsletter Insight autumn 2016
Quantifi newsletter Insight autumn 2016Quantifi newsletter Insight autumn 2016
Quantifi newsletter Insight autumn 2016Quantifi
 
Atomico Need-To-Know 8 January 2018
Atomico Need-To-Know 8 January 2018 Atomico Need-To-Know 8 January 2018
Atomico Need-To-Know 8 January 2018 Atomico
 
Trade Surveillance with Big Data
Trade Surveillance with Big DataTrade Surveillance with Big Data
Trade Surveillance with Big DataCognizant
 

Similar a Accelerating wall-street-2010-next-stop-nanoseconds 8049450 (20)

Pwc High Frequency Trading Dark Pools
Pwc High Frequency Trading Dark PoolsPwc High Frequency Trading Dark Pools
Pwc High Frequency Trading Dark Pools
 
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu SubbuApidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
 
Cryptocurrency And NSE Stock Market: A Comparative Data Analysis
Cryptocurrency And NSE Stock Market: A Comparative Data AnalysisCryptocurrency And NSE Stock Market: A Comparative Data Analysis
Cryptocurrency And NSE Stock Market: A Comparative Data Analysis
 
Algorithmic trading Research by hushbot
Algorithmic trading  Research by hushbotAlgorithmic trading  Research by hushbot
Algorithmic trading Research by hushbot
 
Merchant Data and Lending
Merchant Data and LendingMerchant Data and Lending
Merchant Data and Lending
 
Global Exchanges Pov 0715[1]
Global Exchanges Pov 0715[1]Global Exchanges Pov 0715[1]
Global Exchanges Pov 0715[1]
 
hushbot Revolutionizing Cryptocurrency Trading
hushbot  Revolutionizing Cryptocurrency Tradinghushbot  Revolutionizing Cryptocurrency Trading
hushbot Revolutionizing Cryptocurrency Trading
 
Tronbotix Whitepaper ICO 2018 English
Tronbotix Whitepaper ICO 2018 EnglishTronbotix Whitepaper ICO 2018 English
Tronbotix Whitepaper ICO 2018 English
 
Tronbotix Whitepaper ICO 2018 English
Tronbotix Whitepaper ICO 2018 EnglishTronbotix Whitepaper ICO 2018 English
Tronbotix Whitepaper ICO 2018 English
 
Algorithmic Trading-An Introduction
Algorithmic Trading-An IntroductionAlgorithmic Trading-An Introduction
Algorithmic Trading-An Introduction
 
Algorithmic Trading in FX Market By Dr. Alexis Stenfors
Algorithmic Trading in FX Market By Dr. Alexis StenforsAlgorithmic Trading in FX Market By Dr. Alexis Stenfors
Algorithmic Trading in FX Market By Dr. Alexis Stenfors
 
Outlier Ventures State of Blockchain Q3 2018
Outlier Ventures State of Blockchain Q3 2018Outlier Ventures State of Blockchain Q3 2018
Outlier Ventures State of Blockchain Q3 2018
 
Chapter
ChapterChapter
Chapter
 
Latency arbitrage
Latency arbitrageLatency arbitrage
Latency arbitrage
 
Quantifi newsletter Insight autumn 2016
Quantifi newsletter Insight autumn 2016Quantifi newsletter Insight autumn 2016
Quantifi newsletter Insight autumn 2016
 
Atomico Need-To-Know 8 January 2018
Atomico Need-To-Know 8 January 2018 Atomico Need-To-Know 8 January 2018
Atomico Need-To-Know 8 January 2018
 
BigFastData
BigFastDataBigFastData
BigFastData
 
Trade Surveillance with Big Data
Trade Surveillance with Big DataTrade Surveillance with Big Data
Trade Surveillance with Big Data
 
A Strategist's Guide to Blockchain
A Strategist's Guide to BlockchainA Strategist's Guide to Blockchain
A Strategist's Guide to Blockchain
 
7 Most Sought.pdf
7 Most Sought.pdf7 Most Sought.pdf
7 Most Sought.pdf
 

Más de Smith Kim

국토․도시 및 개발 관련법령에서의 권한배분에 따른 현황고찰
국토․도시 및 개발 관련법령에서의 권한배분에 따른 현황고찰국토․도시 및 개발 관련법령에서의 권한배분에 따른 현황고찰
국토․도시 및 개발 관련법령에서의 권한배분에 따른 현황고찰Smith Kim
 
도시계획 권한
도시계획 권한도시계획 권한
도시계획 권한Smith Kim
 
도시계획 권한
도시계획 권한도시계획 권한
도시계획 권한Smith Kim
 
심용옥후보공보물(제출본)
심용옥후보공보물(제출본)심용옥후보공보물(제출본)
심용옥후보공보물(제출본)Smith Kim
 
알고리즘트레이딩 전략교육 커리큘럼 v4.0
알고리즘트레이딩 전략교육 커리큘럼 v4.0알고리즘트레이딩 전략교육 커리큘럼 v4.0
알고리즘트레이딩 전략교육 커리큘럼 v4.0Smith Kim
 
알고리즘거래 종합관리방안 회원설명회(20130325)
알고리즘거래 종합관리방안 회원설명회(20130325)알고리즘거래 종합관리방안 회원설명회(20130325)
알고리즘거래 종합관리방안 회원설명회(20130325)Smith Kim
 
2011노3527대신증권(항소심)
2011노3527대신증권(항소심)2011노3527대신증권(항소심)
2011노3527대신증권(항소심)Smith Kim
 
국내자본시장 내에서의 페어트레이딩전략의 효율성
국내자본시장 내에서의 페어트레이딩전략의 효율성국내자본시장 내에서의 페어트레이딩전략의 효율성
국내자본시장 내에서의 페어트레이딩전략의 효율성Smith Kim
 
Accurate time for linux applications
Accurate time for linux applicationsAccurate time for linux applications
Accurate time for linux applicationsSmith Kim
 
주식투자인구
주식투자인구주식투자인구
주식투자인구Smith Kim
 
Spotlight on hft
Spotlight on hftSpotlight on hft
Spotlight on hftSmith Kim
 
금융투자업규정시행세칙(의안)
금융투자업규정시행세칙(의안)금융투자업규정시행세칙(의안)
금융투자업규정시행세칙(의안)Smith Kim
 
Nyse strategies media day
Nyse strategies   media dayNyse strategies   media day
Nyse strategies media daySmith Kim
 
신가치창출엔진빅데이터의새로운가능성과대응전략
신가치창출엔진빅데이터의새로운가능성과대응전략신가치창출엔진빅데이터의새로운가능성과대응전략
신가치창출엔진빅데이터의새로운가능성과대응전략Smith Kim
 
Zero aos와 DIY HTS비교
Zero aos와 DIY HTS비교Zero aos와 DIY HTS비교
Zero aos와 DIY HTS비교Smith Kim
 
자본시장법 개정안 입법예고(최종)
자본시장법 개정안 입법예고(최종)자본시장법 개정안 입법예고(최종)
자본시장법 개정안 입법예고(최종)Smith Kim
 
장내파생상품시장의 현황과 과제(Krx)
장내파생상품시장의 현황과 과제(Krx)장내파생상품시장의 현황과 과제(Krx)
장내파생상품시장의 현황과 과제(Krx)Smith Kim
 
High frequency trading(우투증권)
High frequency trading(우투증권)High frequency trading(우투증권)
High frequency trading(우투증권)Smith Kim
 
Bats ex sponsored_access_specification
Bats ex sponsored_access_specificationBats ex sponsored_access_specification
Bats ex sponsored_access_specificationSmith Kim
 
Co location and-proximity_hosting_services
Co location and-proximity_hosting_servicesCo location and-proximity_hosting_services
Co location and-proximity_hosting_servicesSmith Kim
 

Más de Smith Kim (20)

국토․도시 및 개발 관련법령에서의 권한배분에 따른 현황고찰
국토․도시 및 개발 관련법령에서의 권한배분에 따른 현황고찰국토․도시 및 개발 관련법령에서의 권한배분에 따른 현황고찰
국토․도시 및 개발 관련법령에서의 권한배분에 따른 현황고찰
 
도시계획 권한
도시계획 권한도시계획 권한
도시계획 권한
 
도시계획 권한
도시계획 권한도시계획 권한
도시계획 권한
 
심용옥후보공보물(제출본)
심용옥후보공보물(제출본)심용옥후보공보물(제출본)
심용옥후보공보물(제출본)
 
알고리즘트레이딩 전략교육 커리큘럼 v4.0
알고리즘트레이딩 전략교육 커리큘럼 v4.0알고리즘트레이딩 전략교육 커리큘럼 v4.0
알고리즘트레이딩 전략교육 커리큘럼 v4.0
 
알고리즘거래 종합관리방안 회원설명회(20130325)
알고리즘거래 종합관리방안 회원설명회(20130325)알고리즘거래 종합관리방안 회원설명회(20130325)
알고리즘거래 종합관리방안 회원설명회(20130325)
 
2011노3527대신증권(항소심)
2011노3527대신증권(항소심)2011노3527대신증권(항소심)
2011노3527대신증권(항소심)
 
국내자본시장 내에서의 페어트레이딩전략의 효율성
국내자본시장 내에서의 페어트레이딩전략의 효율성국내자본시장 내에서의 페어트레이딩전략의 효율성
국내자본시장 내에서의 페어트레이딩전략의 효율성
 
Accurate time for linux applications
Accurate time for linux applicationsAccurate time for linux applications
Accurate time for linux applications
 
주식투자인구
주식투자인구주식투자인구
주식투자인구
 
Spotlight on hft
Spotlight on hftSpotlight on hft
Spotlight on hft
 
금융투자업규정시행세칙(의안)
금융투자업규정시행세칙(의안)금융투자업규정시행세칙(의안)
금융투자업규정시행세칙(의안)
 
Nyse strategies media day
Nyse strategies   media dayNyse strategies   media day
Nyse strategies media day
 
신가치창출엔진빅데이터의새로운가능성과대응전략
신가치창출엔진빅데이터의새로운가능성과대응전략신가치창출엔진빅데이터의새로운가능성과대응전략
신가치창출엔진빅데이터의새로운가능성과대응전략
 
Zero aos와 DIY HTS비교
Zero aos와 DIY HTS비교Zero aos와 DIY HTS비교
Zero aos와 DIY HTS비교
 
자본시장법 개정안 입법예고(최종)
자본시장법 개정안 입법예고(최종)자본시장법 개정안 입법예고(최종)
자본시장법 개정안 입법예고(최종)
 
장내파생상품시장의 현황과 과제(Krx)
장내파생상품시장의 현황과 과제(Krx)장내파생상품시장의 현황과 과제(Krx)
장내파생상품시장의 현황과 과제(Krx)
 
High frequency trading(우투증권)
High frequency trading(우투증권)High frequency trading(우투증권)
High frequency trading(우투증권)
 
Bats ex sponsored_access_specification
Bats ex sponsored_access_specificationBats ex sponsored_access_specification
Bats ex sponsored_access_specification
 
Co location and-proximity_hosting_services
Co location and-proximity_hosting_servicesCo location and-proximity_hosting_services
Co location and-proximity_hosting_services
 

Accelerating wall-street-2010-next-stop-nanoseconds 8049450

  • 1. June 2010 $499.00 Analytics.InformationWeek.com A n a l y t i c s R e p o r t POWERED BY Accelerating Wall Street 2010 N ex t Sto p : N a n o s e co n d s With data latency on its way to being measured in nanoseconds, message volume exploding, and intensified demand for innovative trading products, Wall Street organizations are turning to the fastest and newest tech- nologies to stay ahead. This special Wall Street & Technology digital report examines some of the latest innovations in the low-latency arms race, including hard- ware acceleration, complex event processing and coloca- tion, and provides exclusive insights from WS&T’s Accelerating Wall Street 2010 conference. Report ID: S1330610
  • 2. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t CO NTENT S 3 4 The Low-Latency Imperative: How Fast Is Fast Enough? Figure 1: 5 Factors Influencing Latency 8 What’s All the Fuss About? 11 How Low Can You Go? 14 Silicon: The Next Tool for Low Latency 16 What’s the Next Low-Latency Tool? Try People 18 Data Center Costs, Oversight Challenge the Sell Side 20 Firms Still Not Analyzing Unstructured Real-Time News F O E L B A T 2 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 3. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t The Low-Latency Imperative: How Fast Is Fast Enough? Conventional wisdom may assert that everyone on Wall Street wants to be as fast as possible. But latency is a matter of perspective, influenced by trading style, instrument class and even the tools used to measure it. By Daniel Safarik Ask any trader, vendor or marketplace operator in the global securities market, “How fast do you need to be in order to be successful?” and the answer will most likely be, “It depends.” Technological advances move at such a pace, and firms rely on such varying strategies, that the level of latency—defined as the time it takes for an order to travel to a marketplace and to be executed or canceled, and then for a confirmation of that activity to return to its source— acceptable to any given party will vary, though none of the intervals are perceptible to the human eye. For the past few years, hardware and software providers have been able to decrease latency exponentially each year. We have gone from talking about milliseconds to microseconds and even nanoseconds. Although it may seem to be accepted wisdom that everyone wants to be “as fast as possible,” that’s not necessarily true. And as the market saw graphically and frighteningly on May 6, speed, by itself, is not the ultimate goal. In fact, lacking business rules that acknowledge the full implications of instantaneous transactions, speed is dangerous. According to Adam Honore, senior analyst at Aite Group, the level of latency considered acceptable by market participants depends largely on several factors, including: • Trading style. If you have an aggressive trading style that relies on opportunistic pricing dif- ferentials, you need to be the fastest. If you are a long-only quant fund, speed is not as critical. • Instrument class. Generally speaking, equities are the fastest-moving markets, with futures, foreign exchange and fixed income lagging behind. • Venue characteristics. The capabilities offered by each exchange and marketplace will 3 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 4. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t vary—dark pools, intraday matching, live limit-order books and so on—as will the level of traffic they attract throughout the course of the day. • Instrument characteristics. Trading shares of highly liquid Microsoft would have a vastly lower latency requirement than an illiquid OTC Bulletin Board stock, for example. There are ranges of latencies that can be instructive, according to Steve Rubinow, CIO at NYSE Euronext. But it is important to remember that these numbers vary greatly with market condi- tions and that the methodologies of measuring latency are not consistent, he adds, stressing that if latencies are quoted out of context with market traffic, they are essentially useless. “Everyone publishes numbers that were generated under the best possible conditions and hopes nobody asks the details, because those details would reveal whether it was comparable or not,” Rubinow says. “Having said all that, to be competitive today, you have to be in the few hundred microseconds of turnaround time.” When it embarked on its Universal Trading Platform (UTP) program last year, NYSE Euronext stated that it was aiming for 150 microseconds to 400 microseconds of latency per round trip for its European Cash Markets. By comparison, on May 14 NYSE rival Nasdaq OMX published an overall average result of 157 microseconds, while noting that 99.9 percent of orders were completed within 757 microseconds, at a rate of 194,205 orders per second. To illustrate how quickly the standard moves, the top class was in the tens of milliseconds a Figure 1 year ago, according to Donal Byrne, CEO and founder of Corvil, a Dublin, 5 Factors Influencing Latency Ireland-based vendor of latency meas- urement technology. [Ed. Note: 1 mil- Trade logic—the code that runs matching engines lisecond = 1,000 microseconds.] and algorithms Speed of calculation hardware About 40 percent of the U.S. equities Speed of telecom switch hardware market volume is comprised of market Quality and number of connections Distance between network nodes makers that are trying to match the Source: Kevin McPartland, Analyst, TABB Group latency of the marketplace they are using, notes Shawn Melamed, founder, 4 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 5. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t president and CEO of New York-based Correlix, which also sells latency monitoring devices to exchanges, including Nasdaq OMX. That group needs the fastest response times, he says. On the other hand, “For someone who is doing index arbitrage, the average latency they would require is really situational,” Melamed comments. “There is no fixed number there. [Because they need to get information from several marketplaces], they will be tolerant to higher latency, so that you can at least get full price information before you make your decision.” Knowing Latency Is Half the Battle The emergence of companies such as Corvil and Correlix, which did not exist five years ago, illustrates an important consideration about latency: Often, knowing the level of latency at a given marketplace is more important than the number itself. Traders—and the algorithms they deploy—now can make decisions about execution venues using latency data, just as they would use fill rate and price quality as decision factors. “To the extent that we have multiple paths to get to a venue, we can effectively normalize out the native latency within that venue.” —Jason Lenzo, Russell Investments At Tacoma, Wash.-based Russell Investments, which operates the Russell 2000 small-cap index, traders rely on this operational transparency to make decisions, bringing the latency data about each execution venue and market-data source right onto trader desktops, relates Jason Lenzo, head of equity and fixed income. “To the extent that we have multiple paths to get to a venue, we can effectively normalize out the native latency within that venue,” he says. “We can then optimize the speed to market across specific optical and telephony links in those networks.” When evaluating latency, it’s vital to consider all the contributing factors, including the trade logic (the code that runs matching engines and algorithms), the speed of calculation hard- ware, the speed of telecom switch hardware, the quality and number of connections, and the distance between network nodes, according to Kevin McPartland, analyst at TABB Group. The 5 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 6. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t industry as a whole is rapidly approaching the point where, “The code is so tight, hardware improvements are the main thing that will increase the overall efficiency of the operation,” McPartland says. Enter, vendors such as Blade Network Technologies (telecom hardware), Equinix (colocation hosting) and Nvidia (gaming graphics cards re-tasked to calculate derivatives). Each of these technology providers is feverishly trying to reduce the latency of its layer in the stack. Santa Clara, Calif.-based Blade makes a 10-GB switch that connects feed handlers, algorithm boxes and matching engines at colocation centers, where firms have increasingly found it use- ful to situate their machines across the hallway from their counterparts, even if their offices and trading staff are on opposite sides of the globe. By merging routers with switches, Blade has eliminated a layer that previously added precious microseconds to a round trip, explains David Iles, Blade Network Technologies’ director of product management. “We are providing sub-700 nanoseconds of latency, port to port,” Iles asserts. “We are also deterministic. You don’t want stale market data getting to devices. It has to take the same time to get from Port 1 to Port 24 as it does from Port 1 to Port 2.” Technologies such as this tend to live side by side in colocation centers run by companies such as Foster City, Calif.-based Equinix. Here, the issue is bandwidth and energy efficiency, both of which are major cost contributors in the low-latency race. Trading firms are increasingly opting for colocation rather than running expensive, high-throughput dedicated fiber from their offices to the marketplace. “We have a customer in Greenwich, Conn.,” relates John Knuff, general manager, global financial markets, at Equinix. “They were spending about $20,000 a month to get trades to New York. They moved a couple of cabinets in with us. It’s $3,000 to $5,000 a month for a cabinet, and $200 to $300 for the cross-connects. They essentially offset the cost of their colocation by getting rid of the network costs back to their office, which had no economic or competitive advantage.” 6 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 7. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t Think Fast(er) An equally important question pervades the minds of traders: Once you’re satisfied with the turnaround time to your market, how do you maximize the value of time between transac- tions? That question interested Tobias Preis, managing director of Artemis Capital Asset Management of Holzheim, Germany, to such a degree that he became one of the first customers of Nvidia’s graphical processing unit (GPU), a processor that has 480 cores, compared to the typical four- to 12-core CPU. The GPU originally was developed to render high-resolution details for computer games. Preis, also a computational physicist, uses the GPU to calculate time series for the DAX-index futures algorithms he deploys on Eurex. “The increases in speed represented by the GPU are many times faster than the reductions in latency by the exchanges,” according to Preis, who says he gets by on 100 milliseconds to 150 milliseconds of average latency to Eurex. “We can now perform parallel-computing calculations that used to take one minute in one second.” Where will low latency be in a year? Many market partici- “I know we will break the pants say they won’t be sur- 100-microsecond barrier.” prised if the discussion is about nanoseconds in a year. —Steve Rubinow, NYSE Euronext “I know we will break the 100 microsecond barrier,” NYSE’s Rubinow says. Beyond that, it becomes enormously expensive to add each zero behind the decimal point, he notes. Despite the excitement over low latency and the extreme competitiveness of financial firms and the vendors that serve them, it’s important to keep a clear head about the need for speed, adds Adam Afshar, president of program trading at Atlanta-based Hyde Park Global Investments, which is 100 percent automated and has no manual traders. “High frequency is just a method for implementing a strategy—it is not the strategy itself,” says Afshar. He notes approvingly that the decreasing cost of technology means that a $10 7 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 8. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t million investment in technology allows a smaller firm to rival the speed of the biggest banks on Wall Street. But the key to success in the marketplace, according to Afshar, is adaptability, and that still comes from human ingenuity. For Afshar, going forward, the more interesting question is not “How fast can you hit the market?” but “What do you do with that speed?” “The bottom line,” he says, “is your adaptability to the nonlinearity of markets.” What’s All the Fuss About? Despite the controversy surrounding high-frequency trading, the trading style is beneficial to long-term investors and to the market at large, argues Arzhang Kamarei, Managing Partner, Tradeworx. By Melanie Rodier High-frequency trading remains mired in controversy, with regulators fearing that unscrupu- lous traders are taking advantage of individual investors. But what critics don’t realize is that high-frequency trading actually is beneficial to long-term investors and to the market at large, according to Arzhang Kamarei, managing partner at Tradeworx, a quantitative investment man- agement firm with expertise in high-frequency market strategy. “High-frequency trading creates opportunities for long-term investors by providing more liq- uidity,” asserted Kamarei, who presented the keynote address at Wall Street & Technology’s recent Accelerating Wall Street conference. The extra liquidity that high-frequency trading provides, he explained, narrows spreads for long-term investors, ultimately helping them get better prices. “During the turbulent fourth quarter of 2008, it was high-frequency traders that stepped up and provided liquidity,” Kamarei argued. “High-frequency trading provides U.S. markets with 8 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 9. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t better prices and deeper liquidity than markets in any other country or region. It helps smooth the course of long-term investors.” High-frequency trading is estimated to generate as much as two-thirds of U.S. equities daily trading volume. But as it grows in popularity, it also has attracted the scrutiny of regulators, eager to appease uneasy investors after the financial crisis. Addressing the controversy surrounding high-frequency trading strategies, Kamarei pointed out that high-frequency trading isn’t always profitable. “High-frequency traders make money through spread capture,” he noted. “They optimize adverse selection to match rebates. More volatility increases spreads.” In April, the SEC unanimously approved a new proposal that would track transactions by high- frequency trading firms to improve oversight of their activity. Under the new rule, firms will be “High-frequency trading provides U.S. markets with better prices and deeper liquidity than markets in any other country or region.” —Arzhang Kamarei,Tradeworx given unique identifiers and will be required to report next-day transaction data when request- ed by regulators. This will allow authorities to keep closer tabs on traders that aren’t registered market makers or broker-dealers without having to follow lengthy audit trails from exchanges when they scrutinize a particular firm or trade. Next-day access to trading data also could assist investigators in finding manipulative, abusive or otherwise illegal trading activity. The SEC estimates that the rule will apply to the largest 400 market participants—firms or individuals whose transactions in exchange-listed securities equal or exceed 2 million shares or $20 million during any calendar day, or 20 million shares or $200 million during any calendar month. Regulators also have been scrutinizing flash orders, which let traders briefly expose their orders to others in the market, and “naked access,” which allows firms to buy and sell 9 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 10. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t stocks on exchanges using a broker’s computer code without regulators knowing who is making the trades. Also under fire from regulators is the strategy of colocating at or near exchange data cen- ters, which authorities say gives high-frequency trading firms an unfair advantage over slower traders. Spurring Competition Kamarei told attendees, however, that colocation isn’t unfair to long-term investors, as it helps to create competition among high-frequency traders. Further, both high-frequency traders and long-term investors can colocate, he noted. Meanwhile, Kamarei argued that any attempts to change the market structure will fail to reverse technology advances. Instead, costs will come down for less advanced users, which in turn will drive further adoption of high-frequency trading technology, he said. As for the future of high-speed trading, Kamarei suggested that sell-side broker-dealers will be the main force for spreading the use of high-frequency trading to all market participants. He predicted that high-frequency trading volumes will stay at their current levels on a volatili- ty-adjusted basis, but many high-frequency trading desks will go out of business, even as high- frequency technology grows more ubiquitous. “New high-frequency trading firms will process more dimensional and complex data,” Kamarei said. Meanwhile, he noted, “The fascination with colocation will decrease as the technology becomes commonplace.” 10 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 11. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t How Low Can You Go? Even as firms explore bleeding-edge technologies to lower latency, Lime Brokerage President and CEO Jeffrey Wecker says there’s still plenty of latency that can be elimi- nated with conventional products and a simple understanding of the nature of latency. By Ivy Schmerken To survive on Wall Street, firms believe they need speed, according to Lime Brokerage presi- dent and CEO Jeffrey Wecker, who delivered the closing keynote address at Wall Street & Technology’s Accelerating Wall Street conference in May. Wecker insisted that anyone willing to make the appropriate investments could achieve mini- mum latency using conventional technology. But with low-latency trading already measured in microseconds, he predicted it would be difficult to reduce latency further. “We use just about every trick available from conventional technology and have squeezed latency out,” said Wecker. Lime Brokerage is an agency brokerage firm that provides “The latency game is nearly over for data delivery, order management and matching logic.” —Jeffrey Wecker, Lime Brokerage high–throughput, low-latency technologies to high-frequency traders and other proprietary trading shops, as well as more traditional buy-side firms, primarily as a managed service. “Short of addressing greater volumes of data, the top performers have a good handle on what they need to do to reduce latency,” Wecker continued. “Software architecture is reaching canonical perfection in the order management process.” Without more research breakthroughs from computer science, he added, reducing latency fur- ther will become more and more difficult. 11 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 12. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t Miles to Go to Eliminate Latency Nonetheless, Wecker acknowledged, latency still exists—from third-party suppliers because of physical distances, in hardware and network equipment, and in software. Physical distance can add latency depending on “how many miles of wires or fiber connect your systems to the matching engine,” Wecker explained. On the network equipment side, he added, latency stems from the number of pieces of equipment and network hops, including the the network interface and any switches through which an order must pass before it reaches the market. “You can lose 10 microseconds in the authentication before you get to the match,” Wecker said. But, he continued, “The greater sources of latency do not come from the search for better hard- ware solutions. I still believe the greatest amount of latency is in code architecture and code restructuring.” According to Wecker, software can cause latency from the client side, from the broker-dealer validation and from the TCP/IP stack. But, he said, firms can work with providers to restruc- ture their software code. Wecker cited “asynchronous parallel code” as a programming method for reducing latency, and said he’s spoken with firms that are making the leap to graphical processing units (GPUs) to parallelize their algorithms. But Wecker cautioned firms against going out on “the bleeding edge” with hardware acceleration tools such as field-programmable gate arrays (FPGAs). “This path is potentially full of a lot of pitfalls,” he told the audience. “Suboptimal bus rates have derailed a lot of firms attempting to deploy GPUs.” Wecker also noted that firms are exploring application-specific integrated circuits (ASICs)— integrated circuits customized for a particular use rather than a general-purpose use—and fab- rication technology. But again, he suggested caution. “Going down that path is very foolish and an expensive venture if you don’t have the experience with software architecture,” he said. 12 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 13. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t Comparing Apples With Oranges But even if firms deploy these technologies successfully, Wecker emphasized, it’s essential to monitor latency, and to sift through all the marketing hype around the term. “There’s evidence of poor latency monitoring and reporting by different firms,” he said, noting that many vendors offer “apples and oranges comparisons.” Though there are industry standards for measuring latency, Wecker said, hype surrounds the word. “Frankly, I think the industry can do a lot better to be intellectually honest about latency,” he said. While customers understand this, Wecker added, they still can get confused. For instance, though many customers are colocated at exchange data centers, “They haven’t made the full investment in either infrastructure, staff or monitoring technology to know if they’re getting the best from the exchange,” he insisted. In one case, Lime found that the client’s “net latency introduced by switches, telecom providers and software architecture had the equivalent of putting their rack five miles away from the market center,” Wecker revealed. “Colocation has been sold to many firms as the be-all, end- all; it’s not. It doesn’t substitute for understanding the contributors to latency. I would argue that just by being in the same data center as the exchange doesn’t help.” While there is a benefit to colocation and the edge it brings as compared with locating in another data center, Wecker explained, the benefit could be negated by intraprocess latency— meaning delays in processing from inside the boxes and applications, including the match interaction and message acknowledgement. To truly minimize latency, a firm must isolate the causes of latency and break out the measurements, Wecker advised. “You have a right to ask questions about the methodologies used in report statistics,” he said. In addition, latency must be examined in context. There is a difference between reporting latency for a single message during normal market activity and reporting latency during peak throughput. “Look at the metric itself,” said Wecker, urging firms to question whether the number is a mean or median value, for an individual message, or for peak throughput. “The nature of scientific latency requires that you approach it in a disciplined way, and that means understanding the full environment,” Wecker said. “That sways you one way or another.” 13 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 14. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t Silicon: The Next Tool for Low Latency GPUs and FPGAs can help lower latency, but they often are difficult to program or change, experts caution. By Melanie Rodier Firms are relying on the newest and fastest processors to power their cutting-edge trading infrastructures. For the most innovative firms, the low-latency work is taking place in silicon, with the use of graphic cards (GPUs) and FPGAs—reprogrammable silicon chips that give trad- ing firms ultra-low-latency methods for identifying and responding to specific information from data feeds. According to panelists at Wall Street & Technology’s Accelerating Wall Street conference in May, hardware acceleration can help shave off microseconds, and soon nanoseconds, in latency. FPGA and GPU technology continues to gain momentum. The benefits are clear to engineers: FPGAs are parallel in nature so different processing opera- tions do not have to compete for the same resources. A GPU can relieve the microprocessor of 2-D or even 3-D graphics rendering and is commonly used in embedded systems, mobile phones and game consoles. Like FPGAs, its highly parallel structure makes it effective for a range of complex algorithms. But before they rush to implement these new toys, financial institutions still need to evaluate the costs and benefits of GPUs and FPGAs carefully, according to experts who participated in a panel discussion, “Silicon: The Next Tool for Low Latency,” at WS&T’s Accelerating Wall Street conference. “You have to look at the cost of new technologies,” said Andrew Athan, managing member, Athan Capital Holdings, “and ask yourself what is it you’re trying to gain.” Not all financial institutions will actually gain a trading advantage by shaving a few microsec- onds off their execution times, Athan suggested. “Will every order I send be matched ahead of a competitor? I am not sure that for every millisecond I gain, I will gain a matched order,” he related. “When we looked at the relative benefits, it didn’t yet make sense for us.” 14 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 15. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t Panelist Ryan Eavy, associate director, enterprise architecture, CME Group, agreed that FPGAs and GPUs aren’t right for everyone. “You can squeeze milliseconds out of an application—that’s huge,” he said. “But microseconds? It depends if it’s worth it.” Even if firms do decide to adopt hardware acceleration technologies, there are some limitations they must overcome, the panelists noted. For example, “The hardware can be difficult to debug and to troubleshoot,” according to Peter Krey, founder and president, Krey Associates. Finding a good team of application developers who understand both the technology behind hardware acceleration as well as the financial industry’s requirements presents its own set of challenges, the panelists concurred. “The industry is very dynamic, with all the regulations, changes in protocols, algorithms, etc.,” said Athan Capital’s Athan, who is based in San Diego. Being in Southern California has “You have to look at the cost of new technologies and ask yourself, what is it you’re trying to gain for yourself?” —Andrew Athan, Athan Capital Holdings allowed Athan to dip into a large talent pool filled with developers who normally build iPods and cell phones, he noted. Others are looking beyond the private sector for qualified technologists. With military contrac- tors now using FPGAs for aerospace and defense systems, Andy Bach, SVP, technology, NYSE Euronext, said the exchange has been hiring experienced developers from the defense industry. But if everyone is leveraging the same hardware, won’t this level out the playing field? “If you can only compete by buying hardware, you’ll do it,” said Athan. “But if you can com- pete by hiring smarter people, or building better algorithms, you’ll do that, too.” 15 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 16. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t What’s the Best Low-Latency Tool? Try People While faster chips and 10 gigabit Ethernet connections help, having the right develop- ers and engineers is the key to winning the low-latency race. By Greg MacSweeney Most financial firms equate being the fastest with having the latest technology. But panelists at May’s Accelerating Wall Street conference agreed that the best tool to reduce latency is not a faster server or FPGA; it’s the right people. While the latest servers and networks certainly help in the low-latency race, the topic of hiring the right people came up again and again during the “What’s in Your Low-Latency Toolbox” session at WS&T’s recent conference. “If you look into the entire value chain of where things are slowed during the trade [process], you realize you need to have the right developers and the right engineers,” said panelist Steven Sadoff, EVP and CIO at Knight Capital Group. “The biggest reductions in latency come when we optimize our software.” At Nasdaq OMX, CTO Mats Andersson said, simplifying software has been tremendously bene- ficial. To do that, he explained, not only must IT and the business be on the same page, the networking engineers have to know what the other technology developers are doing. “When we bring our developers and networking people together, we see great results,” Andersson asserted. But having “simple” software is easier said than done. “Our goal is to be as simple as possible,” noted Scott Ignall, CTO at Lightspeed Financial. “We want our systems to run straight and fast.” But in order to do that, developers and engineers need to know how the markets operate and understand the business goals, he added. Essentially, they need a financial market IQ, as well as technical acumen, Ignall said. “It’s very hard to find the right people, the ones who know technology and finance,” he said. “We are a small company and we move quickly. We don’t have time to teach people about the industry.” 16 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 17. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t Panelist Michael Mollemans, head of electronic execution sales at Daiwa Capital Markets America, agreed. “There is no substitute for financial experience,” he commented. “When I am looking [for technologists], I stay within my circle of industry contacts because they know me and they know the business. They won’t recommend someone who isn’t right.” Aside from having the best and brightest working on your systems, the panelists also said that while colocation provides a noticeable improvement, 10 gigabit Ethernet connections also pro- vide large latency improvements. “Today, it’s pretty safe to say that 10 gigabit Ethernet is required in this race,” said Nasdaq’s Andersson. “It is preferred by our customers, and 10 giga- bit Ethernet is definitely taking off.” Daiwa’s Mollemans added that 10 gigabit Ethernet is definitely a better option than infiniband. “Our goal is to be as simple as possible. We want our systems to run straight and fast.” —Scott Ignall, Lightspeed Financial Finally, scalability to handle spikes in market data is a prerequisite in this market, added Lightspeed’s Ignall. “If you don’t have scalability and uptime during the high watermark for market data, it really doesn’t matter,” he said. “Market data is still the challenge. We test and benchmark constantly.” 17 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 18. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t Data Center Costs, Oversight Challenge the Sell Side Financial services firms spent $1.8 billion on data centers in 2009, and TABB Group expects 2010 spending to go even higher. By Justin Grant Data center oversight and network capacity are the biggest infrastructure-related challenges fac- ing U.S. sell-side equity firms this year, even as spending on the segment continues to rise, TABB Group analyst Kevin McPartland said during his presentation at Wall Street & Technology’s annual Accelerating Wall Street conference in May. Although costs remain a key concern, U.S. equity firms ramped up their investment in data centers within the past year, according to the recent TABB report “U.S. Equity Technology 2010: The Sell-Side Perspective,” which noted that the larger players each support nearly five data centers on average. “Clearly, the sell side loves its data centers,” McPartland told attendees. “There’s a lot of horse- power that has to sit behind these equity businesses in the U.S. … It’s getting more and more complex to manage the infrastructure.” And more costly. Equity firms spent $1.8 billion last year on data centers, with half of that total coming from sell-side shops, according to the TABB Group report, which predicts the sell side’s use of data center space will increase slightly in 2010. “There’s a race here to try to compete,” McPartland continued. “Despite the cost-consciousness, spending is still high, with the sell side spending the most.” In general, these sell-side shops are pursuing a technology-driven agenda with an eye on lower latency, sleeker infrastructure and shrewder IT investment in the wake of slow budget growth, according to TABB Group. The report, which was based on conversations with high-ranking executives at 24 sell-side firms, found that proximity hosting has become prominent among all the major broker-dealers, McPartland revealed. “There’s an old mentality where we still need to be close to our equip- ment,” he said. 18 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 19. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t Still, the vast majority of U.S. hedge funds are not yet colocated, with 76 percent opting to look to their brokers for infrastructure rather than buying it themselves. But even for hedge funds that do not have an ultra-low-latency trading strategy, location does matter. Proximity also is beneficial to smaller hedge funds, which historically have kept their servers in-house, the report said. Sell-side firms, meanwhile, also are aiming to boost their network capabilities. Demand for bandwidth is projected to soar this year on the strength of rising trading volumes, which will result in more data being pumped out by exchanges. McPartland said improved use of bandwidth will be crucial for brokers going forward since “Clearly, the sell side loves its data centers. There’s a lot of horsepower that has to sit behind these equity businesses.” —Kevin McPartland,TABB Group growing data rates and the costs of managing data may slice into margins. This is helping to spark a rush toward server upgrades as well, with most sell-side firms expected to opt for Hewlett Packard and Intel’s servers, according to TABB Group, which noted that the sell side already has spent $113.5 million this year on network servers. “Connectivity is getting cheaper, but the price tag is still high,” McPartland said, while also pointing out that the large firms are increasingly opting for hardware acceleration. “For the bulge bracket, everybody’s either using it or is looking into using it. Smaller firms are priced out—it’s too costly to buy and maintain.” The report also noted that while virtualization at sell-side shops is growing, a completely virtu- alized and utilized infrastructure is still a long way off for U.S. equities firms. “Even as virtual- ization improves,” said McPartland, “there will still be some latency.” 19 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 20. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t Firms Still Not Analyzing Unstructured Real-Time News Real-time news isn’t reliable enough to include in automated trading strategies, say industry executives. By Greg MacSweeney Although the ability to analyze news events in real time has been promoted as the next step in the evolution of automated trading, panelists at Wall Street & Technology’s recent Accelerating Wall Street conference are doubtful the capability can provide an edge. “Two years ago, only 2 percent of the market was testing trading strategies with real-time news,” said Adam Honore, research director at analyst firm Aite Group, during the panel ses- sion “Using High-Performance Databases & CEP in an Automated Trading Strategy.” “Today, two-thirds of the market is at least testing real-time news. The problem, however, is real-time news is not very reliable.” Panelist Robert Almgren, cofounder of Quantitative Brokers, an agency-only brokerage that provides best execution algorithms and transactional cost measurement for fixed income mar- kets, also said he hasn’t seen much value in analyzing news in an automated strategy. “I have always been a pessimist about news, and I would rather use a quantitative method, such as looking at historical data or economic indicators,” he revealed. “The problem with news,” Almgren continued, “is humans don’t even know how to interpret it. So how could a computer?” For example, he noted, how do you know the news story is accu- rate, or if you are really the first person to see it? Andrew Haines, CIO at GAIN Capital Group, also has been hesitant to deploy the real-time news analysis capability in a trading strategy. “It’s hard to come up with a defensible trading strategy based on unstructured news,” he told attendees. In addition, Haines said, he also decided not to try to analyze Twitter messages for trade ideas. “Streambase, our CEP [complex event processing] provider, has the ability to analyze Twitter, but we are not using it,” he commented. GAIN Capital, however, is looking at real-time news 20 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited
  • 21. Accelerating Wall Street 2010 A n a l y t i c s . I n f o r m a t i o n We e k . c o m A n a l y t i c s R e p o r t and Twitter for enhancing risk management, he added, but hasn’t deployed either. Instead of using CEP to analyze news, Haines noted, GAIN Capital is using the technology on its foreign exchange ECN. “We selected our vendor CEP tool last year,” he reported. “With a “It’s hard to come up with a defensible trading strategy based on unstructured news.” —Andrew Haines, GAIN Capital vendor-provided tool, we can focus on business functionality rather than building the technol- ogy. We have bolted the CEP tool onto our ECN with great success.” But even without adding real-time news or Twitter messages to an automated trading strategy, all the panelists agreed that managing the sheer volume of traditional data is a challenge. “The data volumes are so huge,” said Quantitative Brokers’ Almgren. “And if you want to go back and add historical data to a calculation, the problem with processing and analyzing all the data is tremendous.” 21 June 2010 © 2010 Wall Street & Technology/InformationWeek Analytics, Reproduction Prohibited