Asked at the end of DataContent 2013 what he saw as the themes, Russell Perkins, head of InfoCommerce, answered speed and velocity—not only the “push on getting out only top quality information but doing it faster than ever. We knew it was like that in financial but to hear it on all different verticals [is a change].”
In the session Driving Dollars With Data, we saw that the financial sector is not taking the newfound speed of other verticals lying down. In fact, they’re speeding up. “You have to be at least as fast as the other guys,” said Peter Lankford, president of Technology Business Development Corp. “[You have to go] from exchange into trading applications in microseconds or less now.”
Lankford, Mark Alvarez, senior director of reference data structure for Interactive Data Corp., and Adrian Dickson, VP of content for qbeats—all former colleagues at Reuters—discussed the revved up financial data world and all its parameters. They described a market where robots trade with robots because the information travels so fast, and making pennies on a trade may be okay because of the sheer volume.
Ten years ago, Lankford said, it was all about information and analytics for the trader. Today, the keys have become low-latency data, that everything be machine readable, ultra-granular historical data, exchanges (feeds, co-location), and software to support all these. The losers are consolidated vendor datafeeds and traditional display apps. Lastly, Lankford put up this quote from William Gibson: “The future is here. It’s just not evenly distributed.”
Alvarez said that the impact of globalization on capital markets is putting pressure on profit margins. Because markets have localized, things are getting more complicated, be it multiple currencies or multi-assets. “Everyone in the industry is having to embrace their inner statistician,” he said The answer is to impose structure on data.
“Coupling the data content with structure and meta data is vital. You want to use and reuse content.” Also, support for standard interfaces and applications is not optional. All the information is real-time now, Alvarez said; some of it just updates a little more frequently. And data suppliers must shift to supporting a broader array of distribution and delivery, requiring significant investment in infrastructure and capability.
Dickson described qbeats’ model to the group. It’s basically a pay-per-view business of content that they aggregate from willing providers. The original publishers get 70% and qbeats gets 30%. “The prices on our stories will be put next to the headlines,” he said. “The financial services audience is already accustomed to paying for information; we think we understand the needs of our customers.”
qbeats wants to find out the value of one individual piece of content. If something is hot, they figure the price will go up. “We’re the EzPass of the information industry,” Dickson said only half-joking. “You start with a credit of $25.” He said that publishers will be able to embed their logo on the web version of where the article is used. “We also think we have a real opportunity with the blogging community.”
Said moderator Gerry Mintz of Percepta Partners: “I think we’ll walk away from this session with a recognition of how an ignored segment or market, by moving to real-time, may find an extraordinary opportunity. These are technology-driven changes that will happen whether traditional publishers take them or not.”
Ronn Levine began his career as a reporter for The Washington Post and has won numerous writing and publications awards since. Most recently, he spent 12 years at the Newspaper Association of America covering a variety of topics before joining SIPA in 2009 as managing editor. Follow Ronn on Twitter at @SIPAOnline