Financial Services Players Capture Value Via Reference Data Sharing

Standardization and data sharing have historically been strengths in the financial services industry. For example, the networked ATM was an early win in the financial services industry. The ability to take advantage of the network effects and data sharing was one of the earliest examples of how network effects can and do add value. Another example of this is using computerized trading standards/software (e.g., Bloomberg Trading platform) to maximize value. However, when it comes to reference data, firms have historically maintained, built, and managed their own security and client master databases in isolation from other market participants.

In their classic book, Information Rules, Hal Varian and Carl Shapiro describe how standards and network externalities can add significant value to consumers and organizations (Varian and Shapiro, 1999). First, they argue that standards enhance compatibility, or interoperability, generating greater value for users making the network larger (e.g., HTTP for the internet transport or VHS for videotape in the ’80s). Second, because of the enhanced ability to share data, more consumers are attracted, and market share is expanded. Firms can then begin to capitalize on the associated network effects or externalities. Varian and Shapiro also describe how a lack of standards can destroy value. The example used here was of the great Baltimore fire of 1904, where firemen from neighboring towns arrived to help fight the fire, but they could not help because of different hydrant standards (their fire hoses did not fit into Baltimore’s hydrants). This is relevant across various industries including financial services and within technology contexts.

Standardization and data sharing have historically been strengths in the financial services industry. For example, the networked ATM was an early win. The ability to take advantage of the network effects and data sharing was one of the earliest examples of how network effects can and do add value. Another example of this is using computerized trading standards/software (e.g., Bloomberg Trading platform) to maximize value. However, when it comes to reference data, firms have historically maintained, built, and managed their own security and client master databases in isolation from other market participants. Due to exponential growth, the numbers of data silos across lines of business have dramatically emerged. Yet, most of these data platforms are similar in style and content. They are maintained through a combination of automated feeds from external data providers, complex custom applications, and manual data entry. Problems arise as these firms must continue to support aging legacy platforms and disparate, highly decentralized data stores.

Because of these historical and structural reference data management issues, many firms continue to struggle to manage their reference data. For example, firms continue to be challenged in managing new securities and increasing asset class volume. Expensive manual data cleansing and poor data management lead to high aggregate costs, and there often are duplicate contracts with data vendors. Challenges in managing multiple securities masters, multiple repositories, and different asset classes further complicate the situation.

In order to move beyond the current situation, banking executives and leaders should look to their past successes in data standardization and data sharing and make appropriate investments in a reference data management strategy. They should also think about working together across corporate boundaries to develop standards that take advantage of the network effects and data quality improvement that comes with standardized data sharing. Find your allies and think through the best way to build a consortium and drive reference data standards. The results could help to reduce or share costs and simultaneously add value to your customers.

Additional questions and considerations from Varian and Shapiro research:

For those firms that are already in the process of building a consortium, what should you do (or avoid doing) to be successful? For example, what are the lessons learned from Betamax or HD-DVD?

For financial services companies that are already building consortiums for reference data sharing, there are a couple of key points to consider:

  • Beware of firms in the standards-setting process that deep down have no interest in the development of a successful standard. Even if your allies welcome a standard, they may disagree over how extensive or detailed the standard should be. For example, the HDTV standard originally came under attack by disagreements around the scanning formats and line resolution. DVD is another example where major players disagreed over the writing part of the standard, after agreeing on the read standard.
  • Understand tippy markets and standards–when two firms compete for a standard where there is strong positive feedback, only one may emerge as the winner. For example, in the ’80s the video recorder market was a clear example (VHS vs. Beta). Positive feedback and network effects, in this case of VHS, led to a winner-take-all market in which VHS technology essentially vanquished the Beta brand.

Once you get to shared reference data–as that data is standardized and commoditized–what should individual players do to be competitive in the market? How do you become a master of the network effects? How do you differentiate your offerings within the network?

In order to remain or become more competitive and differentiated with commoditized and standardized data, individual players should still:

  • Assess their level of control over the data, the size of the installed base of customers, and how the data is used to generate value.
  • Determine their intellectual property rights and how to take advantage of additional data or aggregated data.
  • Understand their ability to innovate through data or process.
  • Analyze strengths in complementary data or products.
  • Leverage reputation and brand.

What happens when regulatory bodies drive the sharing and standardization rather than the industry doing it naturally? Are there pitfalls? How can the regulated companies use that change, that momentum, to their commercial advantage?

Executives in the financial services industry cannot afford to ignore the government or regulatory bodies’ roles in data sharing. Here are a few key guidelines for navigating these waters:

  • Don’t expect the government’s role to diminish. Financial services is subject to large and increasing returns; market outcomes that appear to be concentrated will continue to attract the attention of the government.
  • Every company needs to know the rules of competition. You are better off anticipating antitrust challenges, both from private parties and the government.
  • Don’t be afraid of cooperating with other companies to set reference data standards and develop new technologies, so long as your efforts are designed to bring benefits to consumers. Steer clear of antitrust issues and hot-button areas.
  • If you are fortunate enough to gain a leading share of the market from your practices or data sharing standards, be sure to conduct an audit of your approach.

Contributed by Jace Frey and Kevin Henderson.