How Has The Flood Of Information Changed Wall Street Since 1990

Fight Censorship, Share This Post!

How Has The Flood Of Information Changed Wall Street Since 1990

In a world where Wall Street admits that it increasingly gets its most precious commodity – information – from social networks such as Twitter, Reddit and Facebook…

… it got us thinking about the changing nature of information flow in finance and how it may be impacting markets.

Conveniently, in a recent note from DataTrek’s Nick Colas, the former SAC portfolio manager takes a big picture look at just this topic, writing that when he started covering stocks in 1991 back at Credit Suisse, there was no Internet, no smartphones, no “Big Data”, no quarterly earnings conference calls, and no real regulation around how companies disseminated potentially market-moving information. All those things exist today, and according to Colas, the fact that the world’s financial decision-makers are flooded with instant (and constant) information may well explain part of why US stocks trade at such premiums to prior cycles. But, as Colas also notes, more information can also make investors overconfident.

Below we excerpt from the DataTrek founder’s latest note about the changing nature of information as it relates to the investment process over the last 30 years.

Too Much Information, by Nicholas Colas

We’ll start in late 1991 when these words first came out of a CFO’s mouth: “We should do a conference call after the quarter.” The speaker was Jerry York, then Chrysler’s chief financial officer. The company had just done a “save the firm” equity issuance to fund production of the then-new Grand Cherokee.

He felt that the institutional buyers of that deal should hear directly from the management team right after Q4 earnings were made public. They had taken a big risk buying Chrysler, which at the time was essentially insolvent. Keeping the lines of communication open with this group of investors was important. After all, the company might need to tap them again if the US economy didn’t continue to rebound.

I was at that first call, which was a hybrid in-person/teleconference held at the old Sky Club on top of what was then the Pan Am building in New York City. Some big investors in the deal traveled to New York to attend, and others dialed in. It did what Jerry wanted. Investors got to ask their questions directly and also hear management’s take on the business.

As effective as that form of shareholder communications was, quarterly earnings conference calls only slowly caught on through the 1990s. For many years, analysts more commonly waited for earnings reports to come through on PR Newswire. We would then print those out on a dot matrix printer and call the company’s CFO or investor relations person. We’d then wait for a call back and ask our questions about the numbers. Sometimes it would be the same day, sometimes the next. And if the company didn’t like you, that return call would simply never come.

Other differences between 1991 and now, as far as the investment process goes:

  • No Internet back then, at least as far as its utility to Wall Street. No Google, no Wikipedia, no “Big Data”.
  • No smartphones. If you were on the road and wanted a price quote or the latest news, you called your trading desk.
  • No email – analysts’ reports were printed and mailed/messengered to clients.
  • No Fed press conferences after FOMC meetings. Only Fed Chair Alan Greenspan spoke on policy, and infrequently at that.
  • No regulations requiring analysts to share their views with all clients at once.
  • No regulations requiring that companies disseminate market-moving information broadly. Most just used their favorite Wall Street analysts to update investors on earnings guidance.

I think about all these differences every time I look at a 1990 – present history of the CBOE VIX Index. Has more, and more-widely available, information made US stocks less volatile? In theory, it should. Volatility is, first and foremost, a function of how much relevant fundamental information is embedded in stock prices.

Here’s that VIX history back to 1990, which shows that the period from 2012 to 2019 did see generally lower volatility than the prior 2 economic up cycles. There were other forces at work, certainly … A long expansion makes for more predictable corporate earnings, which should make for lower equity price volatility. But seeing a VIX that reliably traded below 19 (its long run average) for the better part of a decade is still notable. The truly “different” thing about this period versus the previous ones is the change in the quantity and speed of information flow.

What’s also striking about that chart is that volatility shocks (which always bring lower asset prices) still routinely occur despite the much greater amount of information available to markets and investors. Chalk that up to human nature. Prospect Theory says humans “feel/fear” loss about twice as much as equivalent gains. That asymmetry explains the old trader’s saying that “the market takes the stairs up, but the elevator down” when an unexpected event occurs.

Now, what does all this mean for current US equity market dynamics? Three thoughts:

  1. Everything else equal, more complete information about company/macro fundamentals should make for higher equity valuations now relative to prior cycles. It’s hard to prove statistically that this is the case, but it makes intuitive sense to me.
  2. More information now should also allow markets to reset more quickly after a shock than prior cycles. Imagine if we’d had the Pandemic Recession in pre-Internet 1990 instead of 2020. Would investors have as much confidence in a global economic recovery if they couldn’t see it forming through data from Google Trends, smartphone-enabled mobility tracking, and other 21st century sources of data? I doubt it.
  3. Greater levels of available information can, however, lead to investor overconfidence.

We’ll close out with a cautionary tale about “too much information” that relates to that last point:

  • Back in the 1970s, US researchers ran a study with 8 professional horse racing handicappers as their subjects.
  • They had the subjects list all the horse-specific datapoints they found most useful in predicting the outcome of a race, ranked from most to least important.
  • The handicappers received their top 10 data choices for the horses in an upcoming race and were asked to predict the winners.
  • For the next race, they saw their top 20 choices and made predictions based on that now-larger base of information.
  • Finally, they got their top 40 choices for relevant predictive data and forecast the outcome of the last race.

The surprising finding: while the handicappers’ confidence about their predictions increased with larger amounts of information, their accuracy in picking winners did not.

The lesson, profoundly relevant to investing: use the wealth of information available in a 21st century world with caution. More is not always better.

Tyler Durden
Sat, 07/03/2021 – 18:30


Fight Censorship, Share This Post!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.