Monday, October 21, 2013

DataVis and Public Communication

Data Visualization or simply DataVis has seen a huge increase in popularity as the web has moved from 2.0 to 3.0. Data Visualization can be a powerful tool to communicate complex ideas to a broad audience, something that is often difficult in financial econometrics.
Today I came across the good example of a data visualization that compactly expresses an idea in financial economics (and econometrics, since this is all about measurement). Of course, there are a number of obvious important caveats to this depiction:

  • These results will depend heavily on the choice of subsample, and the past 10 years happen to be relatively good for passive strategies.  This isn't true in all 10 year periods (e.g. ending in November 2008).
  • Hedge funds may not achieve the same average level of growth, but they may offer diversification, reduced risk or exposure to exotic \(\beta\).  
  • Warren Buffet is only one manager.


Friday, October 18, 2013

Between Fama and Shiller

This years Nobel Memorial Prize in Economic Sciences was awarded on Monday to Eugene F. Fama, Lars Peter Hansen and Robert J. Shiller

Most of the popular press has focused on the obvious dichotomy of the seminal contributions of Fama and Shiller, and have concluded that Shiller was right. 

Markets are Gray

This year’s prize has been more controversial than most.  A substantial amount of the criticism of this year Nobel has centered around the differences in the original contributions of Fama and Shiller.  As Justin Wolfers succinctly summarized this difference as:

… financial markets are efficient (Fama), except when they’re not (Shiller), …

One article form The New Yorker was particularly dismissive of efficient markets.

The black and white view adopted in the main stream press is  too simplistic to understand market efficiency.  It is useful to consider a substantially more gray definition of market efficiency, first advanced by another Nobel laureate, Clive. W. J. Granger in a paper with Allan Timmerman.  This extended definition adds two important dimensions to to the definition of weak form efficiency.

The first is the horizon, \(h\).  Actual arbitrage capital operates on frequencies ranging from microseconds to quarters, and so it is essential to consider the time scale when asking whether prices are weak form efficient. 

The second extension is technology, which can be thought of as a combination of actual physical technology, for example the existence of Twitter, and the understanding of econometric and statistical techniques relevant for capturing arbitrage opportunities.   Technology is constantly evolving and existing arbitrage opportunities disappear as understanding of the risk/reward trade off evolves.  This has clearly occurred at the shortest horizons where high-frequency trading has evolved from simple strategies trading the same asset of different markets (e.g. IBM in New York and Toronto) to complex strategies trading hundreds of assets to eliminate arbitrage between futures, ETFs and the underlying components of an index.  Similarly, recent advances allow real-time sentiment analysis constructed from Twitter  feeds to be used to detect price trends.

Understanding the Risk

In addition to both horizon and technology, it is essential to understand the risk of these price trends.  There is an increasing list of examples where strategies that consistently generated profits for years experienced sharp reversals.  In some examples, these reversals are so sharp that a decade or more or accumulated profit is eliminated in a couple of months.  This was the case for simple momentum strategies in 2002 and for statistical arbitrage August 2007. This type of extremely skewed risk-return relationship substantially complicates the econometric analysis of market efficiency.

The Grossman-Stiglitz paradox states that the absence of arbitrage requires arbitrage.  The contradiction, combined with a more nuanced view of efficient markets leads to the relevant question :

Under what conditions are markets efficient?

Wednesday, October 16, 2013

The challenges of high-frequency data

It has been nearly 20 years since the publication of some of the most influential research in trade and quote data.  The past seven years have seen an almost unbelievable growth in the number of quotes and a large increase in the number of transactions on major exchanges.  

This video shows 10 seconds of trading of BlackBerry Ltd on October 2.  This flurry of activity was attributed (ex-post) to rumors of a second private equity suitor.   The flying objects are both trades (circles) and quotes (triangles) generated by participants in one particular market which are then transmitted to the other 10 exchanges pictured.  The NBBO is pictured at the 6 o’clock position.

With a span of 10 seconds, I would suspect that most of limit orders were completely computer generated. It is also clear that the orders are being placed so quickly on different exchanges that the traditional practice of trade signing using the Lee & Ready algorithm cannot be relied upon. 

The video was produced by Nanex, a specialist low-latency data provider.