# Price Impact in Efficient Markets

Market prices generally respond to an increase in supply or demand. This phenomenon, called “price impact,” is of central importance in financial markets. Price impact provides feedback between supply and demand, an essential component of the price discovery mechanism. Price impact also accounts for the vast majority of large traders’ execution costs — costs which regulators may seek to reduce by tweaking market structure.

Price impact is a concave function of meta-order [1] size — approximately proportional to the square-root of meta-order size — across every well-measured financial market (e.g. European and U.S. equities, futures, and bitcoin). There are some nice models that help explain this universality, most of which require fine-grained assumptions about market dynamics. [2] But perhaps various financial markets, regardless of their idiosyncrasies, share emergent properties that could explain empirical impact data. In this post, I try to predict price impact using only conjectures about a market’s large-scale statistical properties. In particular, we can translate intuitive market principles into integral equations. Some principles, based on efficiency arguments, imply systems of equations that behave like real markets.

In part I, we’ll start with the simplest principles, which we’ll only assume to hold on average: the “fair pricing condition”, and that market prices efficiently anticipate the total quantity of a meta-order based on its quantity already-executed. In part II, we’ll replace the fair pricing condition with an assumption that traders use price targets successfully, on average. In part III, we’ll return to fair pricing, but remove some efficiency from meta-order anticipation — by assuming that execution information percolates slowly into the marketplace. In part IV, we’ll emulate front-running, by doing the opposite of part III: leaking meta-orders’ short-term execution plans into the market. In parts V and VI, we’ll discuss adding the notion of urgency into meta-orders.

# Definitions and Information-Driven Impact

We can motivate price impact from a supply and demand perspective. During the execution of a large buyer’s meta-order, her order flow usually changes the balance between supply and demand, inducing prices to rise by an amount called the “temporary price impact.” After the buyer is finished, she keeps her newly-acquired assets off the market, until she decides to sell. This semi-permanent reduction in supply causes the price to settle at a new level, which is higher than the asset’s initial price by an amount called the “permanent price impact.” Changes in available inventory cause permanent impact, and changes in flow (as well as inventory) cause temporary impact. [3]

Another view is that informed trading causes permanent impact, and that uncertainty about informedness causes temporary impact. When a trader submits a meta-order, its permanent impact should correspond in some fashion to her information. And its temporary impact should correspond to the market-estimate of permanent impact. In an “efficient” market, the informational view and the supply/demand view should be equivalent.

Before we proceed, we need some more definitions. Define $\alpha(q)$ as the typical permanent price impact associated with a meta-order of quantity $q$. By “typical”, I mean that $\alpha(q)$ is the expectation value of permanent impacts, $\alpha_{s}(q)$, associated with a situation, $s$, in the set of all possible situations and meta-orders, $S$. Specifically, $\alpha(q) = \mathbf{E}_{s \in S}[\alpha_{s}(q)]$. It’s reasonable to associate the colloquial term “alpha” — which describes how well a given trade ($s$) predicts the price — with $\alpha_{s}$.

Also define $\mathcal{I}(q)$ as the typical temporary price impact after a quantity $q$ has been executed. Again, “typical” means $\mathcal{I}(q) = \mathbf{E}_{s \in S}[\mathcal{I}_{s}(q)]$.

These expectations can be passed through the integrals discussed below, so we don’t need to pay attention to them. In the rest of this post, “permanent impact” will refer to the expectation $\alpha(q)=E_{s \in S}[\alpha_{s}(q)]$ unless otherwise specified (and likewise for “temporary impact” and $\mathcal{I}(q)$).

# I. A Bare-Bones Model

Starting from two assumptions of market efficiency, we can determine the typical price-trajectory of meta-orders. The two conditions are:

I.1) The “fair pricing condition,” which equates traders’ alpha with their execution costs from market-impact (on average):

$\alpha(q) = \frac{1}{q} \int_{0}^{q} \mathcal{I}(q') dq'$

###### The integral denotes the quantity-averaged temporary impact “paid” over the course of an entire meta-order. “Fair pricing” means that, in aggregate, meta-orders of a given size do not earn excess returns or below-benchmark returns.

Temporary price impact (black line) over the course of a meta-order of size $q$. After the execution is finished, the price impact decays (dashed line) to $\alpha(q)$ (red), the quantity-weighted average of the meta-order’s temporary impact trajectory

I.2) Efficient linkage between temporary and permanent price impact:

$\mathcal{I}(q') = \mathbf{E}_{q}[\alpha(q)|q \geq q'] = \int_{q'}^{\infty} \alpha(q)p[q|q \geq q']dq$

###### A. A trader is buying a lot of IBM stock and has so far bought a million shares. B. The rest of the market sees signs (like higher price and volume) of that purchase and knows roughly that somebody has bought a million shares. C. Once a trader has bought a million shares, there is a 50% chance that she’ll buy 5 million in total, and a 50% chance that she’ll buy 10 million. “The market” knows these probabilities. D. For 5 million share meta-orders, the typical permanent price impact is 1%, and for 10 million share meta-orders it’s 2%. So “the market” expects our trader’s meta-order to have permanent impact of 1.5%. The *typical* temporary impact is determined by this expectation value. This particular meta-order may have temporary impact smaller or larger than 1.5%, but meta-orders sent under similar circumstances will have temporary impact of 1.5% on average.

An illustration of this linkage. The temporary price impact trajectory is the black line. At a given value of $q'$, $\mathcal{I}(q')$ (blue) is equal to the expected value (blue) of the permanent price impact given that the meta-order has size $q'$ or bigger. The probability density of the final meta-order size, $p[q|q \geq q']$, is shown in shaded red. The permanent impact associated with those meta-order sizes is shown in green.

# Relationship with Efficiency

The fair pricing condition could emerge when the capital management industry is sufficiently competitive. If a money manager uses a trading strategy that’s profitable after impact costs, other managers could copy it and make money. The strategy would continue to attract additional capital, until impact expenses balanced its alpha. (Some managers are protective of their methods, but most strategies probably get replicated eventually.) If a strategy ever became overused, and impact expenses overwhelmed its alpha, then managers would probably scale back or see clients pull their money due to poor performance. Of course these processes take time, so some strategies will earn excess returns post-impact and some strategies may underperform — fair pricing would hold so long as they average out to a wash.

A strictly stronger condition than I.2) should hold in a market where meta-orders are assigned an anonymous ID, and every trade is instantly reported to the public with its meta-order IDs disclosed. Farmer, Gerig, Lillo, and Waelbroeck call a similar market structure the “colored print” model. Under this disclosure regime, if intermediary profits are zero, the expected alpha would determine the temporary impact path of individual meta-orders, not just the average $\mathcal{I}(q')$ as in I.2). All meta-orders would have the same impact path: $\mathcal{I}_{s}(q') = \mathbf{E}_{q}[\alpha(q)|q \geq q'] = \int_{q'}^{\infty} \alpha(q)p[q|q \geq q']dq$ for any $s$. [6] Now, the colored print model doesn’t seem very realistic; most markets don’t have anywhere near that level of transparency. Nonetheless, Farmer et al. show preliminary measurements that partly support it. [7]

Even without colored prints, the linkage property I.2) could be due to momentum and mean-reversion traders competing away their profits. As discussed by Bouchaud, Farmer, and Lillo, most price movement is probably caused by changes in supply and demand. That is, if prices move on increased volume, it’s likely that someone activated a large meta-order, especially if there hasn’t been any news. So, if average impact overshot I.2) significantly, a mean-reversion trader could plausibly watch for these signs and profitably trade opposite large meta-orders. Likewise, if average impact undershot I.2), momentum traders might profit by following the price trend.

# Solving the System of Equations

We can combine I.1) and I.2) to get an ODE [8]:

$\alpha''(q) + (\frac{2}{q} - \frac{p[q]}{1-P[q]})\alpha'(q) = 0$

This ODE lets us compute $\alpha(q)$ and $\mathcal{I}(q)$ for a given meta-order size distribution, $p[q]$.

It’s common to approximate $p[q]$ as a $Pareto[q_{min},\beta]$ distribution ($p[q] = \frac{\beta q_{min}^{\beta}}{q^{\beta+1}}$). If we do so, then $\frac{p[q]}{1-P[q]} = \frac{\beta}{q}$, and the ODE has solution $\alpha(q) = c_1 q^{\beta-1}+c_2$. Equation I.1) implies $\mathcal{I}(q) = \alpha(q)+q\alpha'(q)$, so we have that $\mathcal{I}(q) = c_1 \beta q^{\beta-1} + c_2$. Impact should nearly vanish for small $q$, so we can say that $c_2 \approx 0$. The post-execution decay in impact is then given by $\frac{\alpha(q)}{\mathcal{I}(q)} = \frac{1}{\beta}$

If we choose $\beta = \frac{3}{2}$ (roughly in-line with empirical data), we get the familiar square-root law: $\mathcal{I}(q) \propto \sqrt{q}$. We also get an impact ratio of $\frac{\alpha(q)}{\mathcal{I}(q)} = \frac{2}{3}$, very close to real-world values.

A similar method from Farmer, Gerig, Lillo, and Waelbroeck gives the same result. They use the fair pricing condition, but combine it with a competitive model of market microstructure. [9] Here, instead of having a specific model of a market, we’re making a broad assumption about efficiency with property I.2). There may be a large class of competitive market structures that have this efficiency property.

# Distribution of Order Sizes Implied by a Given Impact Curve

Under this model, knowing an asset’s price elasticity ($\mathcal{I}(q)$) is equivalent to knowing its equilibrium meta-order size distribution($p[q]$). [10] If a market impact function $\mathcal{I}(q)$ is assumed, we can calculate the meta-order size distribution. [11] For instance, Zarinelli, Treccani, Farmer, and Lillo are able to better fit their dataset with an impact function of the form $\mathcal{I}(q) = a Log_{10}(1+bq)$ (p17). This impact curve implies a $p[q]$ that’s similar to a power-law, but with a slight bend such that its tail decays slower than its bulk:

Meta-order size distribution implied by the impact curve $\mathcal{I}(q) = 0.03 Log_{10}(1+470q)$, which Zarinelli, Treccani, Farmer, and Lillo fit to their dataset of single-day meta-orders. In this case, $q$ would be analogous to their chosen measure of size, the daily volume fraction $\eta$. The impact function’s fit might be invalid for very large meta-orders ($q \approx 1$), so the lack of a sharp cutoff near $q \approx 1$ in the implied size distribution isn’t problematic.

# II. A Replacement Principle for Fair Pricing: Traders’ Effective Use of Price Targets

The two integral equations in part I can be modified to accommodate other market structure principles. There’s some evidence that our markets obey the fair pricing condition, but it’s fun to consider alternatives. One possibility is that traders have price targets, and cease execution of their meta-orders when prices approach those targets. We can try replacing the fair pricing of I.1) with something that embodies this intuition:

II.1) $\alpha(q) = a\mathcal{I}(q) + d$

###### Where $a$ and $d$ are constants. This principle should be true when traders follow price-target rules, and their targets accurately predict the long-term price (on average). If $d=0$ and $a=\frac{5}{4}$, then traders typically stop executing when the price has moved $\frac{4}{5}$ of the way from its starting value to its long-term value. If $a=1$ and $d=0.01$, then traders stop executing when the price is within 1% of its long-term value.

If we keep I.2), this gives the ODE:

$\alpha'(q) + \frac{p[q](a-1)}{1 - P[q]}\alpha(q) + \frac{p[q]d}{1 - P[q]}=0$

It’s readily solved. [12] In particular, if $q \sim Pareto[q_{min},\beta]$ and $a \neq 1$ :

$\alpha(q) = c q^{\beta (1-a)}+\frac{d}{1-a}$ and $\mathcal{I}(q) = \frac{c q^{\beta (1-a)}+\frac{d}{1-a}-d}{a}$.

For typical values of $\beta \approx 1.5$, we can get the usual square root-law by setting $a \approx \frac{2}{3}$. We need $0 < a < 1$ in order for impact to be a concave, increasing function of order size, in agreement with empirical data. This suggests that perhaps traders do employ price targets, only instead of being conservative, their targets are overly aggressive. In other words, this model gives a realistic concave impact function if traders are overconfident and think their information is worth more than it is. [13] More generally, the partial reversion of impact after meta-orders’ completion could be explained with overconfidence. And when the “average” trader is overconfident just enough to balance out her alpha, the market will obey the fair pricing condition. I think there’s more to fair pricing than overconfidence, but this link between human irrationality and market efficiency is intriguing.

# III. A Replacement Principle for Efficient Linkage, with Delayed Dissemination of Information

We can also think about alternatives for I.2). In I.2), “the market” could immediately observe the already-executed quantity of a typical meta-order. But markets don’t instantly process new information, so perhaps the market estimate of meta-orders’ already-executed quantity is delayed:

III.2) $\mathcal{I}\left(q'\right) = \mathbf{E}_{q}[\alpha(q)|q \geq (q'-q_d)^+] = \frac{\int_{(q'-q_d)^+}^{\infty } p[q] \alpha (q) \, dq}{1-P[(q'-q_d)^+]}$

###### This condition should be true when the market (on average) is able to observe how much quantity an anonymous trader executed in the past, when her executed quantity was $q_d$ less than it is in the present. This information can be used to estimate the distribution of her meta-order’s total size, and thus an expectation value of its final alpha. The temporary impact is set by this expectation value.

Intuitively, small meta-orders may blend in with background activity, but large ones are too conspicuous. If someone sends two 100-share orders to buy AAPL, other traders won’t know (or care) whether those orders came from one trader or two. But if a large buyer is responsible for a third of the day’s volume, other traders will notice and have a decent estimate of the buyer’s already-executed quantity, even if they don’t know whether the buyer was involved in the most recent trades on the tape. So, it’s very plausible for market participants to have a quantity-lagged, anonymized view of each other’s trading activity.

Combining III.2) with fair pricing I.1) gives the delay differential equation [14]:

$\begin{cases} q \alpha ''(q) + \alpha '(q) \left(2-\frac{q p[q-q_d]}{1-P[q-q_d]}\right)-\left(\alpha (q)-\alpha (q-q_d)\right)\frac{p[q-q_d]}{1-P[q-q_d]}=0, & \mbox{if } q \geq q_d \\ \mathcal{I}(q)=\alpha(q)=constant, & \mbox{if } q < q_d \end{cases}$.

We can solve it numerically [15]:

$\mathcal{I}(q)$ and $\alpha(q)$ when $q$ is Pareto-distributed, for several values of $q_d$. The general behavior for $q \gg q_d$ is similar to that of $q_d=0$, as in I.

The impact ratio $\frac{\alpha(q)}{\mathcal{I}(q)}$ for several values of $q_d$. This ratio is 1 when the price does not revert at all post-execution, and 0 when the price completely reverts.

I gather that fundamental traders don’t like it when the price reverts on them, so some may want this impact ratio to be close to 1. Delayed information dissemination helps accomplish this goal when meta-orders are smaller than what can be executed within the delay period. But traders experience bigger than usual reversions if their meta-orders are larger than $q_d$. This behavior is intuitive: if a meta-order has executed a quantity less than $q_d$, other traders will have zero information about it and can’t react. But as soon as its executed quantity reaches $q_d$, the market is made aware that somebody is working an unusually big meta-order, and so the price moves considerably.

Some bond traders are pushing for a longer delay in trade reporting. One rationale is that asset managers could execute meta-orders during the delay period, before other traders react and move the market. The idea feels superficially like condition III.2), but isn’t a perfect analogy, because counterparties still receive trade confirmations without delay. And counterparties do use this information to trade. [16] So, delaying prints may not significantly slow the percolation of traders’ information into the marketplace, it just concentrates that information into the hands of their counterparties. Counterparties might provide tighter quotes because of this informational advantage, but only if liquidity provision is sufficiently competitive. [17]

In theory, it’s possible for market structure to explicitly alter $q_d$. [18] An exchange could delay both prints and trade confirmations, while operating, on behalf of customers, execution algorithms which do not experience a delay. This was the idea behind IEX’s defunct router, which would have been able to execute aggressive orders against its hidden order book and route out the remainder before informing either counterparty about the trades. The router would’ve increased the equity market’s $q_d$ by the resting size on IEX’s hidden order book, which (I’m guessing) is very rarely above $100k notional — an amount that doesn’t really move the needle for large fundamental traders, especially since orders larger than $q_d$ experience significant price reversion. Regardless, it’s interesting to think about more creative ways of giving exchange execution algorithms an informational advantage. The general problem with such schemes is that they are anti-competitive; brokers would have to use the advantaged exchange algos, which could command exorbitant fees and suffer from a lack of innovation. [19] # IV. A Replacement Principle for Efficient Linkage, with Information Leakage from Sloppy Trading or Front-Running In III., we altered condition I.2) so that market prices responded to meta-orders’ executions in a lagged fashion. We can try the same idea in reverse to see what happens if market prices adjust to meta-orders’ future executed quantity: IV.2) $\mathcal{I}\left(q_{tot},q_{executed}\right) = \begin{cases} \mathbf{E}_{q}[\alpha(q)|q \geq q_{executed}+q_{FR}] = \frac{\int_{q_{executed}+q_{FR}}^{\infty } p[q] \alpha (q) \, dq}{1-P[q_{executed}+q_{FR}]}, & \mbox{if } q_{executed} ###### Where $\mathcal{I}\left(q_{tot},q_{executed}\right)$ is the temporary impact associated with a meta-order that has an already-executed quantity of $q_{executed}$ and a total quantity of $q_{tot}$. $q_{FR}$ is a constant. On average, a meta-order’s intentions are partly revealed to the market, which “knows” not only the meta-order’s already-executed quantity, but also whether it will execute an additional quantity $q_{FR}$ in the future. If a meta-order will execute, in total, less than $q_{executed}+q_{FR}$, the market knows its total quantity exactly. “The market” uses this quantity information to calculate the meta-order’s expected alpha, which determines the typical temporary impact. This condition may be an appropriate approximation for several market structure issues: A. The sloppy execution methods described in “Flash Boys”: If a sub-par router sends orders to multiple exchanges without timing them to splash-down simultaneously, then “the market” may effectively “know” that some of the later orders are in-flight, before they arrive. If most fundamental traders use these sloppy routing methods (as “Flash Boys” claims), then we might be able to describe the market’s behavior with a $q_{FR}$ approximately equal to the typical top-of-book depth. B. Actual front-running: E.g., if fundamental traders split up their meta-orders into$10M pieces, and front-running brokers handle those pieces, the market will have a $q_{FR} \approx \ 10M$. Though, brokers know their customers’ identities, so they may be able to predict a customer’s permanent impact with better precision than this model allows.
C. Last look: During the last-look-period, a fundamental trader’s counterparty can wait before finalizing the trade. If the fundamental trader sends orders to other exchanges during this period, her counterparty can take those into account when deciding to complete the trade. This is similar to A., except traders can’t avoid the information leakage by synchronizing their orders.

We can examine the solutions of this version of condition 2). Combining it with the fair pricing condition I.1) gives, for meta-orders with $q_{tot}>q_{FR}$: [20]

$\alpha '(q_{tot}) \left(2-\frac{(q_{tot}-q_{FR}) p[q_{tot}]}{1-P[q_{tot}]}\right)+(q_{tot}-q_{FR}) \alpha ''(q_{tot})=0$

If $q_{tot} \sim Pareto[q_{min},\beta]$, this has solution:

$\alpha (q_{tot}) = c_1 + c_2 (q_{tot}-q_{FR}){}^{\beta-1} \, _2F_1(1-\beta,-\beta;2-\beta;\frac{q_{FR}}{q_{FR}-q_{tot}})$

For $q_{tot} \gg q_{FR}$: the $_2F_1(...) \approx 1$, so $\alpha (q_{tot}) \approx c_1 + c_2 q_{tot}^{\beta-1}$, which is the same behavior we saw in the base model I.

If we look at the solution’s behavior for $q_{tot} \gtrsim q_{FR}$, the story is quite different:

Permanent Impact and Peak-Temporary Impact when $q_{tot}$ is slightly above $q_{FR} = 10^{-4}$, with constants $c_1=0$ and $c_2=1$. The temporary impact for a meta-order of size $q_{tot}$ reaches its peak just before the meta-order’s end becomes known to the market, at $q_{executed}=q_{tot}-q_{FR}$. Peak-temporary impact goes negative when $q_{tot}$ is sufficiently close to $q_{FR}$, but it’s possible to choose constants so that it stays positive (except at $q_{tot}=q_{FR}$, where it’s complex-valued). $\alpha(q_{tot})$, on the other hand, has a regular singular point at $q_{tot}=q_{FR}$ and it is not possible to choose non-trivial constants such that $\alpha(q_{tot})$ is always positive. Temporary impact is calculated numerically via equation IV.2).

Under this model, meta-orders slightly larger than $q_{FR}$ necessarily have negative long-term alpha. It’s possible that traders would adapt to this situation by never submitting meta-orders of that size, altering the Pareto-distribution of meta-order sizes so that no commonly-used $q_{tot}$ is associated with negative alpha. But, it’s also possible that some traders would continue submitting orders that lose money in expectation. Market participants have diverse priorities, and long-term alpha is not always one of them.

The model template above gets some general behavior right, but glosses over important phenomena in our markets. It makes no explicit mention of time, ignoring important factors like the urgency and execution rate of a meta-order. It’s not obvious how we could include these using only general arguments about efficiency, but we can imagine possible market principles and see where they lead.

For the sake of argument, say that every informed trading opportunity has a certain urgency, $u$, defined as the amount of time before its information’s value expires. For example, an informed trader may have a proprietary meteorological model which makes predictions 30 minutes before public forecasts are published. If her model predicts abnormal rainfall and she expects an effect on the price of wheat, she’d have 30 minutes to trade before her information becomes suddenly worthless. Of course, in real life she’d have competitors and her information would decay in value gradually over the 30 minutes, perhaps even retaining some value after it’s fully public. But let’s just assume that $u$ is a constant for a given trading opportunity and see where it leads us.

If we try following a strict analogy with the time-independent model, we might write down these equations:

V.1) A “universal-urgency fair pricing condition,” that applies to meta-orders at every level of urgency:

$\alpha(q,u) = \frac{1}{q} \int_{0}^{q} \mathcal{I}(q',u) dq'$

###### This is a much stronger statement than ordinary fair pricing. It says that market-impact expenses equal alpha, on average, for meta-orders grouped by *any* given urgency. There are good reasons to expect this to be a bad approximation of reality — e.g. high-frequency traders probably constitute most short-urgency volume [21] and have large numbers of trades to analyze, so they can successfully tune their order sizes such that their profits are maximized (and positive). Perhaps some traders with long-urgency information submit orders that are larger than the capacity of their strategies, but I doubt HFTs do.

V.2) Efficient linkage between temporary and permanent price impact:

$\mathcal{I}(q',u') = \mathbf{E}_{q,u}[\alpha(q,u)|q \geq q', u \geq u'] =\int_{u'}^{\infty}\int_{q'}^{\infty} \alpha(q,u)p[q,u|q \geq q', u \geq u']dqdu$

###### Where $p[q,u]$ is the PDF of meta-order sizes and urgencies, and $P[q,u]$ is the CDF. $p[q,u|q \geq q',u \geq u']$ is the truncated probability distribution of meta-order sizes and urgencies, $\frac{p[q,u]}{1 - P[q',\infty] - P[\infty,u'] + P[q',u']}$ — which represents the probability distribution of $q$ and $u$ given the knowledge that quantity $q'$ from the meta-order has already executed in time $u'$. This is similar to the time-independent efficient linkage condition I.2). For example, a trader splits her meta-order into chunks, executing 1,000 shares per minute starting at 9:45. If she is still trading at 10:00, “the market,” having observed her order-flow imbalance, will “know” that her meta-order is at least 15,000 shares and has an urgency of at least 15 minutes. “The market” then calculates the expected alpha of the meta-order given these two pieces of information, which determines the average temporary impact.

We can combine these two equations to get a rather unenticing PDE. [22] As far as I can tell, its solutions are unrealistic. [23] Most solutions have temporary price impact that barely changes with varying levels of urgency. But in the real world, temporary impact should be greater for more urgent orders. The universal-urgency fair pricing here is too strong of a constraint on trader behavior. This condition means that markets don’t discriminate based on information urgency. Its failure suggests that markets do discriminate — and that informed traders, when they specialize in a particular time-sensitivity, face either a headwind or tailwind in their profitability.

# VI. A Weaker Constraint

If we want to replace the universal-urgency of V.1) with something still compatible with ordinary fair pricing, perhaps the weakest constraint would be the following:

VI.1) $\mathbf{E}_{u|q}[\alpha(q,u)] = \mathbf{E}_{u|q}[\frac{1}{q} \int_{0}^{q} \mathcal{I}(q',u) dq']$

###### Which says that, for a given $q$, fair pricing holds on average across all $u$.

Requiring this, along with V.2), gives a large class of solutions. Many solutions have $q$ -behavior similar to the time-independent model I, with $u$ -behavior that looks something like this:

Stylized plot of permanent ($\alpha$) and temporary ($\mathcal{I}$) price impact vs urgency. Meta-orders of some urgencies pay more (on average) in temporary impact than they make in permanent impact, while meta-orders of other urgencies pay less than they make.

This weaker constraint leaves a great deal of flexibility in the shape of the market impact surface $\mathcal{I}(q,u)$. Some of the solutions seem reasonable, e.g. for large $u$, $\mathcal{I}$ could decay as a power of $u$. But there are plenty of unreasonable solutions too, so perhaps real markets obey a stronger form of fair pricing.

# Conclusion

Price impact has characteristics that are universal across asset classes. This universality suggests that financial markets possess emergent properties that don’t depend too strongly upon their underlying market structure. Here, we consider some possible properties and their connection with impact.

The general approach is to think about a market structure principle, and write down a corresponding equation. Some of these equations, stemming from notions of efficiency, form systems which have behavior evocative of our markets. The simple system in part I combines the “fair pricing condition” with a linkage between expected short-term and long-term price impact. It predicts both impact’s size-dependence and post-execution decay with surprising accuracy. Fair pricing appears to agree with empirical equities data. The linkage condition is also testable. And, as discussed in part III, its form may weakly depend on how much and how quickly a market disseminates data. If we measure this dependence, we might further understand the effects of price-transparency on fundamental traders, and give regulators a better toolbox to evaluate the evolution of markets.

[1] A “meta-order” refers to a collection of orders stemming from a single trading decision. For example, a trader wanting to buy 10,000 lots of crude oil might split this meta-order into 1,000 child orders of 10 lots.

[2] There’s a good review and empirical study by Zarinelli et al. It has a brief overview of several models that can predict concave impact, including the Almgen-Chriss model, the propagator model of Bouchaud et al. and of Lillo and Farmer, the latent order book approach of Toth et al. and its extension by Donier et al., and the fair pricing and martingale approach of Farmer et al.

[3] Recall the “flow versus stock” (“stock” meaning available inventory) debate from the Fed’s Quantitative Easing programs, when people agonized over which of the two had a bigger impact on prices. E.g., Bernanke in 2013:

We do believe — although, you know, there’s room for debate — we do believe that the primary effect of our purchases is through the stock that we hold, because that stock has been withdrawn from markets, and the prices of those assets have to adjust to balance supply and demand. And we’ve taken out some of the supply, and so the prices go up, the yields go down.

For ordinary transactions, the “stock effect” is typically responsible for about two thirds of total impact (see, e.g., Figure 12). Central banks, though, are not ordinary market participants. But there are hints that their impact characteristics may not be so exceptional. Payne and Vitale studied FX interventions by the SNB. Their measurements show that the SNB’s price impact was a concave function of intervention size (Figure 2). The impact of SNB trades also appears to have partially reverted within 15-30 minutes, perhaps by about one third (Figures 1 and 2, Table 2). Though, unlike QE, these interventions were sterilised, so longer-term there shouldn’t have been much of a “stock effect” — and other participants may have known that.

[4] We can assume without loss of generality that the traders in question are buying (i.e. the meta-order sizes are positive). Sell meta-orders would have negative $q$, and the same arguments would apply, but with “$\geq$” replaced by “$\leq$“. Though, the meta-order size distribution for sell orders might not be symmetric to the distribution for buy orders (i.e. $p[q] \neq p[-q]$). Note that this model assumes that traders don’t submit sell orders when their intention is really to buy. There’s some debate over whether doing so would constitute market manipulation and I doubt it happens all that much, but that’s a discussion for another time.

[5] I’m being a little loose with words here. Say a meta-order in situation $s$ has an already-executed quantity of $q_{executed,s}$, and the market-estimate of $q_{executed,s}$ is $\hat{q}_s$. I.2) is not the same as saying that $\mathbf{E}_{s \in S}[\hat{q}_s] = \mathbf{E}_{s \in S}[q_{executed,s}]$. The market-estimate $\hat{q}_s$ could be biased and I.2) might still hold. And I.2) could be wrong even if $\hat{q}_s$ is unbiased.

[6] I’m being imprecise here. Intermediaries could differentiate some market situations from others, so we really should have: $\mathcal{I}_{s_p}(q') = \mathbf{E}_{q}[\alpha_{S_p}|q \geq q'] = \int_{q'}^{\infty} \alpha_{S_p}(q)p[q|q \geq q']dq$, where $\alpha_{S_p} = \mathbf{E}_{s_p \in S_p}[\alpha_{s_p}(q)]$ is the average alpha for possible situations $s_p$ given observable market conditions. E.g. average alpha increases when volatility doubles, and other traders know it — so they adjust their estimates of temporary impact accordingly. In this case, $S_p$ is the set of meta-orders that could be sent when volatility is doubled. For this reason, and because impact is not the only cause of price fluctuations, the stronger “colored print” constraint wouldn’t eliminate empirically measured $\mathbf{Var}_{s}[\mathcal{I}_{s}]$ — though it should dramatically reduce it.

[7] The draft presents some fascinating evidence in support of the colored print hypothesis. Using broker-tagged execution data from the LSE and an estimation method, the authors group trades into meta-orders. They then look at the marginal temporary impact of each successive child order from a given meta-order (call this meta-order $M_{1}$). In keeping with a concave impact-function, they find that $M_{1}$‘s child orders have lower impact if they’re sent later in $M_{1}$‘s execution. However, if another meta-order ($M_{2}$) is concurrently executing on the same side as $M_{1}$, $M_{2}$‘s child orders have nearly the same temporary impact, regardless of whether they occur early or late in the execution of $M_{1}$ (p39-40). This means that “the market” is able to differentiate $M_{1}$‘s executions from $M_{2}$‘s!

I.2) might seem like a sensible approximation for real markets, but I’d have expected it to be pretty inaccurate when multiple large traders are simultaneously (and independently) active. There should be price movement and excess volume if two traders have bought a million shares each, but how could “the market” differentiate this situation from one where a single trader bought two million shares? It’s surprising, but the (draft) paper offers evidence that this differentiation happens. I don’t know what LSE market structure was like during the relevant period (2000-2002) — maybe it allowed information to leak — but it’s also possible that large meta-orders just aren’t very well camouflaged. A large trader’s orders might be poorly camouflaged, for example, if she has a favorite order size, or submits orders at regular time-intervals. In any case, if a meta-order is sufficiently large, its prints should effectively be “colored” — because it’s unlikely that an independent trading strategy would submit another meta-order of similar size at the same time.

[8]
A. Take a $\frac{d}{dq}$ of I.1): $\mathcal{I}(q)=q \alpha '(q)+\alpha (q)$
B. Set A. equal to the definition of $\mathcal{I}(q')$ in I.2): $q' \alpha '(q')+\alpha (q')=\frac{\int_{q'}^{\infty } p[q] \alpha (q) \, dq}{1-P[q']}$
C. Take a $\frac{d}{dq'}$ of B.: $q' \alpha ''(q')+2 \alpha '(q')=\frac{P'[q'] (\int_{q'}^{\infty } p[q] \alpha (q) \, dq)}{(1-P[q'])^2}-\frac{p[q'] \alpha (q')}{1-P[q']}$
D. Plug B. into C. to eliminate the integral: $q' \alpha ''(q')+2 \alpha '(q')=\frac{P'[q'] (q' \alpha '(q')+\alpha (q'))}{1-P[q']}-\frac{p[q'] \alpha (q')}{1-P[q']}$
E. Use $P'[q']=p[q']$: $\alpha '(q') (2-\frac{q' p(q')}{1-P(q')})+q' \alpha ''(q')=0$
F. And for clarity, we can change variables from $q' \rightarrow q$, and divide by $q$ (since we’re not interested in the ODE when $q=0$).

[9] There’s a helpful graphic on p20 of this presentation.

[10] This equivalence comes from ODE uniqueness and applies more generally than the model here. Latent liquidity models have a similar feature. In latent liquidity models, traders submit orders when the market approaches a price that appeals to them. In addition to their intuitive appeal, latent liquidity models predict square-root impact under a fairly wide variety of circumstances.

It’s helpful to visualize how price movements change the balance of buy and sell meta-orders. Let’s call $N_{s}(q)$ the number of meta-orders, of size $q$, live in the market at a given situation $s$ (a negative $q$ indicates a sell meta-order). When supply and demand are in balance, we have $\sum_{q=-\infty}^{\infty} qN_{s}(q) = 0$ (buy volume equals sell volume).

Say a new meta-order of size $q'$ enters the market and disrupts the equilibrium. This changes the price by $\delta_{s}(q')$, and morphs $N_{s}(q)$ into a new function $N_{s}(q, \delta_{s}(q'))$, with $\sum_{q=-\infty}^{\infty} qN_{s}(q, \delta_{s}(q')) = -q'$. I.e., a new buy meta-order will fully execute only if the right volume of new sell meta-orders appear and/or buy meta-orders disappear. Here is a stylized illustration:

Pre-impact (blue) and post-impact (orange) distributions of meta-order sizes live in the market, at an arbitrary situation $s$. Before a new buy meta-order (red) enters the market, the volume between buy and sell meta-orders is balanced. After the new meta-order begins trading, the distribution shifts to accommodate it. This shift is facilitated by a change in price, which incentivizes selling and disincentivizes buying.

By definition, $\mathcal{I}(q) = \mathbf{E}_{s \in S}[\delta_{s}(q)]$, where the expectation is over all situations when a meta-order of size $q$ might be submitted. Also by definition, $N_{s}(q)$ — if we assume that meta-orders are i.i.d. (which would preclude correlated trading behavior like herding) — is the empirical distribution function of meta-order sizes. So $N_{s}(q)$ and $p[q]$ have the same shape if there are a large number of meta-orders live.

Donier, Bonart, Mastromatteo, and Bouchaud show that a broad class of latent liquidity models predict similar impact functions. Fitting their impact function to empirical data would give a latent liquidity model’s essential parameters, which describe the equilibrium (or “stationary”) $p[q]$, as well as how it gets warped into $p[q,\delta]$ when the price changes by $\delta$.

[11] From the ODE: $\frac{p[q]}{1 - P[q]} = \frac{\alpha''(q)}{\alpha'(q)} + \frac{2}{q}$. We can use I.1) to get $\alpha(q)$ from $\mathcal{I}(q)$, and thus find $p[q]$ (for a continuous probability distribution, $p[q] \propto \frac{p[q]}{1 - P[q]} e^{-\int\frac{p[q]}{1 - P[q]}dq}$).

[12] That is, if $a \neq 1$ : $\alpha(q) = \frac{d}{1-a}+K \exp \left(\int_0^q \frac{(1-a) p[q']}{1-P[q']} \, dq'\right)$. And in the case that $a=1$ : $\alpha(q) = d \int_0^q \frac{p[q']}{P[q']-1} \, dq'+K$.

[13] If fund managers knowingly let their AUM grow beyond the capacity of their strategies, then “overconfidence” might not be the right word. Then again, maybe it is. Clients presumably have confidence that their money managers will not overload their strategies.

[14]
A. Take a $\frac{d}{dq}$ of the fair pricing condition I.1): $\mathcal{I}(q)=q \alpha '(q)+\alpha (q)$
B. Set equal to III.2): $q' \alpha '\left(q'\right)+\alpha \left(q'\right)=\frac{\int_{q'-q_d}^{\infty } p[q] \alpha (q) \, dq}{1-P[q'-q_d]}$
C. Take a $\frac{d}{dq'}$ : $q' \alpha ''\left(q'\right)+2 \alpha '\left(q'\right)=\frac{P'[q'-q_d] \left(\int_{q'-q_d}^{\infty } p[q] \alpha (q) \, dq\right)}{\left(1-P[q'-q_d]\right){}^2}-\frac{p[q'-q_d] \alpha \left(q'-q_d\right)}{1-P[q'-q_d]}$
D. Substitute B. into C. to eliminate the integral: $q' \alpha ''\left(q'\right)+2 \alpha '\left(q'\right)=\frac{\left(q' \alpha '\left(q'\right)+\alpha \left(q'\right)\right) P'[q'-q_d]}{1-P[q'-q_d]}-\frac{p[q'-q_d] \alpha \left(q'-q_d\right)}{1-P[q'-q_d]}$
E. And use $P'[q'-q_d]=p[q'-q_d]$ to get $q \alpha ''(q) + \alpha '(q) \left(2-\frac{q p[q-q_d]}{1-P[q-q_d]}\right)-\left(\alpha (q)-\alpha (q-q_d)\right)\frac{p[q-q_d]}{1-P[q-q_d]}=0$

[15] The solutions were generated with the following assumptions:

$q \sim Pareto[q_{min}=10^{-7},\beta=\frac{3}{2}]$
Initial conditions for $q_d=0$ : $\alpha(q_{min})=10^{-5}, \alpha'(q_{min})=10^{3}$
Initial conditions for $q_d=10^{-5}$ : $\alpha(q_{min})=1.1 \times 10^{-4}, \alpha'(q_{min})=10^{-2}$
Initial conditions for $q_d=10^{-2}$ : $\alpha(q_{min})=1.1 \times 10^{-4}, \alpha'(q_{min})=10^{-3}$
The $q_d=0$ solution was generated from the ODE of I.1).

[16] Here’s Robin Wigglesworth on one reason bank market-makers like trade reporting delays:

These days, bank traders are loath or unable to sit on big positions due to regulatory restrictions. Even if an asset manager is willing to offload his position to a dealer at a deep discount, the price they agree will swiftly go out to the entire market through Trace, hamstringing the trader’s ability to offload it quickly. [Emphasis added]

[17] I don’t know whether bond liquidity provision is sufficiently competitive, but it has notoriously high barriers to entry.

Even for exchange-traded products, subsidizing market-makers with an informational advantage requires great care. E.g., for products that are 1-tick wide with thick order books, it’s possible that market-makers monetize most of the benefit of delayed trade reporting. On these products, market-makers may submit small orders at strategic places in the queue to receive advance warning of large trades. Matt Hurd calls these orders “canaries.” If only a handful of HFTs use canaries, a large aggressor won’t receive meaningful size-improvement, but the HFTs will have a brief window where they can advantageously trade correlated products. To be clear, canaries don’t hurt the aggressor at all (unless she simultaneously and sloppily trades these correlated products), but they don’t help much either. Here’s a hypothetical example:

1. Canary orders make up 5% of the queue for S&P 500 futures (ES).
2. A fundamental trader sweeps ES, and the canaries give her a 5% larger fill.
3. The canary traders learn about the sweep before the broader market, and use that info to trade correlated products (e.g. FX, rates, energy, cash equities).

Most likely, the fundamental trader had no interest in trading those products, so she received 5% size-improvement for free. But, if more HFTs had been using canaries, their profits would’ve been lower and maybe she could’ve received 10% size-improvement. The question is whether the number of HFTs competing over these strategies is large enough to maximize the size-improvement for our fundamental trader. You could argue that 5% size-improvement is better than zero, but delaying public market data does have costs, such as reduced certainty and wider spreads.

[18] If $q_d$ were intentionally changed by altering market structure, there’d probably be corresponding changes in the distribution of $q$ and the initial conditions. These changes could counteract the anticipated effects.

[19] A more competition-friendly version might be for exchange latency-structure to allow canaries. But the loss of transparency from delaying market data may itself be anti-competitive. E.g., if ES immediately transmitted execution reports, and delayed market data by 10ms, then market-makers would only be able to quote competing products (like SPY) when they have canary orders live in ES. Requiring traders on competing venues to also trade on your venue doesn’t sound very competition-friendly.

[20]
A. Since $\mathcal{I}$ is piecewise, split the fair pricing integral I.1) into the relevant two regions: $\alpha(q_{tot}) = \frac{1}{q_{tot}} \left( \int_0^{q_{tot}-q_{FR}} \mathcal{I}(q_{tot},q_{executed}) dq_{executed} + \int_{q_{tot}-q_{FR}}^{q_{tot}} \mathcal{I}(q_{tot},q_{executed}) dq_{executed} \right)$
B. Plugging in IV.2) to A.:
$q_{tot}\alpha(q_{tot}) = \int_0^{q_{tot}-q_{FR}} \frac{\int_{q_{executed}+q_{FR}}^{\infty } p[q] \alpha (q) \, dq}{1-P[q_{executed}+q_{FR}]} \, dq_{executed}+q_{FR} \alpha(q_{tot})$
C. Take a $\frac{\partial}{\partial q_{tot}}$ : $q_{FR} \alpha '(q_{tot})+\frac{\int_{q_{tot}}^{\infty } p[q] \alpha (q) \, dq}{1-P[q_{tot}]}=q_{tot} \alpha '(q_{tot})+\alpha (q_{tot})$
D. Take another $\frac{\partial}{\partial q_{tot}}$ : $q_{FR} \alpha ''(q_{tot})+\frac{P'[q_{tot}] (\int_{q_{tot}}^{\infty } p[q] \alpha (q) \, dq)}{(1-P[q_{tot}])^2}-\frac{p[q_{tot}] \alpha(q_{tot})}{1-P[q_{tot}]}=q_{tot} \alpha ''(q_{tot})+2 \alpha '(q_{tot})$
E. Subsitute C. into D. to eliminate the integral, and use $P'[q_{tot}] = p[q_{tot}]$ : $\alpha '\left(q_{\text{tot}}\right) \left(2-\frac{\left(q_{\text{tot}}-q_{\text{FR}}\right) p[q_{\text{tot}}]}{1-P[q_{\text{tot}}]}\right)+\left(q_{\text{tot}}-q_{\text{FR}}\right) \alpha ''\left(q_{\text{tot}}\right)=0$

[21] The value of HFTs’ information will decay in a complex manner over the span of their predicted time period. An HFT might predict 30-second returns and submit orders within 100us of a change in its prediction. If that prediction maintained its value for the entire 30 seconds (becoming valueless at 31 seconds), then the HFT wouldn’t need to react so quickly. High-frequency traders, almost by definition, are characterized by having to compete for profit from their signals. From the instant they obtain their information, it starts decaying in value.

[22] Thanks to Mathematica.
$q g(q,u)^2 \alpha ^{(2,1)}(q,u) = g(q,u) \left(g^{(1,0)}(q,u) \alpha ^{(0,1)}(q,u)+q g^{(1,0)}(q,u) \alpha ^{(1,1)}(q,u) + q g^{(1,1)}(q,u) \alpha ^{(1,0)}(q,u)+g^{(0,1)}(q,u) \left(2 \alpha ^{(1,0)}(q,u)+q \alpha ^{(2,0)}(q,u)\right)+g^{(1,1)}(q,u) \alpha (q,u)\right) - 2 g^{(0,1)}(q,u) g^{(1,0)}(q,u) \left(q \alpha ^{(1,0)}(q,u)+\alpha (q,u)\right)+g(q,u)^3 p(q,u) \alpha (q,u)-2 g(q,u)^2 \alpha ^{(1,1)}(q,u)$

With $g\left(q,u\right) = \frac{1}{1 - P[q,\infty] - P[\infty,u] + P[q,u]}$

The procedure is to plug V.2) into V.1) and take 2 $q$ partial derivatives and 1 $u$ partial:

A. Inserting V.2) into V.1): $\alpha \left(q,u'\right)=\frac{\int_0^q g\left(q',u'\right) \int_{u'}^{\infty} \left(\int_{q'}^{\infty} p(q'',u) \alpha (q'',u) \, dq''\right) \, du \, dq'}{q}$
B. Take a $\frac{\partial}{\partial q}$ : $\alpha ^{(1,0)}\left(q,u'\right)=\frac{g\left(q,u'\right) \int_{u'}^{\infty} \left(\int_q^{\infty} p(q'',u) \alpha (q'',u) \, dq''\right) \, du}{q}-\frac{\int_0^q g\left(q',u'\right) \int_{u'}^{\infty} \left(\int_{q'}^{\infty} p(q'',u) \alpha (q'',u) \, dq''\right) \, du \, dq'}{q^2}$
C. Substitute A. into B. to eliminate integrals where applicable: $\alpha ^{(1,0)}\left(q,u'\right)=\frac{g\left(q,u'\right) \int_{u'}^{\infty} \left(\int_q^{\infty} p(q'',u) \alpha (q'',u) \, dq''\right) \, du}{q}-\frac{\alpha \left(q,u'\right)}{q}$
D. Take another $\frac{\partial}{\partial q}$ : $\alpha ^{(2,0)}\left(q,u'\right)=\frac{g^{(1,0)}\left(q,u'\right) \int_{u'}^{\infty} \left(\int_q^{\infty} p(q'',u) \alpha (q'',u) \, dq''\right) \, du}{q}-\frac{g\left(q,u'\right) \int_{u'}^{\infty} \left(\int_q^{\infty} p(q'',u) \alpha (q'',u) \, dq''\right) \, du}{q^2}-\frac{g\left(q,u'\right) \int_{u'}^{\infty} p(q,u) \alpha (q,u) \, du}{q}+\frac{\alpha \left(q,u'\right)}{q^2}-\frac{\alpha ^{(1,0)}\left(q,u'\right)}{q}$
E. Substitute C. into D to eliminate integrals where applicable: $\alpha ^{(2,0)}\left(q,u'\right)=\frac{g^{(1,0)}\left(q,u'\right) \left(\alpha ^{(1,0)}\left(q,u'\right)+\frac{\alpha \left(q,u'\right)}{q}\right)}{g\left(q,u'\right)}-\frac{g\left(q,u'\right) \int_{u'}^{\infty} p(q,u) \alpha (q,u) \, du}{q}-\frac{2 \alpha ^{(1,0)}\left(q,u'\right)}{q}$
F. Take a $\frac{\partial}{\partial u'}$ : $\alpha ^{(2,1)}\left(q,u'\right)=\frac{-g^{(0,1)}\left(q,u'\right) \left(\int_{u'}^{\infty} p(q,u) \alpha (q,u) \, du\right)}{q}+\frac{g^{(1,1)}\left(q,u'\right) \left(\alpha ^{(1,0)}\left(q,u'\right)+\frac{\alpha \left(q,u'\right)}{q}\right)}{g\left(q,u'\right)}+\frac{g^{(1,0)}\left(q,u'\right) \left(\frac{\alpha ^{(0,1)}\left(q,u'\right)}{q}+\alpha ^{(1,1)}\left(q,u'\right)\right)}{g\left(q,u'\right)}-\frac{g^{(0,1)}\left(q,u'\right) g^{(1,0)}\left(q,u'\right) \left(\alpha ^{(1,0)}\left(q,u'\right)+\frac{\alpha \left(q,u'\right)}{q}\right)}{g\left(q,u'\right)^2}+\frac{g\left(q,u'\right) p\left(q,u'\right) \alpha \left(q,u'\right)}{q}-\frac{2 \alpha ^{(1,1)}\left(q,u'\right)}{q}$
G. To get the result, substitute E. into F. to eliminate integrals where applicable.

[23] I could be wrong, and it’s hard to define what “reasonable” solutions look like. But I checked this three ways:

1. I tried numerically solving for $\alpha$ (and thus $\mathcal{I}$) assuming various joint probability distributions $p[q,u]$ — where $q$ and $u$ are dependent and generated by functions of Weibull, Pareto, Log-Normal, or Stable random variables. I didn’t see any solutions where $\alpha$ and $\mathcal{I}$ had significant $u$ -dependence without simultaneously having some other ridiculous feature (e.g. an infinity at small $q$).
2. I tried assuming $\alpha(q,u)$ had a few reasonable forms (e.g. $\alpha(q,u) \propto q^x u^{-y}$) and solving numerically for $p[q,u]$. All the solutions I saw were not probability distributions (e.g. had negative probabilities).
3. It’s possible to solve the two integral equations directly if we assume that $p$ and $q$ are independent ($p[q,u]=p_q[q]p_u[u]$) and the solutions are separable ($\alpha(q,u)=\alpha_q(q)\alpha_u(u)$ and $\mathcal{I}(q,u)=\mathcal{I}_q(q)\mathcal{I}_u(u)$). In this case, $\mathcal{I}_q(q)$ and $\alpha_q(q)$ obey the same ODE as the original time-independent system in part I. And $\alpha_u(u)=\mathcal{I}_u(u)= constant$, which isn’t realistic.

# Pershing Square and Information Leakage on IEX

Pershing Square has filed an updated 13D disclosing that they sold 5 million shares of Valeant last week. Pershing almost surely considered this sale to be sensitive information, but I believe that their execution method was quite conspicuous.

In a recent blog post, I noted that some institutional flows may leave a signature in FINRA ATS data. IEX in particular seemed to have some enthusiastic customers, some of which were also their shareholders. [1] In addition to the FINRA data, IEX reports near-realtime volume data on their website, which could make customer flows detectable long before transactions are complete. [2] I noted that the last time Pershing Square traded Valeant common stock, IEX reported an anomalously high market share of Valeant trades that day:

[I]t may be more than coincidence that IEX’s share of VRX volume was anomolously high when Pershing Square recently bought 2 million shares.

It’s (almost) too easy to mention the irony if valued information has leaked because of Greenlight’s or Pershing Square’s support for IEX. Ackman’s paranoia about front-running features prominently in “Flash Boys.”

So, when IEX abruptly began reporting very high market share in Valeant on Dec. 24, it piqued my interest. After a few more days of persistently high volume, it seemed very likely that Pershing Square was trading the stock. On the 30th, I tweeted “I hope Pershing Sq read my post,” along with some screenshots of Valeant volume on IEX.

Now, IEX has other loyal customers — it could have been that Greenlight was trading Valeant for instance. But this seemed less likely because of Pershing’s close relationship with the stock. Just looking by eye [3], when IEX showed a lot of activity on Valeant, the price seemed to either drift down or stay stable (sometimes appearing “pinned”) — and, when IEX showed a pause in activity, the price tended to rise. That kind of pattern may indicate that the IEX-favoring trader was selling. [4] Short-swing profit rules also mean that Pershing Square would have been more likely to be selling than buying (see Matt Levine’s footnote #5). Altogether, the information at the time was very suggestive that Pershing Square was selling Valeant common stock. [5]

Traders have always maintained relationships with their favorite brokers and market centers. Sometimes these relationships can result in suboptimal execution quality. But hopefully, in exchange for their loyalty, traders receive other benefits. Perhaps Pershing Square has decided that furthering IEX’s “pro-investor” agenda is worth leaking their trading intentions. Or, more cynically, perhaps they’ve decided that it’s more important to support their investment in IEX. Either way, if I were Pershing Square, I’d be giving those decisions another look. [6]

[1] IEX CEO Brad Katsuyama has disclosed that some customers heavily favor IEX (at around 15m:10s):

We have some very close partners that have shifted a lot of their trading towards IEX — a third of their volume is now executed on our market.

[2] IEX doesn’t report trades or volume in their current market data protocol. Perhaps that’s an indication that IEX thinks its trade data is sensitive? It wouldn’t be hard for a machine to read the data from the website though (a tool like PhantomJS might work). That data could probably be matched with off-exchange prints on the consolidated tape to get a more precise view of IEX trades. Regardless, if and when IEX becomes an exchange, its trades will be explicitly identified on the tape.

[3] This is far from rigorous, obviously. I’d love to see an analysis of price impact and IEX volume data with large sample size.

[4] Liquidity providers would obviously know whether their counterparties on IEX had a tendency to be buying or selling Valeant in a given minute/hour/day/week. I have never used information like this to make trading decisions (nor has my company), but I believe it’d be perfectly compliant to do so. Valeant volume (single-counted) on IEX was comparable to the volume Pershing Square reported for each day. So, if Pershing sent 1/3 of their volume to IEX (see [1]), that means that their order flow probably attracted liquidity onto IEX. A lot of that may have been due to execution algos noticing the flow and choosing to interact with it. If execution algos can use information from dark pool fills to make trading decisions, then surely prop traders can too. This is probably a much more important source of information leakage than the type that IEX claims to prevent.

[5] The larger picture could have been more complex, of course. Pershing could have been selling common stock, while simultaneously buying calls or selling puts.

[6] Some crude methods to estimate the potential benefit to Pershing Square of heavily patronizing IEX:

1. Probably around 5 million extra shares traded on IEX as a result of this (likely) routing decision. At 18mils/share, IEX would’ve made $9k extra revenue. Maybe there’s some momentum effect where, as IEX receives more volume, they attract future business. Let’s be generous then and say this volume is worth$100k to IEX and its project, and that Pershing Square believes any benefit to this project fully accrues to institutional traders like themselves.
2. Alternatively, let’s assume that “success” for IEX means achieving the same market cap as Nasdaq, $10B. 5 million shares is about 0.2% of IEX’s monthly volume of 2-3B shares. Say that Pershing Square typically trades about once per month, so that they can increase IEX’s long-term revenue by 0.2%. Again, let’s be generous and say that 0.2% of revenue increases IEX’s chance of “success” by 1%. So, Pershing Square’s loyalty could improve IEX’s expected value by as much as$10M. It’s hard for me to parse IEX’s cap table in their exchange application, but let’s guess that Pershing Square owns 5% of it. That’d mean that Pershing Square would receive an expected-value benefit of $500k from their routing favoritism. And estimating the cost to Pershing Square: 1. Say that Pershing’s intention to decrease their Valeant stake leaked before they finished their trading. By what amount would Valeant’s stock move? I don’t know, but a conservative estimate should be at least 1%, right? Given the explosion in volume on IEX, what probability would the market assign to the possibility that Pershing was selling? My personal estimate at the time was over 50%, with a <10% chance that they were buying (neither I nor my company used this information to trade). So, perhaps the stock would (and did) drop by 0.5% because of this leak. If Pershing had$300M left to trade, this one incident would cost them $1.5M. And it’s not hard to imagine this number being several times higher. 2. Instead of Valeant, say that Pershing Square wants to trade a stock that nobody expects them to. Now, people looking at IEX’s reported volume wouldn’t have much idea of the side of the trade, or that it came from Pershing Square. But, the market-makers on IEX would probably know the side. The first hour after Pershing’s meta-order begins trading, maybe these HFTs develop an inkling (10% probability) that the meta-order is large (50% of ADV). Let’s assume, as IEX seems to, that HFTs are ruthless and that they start moving the price in accordance with the meta-order’s expected market impact. Using the square-root law (e.g. p8), the 50% of ADV meta-order could be expected to move the price by sqrt(0.5) * the stock’s daily volatility. Say the daily volatility is 1%, so the market impact would be around 0.7%. It’s already been an hour though, so perhaps half of this market impact has already occurred. The HFTs have 10% confidence that this further 0.35% move will happen, so they move the price by 0.03%. If Pershing had$300M left to trade at this point, that 0.03% would cost them about $100k. And maybe Pershing trades this size 5 times per year, so the routing preference could cost them$500k/yr. This is a paranoid (and rough) estimate of HFT order anticipation, but paranoia seems to be part of the IEX ethos.

In any case, it seems to me that the cost to Pershing Square of favoring IEX out-weighs a generously calculated benefit. But I guess a million here and there might not be a big deal to them.

# Possible Compromises for IEX

The most controversial aspect of IEX’s proposed design seems to be the non-uniform application of their speed bump. [1] Before the community invests too much time debating this issue, I want to discuss why the unfair access proposed by IEX is unnecessary. IEX could accomplish their stated goals without offering an informational advantage to its peg orders or router.

Apparently, IEX doesn’t apply the speed bump to incoming market data used for repricing algorithmic order types, or for communications between the exchange and their router. A lot of people (including me) have explained how these disparities can cause problems. In short, repricing algorithmic order types with market data that counterparties haven’t yet seen is equivalent to “latency arbitrage.” And, it feels anti-competitive for IEX to delay communications between itself and every unaffiliated router. [2][3] In this post, I’ll explain why these two exceptions to the speed bump aren’t needed to prevent “latency arbitrage” and “front-running.”

# Protecting Peg Orders Without Privileging Them

IEX wants to make it impossible for its peg orders to be “picked off” by traders that process market data faster than IEX can. But that doesn’t actually require IEX to reprice its peg orders with fully non-delayed market data. The CME is introducing functionality that timestamps client orders at the moment they are received, then processes them in the order of those timestamps. IEX could do the same, but also timestamp incoming market data. If IEX doesn’t want to subscribe to wireless data feeds, it could subtract the latency difference between wireless and fiber links from its market data timestamps. [4] Once IEX has levelized timestamps for all messages, all it needs to do is process the messages in the correct order. This would accomplish IEX’s goal of “[ensuring] that no market participants can take action on IEX in reaction to changes in market prices before IEX is aware of the same price changes.”

If re-ordering messages with software during the shoebox delay makes the delay appear more “intentional” (which violates Reg. NMS), there are analog options too. [5] IEX could introduce smaller shoeboxes for the direct feeds it processes. For example, if IEX receives market data messages from Nasdaq 200us before any trader can act on them, then it can add a delay coil of 200us to its cable from Nasdaq. And, if it receives market data from NYSE 50us before fast traders do, then it can add a 50us coil to its NYSE feed, etc.

Either of these options would prevent IEX peg orders from being repriced in a “last look”-like manner. Here’s a stylized, bad diagram:

# Preventing Information-Leakage from IEX’s Router Without Privileging It

IEX says that it delays outgoing messages to all subscribers, except their routing broker-dealer (IEXS), “to prevent “information leakage” or “liquidity fade” when IEXS routes to other markets.” Their concern is that, without this asymmetric delay, market-makers could pull their quotes on other exchanges if a trader sends a large order to IEX which partially executes before being routed out. However, IEX could prevent that “front-running” [6] by locating its router outside the speed bump in Secaucus, with clients. The router could then maintain its view of exchanges’ visible order books, including IEX’s, and time the sending of its orders so that they arrive at all exchanges simultaneously.

IEX suggests that competing routers could operate in this way, so IEX should be aware that its router could do the same. [7][8] But there is a drawback. The router would only know the visible quantity posted on IEX, and wouldn’t be able to optimally interact with IEX’s hidden orders. The only way a router can fully access hidden liquidity at a given exchange is by operating sequentially: first sending an order to that exchange, waiting to hear back, then sending any unfilled balance to other markets. The whole point of hidden liquidity is that you only know it’s there after you (or others) trade with it. [9]

By allowing its router to bypass the speed bump, IEX effectively gives it exclusive access to IEX’s hidden order book information. That special access only lasts for the speed bump duration of 350us, but it still seems problematic. Lava was fined for allegedly using information from its hidden order book to help routing decisions at an affiliate. Matt Levine argues that this offense was (mostly) victimless:

What ColorBook did with the hidden orders is route its customers to those hidden orders… Once they submitted an order to buy X shares at Y price, ColorBook would send it toward the hidden orders. That’s exactly what you want when you submit a hidden order!

Which could certainly be true, though those hidden order users might not have liked interacting with flow from the ColorBook router. [10]

The argument to allow IEX to favorably treat its router is pretty much the same as Levine’s point about Lava. Such treatment, if fully disclosed, would probably improve fill rates for users of both the router and IEX hidden orders. It does, however, hurt users of non-IEX routers (and non-IEX resting orders, which miss fills). The question is whether exchanges should be permitted to help their users via any means, or whether they have to consider the broader competitive landscape. Should BATS be permitted to make routing decisions based on special access to Edge’s hidden orders? The same trade-offs apply.

Regardless, IEX is perfectly capable of operating a router immune to “front-running,” without giving it preferential access. This issue is not about “front-running,” it’s about accessing hidden liquidity. [11]

# Compromising

The rhetoric surrounding IEX has always been too hot for reasonable debate. That’s a shame. I think that there’s room for a compromise which allows IEX to accomplish its goals, while also satisfying automated traders and competitors. The “Flash Boys” would just have to admit that, sometimes, people who they hate make good points. Maybe that’s part of growing up. [12]

[1] This post, as always with IEX, is speculative. Their currently posted exchange application doesn’t have much information on the speed bump and when it applies. IEX’s comment letters provide more detail, but there are still some uncertainties in my mind as to what exactly their market model entails.

[2] IEX sort of denies this:

IEXS, the routing broker‐dealer, does not route to IEX and all orders, routable or otherwise, must pass through the POP, so there is no competitive disparity in terms of access to IEX’s trading system.

But also:

Following completion of routing actions, as instructed by the Exchange, any unfilled balance of shares returns to the Exchange for processing in accordance with applicable rules. That message does not traverse the POP

[3] IEX favorably treating its router could prompt other exchanges to create similar arrangements for their own routers, putting brokers’ smart order routers and small exchanges at a competitive disadvantage. I don’t really understand why IEX would want that to be permitted. If a larger exchange like Nasdaq were to introduce a speed bump that doesn’t apply to its router, traders would be strongly incentivized to use Nasdaq’s router, and nobody would use IEX’s. I’d think that a startup exchange would be most supportive of Reg NMS’s spirit of fair competition.

IEX’s peg order treatment could raise questions about fair competition as well. Traders and brokers may be forced to use IEX’s algorithmic order types rather than their own. Citadel expressed concern that IEX could one day “charge more to execute pegged orders… that have an inherent time advantage over other order types.” And perhaps IEX already does — by charging a higher rate for hidden orders. My understanding is that all hidden orders on IEX are effectively midpoint pegs which are repriced using non-speedbumped market data. It’s not unusual for an exchange to charge extra for hidden executions, but providing a latency advantage to hidden orders raises new questions about their fees.

[4] For example, if IEX receives a book update from Nasdaq at 10:00:00.000000 over fiber with a 1-way latency of 200us, but they know the fastest wireless link has a 1-way latency of 100us, then IEX could recalibrate the timestamp of that book update to 9:59:59.999900. That would represent the time that the fastest trader could have received the same market data message (100us earlier than IEX). There are some wrinkles when you consider that wireless links are not always operational, so if IEX were to be completely fair it would not perform this subtraction when the weather is bad. Rather than deal with that issue, it may be easier for IEX to just subscribe to wireless feeds from the most important markets. It’d probably cost a total of around $50k/mo, which doesn’t sound like a big burden. [5] I don’t see why it should matter whether a delay has software components, but I’m also not a lawyer. [6] Or whatever they’re calling it these days. [7] Top of p15. [8] Though I don’t understand the extent that an exchange can act in a broker-like capacity. Perhaps locating their router in a different datacenter and offering functionality similar to brokers’ smart order routers (SORs) crosses some line? If so, that still seems better than their proposal to offer systematically-advantaged SOR-like functionality? [9] IEX seems rather dismissive of sequential routing in a comment letter (p16). But sequential routing does have its advantages. Not every user wants to fully access lit quotes without regard for market impact, price improvement, or fees. [10] This depends on many factors, including the toxicity of ColorBook routed orders. If the information sharing between the Lava ECN and the ColorBook router were disclosed to hidden order users, they could have taken that into account before sending those hidden orders. As Levine says: That wasn’t disclosed in its filings, or consistently disclosed in its advertising. (Though it was sometimes: This is not so much “a fact LavaFlow kept secret” as it is “a fact LavaFlow forgot to tell people. How consistently has IEX disclosed any favorable access it affords its router? I don’t know, but Citadel says: While it is not explicit in the Application, IEX has explained informally that the IEX Router would not be required to go through the IEX Access Delay to access the IEX trading system or when routing orders from the IEX trading system to other market centers. [11] There’s also an argument that allowing the IEX router to skip the speed bump guarantees that any unfilled portion of a routable order will be first in the queue when it returns to IEX. Ignoring the issue of whether IEX should be able to offer this benefit only to clients of its router, I don’t think it’s actually true. I don’t know exactly how IEX’s router works. But if it submits orders so that they hit BATS at the same time as NYSE, it should be possible for a trader to react to the sweep on BATS and submit an order to IEX more than 350us before IEX hears back from NYSE. [12] Matt Levine is responsible for creating the pun. I’m responsible for using it badly. # Can We Tell Who Trades on Which Dark Pools? Marketplace transparency ensures that investors receive a fair price and have accurate data to conduct their research. But, transparency can also make it harder for traders to conceal their intentions from competitors and counterparties. Exchanges and regulators are tasked with balancing the transparency needs of a market’s customers. Dark pools, by operating with the minimal amount of transparency permitted, are meant to help institutions hide their order flow. They do this, roughly speaking, in two ways: 1. Lack of pre-trade transparency. Orders are invisible on dark pools until they execute. 2. Reduced post-trade transparency. Dark pools are required to quickly report trades to the consolidated tape, but this process is not instant. Subscribers to the public tape also don’t know which dark pool (or wholesaler/ELP) reported a given trade. Market structure is always changing, and there’s a new wrinkle to #2. FINRA Rule 4552 specifies that weekly dark pool volume be published per security.* The data is made public on a 2-week delayed basis, but as we’ll see, it may still have some informational value. # 13F Holdings Data Regulation also requires that large asset managers report their end-of-quarter long positions, within 45 days. [1] Many hedge funds wait until the last minute to file their 13Fs, which suggests that they consider the disclosed information to be valuable. Some hedge funds, like Greenlight Capital, publicly promote the dark pool IEX. Greenlight also owns a stake in IEX, so it may make sense for it to preferentially trade there. We can combine the 13F-reported changes in Greenlight’s long positions with the FINRA 4552 data to get an idea of whether it trades disproportionate volume on IEX. Here’s a density plot of Greenlight’s quarterly trading activity versus IEX’s: A measure of Greenlight’s volume versus a measure of IEX’s market share, for each stock and quarter. The x-axis is: $\log [c + \frac{V_{a,s}V}{V_{a}V_{s}}]$, where c is a small constant $10^{-15}$, $V$ is the total quarterly volume across all ATSs and stocks, $V_{a}$ is the total quarterly volume on the given ATS $a$ (in this case $a$ is IEX), $V_{s}$ is the quarterly volume on a given stock (across all ATSs), and $V_{a,s}$ is the quarterly volume on a given ATS (IEX) and stock. The y-axis is: $\log [c_{f,0.05} + \frac{V_{f,s}}{V_{s}}]$, where $V_{f,s}$ is the trading volume for the fund $f$ (Greenlight) implied by the change in 13F-reported position for the stock $s$, $c_{f,0.05}$ is the 5th percentile of the first quarter’s (ending 9-30-2014) 13F-implied volume for the fund $f$. Each data point is from a given quarter and stock. It does look like something is going on here. The above is for the entire universe of NMS Tier 1 stocks. What if we limit it to stocks that we suspect Greenlight is more likely to trade? Here is a similar plot, restricted to stocks in which Greenlight reported long positions in their previous quarter’s 13F: Similar to above. Includes a linear regression with shaded 95% confidence intervals. Obviously, correlation is different from causation, but this relationship indicates that Greenlight may direct a lot of volume to IEX. IEX also reports near-realtime volume on its website, so one could potentially detect when Greenlight is currently trading a stock. Pershing Square, another backer of IEX, trades too infrequently to make a similar analysis worthwhile, but it may be more than coincidence that IEX’s share of VRX volume was anomolously high when Pershing Square recently bought 2 million shares. [2] It’s (almost) too easy to mention the irony if valued information has leaked because of Greenlight’s or Pershing Square’s support for IEX. Ackman’s paranoia about front-running features prominently in “Flash Boys.” [3] And Greenlight sometimes has felt that even 13F disclosure harms its business. [4] # A Broader Analysis It seemed fun to check if any other hedge funds had easily-detected dark pool preferences. I selected the top 100 funds listed on Octafinance and attempted to query Jive Data for their 13F data for the 5 quarters leading up to June 30, 2015. I then did a Lasso regression of the relative volume of each hedge fund on the relative volume of all dark pools (these volume measures are defined in the caption of the first plot), using the first 4 quarters of data. The 5th quarter was used as test data. The regression only includes data points where the given fund was active in a stock that quarter. [5] It’s not anything fancy, but this process hopefully catches some superficial relationships like Greenlight’s with IEX. Here’s the R script used, as well as the plots and tables it outputs. See “lassoResultsWhenFundTraded_LogHFAnomVol_on_LogAtsAnomVol.csv” (in the second zip file above) for a summary table of the Lasso results. [6] Care is needed when assigning statistical significance to such a large number of regressions, but lots of things stick out. Mariner Investment Group appears to be one of the more detectable funds [7], with test-set $R^{2}$ not much below 0.5. Predicted and actual volume measures for Mariner Investment Group’s test data. It appears as if Mariner likes to trade on Level ATS, and tends to avoid Sigma-X and the UBS dark pool. We can’t disentangle a fund’s routing decisions from other reasons for these correlations — e.g. a fund may be more likely to trade a stock if high retail participation has distorted its price, making the fund’s activity correlated with that of Interactive Brokers’ ATS (IATS), even if the fund doesn’t trade there. [8] But, there appears to be a tendency for hedge funds to route away from UBS’s ATS; Tortoise Capital Advisors is the only fund with a positive coefficient for UBS, and many have large negative coefficients. I don’t know the reason for that, it may be that hedge funds are displeased with the execution quality, or just that they’re not UBS’s clientele. If it’s the former, this analysis raises a sticky dilemma for traders who want to hide their intentions: If you don’t like a certain venue, your information leakage might rise if you avoid it. If that’s really the case, you may want to route there even if you think they do shady stuff. Sometimes, fixing market structure requires collective action, and we need regulators to effect that action on our behalf. Some highly active funds have a surprisingly large test $R^{2}$. It’s possible that whenever you can make a confident prediction about a fund’s volume, it may turn out to be especially hard to predict the direction of that volume. I wonder if that’s the case for Citadel Advisors (their prediction has an $R^{2}$ near 0.1), because I really would expect Citadel to be sophisticated enough to cloak their trading. Some highly active funds that appear to have more detectable flows include: Bridgewater ($R^{2}$ ~ 0.07) , Millenium ($R^{2}$ ~ 0.05), Royce ($R^{2}$ ~ 0.1, which apparently likes Morgan Stanley’s ATS, and avoids JP Morgan’s), BlueMountain ($R^{2}$ ~ 0.07, possibly likes MS, and avoids UBS), Tudor ($R^{2}$ ~ 0.1, possibly avoids UBS), Carlson ($R^{2}$ ~ 0.13, which may have prefered ITG [9], and traded more volume on stocks less active on Fidelity’s and Interactive Brokers’ ATSs), and Ellington ($R^{2}$ ~ 0.2). Highbridge, Adage, D.E. Shaw, and both Two Sigma entities have very weak detectability ($R^{2}$ ~ 0.03). AQR, Renaissance, and Visium probably leak little or no volume information this way. Plenty of less active funds have sizable $R^{2}$s too. But I do find it interesting to discuss the example where the prediction arguably fails most. The prediction for Magellan Asset Management does not do well during the test quarter: Predicted and actual volume measures for Magellan Asset Management’s test data. The largest component in its regression was an apparent tendency to trade on IEX. This relationship suddenly reversed in the last quarter: Magellan’s relative volume vs anomalous IEX volume, for stocks that Magellan traded in the given quarter. A linear regression is shown with 95% confidence bands, for each quarter’s relationship. Was Magellan formerly a big user of IEX, but started avoiding it in Q2? We can’t be sure because the tendency was found via data snooping, but it is suggestive. If so, Magellan may have taken the easiest countermeasure of all, changing their behavior. # Unpredictable Trading The key to avoiding this sort of leakage is to trade unpredictably, or at least, to trade in the same manner as the population norm. Which, in my view, means that Einhorn’s reasoning described in “Flash Boys” could be almost exactly wrong: After listening to Brad’s pitch, Einhorn asked him a simple question: Why aren’t we all just picking the same exchange? Why don’t investors organize themselves to sponsor a single stock exchange entrusted with guarding their interests and protecting them from Wall Street predators? Block trading can be a valuable service, but its utility has a limit. To see why, say that 100 high-alpha investors agree to exclusively trade on a single venue, and public documents show that only one of them owns Micron stock. Suddenly, that venue reports an unusual volume of Micron trades. With a bit of ancillary data (perhaps news articles or observed price impact), other traders might ascertain whether that investor is reducing or adding to her position. I imagine that this type of information leakage can occur on lit exchanges too. The major exchanges have more volume to camouflage institutional executions. But if a hedge fund were to preferentially trade at a minor exchange (or blacklist a major one) their activity may leave a signature. Investors who persistently use the same execution algorithms (or algorithmic order types) could perhaps even leak the side of their transactions. [10] The premise of Reg. NMS is that competition between exchanges lowers costs and prevents abuses. If an upstart venue is widely seen as superior, it will rapidly attract market share. People dissatisfied with the major exchanges have yet to reach consensus on an alternative. Which means that if they unreservedly support their favorite upstart, their execution quality can suffer. That must be frustrating. It’s understandable then if upstarts try schemes that force participants to use their venue. NYSE has suggested that IEX’s design includes anti-competitive routing practices and peg order handling. Unless traders disaffected by market fragmentation stop being fragmented themselves, their only way forward is to attack the fundamentals of Reg. NMS. I’m not sure it’s the answer [11], but it shouldn’t be a surprise if market critics are wistful for the days when they traded on a single, monolithic exchange. [1] Particularly large positions have to be reported sooner. Short positions do not have to reported in the US, though there is a movement to change this. Large short positions in European equities have to be reported quickly, and I’d be curious to see this post’s analysis repeated with the higher resolution European data. [2] Here’s a screenshot of IEX’s most traded stocks on Oct 21, shortly after the close. A very large chunk of this volume appeared before Pershing Square announced that they had traded (though I didn’t take a screenshot). This is a good opportunity for me to remind you that nothing on this blog is trading or investment advice. [3] From “Flash Boys” (emphasis added): Bill Ackman runs a famous hedge fund, Pershing Square, that often buys large chunks of companies. In the two years before Katsuyama turned up in his office to explain what was happening, Ackman had started to suspect that people might be using the information about his trades to trade ahead of him. “I felt that there was a leak every time,” Ackman says. “I thought maybe it was the prime broker. It wasn’t the kind of leak that I thought.” It never is, is it? [4] Greenlight has also said: We believe that the best response for any investors that are worried about fast computers taking advantage of them is to ask that their orders be routed to IEX. But what about investors worried about slow traders “taking advantage” of them? In that case, maybe they should think twice before sending all of their volume to IEX? [5] Which means that in order to use this particular method to predict the volume of a fund’s activity in a given stock, you’d need to know whether they’re likely to be trading it at all. Perhaps that’s doable sometimes. But, in any case, it’s not what I’m trying to do here. This post is just to see whether funds might have any detectable preferences, not to determine if those preferences create trading opportunities. [6] Which contains coefficients given by the Lasso regression of each hedge fund’s relative volume on ATS’s anomalous volume. Each (quarter, stock) pair is a data point. Mean-square error is given for the training set (Total_MSE_Train) and test set (Total_MSE_test). A measure of $R^{2}$ is given for the training set (R-Squared_MSE_Test) and test set (R-Squared_MSE_Test) — note that the $R^{2}$ is a bit unusual for the test set, in that it uses the mean from the training set as its “null prediction.” Sample sizes for each set are given by n_Train and n_Test. [7] Their equity portfolio consists mostly of ETFs and biotech, so this could be an artifact. [8] In that instance, IATS trading activity could still be a useful predictor of hedge fund volume. [9] ITG’s volume has collapsed after being fined for prop trading in its own dark pool. I would imagine that they’ve lost many customers since the end of the last quarter in this dataset (June 30), so prediction accuracy may be lower for later quarters. [10] If there’s demand for it, maybe I can look into whether any market data patterns are correlated with institutional flows. [11] For one thing, it’s not clear why a movement to make trading infrastructure more utility-like should stop with exchanges. What about brokers, execution algorithms, and intermediaries? I think that similar game-theoretical dilemmas could apply to those groups too. Restructuring a competitive industry into a state-supervised monopoly is partly an admission that there’s no prospect of further value-adding innovation. As Cliff Asness says: [I]t’s the argument monopolists always make — that they are really only trying to create efficiencies and eliminate waste for the customer. * ATS data is provided via http://www.FINRA.org/ATS and is copyrighted by FINRA 2015. # Could HFTs Benefit from a Cancellation Tax? You’ve probably seen Hillary Clinton’s proposal for a tax that “would hit HFT strategies involving excessive levels of order cancellations.” I don’t want to discuss politics, but the proposal brings up an interesting thought-experiment: What would markets be like if HFTs couldn’t cancel orders? Let’s say that there’s an extremely high order cancellation tax that’s designed such that it won’t affect fundamental traders: [1] 1. There’s no tax for traders that cancel fewer than ~10 orders per day, providing they hold positions from those orders for over a week. Otherwise, the tax makes cancelling unfeasible for everybody. 2. Cancel-replace messages are taxed the same way. So are hidden order cancellations and orders cancelled/modified via exchange algorithm (e.g. peg orders). 3. IOC (Immediate-Or-Cancel) orders are not taxed. Orders cancelled automatically at the end of the day are not taxed. 4. There are no ELPs (Electronic Liquidity Providers), wholesalers, internalizers, etc. 5. There are no loopholes or exemptions. I really doubt this scheme is what Clinton has in mind, and market dynamics are extremely unpredictable, so the below is purely for fun. Many HFTs might think this tax would put them out of business. I’m not so sure. I think it would transform markets, with the resulting market structure having the same opportunity and need for algorithmic traders. The tax would clear out almost all automated resting liquidity. A few medium-frequency algorithms might thriftily use their 10 order allotment. But most orders resting in the book would belong to fundamental and retail traders. [2] Now, without market makers, would the market become a utopia where long-term traders seamlessly match with each other? In some sense, that’s the dark pool dream. [3] In that dream, large fundamental traders use minimum quantity orders to hide their intentions and wait for peers to trade with them. But, block trading’s current volume suggests that it usually doesn’t offer liquidity as cheaply as intermediaries do. [4] I’m inclined to think that many long-term traders would still find it cheap and expedient to post orders on ordinary exchanges. And I’d bet that algorithms would fill most of those orders. Programs like to interact with orders that they consider mispriced. To maximize fill probability, sophisticated institutional traders might deliberately price orders more aggressively, but still inside the widened spreads. [5] Unsophisticated traders, having an imprecise view of fair market value because of the wide spreads, would price orders less efficiently than they do now. Aggressive algorithms would also have much more certainty as to the nature of their counterparties, because market-makers would be absent from the order book. The increased certainty and the abundance of inefficiently priced resting orders would make aggressive algorithmic trading dramatically easier. I think the result would be that many electonic market-makers’ current pricing models could be profitably used to remove liquidity with IOC orders. [6] Intermediation would undergo a regime change from passive to aggressive market-making. [7] Market-restructuring generally benefits sophisticated traders who quickly grasp new dynamics, so I wouldn’t be at all surprised if this shakeup actually increased HFT revenues, at least for the few years it would take other participants to adapt to the new landscape. # Exceptions to the Hypothetical Tax My feeling is that the current population of market participants [8] need professional intermediaries to help determine prices during continuous trading [9]. In the scenario above, that need would be met by aggressive algorithmic traders. I deliberately specified the tax above to offer as few loopholes to traders as possible. What if we loosened some of those constraints? Matt Levine, while answering the question of “Why Do High-Frequency Traders Cancel So Many Orders?”, says that Regulating the parts of Wall Street that you don’t like can help out the parts of Wall Street that you do like. I think the most glaring beneficiaries of a loosened tax could be ELPs, wholesalers, and internalizers. If other market-makers couldn’t realistically submit resting quotes, but ELPs (etc) could still receive incoming orders and decide whether to fill them, long-term traders seeking liquidity would have no option but to trade with ELPs. The removal of competitors would be a huge boon to the ELP business. [10] Similarly, if large, established market-makers received exemptions from the tax, they would stand to benefit at the expense of upstarts. If cancellations of hidden orders were tax-exempt, then we’d expect to see a surge in the share of trading on dark pools, as well as the further transformation of lit exchanges into dark pools. If traders could cancel 10 orders per day without the holding requirement, we might also get an army of retail traders trying to fill the shoes of market-makers. [11] If the exemption were 10,000 orders instead of 10, capital-rich entities like hedge funds could profitably become market-makers. And, if cancel-replace messages were tax exempt, I think pretty much nothing would change. Our markets are complex. Occasionally complex enough for the predicted effects of new regulation to be the opposite of the actual effects. I’m not very confident in my speculation about the consequences of a strict cancellation tax, but I’m skeptical of anybody who is. Eliminating market inefficiencies is not straightforward. And as long as they exist, I suspect algorithmic traders will do just fine. [12] [13] [1] Assuming that fundamental traders didn’t use execution algorithms and only traded manually (or via manual broker). [2] There’s no clear line separating “fundamental” and “speculative” traders, and for brevity I’m just going to wrap speculative traders into the fundamental category if they hold positions for at least a week. Some might call anybody exiting their positions after a week a “high-frequency” trader, but the logic in this post wouldn’t really change if the holding period were changed to a year. [3] In “Flash Boys”, Michael Lewis paraphrases IEX COO John Schwall: For the first time in Wall Street history, the technology existed that eliminated entirely the need for financial intermediaries. [4] There are many definitions of “block trades.” One definition by Tabb Group labels any trade over 20% of average daily volume (ADV) a “block.” These “blocks” are about 1/8 of institutional volume (and presumably much less of overall volume). Another Tabb definition calls any transaction over 10,000 shares (a size well within the range of HFTs) a “block”. These “blocks” are about 1/6 of total volume. “Block” sales to banks’ equity desks are in the range of several hundred million dollars, generally proceed at a discount of 3-4%, and are a minute portion of ADV. On the other hand, it’s conceivable that a renaissance in block trading is inhibited by the renaissance in block trading exchanges. Norway’s SWF says that fragmentation in block venues “can increase the search cost for buyside traders” and that monolithic “utility-like block crossing venues” would “increase the fill probability.” So maybe if all institutions agreed to execute at a single venue, we’d see a resurgence in block volumes. It does seem that, in this new market with ultra-wide spreads, negotiating a price for block trades would be harder. There would be less certainty about the fair value of a stock, making block traders more cautious of their counterparties ripping them off. [5] Both to incentivize counterparties to trade and to disincentivize what John Arnold calls “front-running,” where other traders react to an order by submitting their own orders at slightly more aggressive prices. In this scenario, the “front-runners” would be other long-term traders. [6] With some level of modification. Many of today’s informative signals would become useless. But I’m sure changes in the market would also create new signals. Creative, flexible firms may succeed at the expense of those that haven’t been investing in talent. [7] Markets today really exist somewhere in the middle of these two regimes — particularly because marketable, low-alpha flow is typically filled by wholesalers, making the population of low-alpha orders on lit exchanges disproportionately passive. The spirit of some example strategies on this blog is to identify and fill these low-alpha resting orders, saving them the cost of crossing the spread. This is an untraditional use of the term, but I think it’s fair to call the activity of these aggressive strategies “market-making.” [8] Participants on public markets, that is. Traders in the rapidly growing private market seem to be doing fine without price transparency, so far. [9] Auctions, particularly those with large volumes, are probably different. [10] Wider spreads on lit exchanges would allow ELPs (etc) to fill incoming orders at worse prices than they do now, while still bettering the prices on exchanges. This is absolutely not trading or investment advice: But, since Knight is publicly traded, it may be possible to bet on a tax with ELP exemptions becoming reality. [11] 100,000 retail traders submitting 10 orders per day could partly fill the shoes of automated market-makers. Though, I doubt they’d do the job as well, and it would be a spectacular waste of labor. [12] The history of market structure so far has led to cheaper executions and open access. I can, however, imagine a future cycle that’s quite different: 1. The market structure changes, because of technology, greater access, regulation, etc. 2. Many participants continue their old habits, and trade inefficiently. 3. Savvy traders notice distortions created by this inefficient behavior. They trade to correct these incongruities, and profit. 4. The less specialized market participants learn to be more efficient, and the savvy traders compete more fiercely with each other. This drives down their profit margins. 5. People gradually find out how much money savvy traders *used* to make. Some get upset. 6. People clamor for change, and get it. Return to step 1 and repeat. [13] There are, of course, potential regulations that would eliminate algorithmic trading, such as a high transaction tax. # IEX Peg Orders: Last Look for Equity Markets? Matt Levine recently challenged his readers to describe how IEX’s speedbump might be gamed: It’s hard for me to figure out a way to game it. You all are smart, tell me how to game it. The prize is maybe you get to game it. I’ve discussed some issues with the IEX platform. In this post, I’ll add detail for a few of those issues. And, while labeling an exchange “gameable” is subjective, IEX peg orders remind me of controversies from other markets. I haven’t seen IEX’s source code or system design, so this post is speculative. All orders sent to IEX go through the 350us “shoebox” delay, at the time of entry. However, the exchange does not apply the delay to algorithmic order types such as peg orders. This behavior is designed to prevent nefarious traders who, after seeing a quote change on another exchange, rapidly submit aggressive orders to IEX, hoping to hit an order pegged at the now-stale price. [1] IEX’s intention is a good one. But, if the shoebox delay is not fine-tuned, there can be some undesirable side effects. # Last Look Orders on many spot FX exchanges are subject to what’s called “last look,” where the resting side, after a match, may briefly wait before deciding to proceed with the trade. Last look helps bank liquidity-providers avoid being “picked off,” and gives them option-value by letting them back out of fills if the market goes against them. It may serve legitimate business purposes, but it’s easy to understand why the practice is controversial. BlackRock, for instance, has said that last look causes “phantom liquidity.” IEX peg orders offer something like a ‘conditional’ last look. Instead of becoming non-firm at the trader’s discretion, peg orders opt-out of executions only if the NBBO moves away within 350us of the incoming order’s reception. [2] This restriction makes them less valuable than true last look, but their option-value is still very significant. How significant? I would estimate that it’s worth around half a tick. To give a rough idea, here is a plot of trades on Nasdaq, grouped by whether Nasdaq still had a quote present at the same price 350us later: Top panel: Average market-priced profit or loss per share vs distance in time from trade, from the perspective of the passive side of the trade. Trades are grouped by how their price compared to the (round lot) Nasdaq BBO, 350us post-trade. Roughly speaking, if Nasdaq were to pull its orders in similar circumstances as IEX pulls its peg orders, Nasdaq would prevent all the trades from the “Better than 350us Post-Trade Nasdaq BBO” group. The group that would remain (“Equal or Worse Than 350us Post-Trade Nasdaq BBO”) would be very profitable after receiving the ~30mil rebate. Visible execution only. The “market price” is the average price of the most recently traded 100 shares. Chart is over 8 days in August 2014 and excludes fees and rebates. Bottom panel: Shares traded on Nasdaq vs time from trade (including fiducial trade). Here is the same for colocated trades on Nasdaq BX, again grouped by how they compared to the Nasdaq (Inet) BBO 350us post-trade: Other, non-colocated exchanges (like BatsZ or EdgeX) that I checked are similar. These charts are just hints at the option-value offered by IEX and are closest in spirit to IEX primary peg orders, which (I think) only trade at the 350us ex-post NBBO. [3] The edge that midpoint pegs and D-Pegs receive from the head start is much harder to estimate, but I expect that it’s sizable. It might seem like the edge I’ve described is solely due to IEX successfully preventing peg orders from being “picked off.” It isn’t. A peg order is “picked off” when its counterparty has reacted to an event which should have previously caused that order to be repriced. IEX is repricing peg orders using information that counterparties didn’t have at the time of their orders’ submission. Equities markets are decentralized, and partly unsynchronized — IEX claims to have fixed all race conditions, but they have only fixed one, and by doing so they’ve created others. [4] # Sources of Peg Orders’ Edge Sophisticated traders might take advantage of the option-value offered by IEX by simply sending peg orders instead of normal, firm orders. If they’re fast enough to be the first peg orders received by IEX, the estimated 50mil edge could make losing strategies wildly profitable. Where does this money come from? I think it’s mostly from two populations: 1) All resting orders on other exchanges: A) IEX pegs are priced using other exchanges’ quotes. Peg orders that would have been unviable economically will now be profitable, through their use of information about the future state of those quotes. B) These peg orders will proliferate on IEX. C) Orders sent to IEX that would otherwise have been routed out to other exchanges will now trade with these proliferated peg orders. D) That will lead to fewer, more toxic fills for passive orders on every other exchange. E) Market makers may widen their quotes to compensate for this adverse selection. 2) High-alpha aggressive orders on IEX: A) Aggressive traders may cause or predict the movement of prices on other exchanges. B) IEX will see these price movements, which occur *after* they receive aggressive orders, and pull posted peg orders which would have executed. This fading of liquidity could harm the same traders that Michael Lewis wanted to protect. C) Aggressive traders could send their orders to IEX 350us before sending orders to other exchanges, in the spirit of Thor. That would prevent IEX from using future information to pull peg orders. Delaying orders is not always an option though; if the aggressive trader is an execution algorithm reacting to a trade, it couldn’t afford to delay any of its orders. If it did, a competing execution algorithm (or HFT) might clear out the available liquidity. Thor-style delays may work for human traders, but are not helpful for the vast majority of volume executed by computers. # Unintended Usage Knowing exactly how these orders will behave, sophisticated traders can integrate them into their strategies more effectively than other users. I bet it’s profitable to simply copy quotes posted on other exchanges onto IEX with peg orders. IEX allows traders to mirror liquidity from other exchanges, without the risk of getting run-over that normally entails. And, most of this revenue will be earned by high-speed traders. When there’s a structural inefficiency like this one, the fastest orders capture the profit. A 50 mil per share edge is very enticing to HFTs, and I’d expect that many will soon be competing in a race to be first in the ‘peg order queue’ (if they aren’t already). I’m sure there are many other examples where conditional executions allowed by the speedbump change the circumstances of trading from win/lose to win/scratch. [5][6] # Understanding Timescales of Trading Put a certain way, IEX’s speedbump doesn’t sound very significant; 350us is less than 1% of the time it takes to blink. But, like it or not, computers do the majority of trading these days, including on behalf of fundamental traders. Market professionals know a lot can happen in hundreds of microseconds, and a speed advantage of that magnitude can guarantee profit. IEX knows this too. Cofounder Dan Aisen says that “350 microseconds is an enormous head start.” Selective application of a speedbump is economically equivalent to an exchange distributing a secret data feed, providing anointed traders advance notice about changes in the order book. A simple system update would resolve this issue. IEX could keep its peg orders from executing at stale prices, without using information their counterparties don’t yet have. FX traders understand the consequences of interacting with last look liquidity, and can route their orders elsewhere. Equity markets are different. Maybe last look would tighten spreads for retail traders. But we should think hard before bringing it to our stock market. [1] A crude example of what IEX hopes to avoid: 1. Nasdaq has set the NBBO of a stock, which is 10.00/10.01. 2. IEX has a primary peg order on its bid, currently resting at 10.00. 3. Somebody sends a large sell order to Nasdaq, clearing out the bid and leaving some quantity resting at 10.00. The new NBBO is 9.99/10.00. 4. High-speed Trader A sees that and quickly sends an order to IEX to sell at 10.00. If IEX were to receive Trader A’s sell order before they knew that the NBBO had changed, then they’d execute it against the peg order at 10.00. That’d be bad for the peg order. So, IEX delays the high-speed trader’s order for 350us, which is more than enough time for them to see that the NBBO has changed and reprice the peg order to 9.99, preventing it from being “picked off.” [2] Mostly. D-Peg orders adjust their price in response to the number of quotes at the NBBO, 350us after reception of an incoming order. If you’re interested, the mechanics of the D-Peg are now disclosed. They’re on p210 of this pdf from IEX’s exchange application. [3] A few ways the figures differ from IEX primary pegs: 1. The NBBO is different from the Nasdaq BBO. Adding in venues with inverted pricing (Bx,EdgeA,BatsY) should make this edge larger. 2. Different exchanges and different order types have different populations of traders. 3. A market data message takes time to get from Nasdaq to IEX’s system. The details may bore you, but IEX’s speed advantage will vary by exchange. Messages from Carteret to Weehawken on commodity fiber take about 180us 1-way. On a wireless network, messages from Carteret to IEX’s POP in Secaucus probably take about 90us. So HFTs may receive Nasdaq messages around 90us before IEX does, which means that IEX arguably has a 260us headstart (350us – 90us) for reacting to Nasdaq. For the 4 Bats exchanges in Secaucus, IEX will essentially have the full 350us head start. For the NYSE exchanges, IEX should have a smaller advantage. And IEX may receive market data from CHX (in Chicago) well after high-speed traders do, if any bother to send it wirelessly to NJ. There’s also nothing stopping IEX from getting its market data over wireless, which would give them the full head start for messages from every exchange, and would be a tiny expense by their standards. [4] For readers who have experience with software, here’s an analogy: Let’s say that you have some multi-threaded software. The software processes a stream of two types of events, A and B. Sometimes, events of Type A occur slightly before those of Type B, but the Type A event processing tends to be slower. Because of that slowness, the software often finishes the Type B events first. That causes events to be handled out of order, and has bad consequences. Your measurements show that Type A’s processing is typically slower than Type B’s by 100us, but never more than 300us. So, you decide to delay all Type B events by 350us, because that will make sure they can never beat any Type A events which occurred first. You’re very proud of yourself, and tell your customers that their synchronization problems are over. If “Type A” events are NBBO changes that cause you to reprice peg orders, and “Type B” events are all other customer orders, then this analogy is close to the idea of exchange speedbumps. The problem, of course, is that now the “Type B” orders have a speed disadvantage, which means, if the price moves away shortly after they were received, that they can’t match with peg orders. There are methods to properly deal with these situations in software. It’s just that adding a constant delay to select events isn’t one of them. [5] Here’s another example of a pretty dumb strategy that only an HFT could try: 1. The NBBO for a stock is set by Nasdaq at 10.00/10.05. 2. An HFT observes a hidden trade on Nasdaq at 10.03. 3. The HFT knows that there is probably still hidden liquidity available at 10.03, because resting hidden orders tend to be large. 4. The passive side of the hidden trade isn’t distributed in market data. But the HFT has a model which estimates that there’s a 70% chance that the resting side is the bid. 5. If the HFT were more confident about that estimate, it could submit a midpoint buy order to Bats, which could get filled at 10.025, lower than the price the hidden order just paid. However, the 30% chance that the estimate is wrong is too high — If somebody sends large sell orders checking for hidden liquidity at Bats, and shortly afterwards sweeps the Nasdaq bid, the HFT will be stuck with a toxic fill. 6. The HFT submits a midpoint buy peg to IEX instead. 7. Now, if their guess is wrong, they’re protected. When IEX receives the same aggressive order checking for hidden liquidity, it holds it for 350us. While holding onto that order, IEX sees the Nasdaq bid swept, and pulls the HFT’s midpoint order. 8. If the HFT’s guess is right, and someone sends large sell orders to IEX and Nasdaq, the hidden order at Nasdaq will trade at 10.03, leaving the displayed bid intact. The HFT’s order will execute at IEX at 10.025, a better price than the hidden order received. [6] In addition to peg orders backing away from fills after market conditions deteriorate, it’s possible that IEX uses non-delayed data to help peg orders aggressively trade against resting liquidity, potentially “picking off” orders on their own exchange. I previously blogged about this “book recheck” feature. “Book rechecks” could offer conditionality to sophisticated traders wanting to remove liquidity before specific future events. # A Close Look at the Treasury Flash Rally Report Flash events, where prices rapidly change and revert to their previous levels, are not well understood. Government reports on these events are immensely helpful, and I was pleased to see a high level of detail in the recent Joint Staff Report on the October 15, 2014 flash event in US Treasuries. It’s hard to see by eye, but many of the charts in the report show important market metrics broken down by trader type, with what appears to be 1-2 second resolution. This kind of data is rarely made public, and is a huge treat for a practitioner like me. In this post I will begin to explore the contents of this ~15 minute dataset. The analysis required some moderately difficult image parsing, not an area of expertise for me, so there could be errors. # Types of Traders in the Report The report mentions several types of traders, and since each “employs some level of automated trading,” it’s tough to label just one category ‘high-frequency trading,’ though the closest group is probably “Principal Trading Firms” (PTFs), which trade their own capital and do not have customers. [1] But a few algorithmic traders, like Citadel and Renaissance, may be included in the “Hedge Fund” category. Some charts also have “FCM” and “Other” categories, which could contain smaller algorithmic traders that were hard to classify. [2] # Counterparty-tagged Volume Given the contents of charts 3.5-3.8, and the large amount of self-trading, it’s a reasonable guess that PTFs were trading mostly with other PTFs during the event. But, the data pulled from the charts don’t particularly support this hypothesis. Here is a plot of the net inventory change per second, by trader type and aggressor-flag, for seconds when any group’s aggressively or passively-accumulated inventory changed by more than 100 million dollars: Net inventory change per second in 10-year futures. Data is from report Figures 3.6 and 3.8. Assumes that each bar in both figures represents 1-second. There are 928 such bars. Assuming that the 1-second inventory change is reflective of actual trades [3], this figure shows that, during big seconds, little volume was generated by intra-group trading. Here’s a similar plot for the cash market [4] which appears to show that PTFs traded more with banks than each other: Similar to above, from Figures 3.5 and 3.7. # Volume Between and Within Groups of Traders Here is the overall share of volume between traders of various types:  AssetManager BankDealer HedgeFund Other PTF AssetManager 0.00033 0.01446 0.00578 0.00551 0.02042 BankDealer 0.01446 0.03439 0.03037 0.03146 0.11861 HedgeFund 0.00578 0.03037 0.00642 0.02045 0.05794 Other 0.00551 0.03146 0.02045 0.01953 0.07332 PTF 0.02042 0.11861 0.05794 0.07332 0.18269 Total 0.0465 0.22928 0.12096 0.15028 0.45297 Estimated portion of 10-year futures volume during the event window attributable to each pair of groups. For example, 2% of volume was from asset managers trading with PTFs. Net inventory change per second is used as a surrogate for volume. Counterparty-tagging is estimated pro-rata, e.g. if PTFs and banks each passively bought 50 in 1 second, when hedge funds and asset managers each aggressively sold 50, then the estimate is that hedge funds sold 25 to PTFs and 25 to banks (and asset managers did the same). Again, data is from figures 3.6 and 3.8. Volume is single counted. Only 18% of estimated volume was from PTFs trading with other PTFs. Given that total PTF volume was 45% under this estimate, PTFs were slightly less likely to interact with one another than by random chance (which would be 0.45 * 0.45 = 20%). [5][6] Note that there’s a disparity between this estimate of total PTF volume and what’s in the report, which has PTF share at around 60% during both the event window (p25) and across the day (Table 2). [7] It might also be interesting to see how these statistics evolved over time: Estimated portion of total volume in the preceding 20 seconds traded between PTFs and other groups for 10-year futures. The estimated volume share of PTF-PTF is rarely far from the square of total PTF share, which suggests that worries about “PTFs trading almost solely with each other” may be unfounded. [8][9] Plots for the other group-pairs here, here, and here. We can also use the same method to estimate the volume between aggressive and passive traders in all of the groups:  Group AssetManager Passive BankDealer Passive HedgeFund Passive Other Passive PTF Passive Total AssetManager Aggressive 0.00034 0.02092 0.00585 0.00567 0.02521 0.05799 BankDealer Aggressive 0.00801 0.03439 0.02758 0.03361 0.1216 0.22519 HedgeFund Aggressive 0.00576 0.03312 0.00643 0.01864 0.05412 0.11807 Other Aggressive 0.00537 0.02937 0.02234 0.01954 0.07505 0.15167 PTF Aggressive 0.01564 0.11571 0.06172 0.07154 0.18246 0.44707 Total 0.03512 0.23351 0.12392 0.149 0.45844 1 Estimated portion of 10-year futures volume between passive and aggressive trades from each group. The estimates show that asset managers tended to trade more aggressively (5.8% of total volume) than passively (3.5%). When trading aggressively, 36% (0.021/0.058) of their volume was executed against a bank-dealer, significantly higher than bank-dealers’ 23% share of overall passive volume. Given that asset managers characteristically have “large directional flows spanning multiple trading sessions,” their tendency to trade with banks may be of interest to people worried about bond market liquidity because “banks now have less risk-warehousing capacity than they did in the past.” # Group-Identified Book Depth Large, passive sell orders may have stopped the flash rally. From the report: Around 9:39 ET, the sudden visibility of certain sell limit orders in the futures market seemed to have coincided with the reversal in prices… [W]ith prices still moving higher, a number of previously posted large sell orders suddenly became visible in the order book above the current 30-year futures price (as well as in smaller size in 10-year futures). We don’t know who submitted those orders for the 30-year, but the report may tell us who did for the 10-year. Here is an estimate of ID-tagged order book depth around this time, using data from Figures 3.19 and 3.22: Estimated visible book depth in top 3 price levels for 10-year futures, by type of trader. Hedge Fund and FCM data from 3.22 is merged into data from Figure 3.19. The depth quantity from the “Other” trader category in Figure 3.19 appears to be very close to the sum of the quantity from “FCM” and “Other” traders in Figure 3.22; “Other_322” uses the Figure 3.22 data. Aligning, renormalizing, and merging the data from Figure 3.22 into data from Figure 3.19 required some judgment, so there may be errors (and the x-axis is probably off by a few seconds). Time resolution of the data in Figure 3.19 appears to be about 1.8 seconds, but the similar Figure 3.17 from the cash market has an apparent resolution of 1 second; it’s possible that this disparity is because the authors wanted to protect traders’ privacy. The origin of these large sell orders could have been traders in the “Other” category of Figure 3.22. I wonder if they may have come from asset managers, which are not separately included in the depth plots. # Self-Trading According to the report: “in the 5-year note in the cash market… self-trading accounted for about one-third of net aggressive trade volume between 9:33-9:39 ET.” Levels of self-trading were high on futures markets as well. Regulators are contemplating new, industry-initiated, rules on self-trading. That makes a lot of sense. The usual defense of self-trading argues (correctly) that it can be the accidental by-product of compliant trading, hardly a claim that self-trading is beneficial. Most major exchanges offer self-match prevention, and it seems easy to enable it for all customers. I understand that some trading firms have a siloed business model, where individual groups fiercely compete with one another. In these companies, self-match prevention could allow rival groups to learn each others’ trading strategies. [10] But that doesn’t strike me as a particularly high price to pay. In contrast, accidental self-trading does impose a cost on market participants — it adds noise to market data. [11] Regardless of whether self-trading had any effect on the flash event, it certainly has fostered suspicion of the industry, which seems like pretty good reason to eliminate it. [12] # Potential Causes of Flash Events Andrew Lo discussed the 3 dimensions of liquidity in the recent CFTC Market Risk Advisory Committee: [T]here are three qualities of liquidity that really make up the definition. A security is liquid if it can be traded quickly, if it can be traded in large size and if it can be traded without moving prices. Lo adds that these attributes can be measured. I think he’s right that “liquidity” has a simple, quantitative definition. But there’s an additional wrinkle that makes it prone to sudden changes, and challenging to measure. Liquidity is also about expectations, and its three components (price, time, and size) evolve in response to any anticipated change in them. This evolution may be especially important in flight-to-safety markets. If you want protection from volatility, and worry that bond market liquidity could dry up, you might accelerate your purchase of treasuries. If others decide the same, then there could be a rapid, cascading deterioration in liquidity. [13] Many models of liquidity involve a book of “latent orders,” which are orders that exist in traders’ imaginations and are not yet live. A trader with latent orders might think, for example: “X is over-valued by 15%, so if it drops 20% with little change in my outlook, I’ll buy it.” Donier, Bonart, Mastromatteo, and Bouchaud propose a model where traders instantly submit latent orders as real orders when the market price gets close to their desired price. [14] Their model exhibits many properties found in real markets. But, there’s no reason to expect that latent traders watch markets full-time, and as the authors say in a footnote, these traders’ slow reaction time could be a factor in flash events: It is very interesting to ask what happens if the conversion speed between latent orders and real orders is not infinitely fast, or when market orders become out-sized compared to the prevailing liquidity. As we discuss in the conclusion, this is a potential mechanism for crashes I think that our markets tend to have a layer of liquidity provided by professional intermediaries, and a much thicker layer provided by slower latent traders, far from the top of book. In rare occasions that intermediary layer could be exhausted and, if sufficient time isn’t available for latent traders to step in, a flash event may occur. If so, I’m not sure that there’s an easy remedy. Some people may think that slowing down our markets would prevent these flash events, but I suspect it wouldn’t be that straightforward. Latent traders might check prices once a day (or less), which would mean that our markets would need to be made *a lot* slower. Also, some latent traders may pay attention to the market only after significant volume has transacted at their target price, so slower markets could still have episodes of extreme volatility, they’d just last for days instead of minutes. Some flash events probably have more rectifiable causes. The August 24, 2015 event was likely exacerbated by temporary changes in market structure from LULD halts, NYSE Rule 48, futures being limit-down, and futures’ price limits changing simultaneously with the equity opening. These measures are intended to give markets time to attract latent liquidity. But because they alter market structure, they may shutdown some professional intermediaries, which aren’t set up to trade in one-off conditions. Increased volatility isn’t surprising with intermediary liquidity missing, and still insufficient time for most latent traders to become active. Many people think that “HFT Hot Potato,” where HFTs panic as their inventory devalues and then dump it on other HFTs, is a factor in flash events. [15] For the October 15 event, that seems pretty unlikely. PTFs do not appear to have preferentially traded with each other. And figures 3.9-3.12 in the report show that the bulk of aggressive volume from PTFs and Bank-Dealers consisted of exposure-increasing buy orders. [16] Exposure-decreasing aggressive orders were, for the most part, selling the 10-year. [17] # The Utility of Fine-Grained Data I’ve argued in the past that more post-trade disclosure would dispel conspiracy theories and ensure that our markets stay clean. This Joint Staff Report included data with a resolution that surprised me. I hope that trend continues. Even if it doesn’t, there is a possibility that data with this level of precision can be matched with real market data messages to a limited extent. That isn’t an easy problem technically, but I intend to give it a try. [1] IEX has used a similar definition. From a 2014 blog post by Bradley Hope: IEX says that in July 17.7% of trading on its platform is done by proprietary trading firms, which it says are firms that have no clients and trade for their own account. It places HFT firms in this category. As an aside, this percentage appears to have risen in the last year: Brokers trading their own principal—they include both HFT firms and the big banks’ proprietary trading desks—account for 23 percent of IEX’s trades. Though this second definition may be different than the one given in 2014, since it includes banks’ supposedly shrinking prop-trading desks and also appears to be restricted to broker-dealers. The Joint Staff Report’s definition says that PTFs “may be registered as broker-dealer[s]” (emphasis added), and certainly not all high-speed traders are broker-dealers. [2] The report makes it clear that classifying firms was not easy: Categorizing the firms requires some judgment, particularly given that they sometimes share certain characteristics or may act in multiple capacities… [S]ome bank-dealer and hedge fund trading patterns exhibit characteristics of PTFs, while many smaller PTFs clearly are not trading rapidly. [3] This should be close, but not identical, to the aggressive and passive volume of each group. For example, Bank A may aggressively buy from PTF B, then Bank A may aggressively sell to PTF B. If these trades occur in the same second, there would be no net change in Bank-Dealers’ aggressively accumulated inventory, or PTFs’ passively accumulated inventory. This 15-minute period is exceptional, and I couldn’t say how often that kind of trading occurs even normally, but we have a hint from a nice paper by CFTC staffers Haynes and Roberts. In that paper, Table 8 provides a measure of holding times for different types of traders. It shows that, for the 10-year bond future, 42% of the volume executed by large, automated traders is typically netted with trades on the opposing side within 1 minute. We can crudely estimate the portion of these traders’ volume that is held for under a second by considering the distribution of order resting times, given in Table 7. Summing the appropriate values for the 10-year, about 8.6% of double-counted volume is generated by passive, automated orders that are executed within 1 second, and 23.5% within 1 minute. The ratio of these two numbers is 37%, which may also be reflective of the ratio between trades that have a 1 second holding time (or less) and trades that have a 1 minute holding time (or less). So we can (very roughly) estimate that 0.37 * 0.42 ~ 15% of a typical high-speed trader’s volume is turned over in a second. This estimate applies to single traders’ turnover, not the aggregate of their group. [4] With a 50 million dollar threshold instead of the 100 used for the futures plot, because the cash market is less active than futures. [5] One of the first things discussed on this blog is that algorithms generally want to avoid trading with one another. Table 4 from the above-linked paper says that total volume for 10-year futures is typically composed of: 43% algorithms trading with algorithms, 41% algorithms trading with humans, 12% humans trading with humans. These statistics show algorithms interacting with other algorithms about as often as you’d expect by random chance, which surprises me slightly — I’d have expected algos to tend towards interacting more with humans. [6] If you’re interested in the correlation matrix of inventory changes:  AssetManager Aggressive AssetManager Passive BankDealer Aggressive BankDealer Passive HedgeFund Aggressive HedgeFund Passive Other Aggressive Other Passive PTF Aggressive PTF Passive AssetManager Aggressive 1 0.099 -0.028 -0.3 -0.033 -0.04 -0.021 -0.013 -0.012 -0.12 AssetManager Passive 0.099 1 -0.11 0.017 -0.083 0.000065 -0.067 0.033 -0.2 0.095 BankDealer Aggressive -0.028 -0.11 1 -0.17 0.0043 -0.24 -0.078 -0.34 -0.029 -0.45 BankDealer Passive -0.3 0.017 -0.17 1 -0.29 0.08 -0.11 0.25 -0.44 0.22 HedgeFund Aggressive -0.033 -0.083 0.0043 -0.29 1 0.099 -0.044 -0.28 0.012 -0.43 HedgeFund Passive -0.04 0.000065 -0.24 0.08 0.099 1 -0.18 0.11 -0.47 0.099 Other Aggressive -0.021 -0.067 -0.078 -0.11 -0.044 -0.18 1 -0.14 0.096 -0.34 Other Passive -0.013 0.033 -0.34 0.25 -0.28 0.11 -0.14 1 -0.41 0.4 PTF Aggressive -0.012 -0.2 -0.029 -0.44 0.012 -0.47 0.096 -0.41 1 -0.33 PTF Passive -0.12 0.095 -0.45 0.22 -0.43 0.099 -0.34 0.4 -0.33 1 Correlation between trader groups’ 1-second (aggressor-flagged) inventory changes. Data again from Figures 3.6 and 3.8. A large positive (negative) number means that the two groups are more likely to be trading on the same (opposite) side during the same second. Nothing immediately struck me about the lagged cross-correlations or auto-correlations; except perhaps that asset managers tend to persistently trade on the same side, which I think we already knew. [7] The reasons for that disparity could include: 1. Sub-second, group-wide turnover, when it is make-make or take-take (sub-second turnover for individual HFTs was estimated to be roughly 15% in [3]). Sub-second turnover should appear in the charts if it’s make-take or take-make, because net inventory in the charts is split by aggressor flag. 2. The y-axis resolution of the charts. The smallest visible changes in net inventory are$2.4M for the aggressive chart and $1.9M for the passive chart. So small executions may be under-represented. Algorithms are known for sending smaller orders than humans. 3. Self-trades could conceivably have been excluded from these charts. 4. Data omitted from the charts. 5. An error on my part. The total volumes in the aggressive and passive charts differ by about 15%. That may give an idea of the margin of error. [8] For specific seconds, the estimated level of intra-group trading is higher. As the time resolution increases, intra-group share should become more volatile (at the finest resolution, it will frequently spike to 100%, whenever a single intra-group match occurs). If you’re interested, here’s a table of seconds when more than$30M traded and the intra-group share was above 75% (estimated). This will happen by random chance most often for the largest trader group (PTFs). I won’t pretend that there’s a way to test statistical significance without control data, but there is possibly a cluster of PTF-PTF trading around 9:33:40 (the timestamps could be off by a couple seconds).

 Time Group Intra-Group 1-Second Volume (Million USD) Intra-Group Share of Total 1-Second Volume 09:30:39 PTF 25 0.79 09:33:38 PTF 25 0.76 09:33:39 PTF 45 0.87 09:33:42 PTF 32 0.94 09:38:19 PTF 42 0.88 09:44:02 BankDealer 25 0.79

[9] Similar statistics published by Eurex show HFTs tending not to trade with each other, during a flash crash in DAX futures. (If videos test your patience, skip a little over halfway through, until the timestamp on the left is 3:38)

[10] It could also create awkwardness in the company cafeteria. If one group has been making money off of another, that might become obvious if self-match prevention were enabled.

[11] Manipuative self-trading imposes a much higher cost on market participants, because the “noise” is specifically designed to deceive. Though, some people think that noise in market data can reduce “front-running” and is beneficial. I don’t agree.  If you think transaction costs would be lower with more limited data, paring data feeds makes more sense than corrupting them. I also suspect that, for most markets, realtime order and trade transparency lowers costs.

[12] This is just speculation, but I wouldn’t be surprised if most non-manipulative self-trading in these markets is from just one or two firms. A rumored (and disputed) report on BrokerTec shows that two firms execute 40% of volume there, Jump and Citadel.

Saijel Kishan and Matt Leising have reported that:

Jump rents out computers and other infrastructure to its traders, who are organized into independent trading teams. The groups operate as separate cost centers… Jump applies its secrecy ethic within the firm. The teams don’t share information about trading strategies with each other

Citadel also has a reputation for internal secrecy.

[13] It’s easy to see how liquidity anxiety would affect asset prices negatively, especially for flight-to-safety products, which are considered “safe” partly because of their liquidity. Say, hypothetically, that money markets are 100% liquid today, but you suspect that they could freeze up in the next year. You’d probably empty your account immediately, right? If enough people did the same, then liquidity could evaporate in a run.

Less intuitive is the possibility that the very safest assets could increase in value when liquidity is expected to disappear. In such situations, there are probably even worse fears about other markets. Long-term treasury prices actually went up during the 2011 debt ceiling crisis, despite some pessimistic speculation. If this phenomenon contributed to the treasury flash rally, there would presumably have been changes in other assets’ liquidity measures, cross-asset lead-lag relationships, or correlation structure.

[14] A consequence of this model is that the order book will be skewed in the opposite direction of a meta-order, e.g. as someone buys a large block of AAPL, there will usually be more quantity available on AAPL’s offer than its bid (near the top of book). That could be an important detail in the “front-running”/HFT/spoofing debate, because the traders who use skewed order books to predict price may actually be trading on the other side of large meta-orders — offering fundamental traders cheaper fills, rather than pushing the price away from them. Strategies that use order book signals may still compete with other mean-reversion traders, but complaints about that don’t sound very compelling.

[15] Kirilenko, Kyle, Samadi, and Tuzun write that “hot potato” trading contributed to the equity flash crash (of 2010):

After buying 3,000 contracts in a falling market in the first ten minutes of the Flash Crash, some High Frequency Traders began to aggressively hit the bids in the limit order book. Especially in the last minute of the down phase, many of the contracts sold by High Frequency Traders looking to aggressively reduce inventories were executed against other High Frequency Traders, generating a “hot potato” effect and a rapid spike in trading volume.

[16] I don’t know what sort of analysis the authors did to determine whether a given order increased or decreased a trader’s exposure. It seems likely that they considered a trader’s “position” to be a combination of their 10-year cash and futures holdings. That wouldn’t be the only measure of market exposure. For example a trader that is short the 30-year may consider buying the 10-year to be a partial hedge. Likewise for a trader long stocks, or another correlated basket.

[17] With the exception of Bank-Dealers, which aggressively covered short futures positions to the tune of about \$200M across 3 minutes, a number that does not sound particularly high.

# IEX, Ideology, and the Role of an Exchange

IEX has raised significant capital, possibly at a valuation well into the hundreds of millions. IEX plans to become a full exchange and continue capturing market share, but I wonder if it might have a unique long-term vision that excites investors. In this post, I will speculate about what that vision might look like. To be absolutely clear, this post is highly speculative, and does not constitute trading or investment advice.

CEO Brad Katsuyama testified that “IEX was founded on the premise of institutionalizing fairness in the market.” Soothing words, and possibly words that tell us something substantive about IEX’s values.

IEX recently introduced the D-Peg, an order type that uses market data to make a prediction about where the price is heading, and transact only at times that are predicted to benefit the user. The D-Peg is a blending of price prediction, traditionally the role of traders, with the matching process itself. Combined with the 350us structural delay built into IEX, it’s easy to see how even crude prediction signals could become incredibly powerful. As cofounder Dan Aisen puts it:

[W]e don’t need to be the single fastest at picking up the signal– as long as we can identify that the market is transitioning within 350 microseconds of the very fastest trader, we can protect our resting discretionary peg orders. It turns out that 350 microseconds is an enormous head start

I imagine that even something as basic as an order book ratio (e.g. [AskQuantity – BidQuantity] / [AskQuantity + BidQuantity]), known 350us in advance, has tremendous economic value.

This philosophy is interesting to think about, and I can see how it might appeal to certain audiences. If the exchange has a better idea of the market price than its customers, it makes a sort of sense for it to use that information to ensure trades can only occur at that price. But, I think the idea is ultimately a misguided one.

Here are some problems with it:

1) The exchange, in an effort to increasingly prevent adverse selection, may want to make their prediction more sophisticated. If a skewed order book results in ‘bad fills,’ then the same can be said for trades occurring after price moves on correlated instruments. If the price of PBR has just dropped by 1%, then buy orders for PBR.A are surely in danger of being “picked off.” Should the exchange try to prevent that? IEX may well have decided that they’ll always allow this kind of adverse selection. But keep in mind that trading signals do not work forever, especially when they are heavily used — so IEX will likely need to continually revise their prediction methods.

2) As the prediction methods get more complex, they are more liable to be wrong. In the example above, maybe an event occurs that affects the values of the two share classes in different ways. The exchange could erroneously prevent traders clever enough to understand that from executing, impeding price discovery.

3) Sophisticated traders like the PBR/PBR.A specialists could opt out of these order types. but how would they make an informed decision? Right now, we just know that the D-Peg uses a “proprietary assessment of relative quoting activity.” Could that “proprietary assessment” change over time? If so, are those changes announced? Matt Hurd has lamented the D-Peg’s undisclosed nature, and thinks it contradicts IEX’s mission of transparency. [1]

4) An exchange cannot increase the profitability of one group of traders without harming another. Now, maybe the only group harmed here are unsympathetic high-frequency traders who don’t deserve their profits. I’m skeptical of that. Who might some of those evil traders “picking off” quotes on IEX be? The motivation for Thor, and a critical part of the Flash Boys story, is the fading of liquidity when a trader submits large marketable orders. Some of the traders that the D-Peg will stymie may be people like the young Brad Katsuyama, investors or brokers who send liquidity-seeking orders simultaneously to many different exchanges. Say the NBBO for BAC is 10.00/10.01, and an investor wants to sell a large holding, so she sends sell orders to multiple exchanges, including IEX. One of those orders hits Nasdaq right when another gets to IEX, but IEX waits 350us, and, seeing the Nasdaq bid disappear, perhaps decides not to execute any resting D-Peg interest with the incoming order. Had the investor timed her sell orders differently (in a similar spirit to Thor, sending the IEX-bound order early), she’d have gotten a better fill rate. [2]

Another possibly harmed group could be non-D-Peg resting orders on IEX. One fascinating aspect of the IEX speedbump is that they can use it not only to prevent resting orders from executing at inopportune moments, but also to help traders remove liquidity at opportune moments! I was surprised to see that some order types can automatically trade against others upon a change in IEX’s view of the NBBO, through a process called “Book Recheck“. The mechanics of IEX seem complicated, so I could be wrong, but it looks to me like orders eligible to “recheck” the book may initiate a trade at a price determined by the realtime (*not* delayed) NBBO. [3] In contrast, cancel requests for the passive sides of these trades would be subject to the IEX speedbump. Here is a concerning, hypothetical example:

A) The NBBO for a stock is 10.00/10.01
B) A trader has submitted an ordinary (non-peg) limit buy order at 10.00 resting on IEX
C) The NBB at 10.00 is completely executed, making the new NBBO 9.99/10.01
D) The trader, seeing that 10.00 is no longer a favorable price for her purposes, tries to cancel her buy order
E) Her order cancellation goes through the 350us speedbump
F) In the meantime, IEX sees that the new NBBO midpoint is 10.00, and decides that a D-Peg sell order (or midpoint peg) is now eligible to recheck the book.
G) The D-Peg order is matched with the bid at 10.00.

The combination of algorithmic order types and selective use of the speedbump resulted in one trader getting an especially good fill, and another trader getting an especially bad fill. I guess if you’re not careful designing your exchange which supposedly prevents traders from picking each other off, you might do some picking-off yourself. [4]

5) Trading that occurs during price movements tends to be more informed, and preventing it could make markets less efficient. This would only be an issue if IEX captured significant market share, but it does sound like permitting trading only during periods of market stasis is part of IEX’s long-term vision. Referring to the D-Peg, Chief Strategy Officer Ronan Ryan says that “[a] core insight behind our market philosophy is that price changes are valuable opportunities, especially for those strategies fast enough to detect signals from price changes.” And also: “The economic benefit is that investors aren’t paying (or selling at) a worse price to a predatory strategy that is aware of quote changes before they are.”

It sounds like the idea is to stop an informed order from trading with an uninformed order, with the exchange deciding which is which. Naturally, the exchange is not an oracle and will misclassify some orders. But, if IEX becomes the dominant marketplace, and its classification is sufficiently good, informed orders will rarely get filled. You might think that wouldn’t happen, because IEX is only targeting ‘short-term’ alpha, but I’d venture a guess that a sizable chunk of order flow with long-term alpha will also have some short-term alpha inseparably folded into it. With information-bearing order flow being blocked, at a certain point, the exchange will be in the position of deciding when the imbalance between supply and demand warrants a price change. I happen to think that generally markets work better when people can freely trade with one another at prices of their choosing [5], and that a vision like this won’t get IEX into the same league as the major exchanges. But, market participants will be the judges of whether this model is a viable one.

6) Even if the exchange is pretty good at determining the market clearing price and balancing supply and demand. It’s not clear they can do so more cheaply than algorithmic traders and human market participants. Right now, IEX is charging 9 cents per 100 shares traded, significantly greater than estimates of typical HFT profit margins. [6]

7) IEX, by delaying executions, is effectively using market data from 350us in the future, piggy-backing on price discovery from other markets. As Aisen suggests, the speedbump is probably accounting for the vast majority of their prediction algorithm’s edge. [7] This is different from complaints about dark pools’ use of visible order information for price discovery. Dark pools can only use order information from the present, and have to report trades to the public tape “as soon as practicable“. The speedbump might well allow IEX to cheaply discover pricing information from lit market data, potentially starting a new era of speedbumps, with each exchange wanting to have a longer delay than their competitors have. Regulators may want to carefully think through possible end results of this form of competition.

8) We don’t understand how this sort of market structure would hold up under stress. HFTs thoroughly simulate their algorithms, does IEX do the same? In a flash crash situation, IEX might stop D-Peg matching for an extended period, preventing those clients from getting filled at prices they may love, and isolating much-needed liquidity from the rest of the market. Additionally, if IEX is too effective at blocking informed order flow, some traders could panic when they repeatedly try and fail to get executed, damaging market stability.

Most of these issues aren’t especially important to overall market health as long as IEX’s market share stays below a few percent. And I think their market model is a perfectly fine one for a dark pool, although a little more disclosure wouldn’t hurt. The question is whether their target audience of fundamental traders will want to participate in this sort of market. I suspect ultimately that they won’t, though IEX might reach critical mass before participants really have time for reasoned debate.

We may have a glimpse of what “institutionalized fairness in the market” really means. To some, it may mean the relief of relying on a trustworthy institution to equitably determine the timing and pricing of their trades. To others, it may sound like a private company determining the market price via secret, non-competitive algorithms — unaccountably picking winners and losers. Institutional arbiters are part of civilized society, but ideally they’re transparent, receptive to criticism, and reformable when not working. Before we hand over the keys to IEX, we had better make sure that they meet these standards.

[1]Hurd’s complaint seems fair enough, but I’ll mention that competing exchanges aren’t always perfectly transparent either. For instance, Nasdaq Nordic’s documentation seemed to have some noteworthy details about reserve orders that weren’t available on Nasdaq’s US site.

[2] Brad Katsuyama said that “fading liquidity” is one of IEX’s “concerns regarding negative effects of structural inefficiencies” in his testimony to the US Senate:

[D]ue to the construct of the market system certain strategies are able to get out of the way of buy or sell interest as they are accessing the market in aggregate, which calls into question the fairness of the inefficiencies which allow or enable such behavior, and the potential distortion of price discovery and of supply and demand.

[3] Execution Tag “LastLiquidityInd” has a value for “Removed Liquidity on Recheck.” And the Form ATS says:

Upon a change to the Order Book, the NBBO, or as part of the processing of inbound messages, the System may test orders on one or both sides of its market against the contra side of the Order Book to determine if new executions can occur as a consequence of the change in the IEX Book or prevailing market conditions[.] Orders resting on the Order Book at the IEX determined Midpoint, may be eligible to trade against orders in the updated Order Book, which were ineligible, or did not satisfy the order’s conditions, when they were originally booked.

Does that mean the recheck uses the same non-delayed NBBO that IEX uses in the rest of their logic? I don’t know, but more disclosure from IEX seems like a good idea.

[4] Our hypothetical trader who had her buy order “scalped” may also have heard statements from IEX such as “You can not scalp trades, you can not scalp orders that are on IEX.”

[5] Within some reasonable limits of course. Limit-Up-Limit-Down price constraints seem to be appreciated by most participants, though even those aren’t completely free from criticism. Reg. NMS Order Protection also has some passionate opinions on both sides. There is always going to be some tension between letting traders determine prices unencumbered, and protecting them from ‘erroneous’ or ‘unfair’ transactions.

[6] Rosenblatt Securities, which has conducted surveys of HFTs, recently estimated that HFT profit margins in US Equities are around 5 cents per 100 shares. Tabb Group similarly sees shrinking profit margins.

[7] The D-Peg aside, even the simplest formula like the NBBO midpoint will have massive alpha with a 350us “head start.”

# Plain, Old Fraud in the Twitter-Hack Flash-Crash?

Two years ago, hackers took control of the Associated Press’s Twitter account and falsely tweeted that the president was injured due to explosions at the White House. Within 3 minutes, US stock indexes dropped about 1%, but recovered to their pre-tweet values after an additional 4 minutes.

I don’t like to idly speculate [1], but ever since then, I keep wondering if this hack might have been part of a massive manipulation scheme [2]. Even if it was just a prank, it seems like the hackers would have been foolish not to try capitalizing on the market movements that they caused. If they wanted to commit crimes, why not at least make some money?

It would be easy to profit off of such a scheme, and it seems conceivable that a savvy, well-funded group might have cleared an enormous sum. It’s also possible that this hypothetical group could have avoided attracting *too much* attention before wiring out the proceeds, perhaps by splitting up the trades across many accounts, without ever touching an American financial product or bank (markets worldwide were impacted by the tweet). The Syrian Electronic Army claimed responsibility for the hack. I obviously don’t know if that claim is true. But if it is, presumably that group could use the money.

Much like spoofing, the intentional spread of misinformation can harm all sorts of traders. There has been speculation that algorithmic traders were disproportionately deceived by the hack. I imagine that some were, but so were plenty of humans. Here’s Sal Arnuk of Themis Trading:

My initial reaction before I realized it was a fake tweet was the same horrible feeling I had when I worked at the top of the New York stock exchange when planes hit the World Trade Center.

And Arnuk also appears aware of the possibility that it was a profit-making scam:

When I realized it was a fake tweet, I was outraged and ashamed that the market was able to be manipulated so easily.

Regulators take spreading false rumors very seriously, like in today’s suit over false EDGAR filings. I am sure they have been looking into this more significant and complex incident. If and when they complete their investigation, don’t be surprised if it was more than just vandalism.

[1] That means I’m about to. This post is highly speculative.

[2] An SEC information page briefly describes “pump-and-dump” scams:

“Pump-and-dump” schemes involve the touting of a company’s stock (typically small, so-called “microcap” companies) through false and misleading statements to the marketplace. These false claims could be made on social media such as Facebook and Twitter, as well as on bulletin boards and chat rooms.

These scams may also be called “short-and-distort” when the manipulator shorts a financial instrument before spreading negative rumors.

# Are Data Centers that Host Exchanges Utilities?

Swedish regulators are seeking to fine Nasdaq OMX for alleged anti-competitive practices in the Nordic colocation business. This case is fairly limited in scope, but it raises some more general questions about colocation.

HFTs, execution algorithms, and smart order routers rely on exchange colocation to provide cost-effective and fair access to market centers. Locking competitors out of an established data center could easily destroy their businesses. The Swedish Competition Authority alleges that Nasdaq OMX did just that:

In 2009, a Stockholm – based multilateral trading platform called Burgundy was launched. Burgundy was formed by a number of Nordic banks. The Burgundy ownership structure gave the platform a large potential client base, especially with respect to brokers. However, the owners had limited possibilities of moving trade from Nasdaq to Burgundy as long as the trade on Burgundy was not sufficiently liquid to guarantee satisfactory order execution. In order to increase the liquidity on Burgundy, it was vital for Burgundy to get more trading participants…

In order to come into close physical proximity with the customers’ trading equipment in Lunda, Burgundy decided to move its matching engine to the data centre in Lunda…

Burgundy had finalised negotiations with Verizon, via their technology supplier Cinnober, and the parties had agreed that space would be leased in Lunda for Burgundy’s matching engine. When Nasdaq heard of this agreement, they contacted Verizon demanding to be the sole marketplace offering trading services in Nordic equities in Lunda. Nasdaq told Verizon that if Verizon allocated space to the Burgundy matching engine at their data centre in Lunda, Nasdaq would remove their own primary matching engine and their co-location service from that centre. Such an agreement with Burgundy/Cinnober could also have an impact on Verizon’s global collaboration with Nasdaq. Verizon accepted Nasdaq’s demands, and terminated the deal with Burgundy/Cinnober.

# Latency Arbitrage And Traders’ Expenses

In the US, a lot of people are upset that equity exchanges are located all over New Jersey, instead of being in one building. Michael Lewis’s primary complaint about HFT is that it engages in “latency arbitrage” by sending orders between market centers ahead of anticipated institutional trades. I suspect that, in addition to those concerned about latency arbitrage, most HFTs would also be happy if important exchanges moved to one place. That would cut participants’ expenses significantly; there would be no need to host computers (and backups) at multiple locations and no need to procure expensive fiber and wireless links between locations. It would also allow HFTs to trade a given security on multiple exchanges from a single computer, dramatically simplifying risk checks and eliminating accidental self-trading.

If so many market participants want it, why hasn’t it happened? There could be several reasons:

1. It’d be anti-competitive for exchanges to cooperate too much in bargaining with data center providers.
2. Established exchanges want the best deal possible for their hosting, which means they need to consider bids from many competing providers.
3. Under the Reg. NMS Order Protection Rule, there could be some benefit to exchanges if they have a structural delay with their competitors.
4. Some exchanges see hosting, connectivity, and related services as important sources of revenue and want customers to procure those services from them. This especially includes exchanges which require colocated customers to lease rackspace only from the exchanges themselves – and also exchanges that operate their own data centers.
5. Exchanges don’t want competitors in the same data center as them, so they use their considerable leverage with providers to keep them out.

The allegations against Nasdaq OMX are about #5 and seem to be about just one case. But here’s a potentially concerning statement by Andrew Ward in the FT (2010):

People familiar with Verizon said it would be unusual in the exchange industry for more than one operator to share the same data centre.

# A New Model for Colocation

Reg. NMS requires exchanges to communicate with each other, and would work better if delays in that communication were kept to a minimum. Would it be reasonable for updated regulation to require market centers to provide one another with a rapidly updated view of their order books? That would necessitate exchange matching engines being physically close to one another, ideally in the same building. Requiring this would end most types of “latency-arbitrage,” whether real or perceived.

One solution could be for FINRA to solicit bids from providers under the assumption of a long-term contract, with extra space available for new exchanges and traders – keeping costs down for everybody. This proposal is in tension with the concern in #1, but I wonder if, because we have a single national market system, it’s reasonable for that system to negotiate as a single entity, and for the other concerns to override #1. Exchanges would no longer make much money from colocation services, but they could compensate for that by raising trading fees, which would arguably be healthier for markets anyway.

In my mind, what separates data centers from utilities is that, for most of their non-financial customers, there’s very little benefit to being in a certain building versus a nearby one. So long as nearby buildings have good connectivity to the local internet, customers have many options when procuring hosting services. Financial customers are much different. Once a major exchange is located in a certain building, traders, and sometimes competing exchanges, have no choice but to lease space there. That feels to me like a completely different dynamic, and possibly one that justifies data centers, in this specific industry, being classified as utilities.