The Flash Crash was caused by a complex interaction between different IT systems. Some of those systems failed in an ungraceful manner, and other systems could not cope with these failures. What I wish is that, instead of wringing our hands about how computers now manage our trading infrastructure, we took the time to understand that computers manage systems that are far more mission-critical than our financial markets.
I have worked extensively on trading systems and appreciate the importance of stable markets. I think that any software connecting to markets should have multiple layers of safety checks which are as independent as possible. In fact, I would argue for much greater risk checks than we have now. Algorithmic trading companies understand well what can happen if their safety measures fail. But, the worst thing that can happen to HFTs and exchanges is generally bankruptcy. 
It’s true that ordinary investors can get caught in IT failures and lose money, though hopefully erroneous trades can be unwound in those instances. It’s also true that traders illegally gaming each other and manipulating markets can defraud innocent investors. Given the spectacular systems failure of the Flash Crash, it’s worth reflecting on the risks associated with shoddy automation in other areas of life. I can’t even begin to detail all of the serious IT failures we’ve had in the last 50 years, but here are a few (in no particular order) that make the Flash Crash look like a triviality:
- The Northeast Blackout of 2003. Complex interactions between multiple events, including a software failure, interrupted electricity delivery for over 50 million people, contributing to many deaths.
- The Therac-25 radiotherapy device. Buggy software with poor safeguards allegedly caused at least 6 cancer patients to receive fatal or near-fatal doses of radiation.
- Problems with software that controls airbag deployment in cars, including certain Cadillacs.
- Problems with electronic voting in the 2014 Belgian Elections. Only 2000 voters seem to have been affected, but I might guess that fewer than that number of traders were seriously affected by the Flash Crash.
- The infamous Toyota “Unintended Acceleration” issue may have been caused by faulty software, according to some experts . The issue has allegedly caused dozens of deaths.
- A flaw in the software of a Soviet satellite reportedly triggered alarms that the US had launched 5 ICBMs. Human operators, suspecting a false alarm, fortunately waited for radar confirmation of the launches before reacting.
This is far from a complete list, and often we don’t even know if faulty IT contributed to a fatal accident. I think that many financial professionals suffer from déformation professionnelle.  The reality is that, despite the hullabaloo over the Flash Crash, it had few serious consequences in the grand scheme of things.
The Flash Crash very temporarily resulted in the loss of about $1T in the market-value of securities. It also triggered a media firestorm which potentially convinced some retail investors to hold cash, and miss the post-crisis stock market recovery. For those active traders who lost money that day, there’s no doubt that the Flash Crash was a big deal. But the reality is that the market recovered within minutes and many of the accidental transactions at absurd values were cancelled. A Flash Crash is also much less likely today, at least in American equity markets, where we have circuit breakers that halt trading when it becomes sufficiently volatile.
As computerized systems rightfully take on more responsibility, I hope we can learn some lessons from the Flash Crash. Unlike exchanges, designers of life-critical systems don’t have the luxury of shutting down for a few minutes when problems are detected. It’s not great that financial markets’ electronic infrastructure couldn’t handle a little stress, but it’s lucky that a failure like this attracted media attention without resulting in any loss of life. The anniversary of the Flash Crash is a reminder that all critical systems require intelligent regulation and, most importantly, is an opportunity to thank the engineers who keep us safe.
 It might be worse for an HFT to have some of their traders violate compliance rules and commit crimes. But I wouldn’t really call that an IT glitch, though with compliance being increasingly automated, maybe one day that’ll change.
 I have looked for an English equivalent to this term, and the closest I’ve found is “occupational psychosis,” which sounds a bit more extreme than I’d like.