Please Note: Blog posts are not selected, edited or screened by Seeking Alpha editors.


|Includes: A, AA, General Motors Company (GM), T, VZ, VZ
High Frequency Trading (HFT) and secret algorithms have become the new competitive strategy in the global financial industry. The faster traders can turn around trades, the faster they can get in and out of quick markets and short time pockets of opportunity. The goal is to do many trades within a day but to get out by the end of the day (not holding any stocks that were traded that day).
When you talk to people about HFT (High Frequency Trading) many have different definitions and assume that there is adequate regulatory coverage on it. After doing some research and reviewing what has been written about it, there is a definite disconnect on how trades are processed and how they can be reviewed later.

After reading The BIG Short, by Michael Lewis, I started to doubt the accuracy of the ratings agencies like S&P and Moody’s. Evidently, the whole subprime mortgage debacle was helped by the ratings of bonds to AAA that should have been rated at B-. Lewis points that out in his book and it should be read.
Going beyond that, are the actual charts that show trading and volumes which most traders and commentators on Seeking Alpha refer to, accurate anymore? With all the trading going on by robotic traders, which increases volumes and trades geometrically, has it changed the validity of these charts? One comment I found on a financial blog was this:
The real problem is that the equity market no longer is helping to raise capital and has turned into a micro second casino. -   Themis Trading LLC Blog
Trading is not being done on the trading floor as much as it is being done in the robotic servers that are co-located at the data centers. There was an accusation of a timing advantage of trading firms that co-located the trading “robots” with the actual data center of the exchange. Trades can actually be executed a couple of milliseconds faster than if they were processed closer to the actual data center. The NYSE has said they have taken care of that problem, but have all the global exchanges addressed this issue?
Having an “edge” of 30-40 milliseconds is a huge advantage. When you really take this into consideration, you can see that the average investor or for that matter, the traditional institutional “human” trader, is at a disadvantage to the robotic traders which are constantly scanning the markets for any trends in upticks or downturns. That is where the firms using “robotic traders” can make millions of dollars a day.

In technology management courses at Northwestern University, I always professed that when it comes to technology, the creation of technology outpaces the acceptance of technology. The regulation of technology lags behind the acceptance and the enforcement of the regulation lags behind even further.
Within all the exchanges, there is not a way to go back and dissect all the micro-trading going on. If trading robots are trading in the micro-seconds, any tools that are running in milliseconds, let alone seconds, are far too slow to predict, let alone catch, any irregularities.
I have come to the conclusion that ALL Ratings Agencies are behind as well as the regulatory agencies. The reason for this? Their tracking capabilities are far slower than what the robotic trading capabilities are in the financial cloud.

Here is an everyday analogy: The more progressive firms in the industry have moved on to electric power saws and really beyond that - laser saws - while the ratings agencies as well as the regulatory agencies are still using hand saws.  (Or more aptly put - Hack saws). 

The “real problem” is the inability to measure and track all transactions in this trading vortex. As the old saying goes, “you cannot manage what you cannot measure.” HFT is an example of when the acceptance and application of technology is faster than the regulation of the technology and enforcement of the regulation.
When I was working at Bell Telephone Laboratories years ago, whenever we had a crash of a central office, in order to determine what was at fault, the Labs had a full-blown simulator as well as a central office that could be configured exactly the same way as the one out in the field and traffic could be added and you could pinpoint the problem.
If we look back at the “Flash Crash” of May 6, 2010 when the DOW dropped 1,000 points, no one ever came back with the real reason the market slid so fast. If they had the tools to simulate and re-construct the exact sequence of events and all the trades in sequence, they would have came out with an accurate explanation.
They did not and the conclusion that I have is that they cannot replicate what goes on. Or it was cyber warfare and they don’t want to let on to that vulnerability. I will assume it is not cyber warfare.
There is “digital slippage” from a synchronization standpoint. In effect, if ALL trades are not being traded on one specific timing source, how can you put an accurate value on a trade?
The question with HFT is, how accurate is “the Cloud” that it is trading in? 
What is the timing source and can it be broken down into smaller parts than the velocity of the trading itself? If I can trade at 3-5 microseconds per transaction and have a 30 millisecond jump on my competitors, I am well beyond any scrutiny if the clocking at the exchange as well as the regulatory agencies can only get down to one hundredths of a second.
There used to be the same problem on the Chicago Mercantile Exchange years ago when a trader had a choice of several timestamps on the floor to “timestamp” a trade.  If there were known inaccuracies, let’s say one timestamp was 20-30 seconds slower than another or one was 50 seconds faster, you could play with the value of the trade based on what clock you took it to. They needed a Master clock. The same goes with HFT that trades in the financial cloud.
Mission critical financial networks need timing accuracy. Accuracy into the nanoseconds is necessary, if you have traders making trades in the microseconds.