Then I got hold of my FIRST tick by tick database, comprising of 3 months of foreign exchange data relating to the Swissie-dollar rate. I got a tape shipped over and it was a joy to love and behold. 300 megabytes of data, i was drooling over it, near orgasmic state. I ran my neural network models there but couldnt incorporate that into my phd, Professor Wood, my supervisor, told me not to be a greedy beady and just finish the damn thing. Well.........
I got my revenge by working with data in my job. We had data cubes which were gigantic, imagine multi-dimensional data of millions of rows with thousands of columns. Oh! fun and games! :) But those were small fun and games, just end of day runs to determine the greeks and var's. Then when I moved into the pure electronic trading area and we ended up with terrabytes of data, godzilla sized databases with very very smart statistical models to determine program trading strategies, to do transaction cost analysis, etc. With the globalisation of the markets, adding more products to the mix (like mixing the fixed income and the equity markets), the electronic trading desks are becoming more data intensive than before. But life is going to get a whole load worse
And then MiFID / RegNMS is going to make life a whole load more challenging for everybody. As this reports writes, "These mandates will require institutions and trading venues to store massive amounts of market data, even as the volume of trade and quote data continues to grow. As a result, TowerGroup believes these regulations will be responsible for not only creating more sources of market data, but by 2012, could cause a 900% increase in the amount of market data being published."
Financial Institutions have to store data for a minimum of 5 years so that we might need to prove to a regulator that we offered the best execution to our clients. And for various other reasons. So it is quite possible that greater than a terrabyte of data can easily be produced on an hourly basis (you can fit the largest library in the world in that amount!). The physical storage of this much amount of data is not the problem, we have very very good technology available to do so.
Now let me give you one example why I think this is a worry. Look around you and look at your technology. Your mobile phone? your digital camera? your cam-corder? your PC? laptop? think about it, which technology platform is 5 years old? comparatively small. Mobile phones would be 1-2 years, digital cameras about the same, camcorder about the same, PC perhaps 2-3 years old...... So think about the challenges that banks will have to maintain this amount of data, the technology platform to support it....... My first dataset was stored on tapes which I converted to the 8 inch floppy disks, and then the 5 and 1/4 inch, then the 3.5 incher stiffies. My manchester data sets and dissertation documents were stored in the 3.5 inch stiffies and last year I found that I couldnt read them any more. I didnt even have a bloody floppy drive in my pc! But I have wittered on enough. Big challenges (think about the poor museums of modern art who have to maintain working technology platforms from the 1970's!, who the hell knows how that kind of stuff works???)
TOWERGROUP FINDS MiFID AND REG NMS MAY DRIVE 900 PERCENT INCREASE IN AMOUNT OF MARKET DATA PUBLISHED BY 2012
- Needham, MA - 31 July 2007
- Designed to improve competition and level the playing field for investors, the EU Markets in Financial Instruments Directive (MiFID) and the US Regulation National Market System (Reg NMS) are poised to drastically change the way the global securities industry handles market data. New research from TowerGroup finds that regulatory compliance will ultimately pave the way for firms to completely automate the trading process.
These mandates will require institutions and trading venues to store massive amounts of market data, even as the volume of trade and quote data continues to grow. As a result, TowerGroup believes these regulations will be responsible for not only creating more sources of market data, but by 2012, could cause a 900% increase in the amount of market data being published. This projected growth, using the NYSE as an example.
“Discussion in the media has focused on the fact that these regulations mandate best execution, which misses the bigger picture” said Tom Price, senior analyst in the Securities & Capital Markets practice at TowerGroup and author of the research. “The ramifications may not be felt immediately, but once Reg NMS and MiFID are final, they will have forever changed the way financial services firms treat data.”
• Regulatory compliance is expensive, but the true challenge will lie in determining where to spend – as global market participants juggle the requirements of bandwidth, capacity, latency, and subsequent storage to support the anticipated flood of market data.
• Trade reconstitution will become crucial “forensic evidence” in proving best execution, and will require gathering and storing every published quote at the time of execution along with any other data points associated with the trade.
• Except for some regional differences between Reg NMS and MiFID, institutions can achieve compliance through an integrated strategy that encompasses both directives. The success of such a strategy depends on combining consistent policies and procedures with improved infrastructure.“Regulatory compliance will pave the way for firms to completely automate the trading process,” added Price. “Once all the players have done so, the winner in the hunt for liquidity will be whoever can process the data the fastest.”
The new research report titled “Preparing for the Data Flood: Ramifications of Reg NMS and MiFID,” addresses how Reg NMS and MiFID will create more sources of data and drive a greater volume of quote data from each of these sources. The research also analyzes the areas where market participants will have to improve their market data infrastructure and applications.
No comments:
Post a Comment