Today’s financial markets generate a volume of data barely dreamed about in the early days of electronic trading. Every year, exchanges create a record-breaking amount of transactions, and we know that somewhere within all of that data there lies a digital treasure chest. Finding a way to analyze and deliver this valuable data is a multi-headed problem, roughly broken down between charting, historical trade display and research scenarios.
In my role as an engineering manager at TT, I’m part of a team that’s been working to solve this problem in the new TT platform. We think we’ve found the answer by leveraging cutting-edge technologies, Node.js and Amazon Web Services (AWS). I’m excited about our solution, which is now automatically available to all TT platform users. Read on to learn more about our approach and how it can help you overcome the multi-faceted big-data challenges we all encounter today.
One of the new technologies that makes the TT platform possible is WebSockets, which we use to deliver real-time data to both our mobile and desktop users around the world. Writing a WebSocket server with Node.js is as simple as writing just a few lines of code. Check out this example from the popular Node.js “ws” package. This is literally all you need to run a WebSocket server in Node.js:
Doing something similar with the popular C++ library libwebsockets or a Java API like Jetty can turn out to be hundreds of lines of code. If you’ve ever used the über-popular library CuRL to make web requests from your C++ code, you know it also commonly results in memory management issues when handling multi-part replies.
Amazon Web Services (AWS)
With our current-generation analytics deployment, it was always a guessing game to figure out how much infrastructure we needed. We used to have to ask ourselves if we had requisitioned enough racks to provide a scalable solution that could provide real-time data delivery and enough storage capacity for the foreseeable future.
But that’s no longer true in the new architecture thanks to cloud technologies—specifically, Amazon Web Services (AWS). By co-locating our historical data and analytics within AWS, we can scale up operations when necessary and have access to unlimited storage. Our analytics server auto-scales via AWS Elastic Beanstalk and runs on data that resides wholly in AWS Simple Storage (S3).
In the long run, we will allow user-driven analytics to run on our servers within AWS. Customers will still be able to request smaller data sets and analyze them locally, but they will no longer need to transfer or house big data for the bulk of their research needs. As a result, many firms will be able to eliminate the cost and hassle of recording, persisting and running analytics on locally stored data.
With TT, users can chart all contracts that are currently tradeable on our platform, and we plan to add even more markets that will complement what is currently available.
Later this month, we will be rolling out more than 10 years of tick data for some markets. What this means is that simply by scrolling, you’ll be able to see every trade that occurred in the last 10 years for a specific contract, assuming it has been listed that long.
Of course, the technology powering this is our Node.js analytics server and long-term data store in AWS.
By leveraging these efficient technologies, we are providing a unique, cutting-edge analytics platform backed by a cloud-based data storage solution that can handle volumes of data once only dreamed about.
Going forward, we are continuing to build out TT’s analytics capabilities with things like yield pricing/charting, comparison charts, support for options and the ability to provide research capabilities via API access outside of the TT interface. We are also adding more historical data as we speak, and we’re excited that it will be available soon. Look for an announcement in the coming weeks.