@Xanaphlaxes âAnd bro, what are you coding that analyzes 1,000 events per second? My smallest timeframe is 15M candles. I know your name is Hyperscalper, but what could you be seeing at that precision? Does LeeLoo show accurate market depth or something?â
Yeah, pretty unbelievable, right? So the Rithmic âpremiumâ Market Data feed has
no âaggregationâ and reports every Market Depth (aka âThe Bookâ) event that it
can. Instead of crappy feeds that give you maybe 10 Tiers on the Bid and Ask sides
of the market (and are probably slowed down or aggregated), Rithmic gives you
a virtually unlimited (at least 80 Tiers) of unaggregated Market Depth updates
in real time. They are better known for fast Order routing, but Market Data quality
is even MORE important.
Why? Well, for decades, Iâve worked with using Market Depth (aka placements of
Size@Price on âThe Order Bookâ) to predict where the market will move. Itâs a bit
too complex to understand, but Near the Inside Market, itâs the case that a âwaveâ
of Size exists on, say, the Bid, relative to the Ask/Offer which literally âpushesâ against
the Market Price, thus predicting the market will move up in this case.
So, yes, especially when the Market first opens, there can be âburstsâ of data
with local data rates above 1000 updates per second. Naturally, you donât want
ALL of the Market Depth, and might reject those which are, say, 40 tiers away from
the current Market PriceâŠ
To do this, even compiled C# may not be fast enough, given that you need to
do some computations on each data item. So, the answer is that you Capture
and Queue each eventâs essential data; and return from OnMarketDepth callback,
for the next one. Meanwhile, your QueueProcessor is working as fast as it can
processing the queued data. My latencies between capture and processing on
the Queue thread, are usually <2 milliseconds. And in âpathologicalâ situations,
the Queue manager is permitted to discard older pieces of data, if it finds that
it simply cannot keep up; but most of the time thatâs not needed.
And I actually pull in the NQ eMini contractâs data, as well as concurrently,
the MNQ micro contractâs data at the same time. So that Doubles up the data
rates I have to handle.
With a reasonable amount of effort, anyone can learn how to do this stuff;
you just need persistence, to keep moving toward your goals !
But, hey, itâs just another day at the office ! LOL
hyperscalper