Hey Neal,?
?
If you're interested, I have a script that scrapes and saves Level 2 DOM data to a text file. The script's basic layout is where Level 2 DOM ticker is captured to a globally accessible dataframe via the TickerUpdate event. The script's main body runs on a 15 second loop, where every 15 seconds the dataframe is copied and then exported. I copy the dataframe before exporting as I found that exporting it whilst receiving incoming tickers sometimes resulted in access issues to the dataframe. I should mention that the df kept expanding throughout the day and trimming it periodically also resulted in access issues. I asked ChatGPT for some code to overcome the struggles and it provided me a custom data class that essentially creates a dataframe that trims itself whenever records are added to it. Any records older than 60 seconds are trimmed. Setting the object up as a custom data class enabled the self-trimming behavior without suffering from the access issues.?
?
Now that I'm thinking about it, perhaps I could stop copying the dataframe as well in the main loop. I'll probably leave it as it. I find Python can be real finicky and delicate at times. Anywhoozle, the script has some extra things in it too that I would have to clean up, but if you're interested i'll do that and post what i got.?