You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Been fiddling around with VBT, and got to the point where I can test my strategies around 1200 tickers for over two weeks.
I have 3 different timeframes to generate signals for each ticker. For example TSLA can yield 3 datasets range from 111 rows(15min timeframe) to 2023 rows (3min timeframe) to make up one timeframe set.
And I'm testing 4 different timeframe sets. Screenshot below shows size of the different timeframe pickle files, if this helps sensing the size of the dataframes.
Was testing one strategy this afternoon, and realized that the vbt.Portfolio.from_signals portion of the code can take up to 1hour long to process before showing the results. I know it's one hour as my code sends telegram msg when some important stages are done.
Screenshot below shows 1h passed from 8:11pm to 9:14pm.
The earlier stages of pulling stock data is rather fast, with my comp maxing out memory (up to 15gb+) and CPU running at 30% utilization (using multi-threading to download data, no idea how to use multi-processing yet).
But when it comes to the vbt.Portfolio.from_signals bit, the fans stop whinning and the CPU utilization drops to 8%-ish.
this is the bit of my code here. Am I missing something, or is it usual to wait this long for the size of the test I'm running?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi everyone,
Been fiddling around with VBT, and got to the point where I can test my strategies around 1200 tickers for over two weeks.
I have 3 different timeframes to generate signals for each ticker. For example TSLA can yield 3 datasets range from 111 rows(15min timeframe) to 2023 rows (3min timeframe) to make up one timeframe set.
And I'm testing 4 different timeframe sets. Screenshot below shows size of the different timeframe pickle files, if this helps sensing the size of the dataframes.
Was testing one strategy this afternoon, and realized that the vbt.Portfolio.from_signals portion of the code can take up to 1hour long to process before showing the results. I know it's one hour as my code sends telegram msg when some important stages are done.
Screenshot below shows 1h passed from 8:11pm to 9:14pm.
The earlier stages of pulling stock data is rather fast, with my comp maxing out memory (up to 15gb+) and CPU running at 30% utilization (using multi-threading to download data, no idea how to use multi-processing yet).
But when it comes to the vbt.Portfolio.from_signals bit, the fans stop whinning and the CPU utilization drops to 8%-ish.
this is the bit of my code here. Am I missing something, or is it usual to wait this long for the size of the test I'm running?
I'm running another backtest for 2 strategies at the same time, and i have been waiting 1.5hrs for the results.
Hope there is some way to speed this part up!
1st run using 1 strategy:
CPU: Intel core i5-4670 with 16gb ram, desktop.
2nd run for 2 strategies:
CPI: Intel core i7-1165g7 with 32gb ram, notebook.
Beta Was this translation helpful? Give feedback.
All reactions