Skip to content

Commit

Permalink
Merge remote-tracking branch 'me/handler_cache_demo' into handler_cac…
Browse files Browse the repository at this point in the history
…he_demo
  • Loading branch information
you-n-g committed Sep 26, 2021
2 parents 759705b + 4581da3 commit 3e01dc9
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 3 deletions.
1 change: 0 additions & 1 deletion examples/data_demo/data_cache_demo.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
- To show the data modules of Qlib is Serializable, users can dump processed data to disk to avoid duplicated data preprocessing
"""

from copy import deepcopy
from copy import deepcopy
from pathlib import Path
import pickle
Expand Down
4 changes: 2 additions & 2 deletions examples/data_demo/data_mem_resuse_demo.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,10 +50,10 @@
# this will save the time to reload and process data from disk(in `DataHandlerLP`)
# It still takes a lot of time in the backtest phase
for i in range(repeat):
task_train(new_task["task"], experiment_name=exp_name)
task_train(new_task, experiment_name=exp_name)

# 4) User can change other parts exclude processed data in memory(handler)
new_task = deepcopy(task_config["task"])
new_task["dataset"]["kwargs"]["segments"]["train"] = ("20100101", "20131231")
with TimeInspector.logt("The time with reusing processed data in memory:"):
task_train(new_task["task"], experiment_name=exp_name)
task_train(new_task, experiment_name=exp_name)

0 comments on commit 3e01dc9

Please sign in to comment.