Skip to content

DanyeongLee/dream-challenge-gene-expression

Repository files navigation

dream-challenge-gene-expression

PyTorch Lightning Config: Hydra Template

Description

  • train
    • Train with training data
    • Validate with validation data (validation data is different according to the 'fold' argument)
    • Log to wandb
    • Save best checkpoint
    • When training ends, test with HQ_testdata & log to wandb
  • test
    • Test with HQ_testdata
    • Log to wandb
  • predict
    • Make submission.txt file with challenge test data

How to run

View help to identify all the available parameters

python train.py --help

Train model with chosen model configuration from configs/model/ (You should write config(.yaml) file first)

python train.py model=deepfamq_conjoined_adamw

You can override any parameter from command line like this (or you can just fix config files)

python train.py trainer.max_epochs=20 datamodule.batch_size=64 model.net.conv_kernel_size=15

Set 'name' argument to represent the settings you used (ckpt directory & wandb group name are automatically set)

python train.py model=deepfamq_conjoined_adamw model.net.conv_kernel_size=15 name=deepfamq_conjoined_adamw_conv15

Train models with 5-fold CV scheme (You can use snakemake to wrap these runs)

python train.py model=deepfamq_conjoined_adamw trainer.gpus=[0] fold=0

python train.py model=deepfamq_conjoined_adamw trainer.gpus=[0] fold=1

                              ...

Train model with whole training set & validate with HQ_testdata (just set fold=None)

python train.py model=deepfamq_conjoined_adamw trainer.gpus=[0] fold=None

Test model with HQ_testdata (ckpt_path is automatically set according to args 'name' & 'fold')

python test.py model=deepfamq_conjoined_adamw name=deepfamq_conjoined_adamw_conv15 fold=0

Make prediction file to be submitted (ckpt_path is automatically set according to args 'name' & 'fold')

python predict.py model=deepfamq_conjoined_adamw name=deepfamq_conjoined_adamw_conv15 fold=0

About

Predicting gene expression using promoter sequence.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published