Skip to content

Commit

Permalink
Use default docker paths
Browse files Browse the repository at this point in the history
  • Loading branch information
simsa-st committed Mar 13, 2023
1 parent e19c817 commit 9b1136e
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
2 changes: 1 addition & 1 deletion baselines/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ We provide code to reproduce all results that can be found in the [paper](../REA

The code is structured into three subfolders:
* [NER](NER/) contains most of the baselines code, including training code for RoBERTa, LayoutLMv3 and RoBERTa pretraining, and the inference code.
* [layoutlmv3_pretraing](layoutlmv3_pretrain/) [coming soon] contains code for LayoutLMv3 pretraining
* [layoutlmv3_pretraing](layoutlmv3_pretrain/) contains code for LayoutLMv3 pretraining.
* [table-transformer](table-transformer/) [coming soon] contains code for DETR used for table and Line Item detection.

## Results on the validation set
Expand Down
6 changes: 3 additions & 3 deletions baselines/layoutlmv3_pretrain/pretrain.py
Original file line number Diff line number Diff line change
Expand Up @@ -568,8 +568,8 @@ def __len__(self):


def main():
dataset_path = Path("../../data/docile")
split = "unlabeled.json"
dataset_path = Path("/app/data/docile")
split = "unlabeled"
model_name = "microsoft/layoutlmv3-base"
batch_size = 128
num_workers = 0
Expand All @@ -578,7 +578,7 @@ def main():
weight_decay = 0.5
gradient_norm_clip_val = 5.0
optimizer = "AdamW"
ckpt_dir = "./saved_models"
ckpt_dir = "/app/data/baselines/trainings/layoutlmv3_pretraining"
num_gpus = 8
num_nodes = 1
max_epochs = 30
Expand Down

0 comments on commit 9b1136e

Please sign in to comment.