Skip to content

Commit

Permalink
Fix finetuning batch size
Browse files Browse the repository at this point in the history
  • Loading branch information
rohan-mehta committed Aug 8, 2023
1 parent 597e634 commit 74c0100
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion llama_finetuning.py
Original file line number Diff line number Diff line change
Expand Up @@ -180,7 +180,7 @@ def main(**kwargs):
# Create DataLoaders for the training and validation dataset
train_dataloader = torch.utils.data.DataLoader(
dataset_train,
batch_size=train_config.batch_size_training,
batch_size=train_config.micro_batch_size,
num_workers=train_config.num_workers_dataloader,
pin_memory=True,
sampler=train_sampler if train_sampler else None,
Expand Down

0 comments on commit 74c0100

Please sign in to comment.