Skip to content

Commit 3afbe13

Browse files
committed
s/sentencepiece/tiktoken
1 parent 9e8608f commit 3afbe13

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

llama/tokenizer.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -82,7 +82,7 @@ def __init__(self, model_path: str):
8282
mergeable_ranks=mergeable_ranks,
8383
special_tokens=self.special_tokens,
8484
)
85-
logger.info(f"Reloaded SentencePiece model from {model_path}")
85+
logger.info(f"Reloaded tiktoken model from {model_path}")
8686

8787
# BOS / EOS token IDs
8888
self.n_words: int = self.model.n_vocab

0 commit comments

Comments
 (0)