[go: nahoru, domu]

Skip to content

Commit

Permalink
feat: Add support for JiebaTokenSplitter in tokenizer.py
Browse files Browse the repository at this point in the history
  • Loading branch information
simonChoi034 committed Jun 27, 2024
1 parent 8ea5cba commit d549637
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion gliner/data_processing/tokenizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ def __init__(self, splitter_type='whitespace'):
elif splitter_type == 'jieba':
self.splitter = JiebaTokenSplitter()
else:
raise ValueError(f"{splitter_type} is not implemented, choose between 'whitespace', 'spacy' and 'mecab'")
raise ValueError(f"{splitter_type} is not implemented, choose between 'whitespace', 'spacy', 'jieba' and 'mecab'")

def __call__(self, text):
for token in self.splitter(text):
Expand Down

0 comments on commit d549637

Please sign in to comment.