[go: nahoru, domu]

Skip to content

Commit

Permalink
Fix issue with loading non-transformer LLM models in Extractor/RAG pi…
Browse files Browse the repository at this point in the history
…peline, closes #734
  • Loading branch information
davidmezzetti committed Jun 18, 2024
1 parent 42583f0 commit 7073084
Showing 1 changed file with 3 additions and 2 deletions.
5 changes: 3 additions & 2 deletions src/python/txtai/pipeline/text/extractor.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

from ..base import Pipeline
from ..data import Tokenizer
from ..llm import LLM
from ..llm import GenerationFactory, LLM

from .questions import Questions
from .similarity import Similarity
Expand Down Expand Up @@ -159,7 +159,8 @@ def load(self, path, quantize, gpu, model, task, **kwargs):
return path

# Attempt to resolve task if not provided
task = task if task else Models.task(path, **kwargs)
task = GenerationFactory.method(path, task)
task = Models.task(path, **kwargs) if task == "transformers" else task

# Load Questions pipeline
if task == "question-answering":
Expand Down

0 comments on commit 7073084

Please sign in to comment.