Coder Social home page Coder Social logo

Comments (6)

drazvan avatar drazvan commented on July 18, 2024 3

@swapnil3597 : when registering an LLM provider with register_llm_provider you need to register the class, rather than a specific instance. In your case you're passing TgiLlm() which is an instance. Changing the registration line to the below should fix:

register_llm_provider("tgi_llm", TgiLlm)

Let me know if this solved your issue.

from nemo-guardrails.

swapnil3597 avatar swapnil3597 commented on July 18, 2024 1

@drazvan, I tried wrapper TGI client generate with Langchain's LLM, I'm getting following error:

LLM Wrapper Code:

from typing import Any, List, Mapping, Optional
from langchain.callbacks.manager import CallbackManagerForLLMRun
from langchain.llms.base import LLM
from text_generation import Client

from nemoguardrails.llm.helpers import get_llm_instance_wrapper
from nemoguardrails.llm.providers import register_llm_provider


class TgiLlm(LLM):
    # Initializing TGI client
    url = "http://127.0.0.1:8080"
    tgi_client = Client(url, timeout=240)

    # Inference params
    max_new_tokens = 512
    repetition_penalty = 1.1
    temperature = 0

    @property
    def _llm_type(self) -> str:
        return "TGI_CUSTOM"

    def _call(
            self,
            prompt: str,
            stop: Optional[List[str]] = None,  # Same as stop_sequences
            run_manager: Optional[CallbackManagerForLLMRun] = None,
    ) -> str:
        if stop is not None:
            raise ValueError("stop kwargs are not permitted.")

        return self.tgi_client.generate(
            prompt=prompt,
            max_new_tokens=self.max_new_tokens,
            repetition_penalty=self.repetition_penalty,
            temperature=self.temperature,
            stop_sequences=stop
        ).generated_text

    @property
    def _identifying_params(self) -> Mapping[str, Any]:
        """Get the identifying parameters."""
        return {
            "max_new_tokens": self.max_new_tokens,
            "repetition_penalty": self.repetition_penalty,
            "temperature": self.temperature
        }



register_llm_provider("tgi_llm", TgiLlm())

Getting following error:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[2], line 2
      1 config = RailsConfig.from_path("/home/ubuntu/llm/nemoguardrails_poc/tgi_with_guardrails/")
----> 2 app = LLMRails(config)

File ~/environments/llm_new/lib/python3.10/site-packages/nemoguardrails/rails/llm/llmrails.py:86, in LLMRails.__init__(self, config, llm, verbose)
     83 self.runtime = Runtime(config=config, verbose=verbose)
     85 # Next, we initialize the LLM engines (main engine and action engines if specified).
---> 86 self._init_llms()
     87 # Next, we initialize the LLM Generate actions and register them.
     88 actions = LLMGenerationActions(
     89     config=config,
     90     llm=self.llm,
     91     llm_task_manager=self.runtime.llm_task_manager,
     92     verbose=verbose,
     93 )

File ~/environments/llm_new/lib/python3.10/site-packages/nemoguardrails/rails/llm/llmrails.py:146, in LLMRails._init_llms(self)
    143             kwargs["model"] = llm_config.model
    145 if llm_config.type == "main" or len(self.config.models) == 1:
--> 146     self.llm = provider_cls(**kwargs)
    147     self.runtime.register_action_param("llm", self.llm)
    148 else:

TypeError: BaseLLM.__call__() missing 1 required positional argument: 'prompt'

from nemo-guardrails.

drazvan avatar drazvan commented on July 18, 2024

Hi @sam-h-bean!

This week we'll be pushing a new round of updates to the GitHub repo (and they will get published to PyPi at the end of the month). One of the updates improves the ability to connect the guardrails to any LLM. Once that will be pushed, you will need to wrap the /generate endpoint in a class that derives from BaseLanguageModel (from LangChain). Happy to assist you with that.

from nemo-guardrails.

sam-h-bean avatar sam-h-bean commented on July 18, 2024

@drazvan for people who are not using langchain what options do they have?

from nemo-guardrails.

drazvan avatar drazvan commented on July 18, 2024

They don't need to use LangChain per se. LangChain is already a dependency for nemoguardrails. They would just need to implement the LangChain interface for BaseLanguageModel like any other.

from nemo-guardrails.

swapnil3597 avatar swapnil3597 commented on July 18, 2024

Thanks @drazvan, this worked

from nemo-guardrails.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.