Coder Social home page Coder Social logo

Comments (6)

keenborder786 avatar keenborder786 commented on July 16, 2024

okay I am checking it

from langchain.

keenborder786 avatar keenborder786 commented on July 16, 2024

Can you please share your code and prompt as well and the output which shows how the stop word is not working?

from langchain.

keenborder786 avatar keenborder786 commented on July 16, 2024

Because I double checked the code, everything seems to be functional.

from langchain.

NikitaKlichko avatar NikitaKlichko commented on July 16, 2024

Because I double checked the code, everything seems to be functional.

When specifying stops in the config and calling the list of stop tokens does not work, because in this case the stop list will be None in generate params (see code above in description issue)

generation_config = {
    'temperature':0.01,
    'top_p':0.9,
    'top_k':30,
    'max_tokens':1024,
    'repetition_penalty':1.1,
    'stop': ['.']
    }
    llm = VLLM(model=llm_name, dtype='float16', **generation_config)
    llm.invoke('question')

if we do that

llm.invoke('question', stop=['...'])

it works.
For the same reason, if we run, for example, rag_chain stop list don't work

rag_chain = (
    {"context": retriever, "question": RunnablePassthrough()}
    | prompt
    | llm
    | StrOutputParser()
)

from langchain.

keenborder786 avatar keenborder786 commented on July 16, 2024

No it won't be None because when you pass stop to the instance of VLLM i.e during the initialization, the self.stop is passed to the VLLM client by using the _default_params attribute. This is passed here, so it cannot be None regardless of whether you pass the stop during initialization or during invoking.

from langchain.

NikitaKlichko avatar NikitaKlichko commented on July 16, 2024

No it won't be None because when you pass stop to the instance of VLLM i.e during the initialization, the self.stop is passed to the VLLM client by using the _default_params attribute. This is passed here, so it cannot be None regardless of whether you pass the stop during initialization or during invoking.

if do like this stop token won't work in current _generate implementation

generation_config = {
    'temperature':0.01,
    'top_p':0.9,
    'top_k':30,
    'max_tokens':1024,
    'repetition_penalty':1.1,
    'stop': ['.']
    }
 llm = VLLM(model=llm_name, dtype='float16', **generation_config)
    
SYSTEM = "You are an astronomer scientist. Explain step by step"
question = "How does the Moon differ from Earth?"
template = [{"role": "system", "content": f"{SYSTEM}"}, 
            {"role": "user", "content": "Вопрос: {}".format(question)}]

llm_tokenizer = AutoTokenizer.from_pretrained(llm_name)
prompt = llm_tokenizer.apply_chat_template(template, tokenize=False, add_generation_prompt=True)
llm.invoke(prompt)
---------------------
ANSWER:
The Moon and Earth are two celestial bodies that are closely related yet distinct in many ways. Here's a step-by-step explanation of the main differences between them:

1. **Origin**: The Moon is thought to have formed about 4.5 billion years ago, shortly after the formation of the Earth. One theory is that the Moon was created when a massive object collided with the Earth, causing debris to be thrown into orbit and eventually coalesce into the Moon.

2. **Size and Mass**: The Moon is significantly smaller than Earth. It has a diameter of about 3,474 kilometers (2,159 miles), which is about one-quarter of Earth's diameter. The Moon's mass is about one-eighteenth of Earth's mass.
 and so on.

Check #15921

"""In the line params = {**self._default_params, **kwargs, "stop": stop}, the "stop" parameter from the "_default_params" method is overwritten by the local "stop" parameter. If no "stop" parameter is passed to the "_generate" method, it defaults to None, effectively ignoring the "stop" parameter set in the VLLM class"""

from langchain.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.