Rag Prompt
Hello, i was trying the model on my rag recently. I juste rewrote by saying "write no_answer" the very classical prompt on langchain hub :
"""You are an assistant for question-answering tasks. Use the following pieces of retrieved context to answer the question. If you don't know the answer, just write 'no_answer'. Use three sentences maximum and keep the answer concise.
Question: {question}
Context: {context}
Answer:"""
and it always reply with 'no_answer'
I took the very same example you have put on your model card with vLLM, to reproduce that.
Do you have any idea why its replying everytime "no_answer" ?
N.B : i was trying llama 3 8b on the same prompt back then with the same setup ( and same context lenght of 8192 ) , and it was working fine
EDIT : your prompt in your needle of the haystack test is the only one that work : {context} {query} Don't give information outside the document or repeat your findings. Keep your response short and direct.
but if i try to add to this prompt to tell the ai to not answer when it doesn't know, it always says it doesn't know