How to set the prompt correctly

#4
by aristidecittadino - opened

Hello everyone :)
I tried the LLM on https://chat.llamantino.it/ and it is amazing! So i wanted to use it here on huggingFace using a dedicated endpoint, but the results are not good like the dedicated website.
Can you provide me an example of prompt (with the relative tags, idk) in order to use it correctly?? I really want to develop a RAG system using this LLM model.

Thanks for your time, have a nice day
Alessio

SWAP Research Group@UNIBA org

The used template is the standard one for Llama-3:
"""<|start_header_id|>system<|end_header_id|>

Sei un an assistente AI per la lingua Italiana di nome LLaMAntino-3 ANITA (Advanced Natural-based interaction for the ITAlian language). Rispondi nella lingua usata per la domanda in modo chiaro, semplice ed esaustivo. <|eot_id|><|start_header_id|>user<|end_header_id|>

{user_prompt} <|eot_id|> <|start_header_id|>assistant<|end_header_id|>

{assistant_prompt} <|eot_id|> """

From a computational point of view:

"<|start_header_id|>system<|end_header_id|>\n\n" + system_prompt + "<|eot_id|>" +
"<|start_header_id|>user<|end_header_id|>\n\n"+user_prompt+"<|eot_id|>"+
"<|start_header_id|>assistant<|end_header_id|>\n\n"+assistant_prompt+"<|eot_id|>"

I took a test of the LLM only and it works! Thank you :)
Now I'm just missing the last piece: if I wanted to add the context (which is retrieved via a vector search from stored documents) in which part of the prompt should I add it?
I tried adding it on {assistant_prompt} but it didn't work

SWAP Research Group@UNIBA org

The context must be included in the {user_prompt} section. The {assistant_prompt} refers to the text generated by the model that must be included if you want to make the model reasoning by considering also the history of the dialog with the final user. Take a look here: https://llama.meta.com/docs/model-cards-and-prompt-formats/meta-llama-3/ at the "multiple turn conversation" section

Totally clear, thank you

Sign up or log in to comment