app#

Functions for Streamlit applications

Functions

call_llm(user_query)

From the user query this retrieves the context, setup the prompt, and call the LLM API to generate the response.

make_rag_prompt(user_query, passages)

Generate the RAG prompt

app.call_llm(user_query: str) Generator[str, None, None][source]#

From the user query this retrieves the context, setup the prompt, and call the LLM API to generate the response.

Parameters#

user_querystr

The user query.

Returns#

llm responseGenerator[str]

Yields a string generator containing the live LLM response.

app.make_rag_prompt(user_query: str, passages: list[str]) str[source]#

Generate the RAG prompt

Parameters#

user_querystr

The user query.

passageslist[str]

List of relevant passages obtained from the index.

Returns#

promptstr

The prompt to pass to the LLM API