Skip to content
Discussion options

You must be logged in to vote

Hi! Thanks for your question!
Currently there's no DSL for that but you can do that using the ordinary mockLLMAnswer(String). Just provide the JSON structure of your response as String -- the way you would expect the LLM to generate it, and it should work.

If you check how PromptExecutor.executeStructured is implemented, you'll see the following:

repeat(retries) { attempt ->
        ...
        val response = execute(prompt = prompt, model = mainModel) // normal request, response is Message.Response
        ...
        val structure = structureParser.parse(structure, response.content) // trying to parse `response.content` from String
        ...
}

So providing a JSON string in tests shou…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by Ololoshechkin
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants