Skip to contents

Direct interface to LM Studio's OpenAI-compatible endpoint. Uses the messages array format.

Usage

lms_chat_openai(
  model,
  messages,
  host = "http://localhost:1234",
  logprobs = FALSE,
  simplify = TRUE,
  ...
)

Arguments

model

Character. The loaded model name.

messages

List. A structured list of role and content pairs.

host

Character. Server URL.

logprobs

Logical. Whether to request logprobs (currently stubbed by LM Studio).

simplify

Logical. If TRUE, parses output to text.

...

Additional API arguments.

Value

If simplify = FALSE, returns a list representing the raw JSON response. Otherwise, returns a character string containing the generated text. If logprobs = TRUE, it returns an lms_chat_result object with the log probabilities populated as NULL since they are currently stubbed in the LM Studio OpenAI endpoint.