This Documentation is Being Deprecated
This page is being phased out as part of our documentation reorganization.
Click this card to be redirected to the updated version with the most current information.
If you notice any discrepancies or areas needing improvement in the new documentation, please use the “Report an issue” button at the bottom of the page.
completion_params and mode parameters. You can manually build this structure or use model-selector type parameters or configuration.
Request LLM
Entry
Endpoint:
Example
If you want to request OpenAI’s gpt-4o-mini model in a Tool, refer to the following example code:query parameter from tool_parameters is passed in the code.
Best Practices
It’s not recommended to manually buildLLMModelConfig. Instead, allow users to select their desired model in the UI. In this case, you can modify the tool’s parameter list by adding a model parameter according to the following configuration:
llm type parameters. This allows you to modify the above example code as follows:
Request Summary
You can request this endpoint to summarize a text. It will use the system model in your current workspace to summarize the text. Entry:text: The text to be summarizedinstruction: Additional instructions you want to add, allowing you to stylize the summary
Request Rerank
Entry
Endpoint
Request TTS
Entry
Endpoint
Request Speech2Text
Entry:Request Moderation
Entry:true, it indicates that the text contains sensitive content.
Edit this page | Report an issue