# Documentation Map

This documentation portal is designed to help you choose and configure the AI **model** that best suits your needs—or one of our **solutions** (ready-to-use tools for specific practical tasks) from our available options and correctly integrate it into your code.

<table data-header-hidden data-full-width="false"><thead><tr><th width="281.09991455078125" valign="top"></th><th valign="top"></th></tr></thead><tbody><tr><td valign="top"><p><strong>Start with this code block</strong><br><br><span data-gb-custom-inline data-tag="emoji" data-code="1fa81">🪁</span> Step-by-step example:</p><p><a href="setting-up">Setting Up</a><br><br><span data-gb-custom-inline data-tag="emoji" data-code="1fa81">🪁</span> Choose the SDK to use:</p><p><a href="supported-sdks">Supported SDKs</a></p></td><td valign="top"><pre class="language-python" data-overflow="wrap"><code class="lang-python">from openai import OpenAI
client = OpenAI(
base_url="https://api.apilaplas.com/v1",
api_key="&#x3C;YOUR_LAPLASAPI_KEY>",

)
response = client.chat.completions.create(
model="gpt-4o",
messages=\[{"role": "user", "content": "Write a one-sentence story about numbers."}]
)
print(response.choices\[0].message.content) </code></pre></td></tr></tbody></table>

***

## Browse Models

Popular | [View all 200+ models >](https://docs.apilaplas.com/api-references/model-database)

<table data-view="cards"><thead><tr><th></th><th></th><th></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td><a href="../api-references/text-models-llm/openai">ChatGPT</a></td><td></td><td></td><td><a href="../api-references/text-models-llm/openai">openai</a></td></tr><tr><td><a href="../api-references/text-models-llm/deepseek">DeepSeek</a></td><td></td><td></td><td><a href="../api-references/text-models-llm/deepseek">deepseek</a></td></tr><tr><td><a href="../api-references/image-models/flux">Flux</a></td><td></td><td></td><td><a href="../api-references/image-models/flux">flux</a></td></tr></tbody></table>

Select the model by its **Task**, by its **Developer** or by the supported **Capabilities**:

{% hint style="info" %}
If you've already made your choice and know the model ID, use the [Search panel](https://docs.apilaplas.com/?q=) on your right.
{% endhint %}

{% tabs %}
{% tab title="Models by TASK" %}
{% content-ref url="../api-references/text-models-llm" %}
[text-models-llm](https://docs.apilaplas.com/api-references/text-models-llm)
{% endcontent-ref %}

{% content-ref url="../api-references/image-models" %}
[image-models](https://docs.apilaplas.com/api-references/image-models)
{% endcontent-ref %}

{% content-ref url="../api-references/video-models" %}
[video-models](https://docs.apilaplas.com/api-references/video-models)
{% endcontent-ref %}

{% content-ref url="../api-references/music-models" %}
[music-models](https://docs.apilaplas.com/api-references/music-models)
{% endcontent-ref %}

{% content-ref url="../api-references/speech-models" %}
[speech-models](https://docs.apilaplas.com/api-references/speech-models)
{% endcontent-ref %}

{% content-ref url="../api-references/moderation-safety-models" %}
[moderation-safety-models](https://docs.apilaplas.com/api-references/moderation-safety-models)
{% endcontent-ref %}

{% content-ref url="../api-references/3d-generating-models" %}
[3d-generating-models](https://docs.apilaplas.com/api-references/3d-generating-models)
{% endcontent-ref %}

{% content-ref url="../api-references/vision-models" %}
[vision-models](https://docs.apilaplas.com/api-references/vision-models)
{% endcontent-ref %}

{% content-ref url="../api-references/embedding-models" %}
[embedding-models](https://docs.apilaplas.com/api-references/embedding-models)
{% endcontent-ref %}
{% endtab %}

{% tab title="Models by DEVELOPER" %}
**AI21 Labs**: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/ai21-labs)\
**Alibaba Cloud**: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/alibaba-cloud) [Video](https://github.com/Kleepers/laplas/blob/docs/docs/api-references/video-models/alibaba-cloud/README.md)\
**Anthracite**: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/anthracite)\ <mark style="background-color:green;">**Anthropic**</mark>: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/anthropic) [Embedding](https://docs.apilaplas.com/api-references/embedding-models/anthropic)\
**BAAI**: [Embedding](https://docs.apilaplas.com/api-references/embedding-models/baai)\
**Cohere**: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/cohere)\ <mark style="background-color:green;">**DeepSeek**</mark>: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/deepseek)\
**Deepgram**: [Speech-To-Text](https://docs.apilaplas.com/api-references/speech-models/speech-to-text/deepgram) [Text-to-Speech](https://docs.apilaplas.com/api-references/speech-models/text-to-speech/deepgram)\
**Flux**: [Image](https://docs.apilaplas.com/api-references/image-models/flux)\
**Google**: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/google) [Image](https://docs.apilaplas.com/api-references/image-models/google) [Embedding](https://docs.apilaplas.com/api-references/embedding-models/google) [Video](https://github.com/Kleepers/laplas/blob/docs/docs/api-references/video-models/google/README.md) [Vision(OCR)](https://docs.apilaplas.com/api-references/vision-models/ocr-optical-character-recognition/google)\
**Gryphe**: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/gryphe)\ <mark style="background-color:green;">**Kling AI**</mark>: [Video](https://docs.apilaplas.com/api-references/video-models/kling-ai)\
**Meta**: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/meta)\ <mark style="background-color:green;">**MiniMax**</mark>: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/minimax) [Video](https://docs.apilaplas.com/api-references/video-models/minimax) [Music](https://docs.apilaplas.com/api-references/music-models/minimax)\
**Mistral AI**: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/mistral-ai) [Vision(OCR)](https://docs.apilaplas.com/api-references/vision-models/ocr-optical-character-recognition/mistral-ai)\
**NVIDIA**: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/nvidia)\
**NeverSleep**: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/neversleep)\
**NousResearch**: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/nousresearch)\ <mark style="background-color:green;">**OpenAI**</mark>: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/openai) [Image](https://docs.apilaplas.com/api-references/image-models/openai) [Speech-To-Text](https://docs.apilaplas.com/api-references/speech-models/speech-to-text/openai) [Embedding](https://docs.apilaplas.com/api-references/embedding-models/openai)\
**RecraftAI**: [Image](https://docs.apilaplas.com/api-references/image-models/recraftai)\
**Runway**: [Video](https://docs.apilaplas.com/api-references/video-models/runway)\ <mark style="background-color:green;">**Stability AI**</mark>: [Image](https://docs.apilaplas.com/api-references/image-models/stability-ai) [Music](https://docs.apilaplas.com/api-references/music-models/stability-ai) [3D-Generation](https://docs.apilaplas.com/api-references/3d-generating-models/stability-ai)\
**Together AI**: [Embedding](https://docs.apilaplas.com/api-references/embedding-models/together-ai)\
**xAI**: [Text/Chat](https://docs.apilaplas.com/api-references/text-models-llm/xai)
{% endtab %}

{% tab title="Models by CAPABILITY" %}
{% content-ref url="../capabilities/completion-or-chat-models" %}
[completion-or-chat-models](https://docs.apilaplas.com/capabilities/completion-or-chat-models)
{% endcontent-ref %}

{% content-ref url="../capabilities/code-generation" %}
[code-generation](https://docs.apilaplas.com/capabilities/code-generation)
{% endcontent-ref %}

{% content-ref url="../capabilities/function-calling" %}
[function-calling](https://docs.apilaplas.com/capabilities/function-calling)
{% endcontent-ref %}

{% content-ref url="../capabilities/thinking-reasoning" %}
[thinking-reasoning](https://docs.apilaplas.com/capabilities/thinking-reasoning)
{% endcontent-ref %}

{% content-ref url="../capabilities/image-to-text-vision" %}
[image-to-text-vision](https://docs.apilaplas.com/capabilities/image-to-text-vision)
{% endcontent-ref %}

{% content-ref url="../capabilities/web-search" %}
[web-search](https://docs.apilaplas.com/capabilities/web-search)
{% endcontent-ref %}
{% endtab %}
{% endtabs %}

## Browse Solutions

* [AI Search Engine](https://docs.apilaplas.com/solutions/bagoodex/ai-search-engine) – if you need to create a project where information must be found on the internet and then presented to you in a structured format, use this solution.
* [OpenAI Assistants](https://docs.apilaplas.com/solutions/openai/assistants) – if you need to create tailored AI Assistants capable of handling customer support, data analysis, content generation, and more.

***

## Going Deeper

<table data-header-hidden data-full-width="false"><thead><tr><th width="409.4000244140625"></th><th valign="top"></th></tr></thead><tbody><tr><td><p><strong>Use more text model capabilities in your project:</strong><br><br><span data-gb-custom-inline data-tag="emoji" data-code="1f4d6">📖</span> <a href="../capabilities/completion-or-chat-models">​Completion and Chat Completion</a></p><p><span data-gb-custom-inline data-tag="emoji" data-code="1f4d6">📖</span> <a href="../capabilities/function-calling">Function Calling</a></p><p><span data-gb-custom-inline data-tag="emoji" data-code="1f4d6">📖</span> <a href="../capabilities/image-to-text-vision">Vision in Text Models (Image-to-Text)</a></p><p><span data-gb-custom-inline data-tag="emoji" data-code="1f4d6">📖</span> <a href="../capabilities/code-generation">Code Generation</a></p><p><span data-gb-custom-inline data-tag="emoji" data-code="1f4d6">📖</span> <a href="../capabilities/thinking-reasoning">Thinking / Reasoning</a></p><p><span data-gb-custom-inline data-tag="emoji" data-code="1f4d6">📖</span> <a href="../capabilities/web-search">Web Search</a><br><br></p></td><td valign="top"><p><strong>Miscellaneous</strong>:<br><br><span data-gb-custom-inline data-tag="emoji" data-code="1f517">🔗</span> <a href="https://github.com/Kleepers/laplas/blob/docs/docs/broken-reference/README.md">Integrations</a></p><p><span data-gb-custom-inline data-tag="emoji" data-code="1f4d7">📗</span> <a href="https://github.com/Kleepers/laplas/blob/docs/docs/broken-reference/README.md">Glossary</a></p><p><span data-gb-custom-inline data-tag="emoji" data-code="26a0">⚠️</span> <a href="https://github.com/Kleepers/laplas/blob/docs/docs/broken-reference/README.md">Errors and Messages</a></p><p><span data-gb-custom-inline data-tag="emoji" data-code="2753">❓</span> <a href="https://github.com/Kleepers/laplas/blob/docs/docs/broken-reference/README.md">FAQ</a> ​</p><p><br></p></td></tr><tr><td><strong>Learn more about developer-specific features:</strong><br><br><span data-gb-custom-inline data-tag="emoji" data-code="1f4d6">📖</span> <a href="../capabilities/anthropic">Features of Anthropic Models</a><br></td><td valign="top"></td></tr></tbody></table>

## Have a Minute? Help Make the Docs Better!

We’re currently working on improving our documentation portal, and your feedback would be **incredibly** helpful! Take [**a quick 5-question survey**](https://tally.so/r/w4G9Er) (no personal info required!)

You can also rate each individual page using the built-in form on the right side of the screen:

<figure><img src="https://907664505-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FwFXiHXlmmUm0WIL4dfrh%2Fuploads%2Fgit-blob-62017f43d426ea34ff2a6cb09df4076bd12628ee%2Frateform-5.webp?alt=media" alt=""><figcaption></figcaption></figure>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.apilaplas.com/quickstart/readme.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
