Mistral AI SDK
Drop-in replacement for mistralai.Mistral with automatic telemetry on chat completions.
from weflayr.sdk.mistralai.client import Mistral
Client
| Class | Replaces | Mode |
|---|---|---|
Mistral |
mistralai.Mistral |
Sync + async |
Constructor parameters
| Parameter | Env var | Description |
|---|---|---|
api_key |
— | Your Mistral API key |
intake_url |
WEFLAYR_INTAKE_URL |
Weflayr intake base URL |
client_id |
WEFLAYR_CLIENT_ID |
Your Flare client ID |
bearer_token |
WEFLAYR_CLIENT_SECRET |
Your Flare client secret |
Coverage
2
covered
9
not covered
11
total endpoints
| Endpoint | Accessor | Sync | Async | Stream | Billing metrics | Notes |
|---|---|---|---|---|---|---|
complete()
covered
|
client.chat | ✓ | — | — | prompt_tokens, completion_tokens | Synchronous chat completion |
complete_async()
covered
|
client.chat | — | ✓ | — | prompt_tokens, completion_tokens | Async chat completion via `await` |
stream()
not covered
|
client.chat | — | — | ✓ | — | Streaming chat not yet instrumented |
stream_async()
not covered
|
client.chat | — | ✓ | ✓ | — | Async streaming chat not yet instrumented |
create()
not covered
|
client.embeddings | — | — | — | — | Not yet instrumented |
create_async()
not covered
|
client.embeddings | — | — | — | — | Not yet instrumented |
complete()
not covered
|
client.fim | — | — | — | — | Code completion endpoint — not yet instrumented |
complete_async()
not covered
|
client.fim | — | — | — | — | Not yet instrumented |
complete()
not covered
|
client.agents | — | — | — | — | Agentic completion — not yet instrumented |
upload / list / delete
not covered
|
client.files | — | — | — | — | File management — not yet instrumented |
jobs.create()
not covered
|
client.fine_tuning | — | — | — | — | Not yet instrumented |
Examples
Synchronous chat
from weflayr.sdk.mistralai.client import Mistral
client = Mistral(api_key="...")
response = client.chat.complete(
model="mistral-small-latest",
messages=[{"role": "user", "content": "Hello!"}],
tags={"feature": "chat-widget"},
)
print(response.choices[0].message.content)
Async chat
import asyncio
from weflayr.sdk.mistralai.client import Mistral
client = Mistral(api_key="...")
async def main():
response = await client.chat.complete_async(
model="mistral-large-latest",
messages=[{"role": "user", "content": "What is AI?"}],
tags={"customer_tier": "enterprise"},
)
print(response.choices[0].message.content)
asyncio.run(main())