AI-powered text generation, chat, and embeddings using Cohere's language models across multiple cloud platforms (AWS, Azure, GCP, Oracle OCI).
Integrate Cohere's language models into your Python applications for chat, text generation, embeddings, and more. Works across multiple platforms including AWS Bedrock, Azure, GCP, and Oracle OCI.
This skill helps you integrate the Cohere Python SDK into your project for:
Install the Cohere SDK:
```bash
pip install cohere
```
Set your API key as an environment variable (recommended):
```bash
export CO_API_KEY=your_cohere_api_key
```
Or add to your shell profile (~/.zshrc or ~/.bashrc) for persistence.
```python
import cohere
co = cohere.ClientV2()
response = co.chat(
model="command-r-plus-08-2024",
messages=[{"role": "user", "content": "hello world!"}],
)
print(response)
```
For real-time streaming responses:
```python
import cohere
co = cohere.ClientV2()
response = co.chat_stream(
model="command-r-plus-08-2024",
messages=[{"role": "user", "content": "hello world!"}],
)
for event in response:
if event.type == "content-delta":
print(event.delta.message.content.text, end="")
```
Build conversational experiences with message history:
```python
messages = [
{"role": "user", "content": "What is Python?"},
{"role": "assistant", "content": "Python is a high-level programming language..."},
{"role": "user", "content": "What are its main uses?"}
]
response = co.chat(
model="command-r-plus-08-2024",
messages=messages
)
```
Common Cohere models:
Refer to [Cohere's documentation](https://docs.cohere.com/) for the full model list and capabilities.
**Option 1: Environment Variable (Recommended)**
```bash
export CO_API_KEY=your_api_key
```
**Option 2: Explicit Initialization**
```python
co = cohere.ClientV2(api_key="your_api_key")
```
The SDK supports multiple platforms. Consult the [SDK support documentation](https://docs.cohere.com/docs/cohere-works-everywhere) for platform-specific configuration (AWS Bedrock, Azure, GCP, Oracle OCI).
1. **Always use environment variables** for API keys - never hardcode them
2. **Use streaming** for interactive applications to reduce perceived latency
3. **Handle rate limits** - implement retry logic with exponential backoff
4. **Monitor token usage** - track API costs by monitoring response metadata
5. **Choose the right model** - balance capability vs. cost/latency for your use case
```python
import cohere
from cohere.errors import CohereAPIError
co = cohere.ClientV2()
try:
response = co.chat(
model="command-r-plus-08-2024",
messages=[{"role": "user", "content": "Hello"}]
)
except CohereAPIError as e:
print(f"API error: {e}")
# Handle rate limits, invalid requests, etc.
```
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/cohere-python-sdk/raw