Official Python library for the Anthropic API with type-safe clients, streaming support, and tool use helpers
Official Python library for the Anthropic API providing convenient access to Claude models from any Python 3.9+ application. Includes type definitions for all request params and response fields, with both synchronous and asynchronous clients powered by httpx.
Install the Anthropic Python SDK from PyPI:
```bash
pip install anthropic
```
For AWS Bedrock support:
```bash
pip install anthropic[bedrock]
```
For Google Vertex support:
```bash
pip install anthropic[vertex]
```
For improved async performance with aiohttp:
```bash
pip install anthropic[aiohttp]
```
```python
import os
from anthropic import Anthropic
client = Anthropic(
api_key=os.environ.get("ANTHROPIC_API_KEY"),
)
message = client.messages.create(
max_tokens=1024,
messages=[
{
"role": "user",
"content": "Hello, Claude",
}
],
model="claude-sonnet-4-5-20250929",
)
print(message.content)
```
```python
import os
import asyncio
from anthropic import AsyncAnthropic
client = AsyncAnthropic(
api_key=os.environ.get("ANTHROPIC_API_KEY"),
)
async def main() -> None:
message = await client.messages.create(
max_tokens=1024,
messages=[
{
"role": "user",
"content": "Hello, Claude",
}
],
model="claude-sonnet-4-5-20250929",
)
print(message.content)
asyncio.run(main())
```
```python
from anthropic import Anthropic
client = Anthropic()
stream = client.messages.create(
max_tokens=1024,
messages=[
{
"role": "user",
"content": "Hello, Claude",
}
],
model="claude-sonnet-4-5-20250929",
stream=True,
)
for event in stream:
print(event.type)
```
```python
import asyncio
from anthropic import AsyncAnthropic
client = AsyncAnthropic()
async def main() -> None:
async with client.messages.stream(
max_tokens=1024,
messages=[
{
"role": "user",
"content": "Say hello there!",
}
],
model="claude-sonnet-4-5-20250929",
) as stream:
async for text in stream.text_stream:
print(text, end="", flush=True)
print()
message = await stream.get_final_message()
print(message.to_json())
asyncio.run(main())
```
Define and run tools as pure Python functions:
```python
import json
from anthropic import Anthropic, beta_tool
client = Anthropic()
@beta_tool
def get_weather(location: str) -> str:
"""Lookup the weather for a given city
Args:
location: The city and state, e.g. San Francisco, CA
Returns:
A dictionary containing the location, temperature, and weather condition.
"""
return json.dumps({
"location": location,
"temperature": "68°F",
"condition": "Sunny",
})
runner = client.beta.messages.tool_runner(
max_tokens=1024,
model="claude-sonnet-4-5-20250929",
tools=[get_weather],
messages=[
{"role": "user", "content": "What is the weather in SF?"},
],
)
for message in runner:
print(message)
```
Count tokens before creating a message:
```python
count = client.messages.count_tokens(
model="claude-sonnet-4-5-20250929",
messages=[
{"role": "user", "content": "Hello, world"}
]
)
print(count.input_tokens) # 10
```
Check usage after message creation:
```python
message = client.messages.create(...)
print(message.usage)
```
Process multiple requests in a batch:
```python
batch = await client.messages.batches.create(
requests=[
{
"custom_id": "my-first-request",
"params": {
"model": "claude-sonnet-4-5-20250929",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello, world"}],
},
},
{
"custom_id": "my-second-request",
"params": {
"model": "claude-sonnet-4-5-20250929",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hi again, friend"}],
},
},
]
)
result_stream = await client.messages.batches.results(batch.id)
async for entry in result_stream:
if entry.result.type == "succeeded":
print(entry.result.message.content)
```
```python
from anthropic import AnthropicBedrock
client = AnthropicBedrock(
aws_region='us-east-1',
aws_access_key='...',
aws_secret_key='...',
)
message = client.messages.create(
max_tokens=1024,
messages=[
{
"role": "user",
"content": "Hello!",
}
],
model="anthropic.claude-sonnet-4-5-20250929-v1:0",
)
print(message)
```
```python
from anthropic import AnthropicVertex
client = AnthropicVertex()
message = client.messages.create(
model="claude-sonnet-4@20250514",
max_tokens=100,
messages=[
{
"role": "user",
"content": "Hello!",
}
],
)
print(message)
```
1. Store your API key in environment variables using `ANTHROPIC_API_KEY`
2. Use `python-dotenv` to manage environment variables in development
3. Use async clients for improved concurrency in high-throughput applications
4. Consider `aiohttp` backend for async clients when handling many concurrent requests
5. Use streaming helpers for better user experience with long responses
6. Leverage type hints and TypedDicts for safer request construction
7. Use token counting to estimate costs before making requests
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/anthropic-python-sdk/raw