Why it matters
- 30M+ monthly PyPI downloads — the most widely used LLM API client library by a large margin.
- Official and maintained by OpenAI — first-party support, same-day updates for new model releases and API features.
- Full API coverage: chat, images, audio, embeddings, fine-tuning, Assistants, Batch API in one consistent library.
- Typed responses (Pydantic models) catch type errors at development time, not runtime — better developer experience than raw HTTP clients.
Key capabilities
- Chat completions:
client.chat.completions.create()for GPT-4o, GPT-4o-mini, GPT-4-Turbo, and o1 models. - Streaming: Token-by-token streaming with
stream=True; iterate over chunks. - Tool calling: Define function schemas; model calls functions; handle results in conversation.
- Structured output:
response_format={"type": "json_schema", ...}for guaranteed valid JSON. - Image generation: DALL-E 3 and DALL-E 2 via
client.images.generate(). - Speech-to-text: Whisper transcription via
client.audio.transcriptions.create(). - Embeddings: text-embedding-3-small/large via
client.embeddings.create(). - Async support:
AsyncOpenAIclient for async/await usage. - Batch API: Submit large batches of requests for 50% cost reduction.
Technical notes
- Install:
pip install openai - Version: 1.x+ (breaking change from pre-1.0)
- License: MIT (open source)
- GitHub: github.com/openai/openai-python
- Python: 3.7.1+
- API key: Set via
OPENAI_API_KEYenv var orOpenAI(api_key=...) - Downloads: 30M+ monthly PyPI downloads
Ideal for
- Python developers building any application that uses OpenAI's models — the correct starting point for GPT-4o integration.
- Teams building chatbots, RAG systems, agents, or content generation pipelines on OpenAI's API.
- Applications requiring structured output, tool calling, or multi-turn conversations via the Assistants API.
Not ideal for
- Non-Python languages — use
openainpm package for JavaScript/TypeScript. - Teams evaluating non-OpenAI models — use Anthropic's Python SDK for Claude, or frameworks like LangChain for multi-provider support.
- Applications where OpenAI's pricing or data handling doesn't meet requirements — consider Anthropic, Groq, or local models.
See also
- Anthropic Python SDK — Claude API client; similar interface, different models.
- Vercel AI SDK — JavaScript/TypeScript multi-provider SDK for web applications.
- LangChain — Orchestration framework that uses the OpenAI SDK under the hood; adds chain/agent abstractions.