llm_chatgpt 0.1.6
llm_chatgpt: ^0.1.6 copied to clipboard
OpenAI/ChatGPT backend implementation for LLM interactions. Provides streaming chat, embeddings, and tool calling via the OpenAI API.
Changelog #
All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
[Unreleased] #
0.1.6 - 2026-02-10 #
Fixed #
- Ensured conversion from ChatGPT
tool_callstoLLMToolCallalways yields non-null, non-emptyidvalues, synthesizing IDs should the OpenAI response omits them. - Improved interoperability with
llm_core's stricttoolCallIdvalidation for tool messages in tool-calling workflows.
0.1.5 - 2026-01-26 #
Added #
- Builder pattern for
ChatGPTChatRepositoryviaChatGPTChatRepositoryBuilderfor complex configurations - Support for
StreamChatOptionsinstreamChat()method - Support for
chatResponse()method for non-streaming complete responses - Support for
RetryConfigandTimeoutConfigfor advanced request configuration - Input validation for model names and messages
- Improved stream parsing with
GptStreamConverter
Changed #
streamChat()now accepts optionalStreamChatOptionsparameter- Improved error handling and retry logic
- Enhanced documentation