mentortools/libs/: openai-connector-1.2.0+2253692934 metadata and description
OpenAI Connector server
| author | OpenAPI Generator Community |
| author_email | team@openapitools.org |
| classifiers |
|
| description_content_type | text/markdown |
| keywords | OpenAPI,OpenAPI-Generator,OpenAI Connector server |
| license | Unlicense |
| project_urls |
|
| requires_dist |
|
| requires_python | >=3.8,<4.0 |
| File | Tox results | History |
|---|---|---|
openai_connector-1.2.0+2253692934-py3-none-any.whl
|
|
|
openai_connector-1.2.0+2253692934.tar.gz
|
|
openai-connector
No description provided (generated by Openapi Generator https://github.com/openapitools/openapi-generator)
This Python package is automatically generated by the OpenAPI Generator project:
- API version: 0.1.0
- Package version: 1.2.0+2253692934
- Generator version: 7.18.0
- Build package: org.openapitools.codegen.languages.PythonPydanticV1ClientCodegen
Requirements.
Python 3.7+
Installation & Usage
pip install
If the python package is hosted on a repository, you can install directly using:
pip install git+https://github.com/GIT_USER_ID/GIT_REPO_ID.git
(you may need to run pip with root permission: sudo pip install git+https://github.com/GIT_USER_ID/GIT_REPO_ID.git)
Then import the package:
import openai_connector
Setuptools
Install via Setuptools.
python setup.py install --user
(or sudo python setup.py install to install the package for all users)
Then import the package:
import openai_connector
Tests
Execute pytest to run the tests.
Getting Started
Please follow the installation procedure and then run the following:
import time
import openai_connector
from openai_connector.rest import ApiException
from pprint import pprint
# Defining the host is optional and defaults to http://localhost
# See configuration.py for a list of all supported configuration parameters.
configuration = openai_connector.Configuration(
host = "http://localhost"
)
# Enter a context with an instance of the API client
async with openai_connector.ApiClient(configuration) as api_client:
# Create an instance of the API class
api_instance = openai_connector.OpenaiApi(api_client)
prompt_bundle_for_generate_image = openai_connector.PromptBundleForGenerateImage() # PromptBundleForGenerateImage |
try:
# Generate image
api_response = await api_instance.generate_image_v1_openai_image_generate_post(prompt_bundle_for_generate_image)
print("The response of OpenaiApi->generate_image_v1_openai_image_generate_post:\n")
pprint(api_response)
except ApiException as e:
print("Exception when calling OpenaiApi->generate_image_v1_openai_image_generate_post: %s\n" % e)
Documentation for API Endpoints
All URIs are relative to http://localhost
| Class | Method | HTTP request | Description |
|---|---|---|---|
| OpenaiApi | generate_image_v1_openai_image_generate_post | POST /v1/openai/image/generate | Generate image |
| OpenaiApi | generate_response_v1_openai_response_generate_post | POST /v1/openai/response/generate | Generate response |
| OpenaiApi | stream_response_v1_openai_response_stream_post | POST /v1/openai/response/stream | Stream response |
| OpenaiConnectorApi | generate_image_v1_openai_image_generate_post | POST /v1/openai/image/generate | Generate image |
| OpenaiConnectorApi | generate_response_v1_openai_response_generate_post | POST /v1/openai/response/generate | Generate response |
| OpenaiConnectorApi | stream_response_v1_openai_response_stream_post | POST /v1/openai/response/stream | Stream response |
Documentation For Models
- ChatHistoryContent
- ChatHistoryItem
- ChatRole
- FilePayload
- HTTPValidationError
- ImageResponse
- ImageSettings
- InputTokensDetails
- OutputTokensDetails
- PromptBundle
- PromptBundleForGenerateImage
- ResponseChunk
- ResponseChunkType
- ResponseSchemaPayload
- ResponseUsage
- TextResponse
- Tool
- ToolCall
- ToolCallResult
- ToolParameter
- Usage
- UsageDetails
- UsageInputTokensDetails
- ValidationError
- ValidationErrorLocInner
- WrappedResponseImageResponse
- WrappedResponseTextResponse
Documentation For Authorization
Endpoints do not require authorization.