DreamGen API
We offer an API for our Opus V1+ models. The API is available to some of our premium subscription plans. Keep in mind that the DreamGen API is strictly for personal use, and we actively prohibit any form of sharing.
You can interact with the API through HTTP requests from any language. Some APIs are also compatible with OpenAI's client libraries.
Authentication
The DreamGen API uses API keys for authentication. You can create and remove API keys in your account page.
To authenticate the API requests, you submit your API key through the HTTP Authorization
header:
Authorization: Bearer YOUR_DREAMGEN_API_KEY
Keep your API keys secret. Do not share them with other people and do not expose it publicly.
Chat API
This API is currently undocumented.
Text API
POST https://dreamgen.com/api/v1/model/completion
Request
Response
The response is streamed in JSON-lines format, each line containing a JSON object of the type CompletionResponse
as defined below.
OpenAI compatible APIs
We provide several OpenAI compatible APIs, namely:
- Chat API:
POST https://dreamgen.com/api/openai/v1/chat/completions
- Text API:
POST https://dreamgen.com/api/openai/v1/completions
- Models API:
POST https://dreamgen.com/api/openai/v1/models
OpenAI Chat Completion
NOTE: This is a stateless API, message contents are never logged or stored.
POST https://dreamgen.com/api/openai/v1/chat/completions
This is a version of the native Chat API that is compatible with OpenAI's streaming chat completion specification.
See this Python Colab to see how to use it with the official OpenAI Python client library.
Model specification
Many OpenAI clients only support system
, assistant
and user
roles. In order to support the Opus' text
role that's necessary for story-writing and role-playing, we encode extra information in the model
request field.
Possible model
values are:
"opus-v1-{size}/{MODEL_SPEC}"
Where {size}
is either sm
or lg
and where {MODEL_SPEC}
is a either text
, assistant
or a JSON with the following schema:
The assistant
and user
specification determines how the OpenAI assistant
and user
roles are converted to Opus roles.
You can also use the following shorthands:
opus-v1-{size}/text
: stands for{"assistant": {"role": "text", "open": true}}
opus-v1-{size}/assistant
: stands for{"assistant": {"role": "assistant", "open": false}}
Additional request parameters:
These are additional parameters supported by DreamGen's OpenAI API:
min_p
top_k
repetition_penalty
When using OpenAI's Python SDK, you can pass these using the extra_body
param:
Unsupported (ignored) request parameters:
logit_bias
logprobs
top_logprobs
n
response_format
seed
stream
tools
tool_choice
user
function_call
functions
OpenAI Text Completion
NOTE: This is a stateless API, prompt contents are never logged or stored.
POST https://dreamgen.com/api/openai/v1/completions
This is a version of the native Text API that is compatible with OpenAI's streaming text completion specification.
See this Python Colab to see how to use it with the official OpenAI Python client library.
The text prompt
should follow the ChatML+Text prompt format as described in the Opus V1 model guide.
The model
can be either:
opus-v1-{size}
opus-v1-{size}/text
-- model is allowed to only use thetext
roleopus-v1-{size}/assistant
-- models is allowed to only use theassistant
role
Where {size}
is either sm
, lg
or xl
.
Additional request parameters:
These are additional parameters supported by DreamGen's OpenAI API:
min_p
top_k
repetition_penalty
When using OpenAI's Python SDK, you can pass these using the extra_body
param:
Unsupported (ignored) request parameters:
best_of
echo
logit_bias
logprobs
n
seed
stream
suffix
user
OpenAI List Models
GET https://dreamgen.com/api/openai/v1/models/list
This API is compatible with OpenAI's List Models specification.