|
Synapseml
1.0.4
|
OpenAIChatCompletion implements OpenAIChatCompletion More...


Public Member Functions | |
| OpenAIChatCompletion () | |
| Creates a OpenAIChatCompletion without any parameters. More... | |
| OpenAIChatCompletion (string uid) | |
| Creates a OpenAIChatCompletion with a UID that is used to give the OpenAIChatCompletion a unique ID. More... | |
| OpenAIChatCompletion | SetAADToken (string value) |
| Sets value for AADToken More... | |
| OpenAIChatCompletion | SetAADTokenCol (string value) |
| Sets value for AADToken column More... | |
| OpenAIChatCompletion | SetCustomAuthHeader (string value) |
| Sets value for CustomAuthHeader More... | |
| OpenAIChatCompletion | SetCustomAuthHeaderCol (string value) |
| Sets value for CustomAuthHeader column More... | |
| OpenAIChatCompletion | SetApiVersion (string value) |
| Sets value for apiVersion More... | |
| OpenAIChatCompletion | SetApiVersionCol (string value) |
| Sets value for apiVersion column More... | |
| OpenAIChatCompletion | SetBestOf (int value) |
| Sets value for bestOf More... | |
| OpenAIChatCompletion | SetBestOfCol (string value) |
| Sets value for bestOf column More... | |
| OpenAIChatCompletion | SetCacheLevel (int value) |
| Sets value for cacheLevel More... | |
| OpenAIChatCompletion | SetCacheLevelCol (string value) |
| Sets value for cacheLevel column More... | |
| OpenAIChatCompletion | SetConcurrency (int value) |
| Sets value for concurrency More... | |
| OpenAIChatCompletion | SetConcurrentTimeout (double value) |
| Sets value for concurrentTimeout More... | |
| OpenAIChatCompletion | SetDeploymentName (string value) |
| Sets value for deploymentName More... | |
| OpenAIChatCompletion | SetDeploymentNameCol (string value) |
| Sets value for deploymentName column More... | |
| OpenAIChatCompletion | SetEcho (bool value) |
| Sets value for echo More... | |
| OpenAIChatCompletion | SetEchoCol (string value) |
| Sets value for echo column More... | |
| OpenAIChatCompletion | SetErrorCol (string value) |
| Sets value for errorCol More... | |
| OpenAIChatCompletion | SetFrequencyPenalty (double value) |
| Sets value for frequencyPenalty More... | |
| OpenAIChatCompletion | SetFrequencyPenaltyCol (string value) |
| Sets value for frequencyPenalty column More... | |
| OpenAIChatCompletion | SetHandler (object value) |
| Sets value for handler More... | |
| OpenAIChatCompletion | SetLogProbs (int value) |
| Sets value for logProbs More... | |
| OpenAIChatCompletion | SetLogProbsCol (string value) |
| Sets value for logProbs column More... | |
| OpenAIChatCompletion | SetMaxTokens (int value) |
| Sets value for maxTokens More... | |
| OpenAIChatCompletion | SetMaxTokensCol (string value) |
| Sets value for maxTokens column More... | |
| OpenAIChatCompletion | SetMessagesCol (string value) |
| Sets value for messagesCol More... | |
| OpenAIChatCompletion | SetN (int value) |
| Sets value for n More... | |
| OpenAIChatCompletion | SetNCol (string value) |
| Sets value for n column More... | |
| OpenAIChatCompletion | SetOutputCol (string value) |
| Sets value for outputCol More... | |
| OpenAIChatCompletion | SetPresencePenalty (double value) |
| Sets value for presencePenalty More... | |
| OpenAIChatCompletion | SetPresencePenaltyCol (string value) |
| Sets value for presencePenalty column More... | |
| OpenAIChatCompletion | SetStop (string value) |
| Sets value for stop More... | |
| OpenAIChatCompletion | SetStopCol (string value) |
| Sets value for stop column More... | |
| OpenAIChatCompletion | SetSubscriptionKey (string value) |
| Sets value for subscriptionKey More... | |
| OpenAIChatCompletion | SetSubscriptionKeyCol (string value) |
| Sets value for subscriptionKey column More... | |
| OpenAIChatCompletion | SetTemperature (double value) |
| Sets value for temperature More... | |
| OpenAIChatCompletion | SetTemperatureCol (string value) |
| Sets value for temperature column More... | |
| OpenAIChatCompletion | SetTimeout (double value) |
| Sets value for timeout More... | |
| OpenAIChatCompletion | SetTopP (double value) |
| Sets value for topP More... | |
| OpenAIChatCompletion | SetTopPCol (string value) |
| Sets value for topP column More... | |
| OpenAIChatCompletion | SetUrl (string value) |
| Sets value for url More... | |
| OpenAIChatCompletion | SetUser (string value) |
| Sets value for user More... | |
| OpenAIChatCompletion | SetUserCol (string value) |
| Sets value for user column More... | |
| string | GetAADToken () |
| Gets AADToken value More... | |
| string | GetCustomAuthHeader () |
| Gets CustomAuthHeader value More... | |
| string | GetApiVersion () |
| Gets apiVersion value More... | |
| int | GetBestOf () |
| Gets bestOf value More... | |
| int | GetCacheLevel () |
| Gets cacheLevel value More... | |
| int | GetConcurrency () |
| Gets concurrency value More... | |
| double | GetConcurrentTimeout () |
| Gets concurrentTimeout value More... | |
| string | GetDeploymentName () |
| Gets deploymentName value More... | |
| bool | GetEcho () |
| Gets echo value More... | |
| string | GetErrorCol () |
| Gets errorCol value More... | |
| double | GetFrequencyPenalty () |
| Gets frequencyPenalty value More... | |
| object | GetHandler () |
| Gets handler value More... | |
| int | GetLogProbs () |
| Gets logProbs value More... | |
| int | GetMaxTokens () |
| Gets maxTokens value More... | |
| string | GetMessagesCol () |
| Gets messagesCol value More... | |
| int | GetN () |
| Gets n value More... | |
| string | GetOutputCol () |
| Gets outputCol value More... | |
| double | GetPresencePenalty () |
| Gets presencePenalty value More... | |
| string | GetStop () |
| Gets stop value More... | |
| string | GetSubscriptionKey () |
| Gets subscriptionKey value More... | |
| double | GetTemperature () |
| Gets temperature value More... | |
| double | GetTimeout () |
| Gets timeout value More... | |
| double | GetTopP () |
| Gets topP value More... | |
| string | GetUrl () |
| Gets url value More... | |
| string | GetUser () |
| Gets user value More... | |
| void | Save (string path) |
| Saves the object so that it can be loaded later using Load. Note that these objects can be shared with Scala by Loading or Saving in Scala. More... | |
| JavaMLWriter | Write () |
| |
| JavaMLReader< OpenAIChatCompletion > | Read () |
| Get the corresponding JavaMLReader instance. More... | |
| OpenAIChatCompletion | SetCustomServiceName (string value) |
| Sets value for service name More... | |
| OpenAIChatCompletion | SetEndpoint (string value) |
| Sets value for endpoint More... | |
Static Public Member Functions | |
| static OpenAIChatCompletion | Load (string path) |
| Loads the OpenAIChatCompletion that was previously saved using Save(string). More... | |
OpenAIChatCompletion implements OpenAIChatCompletion
|
inline |
Creates a OpenAIChatCompletion without any parameters.
|
inline |
Creates a OpenAIChatCompletion with a UID that is used to give the OpenAIChatCompletion a unique ID.
| uid | An immutable unique ID for the object and its derivatives. |
| string Synapse.ML.Services.Openai.OpenAIChatCompletion.GetAADToken | ( | ) |
Gets AADToken value
| string Synapse.ML.Services.Openai.OpenAIChatCompletion.GetApiVersion | ( | ) |
Gets apiVersion value
| int Synapse.ML.Services.Openai.OpenAIChatCompletion.GetBestOf | ( | ) |
Gets bestOf value
| int Synapse.ML.Services.Openai.OpenAIChatCompletion.GetCacheLevel | ( | ) |
Gets cacheLevel value
| int Synapse.ML.Services.Openai.OpenAIChatCompletion.GetConcurrency | ( | ) |
Gets concurrency value
| double Synapse.ML.Services.Openai.OpenAIChatCompletion.GetConcurrentTimeout | ( | ) |
Gets concurrentTimeout value
| string Synapse.ML.Services.Openai.OpenAIChatCompletion.GetCustomAuthHeader | ( | ) |
Gets CustomAuthHeader value
| string Synapse.ML.Services.Openai.OpenAIChatCompletion.GetDeploymentName | ( | ) |
Gets deploymentName value
| bool Synapse.ML.Services.Openai.OpenAIChatCompletion.GetEcho | ( | ) |
Gets echo value
| string Synapse.ML.Services.Openai.OpenAIChatCompletion.GetErrorCol | ( | ) |
Gets errorCol value
| double Synapse.ML.Services.Openai.OpenAIChatCompletion.GetFrequencyPenalty | ( | ) |
Gets frequencyPenalty value
| object Synapse.ML.Services.Openai.OpenAIChatCompletion.GetHandler | ( | ) |
Gets handler value
| int Synapse.ML.Services.Openai.OpenAIChatCompletion.GetLogProbs | ( | ) |
Gets logProbs value
logprobs most likely tokens, as well the chosen tokens. So for example, if logprobs is 10, the API will return a list of the 10 most likely tokens. If logprobs is 0, only the chosen tokens will have logprobs returned. Minimum of 0 and maximum of 100 allowed. | int Synapse.ML.Services.Openai.OpenAIChatCompletion.GetMaxTokens | ( | ) |
Gets maxTokens value
| string Synapse.ML.Services.Openai.OpenAIChatCompletion.GetMessagesCol | ( | ) |
Gets messagesCol value
| int Synapse.ML.Services.Openai.OpenAIChatCompletion.GetN | ( | ) |
Gets n value
| string Synapse.ML.Services.Openai.OpenAIChatCompletion.GetOutputCol | ( | ) |
Gets outputCol value
| double Synapse.ML.Services.Openai.OpenAIChatCompletion.GetPresencePenalty | ( | ) |
Gets presencePenalty value
| string Synapse.ML.Services.Openai.OpenAIChatCompletion.GetStop | ( | ) |
Gets stop value
| string Synapse.ML.Services.Openai.OpenAIChatCompletion.GetSubscriptionKey | ( | ) |
Gets subscriptionKey value
| double Synapse.ML.Services.Openai.OpenAIChatCompletion.GetTemperature | ( | ) |
Gets temperature value
top_p but not both. Minimum of 0 and maximum of 2 allowed. | double Synapse.ML.Services.Openai.OpenAIChatCompletion.GetTimeout | ( | ) |
Gets timeout value
| double Synapse.ML.Services.Openai.OpenAIChatCompletion.GetTopP | ( | ) |
Gets topP value
temperature but not both. Minimum of 0 and maximum of 1 allowed. | string Synapse.ML.Services.Openai.OpenAIChatCompletion.GetUrl | ( | ) |
Gets url value
| string Synapse.ML.Services.Openai.OpenAIChatCompletion.GetUser | ( | ) |
Gets user value
|
static |
Loads the OpenAIChatCompletion that was previously saved using Save(string).
| path | The path the previous OpenAIChatCompletion was saved to |
| JavaMLReader<OpenAIChatCompletion> Synapse.ML.Services.Openai.OpenAIChatCompletion.Read | ( | ) |
Get the corresponding JavaMLReader instance.
| void Synapse.ML.Services.Openai.OpenAIChatCompletion.Save | ( | string | path | ) |
Saves the object so that it can be loaded later using Load. Note that these objects can be shared with Scala by Loading or Saving in Scala.
| path | The path to save the object to |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetAADToken | ( | string | value | ) |
Sets value for AADToken
| value | AAD Token used for authentication |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetAADTokenCol | ( | string | value | ) |
Sets value for AADToken column
| value | AAD Token used for authentication |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetApiVersion | ( | string | value | ) |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetApiVersionCol | ( | string | value | ) |
Sets value for apiVersion column
| value | version of the api |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetBestOf | ( | int | value | ) |
Sets value for bestOf
| value | How many generations to create server side, and display only the best. Will not stream intermediate progress if best_of > 1. Has maximum value of 128. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetBestOfCol | ( | string | value | ) |
Sets value for bestOf column
| value | How many generations to create server side, and display only the best. Will not stream intermediate progress if best_of > 1. Has maximum value of 128. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetCacheLevel | ( | int | value | ) |
Sets value for cacheLevel
| value | can be used to disable any server-side caching, 0=no cache, 1=prompt prefix enabled, 2=full cache |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetCacheLevelCol | ( | string | value | ) |
Sets value for cacheLevel column
| value | can be used to disable any server-side caching, 0=no cache, 1=prompt prefix enabled, 2=full cache |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetConcurrency | ( | int | value | ) |
Sets value for concurrency
| value | max number of concurrent calls |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetConcurrentTimeout | ( | double | value | ) |
Sets value for concurrentTimeout
| value | max number seconds to wait on futures if concurrency >= 1 |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetCustomAuthHeader | ( | string | value | ) |
Sets value for CustomAuthHeader
| value | A Custom Value for Authorization Header |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetCustomAuthHeaderCol | ( | string | value | ) |
Sets value for CustomAuthHeader column
| value | A Custom Value for Authorization Header |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetCustomServiceName | ( | string | value | ) |
Sets value for service name
| value | Service name of the cognitive service if it's custom domain |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetDeploymentName | ( | string | value | ) |
Sets value for deploymentName
| value | The name of the deployment |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetDeploymentNameCol | ( | string | value | ) |
Sets value for deploymentName column
| value | The name of the deployment |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetEcho | ( | bool | value | ) |
Sets value for echo
| value | Echo back the prompt in addition to the completion |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetEchoCol | ( | string | value | ) |
Sets value for echo column
| value | Echo back the prompt in addition to the completion |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetEndpoint | ( | string | value | ) |
Sets value for endpoint
| value | Endpoint of the cognitive service |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetErrorCol | ( | string | value | ) |
Sets value for errorCol
| value | column to hold http errors |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetFrequencyPenalty | ( | double | value | ) |
Sets value for frequencyPenalty
| value | How much to penalize new tokens based on whether they appear in the text so far. Increases the likelihood of the model to talk about new topics. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetFrequencyPenaltyCol | ( | string | value | ) |
Sets value for frequencyPenalty column
| value | How much to penalize new tokens based on whether they appear in the text so far. Increases the likelihood of the model to talk about new topics. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetHandler | ( | object | value | ) |
Sets value for handler
| value | Which strategy to use when handling requests |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetLogProbs | ( | int | value | ) |
Sets value for logProbs
| value | Include the log probabilities on the logprobs most likely tokens, as well the chosen tokens. So for example, if logprobs is 10, the API will return a list of the 10 most likely tokens. If logprobs is 0, only the chosen tokens will have logprobs returned. Minimum of 0 and maximum of 100 allowed. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetLogProbsCol | ( | string | value | ) |
Sets value for logProbs column
| value | Include the log probabilities on the logprobs most likely tokens, as well the chosen tokens. So for example, if logprobs is 10, the API will return a list of the 10 most likely tokens. If logprobs is 0, only the chosen tokens will have logprobs returned. Minimum of 0 and maximum of 100 allowed. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetMaxTokens | ( | int | value | ) |
Sets value for maxTokens
| value | The maximum number of tokens to generate. Has minimum of 0. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetMaxTokensCol | ( | string | value | ) |
Sets value for maxTokens column
| value | The maximum number of tokens to generate. Has minimum of 0. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetMessagesCol | ( | string | value | ) |
Sets value for messagesCol
| value | The column messages to generate chat completions for, in the chat format. This column should have type Array(Struct(role: String, content: String)). |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetN | ( | int | value | ) |
Sets value for n
| value | How many snippets to generate for each prompt. Minimum of 1 and maximum of 128 allowed. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetNCol | ( | string | value | ) |
Sets value for n column
| value | How many snippets to generate for each prompt. Minimum of 1 and maximum of 128 allowed. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetOutputCol | ( | string | value | ) |
Sets value for outputCol
| value | The name of the output column |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetPresencePenalty | ( | double | value | ) |
Sets value for presencePenalty
| value | How much to penalize new tokens based on their existing frequency in the text so far. Decreases the likelihood of the model to repeat the same line verbatim. Has minimum of -2 and maximum of 2. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetPresencePenaltyCol | ( | string | value | ) |
Sets value for presencePenalty column
| value | How much to penalize new tokens based on their existing frequency in the text so far. Decreases the likelihood of the model to repeat the same line verbatim. Has minimum of -2 and maximum of 2. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetStop | ( | string | value | ) |
Sets value for stop
| value | A sequence which indicates the end of the current document. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetStopCol | ( | string | value | ) |
Sets value for stop column
| value | A sequence which indicates the end of the current document. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetSubscriptionKey | ( | string | value | ) |
Sets value for subscriptionKey
| value | the API key to use |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetSubscriptionKeyCol | ( | string | value | ) |
Sets value for subscriptionKey column
| value | the API key to use |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetTemperature | ( | double | value | ) |
Sets value for temperature
| value | What sampling temperature to use. Higher values means the model will take more risks. Try 0.9 for more creative applications, and 0 (argmax sampling) for ones with a well-defined answer. We generally recommend using this or top_p but not both. Minimum of 0 and maximum of 2 allowed. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetTemperatureCol | ( | string | value | ) |
Sets value for temperature column
| value | What sampling temperature to use. Higher values means the model will take more risks. Try 0.9 for more creative applications, and 0 (argmax sampling) for ones with a well-defined answer. We generally recommend using this or top_p but not both. Minimum of 0 and maximum of 2 allowed. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetTimeout | ( | double | value | ) |
Sets value for timeout
| value | number of seconds to wait before closing the connection |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetTopP | ( | double | value | ) |
Sets value for topP
| value | An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10 percent probability mass are considered. We generally recommend using this or temperature but not both. Minimum of 0 and maximum of 1 allowed. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetTopPCol | ( | string | value | ) |
Sets value for topP column
| value | An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10 percent probability mass are considered. We generally recommend using this or temperature but not both. Minimum of 0 and maximum of 1 allowed. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetUrl | ( | string | value | ) |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetUser | ( | string | value | ) |
Sets value for user
| value | The ID of the end-user, for use in tracking and rate-limiting. |
| OpenAIChatCompletion Synapse.ML.Services.Openai.OpenAIChatCompletion.SetUserCol | ( | string | value | ) |
Sets value for user column
| value | The ID of the end-user, for use in tracking and rate-limiting. |
1.8.17