|
Synapseml
0.10.0
|
OpenAICompletion implements OpenAICompletion More...


Public Member Functions | |
| OpenAICompletion () | |
| Creates a OpenAICompletion without any parameters. More... | |
| OpenAICompletion (string uid) | |
| Creates a OpenAICompletion with a UID that is used to give the OpenAICompletion a unique ID. More... | |
| OpenAICompletion | SetApiVersion (string value) |
| Sets value for apiVersion More... | |
| OpenAICompletion | SetApiVersionCol (string value) |
| Sets value for apiVersion column More... | |
| OpenAICompletion | SetBatchIndexPrompt (int[][] value) |
| Sets value for batchIndexPrompt More... | |
| OpenAICompletion | SetBatchIndexPromptCol (string value) |
| Sets value for batchIndexPrompt column More... | |
| OpenAICompletion | SetBatchPrompt (string[] value) |
| Sets value for batchPrompt More... | |
| OpenAICompletion | SetBatchPromptCol (string value) |
| Sets value for batchPrompt column More... | |
| OpenAICompletion | SetBestOf (int value) |
| Sets value for bestOf More... | |
| OpenAICompletion | SetBestOfCol (string value) |
| Sets value for bestOf column More... | |
| OpenAICompletion | SetCacheLevel (int value) |
| Sets value for cacheLevel More... | |
| OpenAICompletion | SetCacheLevelCol (string value) |
| Sets value for cacheLevel column More... | |
| OpenAICompletion | SetConcurrency (int value) |
| Sets value for concurrency More... | |
| OpenAICompletion | SetConcurrentTimeout (double value) |
| Sets value for concurrentTimeout More... | |
| OpenAICompletion | SetDeploymentName (string value) |
| Sets value for deploymentName More... | |
| OpenAICompletion | SetDeploymentNameCol (string value) |
| Sets value for deploymentName column More... | |
| OpenAICompletion | SetEcho (bool value) |
| Sets value for echo More... | |
| OpenAICompletion | SetEchoCol (string value) |
| Sets value for echo column More... | |
| OpenAICompletion | SetErrorCol (string value) |
| Sets value for errorCol More... | |
| OpenAICompletion | SetFrequencyPenalty (double value) |
| Sets value for frequencyPenalty More... | |
| OpenAICompletion | SetFrequencyPenaltyCol (string value) |
| Sets value for frequencyPenalty column More... | |
| OpenAICompletion | SetHandler (object value) |
| Sets value for handler More... | |
| OpenAICompletion | SetIndexPrompt (int[] value) |
| Sets value for indexPrompt More... | |
| OpenAICompletion | SetIndexPromptCol (string value) |
| Sets value for indexPrompt column More... | |
| OpenAICompletion | SetLogProbs (int value) |
| Sets value for logProbs More... | |
| OpenAICompletion | SetLogProbsCol (string value) |
| Sets value for logProbs column More... | |
| OpenAICompletion | SetMaxTokens (int value) |
| Sets value for maxTokens More... | |
| OpenAICompletion | SetMaxTokensCol (string value) |
| Sets value for maxTokens column More... | |
| OpenAICompletion | SetModel (string value) |
| Sets value for model More... | |
| OpenAICompletion | SetModelCol (string value) |
| Sets value for model column More... | |
| OpenAICompletion | SetN (int value) |
| Sets value for n More... | |
| OpenAICompletion | SetNCol (string value) |
| Sets value for n column More... | |
| OpenAICompletion | SetOutputCol (string value) |
| Sets value for outputCol More... | |
| OpenAICompletion | SetPresencePenalty (double value) |
| Sets value for presencePenalty More... | |
| OpenAICompletion | SetPresencePenaltyCol (string value) |
| Sets value for presencePenalty column More... | |
| OpenAICompletion | SetPrompt (string value) |
| Sets value for prompt More... | |
| OpenAICompletion | SetPromptCol (string value) |
| Sets value for prompt column More... | |
| OpenAICompletion | SetStop (string value) |
| Sets value for stop More... | |
| OpenAICompletion | SetStopCol (string value) |
| Sets value for stop column More... | |
| OpenAICompletion | SetSubscriptionKey (string value) |
| Sets value for subscriptionKey More... | |
| OpenAICompletion | SetSubscriptionKeyCol (string value) |
| Sets value for subscriptionKey column More... | |
| OpenAICompletion | SetTemperature (double value) |
| Sets value for temperature More... | |
| OpenAICompletion | SetTemperatureCol (string value) |
| Sets value for temperature column More... | |
| OpenAICompletion | SetTimeout (double value) |
| Sets value for timeout More... | |
| OpenAICompletion | SetTopP (double value) |
| Sets value for topP More... | |
| OpenAICompletion | SetTopPCol (string value) |
| Sets value for topP column More... | |
| OpenAICompletion | SetUrl (string value) |
| Sets value for url More... | |
| OpenAICompletion | SetUser (string value) |
| Sets value for user More... | |
| OpenAICompletion | SetUserCol (string value) |
| Sets value for user column More... | |
| string | GetApiVersion () |
| Gets apiVersion value More... | |
| int [][] | GetBatchIndexPrompt () |
| Gets batchIndexPrompt value More... | |
| string [] | GetBatchPrompt () |
| Gets batchPrompt value More... | |
| int | GetBestOf () |
| Gets bestOf value More... | |
| int | GetCacheLevel () |
| Gets cacheLevel value More... | |
| int | GetConcurrency () |
| Gets concurrency value More... | |
| double | GetConcurrentTimeout () |
| Gets concurrentTimeout value More... | |
| string | GetDeploymentName () |
| Gets deploymentName value More... | |
| bool | GetEcho () |
| Gets echo value More... | |
| string | GetErrorCol () |
| Gets errorCol value More... | |
| double | GetFrequencyPenalty () |
| Gets frequencyPenalty value More... | |
| object | GetHandler () |
| Gets handler value More... | |
| int [] | GetIndexPrompt () |
| Gets indexPrompt value More... | |
| int | GetLogProbs () |
| Gets logProbs value More... | |
| int | GetMaxTokens () |
| Gets maxTokens value More... | |
| string | GetModel () |
| Gets model value More... | |
| int | GetN () |
| Gets n value More... | |
| string | GetOutputCol () |
| Gets outputCol value More... | |
| double | GetPresencePenalty () |
| Gets presencePenalty value More... | |
| string | GetPrompt () |
| Gets prompt value More... | |
| string | GetStop () |
| Gets stop value More... | |
| string | GetSubscriptionKey () |
| Gets subscriptionKey value More... | |
| double | GetTemperature () |
| Gets temperature value More... | |
| double | GetTimeout () |
| Gets timeout value More... | |
| double | GetTopP () |
| Gets topP value More... | |
| string | GetUrl () |
| Gets url value More... | |
| string | GetUser () |
| Gets user value More... | |
| void | Save (string path) |
| Saves the object so that it can be loaded later using Load. Note that these objects can be shared with Scala by Loading or Saving in Scala. More... | |
| JavaMLWriter | Write () |
| |
| JavaMLReader< OpenAICompletion > | Read () |
| Get the corresponding JavaMLReader instance. More... | |
Static Public Member Functions | |
| static OpenAICompletion | Load (string path) |
| Loads the OpenAICompletion that was previously saved using Save(string). More... | |
OpenAICompletion implements OpenAICompletion
|
inline |
Creates a OpenAICompletion without any parameters.
|
inline |
Creates a OpenAICompletion with a UID that is used to give the OpenAICompletion a unique ID.
| uid | An immutable unique ID for the object and its derivatives. |
| string Synapse.ML.Cognitive.OpenAICompletion.GetApiVersion | ( | ) |
Gets apiVersion value
| int [][] Synapse.ML.Cognitive.OpenAICompletion.GetBatchIndexPrompt | ( | ) |
Gets batchIndexPrompt value
| string [] Synapse.ML.Cognitive.OpenAICompletion.GetBatchPrompt | ( | ) |
Gets batchPrompt value
| int Synapse.ML.Cognitive.OpenAICompletion.GetBestOf | ( | ) |
Gets bestOf value
| int Synapse.ML.Cognitive.OpenAICompletion.GetCacheLevel | ( | ) |
Gets cacheLevel value
| int Synapse.ML.Cognitive.OpenAICompletion.GetConcurrency | ( | ) |
Gets concurrency value
| double Synapse.ML.Cognitive.OpenAICompletion.GetConcurrentTimeout | ( | ) |
Gets concurrentTimeout value
| string Synapse.ML.Cognitive.OpenAICompletion.GetDeploymentName | ( | ) |
Gets deploymentName value
| bool Synapse.ML.Cognitive.OpenAICompletion.GetEcho | ( | ) |
Gets echo value
| string Synapse.ML.Cognitive.OpenAICompletion.GetErrorCol | ( | ) |
Gets errorCol value
| double Synapse.ML.Cognitive.OpenAICompletion.GetFrequencyPenalty | ( | ) |
Gets frequencyPenalty value
| object Synapse.ML.Cognitive.OpenAICompletion.GetHandler | ( | ) |
Gets handler value
| int [] Synapse.ML.Cognitive.OpenAICompletion.GetIndexPrompt | ( | ) |
Gets indexPrompt value
| int Synapse.ML.Cognitive.OpenAICompletion.GetLogProbs | ( | ) |
Gets logProbs value
logprobs most likely tokens, as well the chosen tokens. So for example, if logprobs is 10, the API will return a list of the 10 most likely tokens. If logprobs is 0, only the chosen tokens will have logprobs returned. Minimum of 0 and maximum of 100 allowed. | int Synapse.ML.Cognitive.OpenAICompletion.GetMaxTokens | ( | ) |
Gets maxTokens value
| string Synapse.ML.Cognitive.OpenAICompletion.GetModel | ( | ) |
Gets model value
| int Synapse.ML.Cognitive.OpenAICompletion.GetN | ( | ) |
Gets n value
| string Synapse.ML.Cognitive.OpenAICompletion.GetOutputCol | ( | ) |
Gets outputCol value
| double Synapse.ML.Cognitive.OpenAICompletion.GetPresencePenalty | ( | ) |
Gets presencePenalty value
| string Synapse.ML.Cognitive.OpenAICompletion.GetPrompt | ( | ) |
Gets prompt value
| string Synapse.ML.Cognitive.OpenAICompletion.GetStop | ( | ) |
Gets stop value
| string Synapse.ML.Cognitive.OpenAICompletion.GetSubscriptionKey | ( | ) |
Gets subscriptionKey value
| double Synapse.ML.Cognitive.OpenAICompletion.GetTemperature | ( | ) |
Gets temperature value
top_p but not both. Minimum of 0 and maximum of 2 allowed. | double Synapse.ML.Cognitive.OpenAICompletion.GetTimeout | ( | ) |
Gets timeout value
| double Synapse.ML.Cognitive.OpenAICompletion.GetTopP | ( | ) |
Gets topP value
temperature but not both. Minimum of 0 and maximum of 1 allowed. | string Synapse.ML.Cognitive.OpenAICompletion.GetUrl | ( | ) |
Gets url value
| string Synapse.ML.Cognitive.OpenAICompletion.GetUser | ( | ) |
Gets user value
|
static |
Loads the OpenAICompletion that was previously saved using Save(string).
| path | The path the previous OpenAICompletion was saved to |
| JavaMLReader<OpenAICompletion> Synapse.ML.Cognitive.OpenAICompletion.Read | ( | ) |
Get the corresponding JavaMLReader instance.
| void Synapse.ML.Cognitive.OpenAICompletion.Save | ( | string | path | ) |
Saves the object so that it can be loaded later using Load. Note that these objects can be shared with Scala by Loading or Saving in Scala.
| path | The path to save the object to |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetApiVersion | ( | string | value | ) |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetApiVersionCol | ( | string | value | ) |
Sets value for apiVersion column
| value | version of the api |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetBatchIndexPrompt | ( | int | value[][] | ) |
Sets value for batchIndexPrompt
| value | Sequence of index sequences to complete |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetBatchIndexPromptCol | ( | string | value | ) |
Sets value for batchIndexPrompt column
| value | Sequence of index sequences to complete |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetBatchPrompt | ( | string [] | value | ) |
Sets value for batchPrompt
| value | Sequence of prompts to complete |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetBatchPromptCol | ( | string | value | ) |
Sets value for batchPrompt column
| value | Sequence of prompts to complete |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetBestOf | ( | int | value | ) |
Sets value for bestOf
| value | How many generations to create server side, and display only the best. Will not stream intermediate progress if best_of > 1. Has maximum value of 128. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetBestOfCol | ( | string | value | ) |
Sets value for bestOf column
| value | How many generations to create server side, and display only the best. Will not stream intermediate progress if best_of > 1. Has maximum value of 128. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetCacheLevel | ( | int | value | ) |
Sets value for cacheLevel
| value | can be used to disable any server-side caching, 0=no cache, 1=prompt prefix enabled, 2=full cache |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetCacheLevelCol | ( | string | value | ) |
Sets value for cacheLevel column
| value | can be used to disable any server-side caching, 0=no cache, 1=prompt prefix enabled, 2=full cache |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetConcurrency | ( | int | value | ) |
Sets value for concurrency
| value | max number of concurrent calls |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetConcurrentTimeout | ( | double | value | ) |
Sets value for concurrentTimeout
| value | max number seconds to wait on futures if concurrency >= 1 |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetDeploymentName | ( | string | value | ) |
Sets value for deploymentName
| value | The name of the deployment |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetDeploymentNameCol | ( | string | value | ) |
Sets value for deploymentName column
| value | The name of the deployment |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetEcho | ( | bool | value | ) |
Sets value for echo
| value | Echo back the prompt in addition to the completion |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetEchoCol | ( | string | value | ) |
Sets value for echo column
| value | Echo back the prompt in addition to the completion |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetErrorCol | ( | string | value | ) |
Sets value for errorCol
| value | column to hold http errors |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetFrequencyPenalty | ( | double | value | ) |
Sets value for frequencyPenalty
| value | How much to penalize new tokens based on whether they appear in the text so far. Increases the model's likelihood to talk about new topics. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetFrequencyPenaltyCol | ( | string | value | ) |
Sets value for frequencyPenalty column
| value | How much to penalize new tokens based on whether they appear in the text so far. Increases the model's likelihood to talk about new topics. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetHandler | ( | object | value | ) |
Sets value for handler
| value | Which strategy to use when handling requests |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetIndexPrompt | ( | int [] | value | ) |
Sets value for indexPrompt
| value | Sequence of indexes to complete |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetIndexPromptCol | ( | string | value | ) |
Sets value for indexPrompt column
| value | Sequence of indexes to complete |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetLogProbs | ( | int | value | ) |
Sets value for logProbs
| value | Include the log probabilities on the logprobs most likely tokens, as well the chosen tokens. So for example, if logprobs is 10, the API will return a list of the 10 most likely tokens. If logprobs is 0, only the chosen tokens will have logprobs returned. Minimum of 0 and maximum of 100 allowed. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetLogProbsCol | ( | string | value | ) |
Sets value for logProbs column
| value | Include the log probabilities on the logprobs most likely tokens, as well the chosen tokens. So for example, if logprobs is 10, the API will return a list of the 10 most likely tokens. If logprobs is 0, only the chosen tokens will have logprobs returned. Minimum of 0 and maximum of 100 allowed. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetMaxTokens | ( | int | value | ) |
Sets value for maxTokens
| value | The maximum number of tokens to generate. Has minimum of 0. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetMaxTokensCol | ( | string | value | ) |
Sets value for maxTokens column
| value | The maximum number of tokens to generate. Has minimum of 0. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetModel | ( | string | value | ) |
Sets value for model
| value | The name of the model to use |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetModelCol | ( | string | value | ) |
Sets value for model column
| value | The name of the model to use |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetN | ( | int | value | ) |
Sets value for n
| value | How many snippets to generate for each prompt. Minimum of 1 and maximum of 128 allowed. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetNCol | ( | string | value | ) |
Sets value for n column
| value | How many snippets to generate for each prompt. Minimum of 1 and maximum of 128 allowed. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetOutputCol | ( | string | value | ) |
Sets value for outputCol
| value | The name of the output column |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetPresencePenalty | ( | double | value | ) |
Sets value for presencePenalty
| value | How much to penalize new tokens based on their existing frequency in the text so far. Decreases the model's likelihood to repeat the same line verbatim. Has minimum of -2 and maximum of 2. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetPresencePenaltyCol | ( | string | value | ) |
Sets value for presencePenalty column
| value | How much to penalize new tokens based on their existing frequency in the text so far. Decreases the model's likelihood to repeat the same line verbatim. Has minimum of -2 and maximum of 2. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetPrompt | ( | string | value | ) |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetPromptCol | ( | string | value | ) |
Sets value for prompt column
| value | The text to complete |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetStop | ( | string | value | ) |
Sets value for stop
| value | A sequence which indicates the end of the current document. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetStopCol | ( | string | value | ) |
Sets value for stop column
| value | A sequence which indicates the end of the current document. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetSubscriptionKey | ( | string | value | ) |
Sets value for subscriptionKey
| value | the API key to use |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetSubscriptionKeyCol | ( | string | value | ) |
Sets value for subscriptionKey column
| value | the API key to use |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetTemperature | ( | double | value | ) |
Sets value for temperature
| value | What sampling temperature to use. Higher values means the model will take more risks. Try 0.9 for more creative applications, and 0 (argmax sampling) for ones with a well-defined answer. We generally recommend using this or top_p but not both. Minimum of 0 and maximum of 2 allowed. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetTemperatureCol | ( | string | value | ) |
Sets value for temperature column
| value | What sampling temperature to use. Higher values means the model will take more risks. Try 0.9 for more creative applications, and 0 (argmax sampling) for ones with a well-defined answer. We generally recommend using this or top_p but not both. Minimum of 0 and maximum of 2 allowed. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetTimeout | ( | double | value | ) |
Sets value for timeout
| value | number of seconds to wait before closing the connection |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetTopP | ( | double | value | ) |
Sets value for topP
| value | An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. We generally recommend using this or temperature but not both. Minimum of 0 and maximum of 1 allowed. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetTopPCol | ( | string | value | ) |
Sets value for topP column
| value | An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. We generally recommend using this or temperature but not both. Minimum of 0 and maximum of 1 allowed. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetUrl | ( | string | value | ) |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetUser | ( | string | value | ) |
Sets value for user
| value | The ID of the end-user, for use in tracking and rate-limiting. |
| OpenAICompletion Synapse.ML.Cognitive.OpenAICompletion.SetUserCol | ( | string | value | ) |
Sets value for user column
| value | The ID of the end-user, for use in tracking and rate-limiting. |
1.8.13