synapse.ml.services.langchain package

Submodules

synapse.ml.services.langchain.LangchainTransform module

This file defines the Langchain transformation in SynapseML. To use this transformation, one needs to first define a chain, then set that chain as a parameter for the LangchainTransformer. Also needs to specify the inputColumn and outputColumn. Then this transformer will perform the operation defined in the Langchain chain to transform the input Column and save it to the OutputColumn. Example Usage:

>>> transformer = LangchainTransformer()
...                       .setInputCol("input_column_name")
...                       .setOutputCol("output_column_name")
...                       .setChain(pre_defined_chain)
...                       .setSubscriptionKey(OPENAI_API_KEY)
...                       .setUrl(baseURL)
>>> transformer.transform(sentenceDataFrame)

If the chain does not have memory, you can also save and load the Langchain Transformer. The saving of chains with memory is currently not supported in Langchain, so we can’t save transformers with that kind of chains Example Usage:

>>> transformer.save(path)
>>> loaded_transformer = LangchainTransformer.load(path)
class synapse.ml.services.langchain.LangchainTransform.LangchainTransformer(inputCol=None, outputCol=None, chain=None, subscriptionKey=None, url=None, apiVersion='2022-12-01', errorCol='errorCol')[source]

Bases: pyspark.ml.util.MLReadable[pyspark.ml.util.RL]

getApiVersion()[source]
getChain()[source]
getErrorCol()[source]
Returns

The name of the error column

Return type

str

getSubscriptionKey()[source]
getUrl()[source]
classmethod read() LangchainTransformerParamsReader[RL][source]

Returns a LangchainTransformerParamsReader instance for this class.

setApiVersion(value: str)[source]
setChain(value)[source]
setErrorCol(value: str)[source]

Sets the value of outputCol.

setInputCol(value: str)[source]

Sets the value of inputCol.

setOutputCol(value: str)[source]

Sets the value of outputCol.

setParams(inputCol=None, outputCol=None, chain=None, subscriptionKey=None, url=None, apiVersion='2022-12-01', errorCol='errorCol')[source]
setSubscriptionKey(value: str)[source]

set the openAI api key

setUrl(value: str)[source]
write() synapse.ml.services.langchain.LangchainTransform.LangchainTransformerParamsWriter[source]

Returns a DefaultParamsWriter instance for this class.

class synapse.ml.services.langchain.LangchainTransform.LangchainTransformerParamsReader(cls: Type[pyspark.ml.util.DefaultParamsReadable[pyspark.ml.util.RL]])[source]

Bases: pyspark.ml.util.MLReader[pyspark.ml.util.RL]

load(path: str) synapse.ml.services.langchain.LangchainTransform.RL[source]

Load the ML instance from the input path.

class synapse.ml.services.langchain.LangchainTransform.LangchainTransformerParamsWriter(instance: Params)[source]

Bases: pyspark.ml.util.DefaultParamsWriter

saveImpl(path: str) None[source]

save() handles overwriting and then calls this method. Subclasses should override this method to implement the actual saving of the instance.

Module contents

SynapseML is an ecosystem of tools aimed towards expanding the distributed computing framework Apache Spark in several new directions. SynapseML adds many deep learning and data science tools to the Spark ecosystem, including seamless integration of Spark Machine Learning pipelines with Microsoft Cognitive Toolkit (CNTK), LightGBM and OpenCV. These tools enable powerful and highly-scalable predictive and analytical models for a variety of datasources.

SynapseML also brings new networking capabilities to the Spark Ecosystem. With the HTTP on Spark project, users can embed any web service into their SparkML models. In this vein, SynapseML provides easy to use SparkML transformers for a wide variety of Microsoft Cognitive Services. For production grade deployment, the Spark Serving project enables high throughput, sub-millisecond latency web services, backed by your Spark cluster.

SynapseML requires Scala 2.12, Spark 3.0+, and Python 3.6+.