synapse.ml.nn package
Submodules
synapse.ml.nn.ConditionalBallTree module
- class synapse.ml.nn.ConditionalBallTree.ConditionalBallTree(keys, values, labels, leafSize, java_obj=None)[source]
Bases:
object
- findMaximumInnerProducts(queryPoint, conditioner, k)[source]
Find the best match to the queryPoint given the conditioner and k from self. :param _sphinx_paramlinks_synapse.ml.nn.ConditionalBallTree.ConditionalBallTree.findMaximumInnerProducts.queryPoint: array vector to use to query for NNs :param _sphinx_paramlinks_synapse.ml.nn.ConditionalBallTree.ConditionalBallTree.findMaximumInnerProducts.conditioner: set of labels that will subset or condition the NN query :param _sphinx_paramlinks_synapse.ml.nn.ConditionalBallTree.ConditionalBallTree.findMaximumInnerProducts.k: int representing the maximum number of neighbors to return :return: array of tuples representing the index of the match and its distance
synapse.ml.nn.ConditionalKNN module
- class synapse.ml.nn.ConditionalKNN.ConditionalKNN(java_obj=None, conditionerCol='conditioner', featuresCol='features', k=5, labelCol='labels', leafSize=50, outputCol='ConditionalKNN_83cd80d4f758_output', valuesCol='values')[source]
Bases:
pyspark.ml.util.MLReadable
[pyspark.ml.util.RL
]- Parameters
- conditionerCol = Param(parent='undefined', name='conditionerCol', doc='column holding identifiers for features that will be returned when queried')
- featuresCol = Param(parent='undefined', name='featuresCol', doc='The name of the features column')
- getConditionerCol()[source]
- Returns
column holding identifiers for features that will be returned when queried
- Return type
conditionerCol
- getValuesCol()[source]
- Returns
column holding values for each feature (key) that will be returned when queried
- Return type
valuesCol
- k = Param(parent='undefined', name='k', doc='number of matches to return')
- labelCol = Param(parent='undefined', name='labelCol', doc='The name of the label column')
- leafSize = Param(parent='undefined', name='leafSize', doc='max size of the leaves of the tree')
- outputCol = Param(parent='undefined', name='outputCol', doc='The name of the output column')
- setConditionerCol(value)[source]
- Parameters
conditionerCol¶ – column holding identifiers for features that will be returned when queried
- setParams(conditionerCol='conditioner', featuresCol='features', k=5, labelCol='labels', leafSize=50, outputCol='ConditionalKNN_83cd80d4f758_output', valuesCol='values')[source]
Set the (keyword only) parameters
- setValuesCol(value)[source]
- Parameters
valuesCol¶ – column holding values for each feature (key) that will be returned when queried
- valuesCol = Param(parent='undefined', name='valuesCol', doc='column holding values for each feature (key) that will be returned when queried')
synapse.ml.nn.ConditionalKNNModel module
- class synapse.ml.nn.ConditionalKNNModel.ConditionalKNNModel(java_obj=None, ballTree=None, conditionerCol=None, featuresCol=None, k=None, labelCol=None, leafSize=None, outputCol=None, valuesCol=None)[source]
Bases:
pyspark.ml.util.MLReadable
[pyspark.ml.util.RL
]- Parameters
- ballTree = Param(parent='undefined', name='ballTree', doc='the ballTree model used for perfoming queries')
- conditionerCol = Param(parent='undefined', name='conditionerCol', doc='column holding identifiers for features that will be returned when queried')
- featuresCol = Param(parent='undefined', name='featuresCol', doc='The name of the features column')
- getConditionerCol()[source]
- Returns
column holding identifiers for features that will be returned when queried
- Return type
conditionerCol
- getValuesCol()[source]
- Returns
column holding values for each feature (key) that will be returned when queried
- Return type
valuesCol
- k = Param(parent='undefined', name='k', doc='number of matches to return')
- labelCol = Param(parent='undefined', name='labelCol', doc='The name of the label column')
- leafSize = Param(parent='undefined', name='leafSize', doc='max size of the leaves of the tree')
- outputCol = Param(parent='undefined', name='outputCol', doc='The name of the output column')
- setConditionerCol(value)[source]
- Parameters
conditionerCol¶ – column holding identifiers for features that will be returned when queried
- setParams(ballTree=None, conditionerCol=None, featuresCol=None, k=None, labelCol=None, leafSize=None, outputCol=None, valuesCol=None)[source]
Set the (keyword only) parameters
- setValuesCol(value)[source]
- Parameters
valuesCol¶ – column holding values for each feature (key) that will be returned when queried
- valuesCol = Param(parent='undefined', name='valuesCol', doc='column holding values for each feature (key) that will be returned when queried')
synapse.ml.nn.KNN module
- class synapse.ml.nn.KNN.KNN(java_obj=None, featuresCol='features', k=5, leafSize=50, outputCol='KNN_a0ebf26cba02_output', valuesCol='values')[source]
Bases:
pyspark.ml.util.MLReadable
[pyspark.ml.util.RL
]- Parameters
- featuresCol = Param(parent='undefined', name='featuresCol', doc='The name of the features column')
- getValuesCol()[source]
- Returns
column holding values for each feature (key) that will be returned when queried
- Return type
valuesCol
- k = Param(parent='undefined', name='k', doc='number of matches to return')
- leafSize = Param(parent='undefined', name='leafSize', doc='max size of the leaves of the tree')
- outputCol = Param(parent='undefined', name='outputCol', doc='The name of the output column')
- setParams(featuresCol='features', k=5, leafSize=50, outputCol='KNN_a0ebf26cba02_output', valuesCol='values')[source]
Set the (keyword only) parameters
- setValuesCol(value)[source]
- Parameters
valuesCol¶ – column holding values for each feature (key) that will be returned when queried
- valuesCol = Param(parent='undefined', name='valuesCol', doc='column holding values for each feature (key) that will be returned when queried')
synapse.ml.nn.KNNModel module
- class synapse.ml.nn.KNNModel.KNNModel(java_obj=None, ballTree=None, featuresCol=None, k=None, leafSize=None, outputCol=None, valuesCol=None)[source]
Bases:
pyspark.ml.util.MLReadable
[pyspark.ml.util.RL
]- Parameters
- ballTree = Param(parent='undefined', name='ballTree', doc='the ballTree model used for performing queries')
- featuresCol = Param(parent='undefined', name='featuresCol', doc='The name of the features column')
- getValuesCol()[source]
- Returns
column holding values for each feature (key) that will be returned when queried
- Return type
valuesCol
- k = Param(parent='undefined', name='k', doc='number of matches to return')
- leafSize = Param(parent='undefined', name='leafSize', doc='max size of the leaves of the tree')
- outputCol = Param(parent='undefined', name='outputCol', doc='The name of the output column')
- setParams(ballTree=None, featuresCol=None, k=None, leafSize=None, outputCol=None, valuesCol=None)[source]
Set the (keyword only) parameters
- setValuesCol(value)[source]
- Parameters
valuesCol¶ – column holding values for each feature (key) that will be returned when queried
- valuesCol = Param(parent='undefined', name='valuesCol', doc='column holding values for each feature (key) that will be returned when queried')
Module contents
SynapseML is an ecosystem of tools aimed towards expanding the distributed computing framework Apache Spark in several new directions. SynapseML adds many deep learning and data science tools to the Spark ecosystem, including seamless integration of Spark Machine Learning pipelines with Microsoft Cognitive Toolkit (CNTK), LightGBM and OpenCV. These tools enable powerful and highly-scalable predictive and analytical models for a variety of datasources.
SynapseML also brings new networking capabilities to the Spark Ecosystem. With the HTTP on Spark project, users can embed any web service into their SparkML models. In this vein, SynapseML provides easy to use SparkML transformers for a wide variety of Microsoft Cognitive Services. For production grade deployment, the Spark Serving project enables high throughput, sub-millisecond latency web services, backed by your Spark cluster.
SynapseML requires Scala 2.12, Spark 3.0+, and Python 3.6+.