synapse.ml.image package
Submodules
synapse.ml.image.SuperpixelTransformer module
- class synapse.ml.image.SuperpixelTransformer.SuperpixelTransformer(java_obj=None, cellSize=16.0, inputCol=None, modifier=130.0, outputCol='SuperpixelTransformer_2d74c1de8591_output')[source]
Bases:
pyspark.ml.util.MLReadable
[pyspark.ml.util.RL
]- Parameters
- cellSize = Param(parent='undefined', name='cellSize', doc='Number that controls the size of the superpixels')
- getCellSize()[source]
- Returns
Number that controls the size of the superpixels
- Return type
cellSize
- getModifier()[source]
- Returns
Controls the trade-off spatial and color distance
- Return type
modifier
- inputCol = Param(parent='undefined', name='inputCol', doc='The name of the input column')
- modifier = Param(parent='undefined', name='modifier', doc='Controls the trade-off spatial and color distance')
- outputCol = Param(parent='undefined', name='outputCol', doc='The name of the output column')
synapse.ml.image.UnrollBinaryImage module
- class synapse.ml.image.UnrollBinaryImage.UnrollBinaryImage(java_obj=None, height=None, inputCol='image', nChannels=None, outputCol='UnrollImage_9ba1c299e0e6_output', width=None)[source]
Bases:
pyspark.ml.util.MLReadable
[pyspark.ml.util.RL
]- Parameters
- height = Param(parent='undefined', name='height', doc='the width of the image')
- inputCol = Param(parent='undefined', name='inputCol', doc='The name of the input column')
- nChannels = Param(parent='undefined', name='nChannels', doc='the number of channels of the target image')
- outputCol = Param(parent='undefined', name='outputCol', doc='The name of the output column')
- setParams(height=None, inputCol='image', nChannels=None, outputCol='UnrollImage_9ba1c299e0e6_output', width=None)[source]
Set the (keyword only) parameters
- width = Param(parent='undefined', name='width', doc='the width of the image')
synapse.ml.image.UnrollImage module
- class synapse.ml.image.UnrollImage.UnrollImage(java_obj=None, inputCol='image', outputCol='UnrollImage_b604a7bf85dc_output')[source]
Bases:
pyspark.ml.util.MLReadable
[pyspark.ml.util.RL
]- Parameters
- inputCol = Param(parent='undefined', name='inputCol', doc='The name of the input column')
- outputCol = Param(parent='undefined', name='outputCol', doc='The name of the output column')
Module contents
SynapseML is an ecosystem of tools aimed towards expanding the distributed computing framework Apache Spark in several new directions. SynapseML adds many deep learning and data science tools to the Spark ecosystem, including seamless integration of Spark Machine Learning pipelines with Microsoft Cognitive Toolkit (CNTK), LightGBM and OpenCV. These tools enable powerful and highly-scalable predictive and analytical models for a variety of datasources.
SynapseML also brings new networking capabilities to the Spark Ecosystem. With the HTTP on Spark project, users can embed any web service into their SparkML models. In this vein, SynapseML provides easy to use SparkML transformers for a wide variety of Microsoft Cognitive Services. For production grade deployment, the Spark Serving project enables high throughput, sub-millisecond latency web services, backed by your Spark cluster.
SynapseML requires Scala 2.12, Spark 3.0+, and Python 3.6+.