synapse.ml.image package

Submodules

synapse.ml.image.SuperpixelTransformer module

class synapse.ml.image.SuperpixelTransformer.SuperpixelTransformer(java_obj=None, cellSize=16.0, inputCol=None, modifier=130.0, outputCol='SuperpixelTransformer_147ebe794bfd_output')[source]

Bases: pyspark.ml.util.MLReadable[pyspark.ml.util.RL]

Parameters
  • cellSize (float) – Number that controls the size of the superpixels

  • inputCol (str) – The name of the input column

  • modifier (float) – Controls the trade-off spatial and color distance

  • outputCol (str) – The name of the output column

cellSize = Param(parent='undefined', name='cellSize', doc='Number that controls the size of the superpixels')
getCellSize()[source]
Returns

Number that controls the size of the superpixels

Return type

cellSize

getInputCol()[source]
Returns

The name of the input column

Return type

inputCol

static getJavaPackage()[source]

Returns package name String.

getModifier()[source]
Returns

Controls the trade-off spatial and color distance

Return type

modifier

getOutputCol()[source]
Returns

The name of the output column

Return type

outputCol

inputCol = Param(parent='undefined', name='inputCol', doc='The name of the input column')
modifier = Param(parent='undefined', name='modifier', doc='Controls the trade-off spatial and color distance')
outputCol = Param(parent='undefined', name='outputCol', doc='The name of the output column')
classmethod read()[source]

Returns an MLReader instance for this class.

setCellSize(value)[source]
Parameters

cellSize – Number that controls the size of the superpixels

setInputCol(value)[source]
Parameters

inputCol – The name of the input column

setModifier(value)[source]
Parameters

modifier – Controls the trade-off spatial and color distance

setOutputCol(value)[source]
Parameters

outputCol – The name of the output column

setParams(cellSize=16.0, inputCol=None, modifier=130.0, outputCol='SuperpixelTransformer_147ebe794bfd_output')[source]

Set the (keyword only) parameters

synapse.ml.image.UnrollBinaryImage module

class synapse.ml.image.UnrollBinaryImage.UnrollBinaryImage(java_obj=None, height=None, inputCol='image', nChannels=None, outputCol='UnrollImage_715ccf14105e_output', width=None)[source]

Bases: pyspark.ml.util.MLReadable[pyspark.ml.util.RL]

Parameters
  • height (int) – the width of the image

  • inputCol (str) – The name of the input column

  • nChannels (int) – the number of channels of the target image

  • outputCol (str) – The name of the output column

  • width (int) – the width of the image

getHeight()[source]
Returns

the width of the image

Return type

height

getInputCol()[source]
Returns

The name of the input column

Return type

inputCol

static getJavaPackage()[source]

Returns package name String.

getNChannels()[source]
Returns

the number of channels of the target image

Return type

nChannels

getOutputCol()[source]
Returns

The name of the output column

Return type

outputCol

getWidth()[source]
Returns

the width of the image

Return type

width

height = Param(parent='undefined', name='height', doc='the width of the image')
inputCol = Param(parent='undefined', name='inputCol', doc='The name of the input column')
nChannels = Param(parent='undefined', name='nChannels', doc='the number of channels of the target image')
outputCol = Param(parent='undefined', name='outputCol', doc='The name of the output column')
classmethod read()[source]

Returns an MLReader instance for this class.

setHeight(value)[source]
Parameters

height – the width of the image

setInputCol(value)[source]
Parameters

inputCol – The name of the input column

setNChannels(value)[source]
Parameters

nChannels – the number of channels of the target image

setOutputCol(value)[source]
Parameters

outputCol – The name of the output column

setParams(height=None, inputCol='image', nChannels=None, outputCol='UnrollImage_715ccf14105e_output', width=None)[source]

Set the (keyword only) parameters

setWidth(value)[source]
Parameters

width – the width of the image

width = Param(parent='undefined', name='width', doc='the width of the image')

synapse.ml.image.UnrollImage module

class synapse.ml.image.UnrollImage.UnrollImage(java_obj=None, inputCol='image', outputCol='UnrollImage_9ca50a5a4431_output')[source]

Bases: pyspark.ml.util.MLReadable[pyspark.ml.util.RL]

Parameters
  • inputCol (str) – The name of the input column

  • outputCol (str) – The name of the output column

getInputCol()[source]
Returns

The name of the input column

Return type

inputCol

static getJavaPackage()[source]

Returns package name String.

getOutputCol()[source]
Returns

The name of the output column

Return type

outputCol

inputCol = Param(parent='undefined', name='inputCol', doc='The name of the input column')
outputCol = Param(parent='undefined', name='outputCol', doc='The name of the output column')
classmethod read()[source]

Returns an MLReader instance for this class.

setInputCol(value)[source]
Parameters

inputCol – The name of the input column

setOutputCol(value)[source]
Parameters

outputCol – The name of the output column

setParams(inputCol='image', outputCol='UnrollImage_9ca50a5a4431_output')[source]

Set the (keyword only) parameters

Module contents

SynapseML is an ecosystem of tools aimed towards expanding the distributed computing framework Apache Spark in several new directions. SynapseML adds many deep learning and data science tools to the Spark ecosystem, including seamless integration of Spark Machine Learning pipelines with Microsoft Cognitive Toolkit (CNTK), LightGBM and OpenCV. These tools enable powerful and highly-scalable predictive and analytical models for a variety of datasources.

SynapseML also brings new networking capabilities to the Spark Ecosystem. With the HTTP on Spark project, users can embed any web service into their SparkML models. In this vein, SynapseML provides easy to use SparkML transformers for a wide variety of Microsoft Cognitive Services. For production grade deployment, the Spark Serving project enables high throughput, sub-millisecond latency web services, backed by your Spark cluster.

SynapseML requires Scala 2.12, Spark 3.0+, and Python 3.6+.