MLContext
@Deprecated public class MLContext extends Object
Typical usage for MLContext is as follows:
scala> import org.apache.sysml.api.MLContext
Create input DataFrame from CSV file and potentially perform some feature transformation
scala> val W = sparkSession.load("com.databricks.spark.csv", Map("path" -> "W.csv", "header" -> "false"))
scala> val H = sparkSession.load("com.databricks.spark.csv", Map("path" -> "H.csv", "header" -> "false"))
scala> val V = sparkSession.load("com.databricks.spark.csv", Map("path" -> "V.csv", "header" -> "false"))
Create MLContext
scala> val ml = new MLContext(sc)
Register input and output DataFrame/RDD Supported format:
scala> ml.registerInput("V", V)
scala> ml.registerInput("W", W)
scala> ml.registerInput("H", H)
scala> ml.registerOutput("H")
scala> ml.registerOutput("W")
Call script with default arguments:
scala> val outputs = ml.execute("GNMF.dml")
Also supported: calling script with positional arguments (args) and named arguments (nargs):
scala> val args = Array("V.mtx", "W.mtx", "H.mtx", "2000", "1500", "50", "1", "WOut.mtx", "HOut.mtx")
scala> val nargs = Map("maxIter"->"1", "V" -> "")
scala> val outputs = ml.execute("GNMF.dml", args) # or ml.execute("GNMF_namedArgs.dml", nargs)
To run the script again using different (or even same arguments), but using same registered input/outputs:
scala> val new_outputs = ml.execute("GNMF.dml", new_args)
However, to register new input/outputs, you need to first reset MLContext
scala> ml.reset()
scala> ml.registerInput("V", newV)
Experimental API: To monitor performance (only supported for Spark 1.4.0 or higher),
scala> val ml = new MLContext(sc, true)
If monitoring performance is enabled,
scala> print(ml.getMonitoringUtil().getExplainOutput())
scala> ml.getMonitoringUtil().getRuntimeInfoInHTML("runtime.html")
Note: The execute(...) methods does not support parallel calls from same or different MLContext. This is because current SystemML engine does not allow multiple invocation in same JVM. So, if you plan to create a system which potentially creates multiple MLContext, it is recommended to guard the execute(...) call using
synchronized(MLContext.class) { ml.execute(...); }
Constructor and Description |
---|
MLContext(org.apache.spark.api.java.JavaSparkContext sc)
Deprecated.
Create an associated MLContext for given spark session.
|
MLContext(org.apache.spark.SparkContext sc)
Deprecated.
Create an associated MLContext for given spark session.
|
Modifier and Type | Method and Description |
---|---|
MLOutput |
execute(String dmlScriptFilePath)
Deprecated.
Execute DML script without any arguments using default configuration.
|
MLOutput |
execute(String dmlScriptFilePath,
ArrayList<String> args)
Deprecated.
Execute DML script by passing positional arguments using default configuration.
|
MLOutput |
execute(String dmlScriptFilePath,
ArrayList<String> argsName,
ArrayList<String> argsValues)
Deprecated.
Execute DML script by passing positional arguments using specified config file
|
MLOutput |
execute(String dmlScriptFilePath,
ArrayList<String> argsName,
ArrayList<String> argsValues,
String configFilePath)
Deprecated.
Execute DML script by passing positional arguments using specified config file
|
MLOutput |
execute(String dmlScriptFilePath,
ArrayList<String> args,
boolean parsePyDML)
Deprecated.
Experimental: Execute DML script by passing positional arguments if parsePyDML=true, using default configuration.
|
MLOutput |
execute(String dmlScriptFilePath,
ArrayList<String> args,
boolean parsePyDML,
String configFilePath)
Deprecated.
Experimental: Execute DML script by passing positional arguments if parsePyDML=true, using specified config file.
|
MLOutput |
execute(String dmlScriptFilePath,
ArrayList<String> args,
String configFilePath)
Deprecated.
Execute DML script by passing positional arguments using specified config file
This method is implemented for compatibility with Python MLContext.
|
MLOutput |
execute(String dmlScriptFilePath,
boolean parsePyDML)
Deprecated.
Experimental: Execute DML script without any arguments if parsePyDML=true, using default configuration.
|
MLOutput |
execute(String dmlScriptFilePath,
boolean parsePyDML,
String configFilePath)
Deprecated.
Experimental: Execute DML script without any arguments if parsePyDML=true, using specified config path.
|
MLOutput |
execute(String dmlScriptFilePath,
Map<String,String> namedArgs)
Deprecated.
Execute DML script by passing named arguments with default configuration.
|
MLOutput |
execute(String dmlScriptFilePath,
scala.collection.immutable.Map<String,String> namedArgs)
Deprecated.
Execute DML script by passing named arguments.
|
MLOutput |
execute(String dmlScriptFilePath,
Map<String,String> namedArgs,
boolean parsePyDML)
Deprecated.
Experimental: Execute PyDML script by passing named arguments if parsePyDML=true.
|
MLOutput |
execute(String dmlScriptFilePath,
scala.collection.immutable.Map<String,String> namedArgs,
boolean parsePyDML)
Deprecated.
Experimental: Execute PyDML script by passing named arguments if parsePyDML=true.
|
MLOutput |
execute(String dmlScriptFilePath,
Map<String,String> namedArgs,
boolean parsePyDML,
String configFilePath)
Deprecated.
Execute DML script by passing named arguments using specified config file.
|
MLOutput |
execute(String dmlScriptFilePath,
Map<String,String> namedArgs,
String configFilePath)
Deprecated.
Execute DML script by passing named arguments using specified config file.
|
MLOutput |
execute(String dmlScriptFilePath,
String configFilePath)
Deprecated.
Execute DML script without any arguments using specified config path
|
MLOutput |
execute(String dmlScriptFilePath,
String[] args)
Deprecated.
Execute DML script by passing positional arguments using default configuration
|
MLOutput |
execute(String dmlScriptFilePath,
String[] args,
boolean parsePyDML)
Deprecated.
Experimental: Execute DML script by passing positional arguments if parsePyDML=true, using default configuration.
|
MLOutput |
execute(String dmlScriptFilePath,
String[] args,
boolean parsePyDML,
String configFilePath)
Deprecated.
Experimental: Execute DML script by passing positional arguments if parsePyDML=true, using specified config file.
|
MLOutput |
execute(String dmlScriptFilePath,
String[] args,
String configFilePath)
Deprecated.
Execute DML script by passing positional arguments using specified config file
|
MLOutput |
executeScript(String dmlScript)
Deprecated.
Execute a script stored in a string.
|
MLOutput |
executeScript(String dmlScript,
ArrayList<String> argsName,
ArrayList<String> argsValues)
Deprecated.
|
MLOutput |
executeScript(String dmlScript,
ArrayList<String> argsName,
ArrayList<String> argsValues,
String configFilePath)
Deprecated.
|
MLOutput |
executeScript(String dmlScript,
boolean isPyDML)
Deprecated.
|
MLOutput |
executeScript(String dmlScript,
boolean isPyDML,
String configFilePath)
Deprecated.
|
MLOutput |
executeScript(String dmlScript,
scala.collection.immutable.Map<String,String> namedArgs)
Deprecated.
|
MLOutput |
executeScript(String dmlScript,
Map<String,String> namedArgs)
Deprecated.
|
MLOutput |
executeScript(String dmlScript,
scala.collection.immutable.Map<String,String> namedArgs,
boolean isPyDML)
Deprecated.
|
MLOutput |
executeScript(String dmlScript,
Map<String,String> namedArgs,
boolean isPyDML)
Deprecated.
|
MLOutput |
executeScript(String dmlScript,
scala.collection.immutable.Map<String,String> namedArgs,
boolean isPyDML,
String configFilePath)
Deprecated.
|
MLOutput |
executeScript(String dmlScript,
Map<String,String> namedArgs,
boolean isPyDML,
String configFilePath)
Deprecated.
|
MLOutput |
executeScript(String dmlScript,
scala.collection.immutable.Map<String,String> namedArgs,
String configFilePath)
Deprecated.
|
MLOutput |
executeScript(String dmlScript,
Map<String,String> namedArgs,
String configFilePath)
Deprecated.
|
MLOutput |
executeScript(String dmlScript,
String configFilePath)
Deprecated.
|
org.apache.spark.SparkContext |
getSparkContext()
Deprecated.
|
MLMatrix |
read(org.apache.spark.sql.SparkSession sparkSession,
String filePath,
String format)
Deprecated.
Experimental API: Might be discontinued in future release
|
MLMatrix |
read(org.apache.spark.sql.SQLContext sqlContext,
String filePath,
String format)
Deprecated.
Experimental API: Might be discontinued in future release
|
void |
registerFrameInput(String varName,
org.apache.spark.sql.Dataset<org.apache.spark.sql.Row> df)
Deprecated.
Register DataFrame as input.
|
void |
registerFrameInput(String varName,
org.apache.spark.sql.Dataset<org.apache.spark.sql.Row> df,
boolean containsID)
Deprecated.
Register DataFrame as input.
|
void |
registerInput(String varName,
org.apache.spark.sql.Dataset<org.apache.spark.sql.Row> df)
Deprecated.
Register DataFrame as input.
|
void |
registerInput(String varName,
org.apache.spark.sql.Dataset<org.apache.spark.sql.Row> df,
boolean containsID)
Deprecated.
Register DataFrame as input.
|
void |
registerInput(String varName,
org.apache.spark.api.java.JavaPairRDD<MatrixIndexes,MatrixBlock> rdd,
long rlen,
long clen)
Deprecated.
Register binary blocked RDD with given dimensions, default block sizes and no nnz
|
void |
registerInput(String varName,
org.apache.spark.api.java.JavaPairRDD<MatrixIndexes,MatrixBlock> rdd,
long rlen,
long clen,
int brlen,
int bclen)
Deprecated.
Register binary blocked RDD with given dimensions, given block sizes and no nnz
|
void |
registerInput(String varName,
org.apache.spark.api.java.JavaPairRDD<MatrixIndexes,MatrixBlock> rdd,
long rlen,
long clen,
int brlen,
int bclen,
long nnz)
Deprecated.
Register binary blocked RDD with given dimensions, given block sizes and given nnz (preferred).
|
void |
registerInput(String varName,
org.apache.spark.api.java.JavaPairRDD<MatrixIndexes,MatrixBlock> rdd,
MatrixCharacteristics mc)
Deprecated.
|
void |
registerInput(String varName,
org.apache.spark.api.java.JavaRDD<String> rdd,
String format)
Deprecated.
Register CSV/Text as inputs: Convenience method without dimensions and nnz.
|
void |
registerInput(String varName,
org.apache.spark.api.java.JavaRDD<String> rdd,
String format,
boolean hasHeader,
String delim,
boolean fill,
double fillValue)
Deprecated.
Register CSV/Text as inputs: Method for supplying csv file format properties, but without dimensions or nnz
|
void |
registerInput(String varName,
org.apache.spark.api.java.JavaRDD<String> rdd,
String format,
boolean hasHeader,
String delim,
boolean fill,
double fillValue,
long rlen,
long clen,
long nnz)
Deprecated.
Register CSV/Text as inputs: Method for supplying csv file format properties along with dimensions or nnz
|
void |
registerInput(String varName,
org.apache.spark.api.java.JavaRDD<String> rdd,
String format,
long rlen,
long clen)
Deprecated.
Register CSV/Text as inputs: Convenience method with dimensions and but no nnz.
|
void |
registerInput(String varName,
org.apache.spark.api.java.JavaRDD<String> rddIn,
String format,
long rlen,
long clen,
FileFormatProperties props,
List<org.apache.sysml.parser.Expression.ValueType> schema)
Deprecated.
Register Frame with CSV/Text as inputs: with dimensions.
|
void |
registerInput(String varName,
org.apache.spark.api.java.JavaRDD<String> rdd,
String format,
long rlen,
long clen,
long nnz)
Deprecated.
Register CSV/Text as inputs: with dimensions and nnz.
|
void |
registerInput(String varName,
MatrixBlock mb)
Deprecated.
|
void |
registerInput(String varName,
MatrixBlock mb,
MatrixCharacteristics mc)
Deprecated.
|
void |
registerInput(String varName,
MLMatrix df)
Deprecated.
Experimental API.
|
void |
registerInput(String varName,
org.apache.spark.rdd.RDD<String> rdd,
String format)
Deprecated.
Register CSV/Text as inputs: Convenience method without dimensions and nnz.
|
void |
registerInput(String varName,
org.apache.spark.rdd.RDD<String> rdd,
String format,
boolean hasHeader,
String delim,
boolean fill,
double fillValue)
Deprecated.
Register CSV/Text as inputs: Method for supplying csv file format properties, but without dimensions or nnz
|
void |
registerInput(String varName,
org.apache.spark.rdd.RDD<String> rdd,
String format,
boolean hasHeader,
String delim,
boolean fill,
double fillValue,
long rlen,
long clen,
long nnz)
Deprecated.
Register CSV/Text as inputs: Method for supplying csv file format properties along with dimensions or nnz
|
void |
registerInput(String varName,
org.apache.spark.rdd.RDD<String> rdd,
String format,
long rlen,
long clen)
Deprecated.
Register CSV/Text as inputs: Convenience method with dimensions and but no nnz.
|
void |
registerInput(String varName,
org.apache.spark.rdd.RDD<String> rdd,
String format,
long rlen,
long clen,
long nnz)
Deprecated.
Register CSV/Text as inputs: with dimensions and nnz.
|
void |
registerOutput(String varName)
Deprecated.
Marks the variable in the DML script as output variable.
|
void |
reset()
Deprecated.
Call this method if you want to clear any RDDs set via registerInput, registerOutput.
|
void |
reset(boolean cleanupConfig)
Deprecated.
|
void |
setConfig(String paramName,
String paramVal)
Deprecated.
Allow users to provide custom named-value configuration.
|
public MLContext(org.apache.spark.SparkContext sc) throws DMLRuntimeException
sc
- SparkContextDMLRuntimeException
- if DMLRuntimeException occurspublic MLContext(org.apache.spark.api.java.JavaSparkContext sc) throws DMLRuntimeException
sc
- JavaSparkContextDMLRuntimeException
- if DMLRuntimeException occurspublic org.apache.spark.SparkContext getSparkContext()
public void setConfig(String paramName, String paramVal)
paramName
- parameter nameparamVal
- parameter valuepublic void registerInput(String varName, org.apache.spark.sql.Dataset<org.apache.spark.sql.Row> df) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namedf
- the DataFrameDMLRuntimeException
- if DMLRuntimeException occurspublic void registerFrameInput(String varName, org.apache.spark.sql.Dataset<org.apache.spark.sql.Row> df) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namedf
- the DataFrameDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.sql.Dataset<org.apache.spark.sql.Row> df, boolean containsID) throws DMLRuntimeException
varName
- variable namedf
- the DataFramecontainsID
- false if the DataFrame has an column ID which denotes the row ID.DMLRuntimeException
- if DMLRuntimeException occurspublic void registerFrameInput(String varName, org.apache.spark.sql.Dataset<org.apache.spark.sql.Row> df, boolean containsID) throws DMLRuntimeException
varName
- variable namedf
- the DataFramecontainsID
- false if the DataFrame has an column ID which denotes the row ID.DMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, MLMatrix df) throws DMLRuntimeException
varName
- variable namedf
- the DataFrameDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.api.java.JavaRDD<String> rdd, String format, boolean hasHeader, String delim, boolean fill, double fillValue) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the RDDformat
- the formathasHeader
- is there a headerdelim
- the delimiterfill
- if true, fill, otherwise don't fillfillValue
- the fill valueDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.rdd.RDD<String> rdd, String format, boolean hasHeader, String delim, boolean fill, double fillValue) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the RDDformat
- the formathasHeader
- is there a headerdelim
- the delimiterfill
- if true, fill, otherwise don't fillfillValue
- the fill valueDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.rdd.RDD<String> rdd, String format, boolean hasHeader, String delim, boolean fill, double fillValue, long rlen, long clen, long nnz) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the RDDformat
- the formathasHeader
- is there a headerdelim
- the delimiterfill
- if true, fill, otherwise don't fillfillValue
- the fill valuerlen
- rowsclen
- columnsnnz
- non-zerosDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.api.java.JavaRDD<String> rdd, String format, boolean hasHeader, String delim, boolean fill, double fillValue, long rlen, long clen, long nnz) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the JavaRDDformat
- the formathasHeader
- is there a headerdelim
- the delimiterfill
- if true, fill, otherwise don't fillfillValue
- the fill valuerlen
- rowsclen
- columnsnnz
- non-zerosDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.rdd.RDD<String> rdd, String format) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the RDDformat
- the formatDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.api.java.JavaRDD<String> rdd, String format) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the JavaRDDformat
- the formatDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.api.java.JavaRDD<String> rdd, String format, long rlen, long clen) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the JavaRDDformat
- the formatrlen
- rowsclen
- columnsDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.rdd.RDD<String> rdd, String format, long rlen, long clen) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the RDDformat
- the formatrlen
- rowsclen
- columnsDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.api.java.JavaRDD<String> rdd, String format, long rlen, long clen, long nnz) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the JavaRDDformat
- the formatrlen
- rowsclen
- columnsnnz
- non-zerosDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.rdd.RDD<String> rdd, String format, long rlen, long clen, long nnz) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the JavaRDDformat
- the formatrlen
- rowsclen
- columnsnnz
- non-zerosDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.api.java.JavaRDD<String> rddIn, String format, long rlen, long clen, FileFormatProperties props, List<org.apache.sysml.parser.Expression.ValueType> schema) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerddIn
- the JavaPairRDDformat
- the formatrlen
- rowsclen
- columnsprops
- propertiesschema
- List of column typesDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.api.java.JavaPairRDD<MatrixIndexes,MatrixBlock> rdd, long rlen, long clen) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the JavaPairRDDrlen
- rowsclen
- columnsDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.api.java.JavaPairRDD<MatrixIndexes,MatrixBlock> rdd, long rlen, long clen, int brlen, int bclen) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the JavaPairRDDrlen
- rowsclen
- columnsbrlen
- block rowsbclen
- block columnsDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.api.java.JavaPairRDD<MatrixIndexes,MatrixBlock> rdd, long rlen, long clen, int brlen, int bclen, long nnz) throws DMLRuntimeException
Marks the variable in the DML script as input variable. Note that this expects a "varName = read(...)" statement in the DML script which through non-MLContext invocation would have been created by reading a HDFS file.
varName
- variable namerdd
- the JavaPairRDDrlen
- rowsclen
- columnsbrlen
- block rowsbclen
- block columnsnnz
- non-zerosDMLRuntimeException
- if DMLRuntimeException occurspublic void registerInput(String varName, org.apache.spark.api.java.JavaPairRDD<MatrixIndexes,MatrixBlock> rdd, MatrixCharacteristics mc) throws DMLRuntimeException
DMLRuntimeException
public void registerInput(String varName, MatrixBlock mb) throws DMLRuntimeException
DMLRuntimeException
public void registerInput(String varName, MatrixBlock mb, MatrixCharacteristics mc) throws DMLRuntimeException
DMLRuntimeException
public void registerOutput(String varName) throws DMLRuntimeException
varName
- variable nameDMLRuntimeException
- if DMLRuntimeException occurspublic MLOutput execute(String dmlScriptFilePath, Map<String,String> namedArgs, boolean parsePyDML, String configFilePath) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSnamedArgs
- named argumentsparsePyDML
- true if pydml, false otherwiseconfigFilePath
- path to config fileIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, Map<String,String> namedArgs, String configFilePath) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSnamedArgs
- named argumentsconfigFilePath
- path to config fileIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, Map<String,String> namedArgs) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSnamedArgs
- named argumentsIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, scala.collection.immutable.Map<String,String> namedArgs) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSnamedArgs
- named argumentsIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, Map<String,String> namedArgs, boolean parsePyDML) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSnamedArgs
- named argumentsparsePyDML
- true if pydml, false otherwiseIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, scala.collection.immutable.Map<String,String> namedArgs, boolean parsePyDML) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSnamedArgs
- named argumentsparsePyDML
- true if pydml, false otherwiseIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, String[] args, String configFilePath) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSargs
- argumentsconfigFilePath
- path to config fileIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, ArrayList<String> args, String configFilePath) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSargs
- argumentsconfigFilePath
- path to config fileIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, String[] args) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSargs
- argumentsIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, ArrayList<String> args) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSargs
- argumentsIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, ArrayList<String> args, boolean parsePyDML) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSargs
- argumentsparsePyDML
- true if pydml, false otherwiseIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, ArrayList<String> args, boolean parsePyDML, String configFilePath) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSargs
- argumentsparsePyDML
- true if pydml, false otherwiseconfigFilePath
- path to config fileIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, ArrayList<String> argsName, ArrayList<String> argsValues, String configFilePath) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSargsName
- argument namesargsValues
- argument valuesconfigFilePath
- path to config fileIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, ArrayList<String> argsName, ArrayList<String> argsValues) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSargsName
- argument namesargsValues
- argument valuesIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, String[] args, boolean parsePyDML, String configFilePath) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSargs
- argumentsparsePyDML
- true if pydml, false otherwiseconfigFilePath
- path to config fileIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, String[] args, boolean parsePyDML) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSargs
- argumentsparsePyDML
- true if pydml, false otherwiseIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, String configFilePath) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSconfigFilePath
- path to config fileIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, boolean parsePyDML, String configFilePath) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSparsePyDML
- true if pydml, false otherwiseconfigFilePath
- path to config fileIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput execute(String dmlScriptFilePath, boolean parsePyDML) throws IOException, DMLException, org.apache.sysml.parser.ParseException
dmlScriptFilePath
- the dml script can be in local filesystem or in HDFSparsePyDML
- true if pydml, false otherwiseIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic void reset() throws DMLRuntimeException
DMLRuntimeException
- if DMLException occurspublic void reset(boolean cleanupConfig) throws DMLRuntimeException
DMLRuntimeException
public MLOutput executeScript(String dmlScript) throws IOException, DMLException
dmlScript
- the scriptIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLOutput executeScript(String dmlScript, boolean isPyDML) throws IOException, DMLException
IOException
DMLException
public MLOutput executeScript(String dmlScript, String configFilePath) throws IOException, DMLException
IOException
DMLException
public MLOutput executeScript(String dmlScript, boolean isPyDML, String configFilePath) throws IOException, DMLException
IOException
DMLException
public MLOutput executeScript(String dmlScript, ArrayList<String> argsName, ArrayList<String> argsValues, String configFilePath) throws IOException, DMLException, org.apache.sysml.parser.ParseException
IOException
DMLException
org.apache.sysml.parser.ParseException
public MLOutput executeScript(String dmlScript, ArrayList<String> argsName, ArrayList<String> argsValues) throws IOException, DMLException, org.apache.sysml.parser.ParseException
IOException
DMLException
org.apache.sysml.parser.ParseException
public MLOutput executeScript(String dmlScript, scala.collection.immutable.Map<String,String> namedArgs) throws IOException, DMLException
IOException
DMLException
public MLOutput executeScript(String dmlScript, scala.collection.immutable.Map<String,String> namedArgs, boolean isPyDML) throws IOException, DMLException
IOException
DMLException
public MLOutput executeScript(String dmlScript, scala.collection.immutable.Map<String,String> namedArgs, String configFilePath) throws IOException, DMLException
IOException
DMLException
public MLOutput executeScript(String dmlScript, scala.collection.immutable.Map<String,String> namedArgs, boolean isPyDML, String configFilePath) throws IOException, DMLException
IOException
DMLException
public MLOutput executeScript(String dmlScript, Map<String,String> namedArgs) throws IOException, DMLException
IOException
DMLException
public MLOutput executeScript(String dmlScript, Map<String,String> namedArgs, boolean isPyDML) throws IOException, DMLException
IOException
DMLException
public MLOutput executeScript(String dmlScript, Map<String,String> namedArgs, String configFilePath) throws IOException, DMLException
IOException
DMLException
public MLOutput executeScript(String dmlScript, Map<String,String> namedArgs, boolean isPyDML, String configFilePath) throws IOException, DMLException
IOException
DMLException
public MLMatrix read(org.apache.spark.sql.SparkSession sparkSession, String filePath, String format) throws IOException, DMLException, org.apache.sysml.parser.ParseException
sparkSession
- the Spark SessionfilePath
- the file pathformat
- the formatIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occurspublic MLMatrix read(org.apache.spark.sql.SQLContext sqlContext, String filePath, String format) throws IOException, DMLException, org.apache.sysml.parser.ParseException
sqlContext
- the SQL ContextfilePath
- the file pathformat
- the formatIOException
- if IOException occursDMLException
- if DMLException occursorg.apache.sysml.parser.ParseException
- if ParseException occursCopyright © 2017 The Apache Software Foundation. All rights reserved.