Spark MLLib is a new component under active development. It was first released with Spark 0.8.0. It contains some common machine learning algorithms and utilities, including classification, regression, clustering, collaborative filtering, dimensionality reduction, as well as some optimization primitives. For detailed list of available algorithms click here.
To add Spark MLLib feature in a Play Scala application follow these steps:
1). Add following dependencies in build.sbt file
The dependency – “org.apache.spark” %% “spark-mllib” % “1.0.1” is specific to Spark MLLib.
As you can see that we have upgraded to Spark 1.0.1 (latest release of Apache Spark).
2). Create a file app/utils/SparkMLLibUtility.scala & add following code to it
View original post 168 more words