Spark is an in-memory open source cluster computing system allowing for fast iterative and interactive analytics. Spark utilizes Scala – a type-safe objected oriented language with functional properties that is fully interoperable with Java. For more information about Spark, please refer to http://spark-project.org. To test out Spark, you can install the stand-alone version on Mac OSX.
This is a follow up to my previous blog post on the topic – Installing Spark 0.6.1 Standalone on OSX Mountain Lion (10.8). Since this blog post, Spark has added some interesting features including:
- An associated in-memory file system called Tachyon, more info at: AmpLab Tachyon and Shark Update.
- An associated graph analytics on top of Spark called GraphX, more info at: AmpLab GraphX: Graph Analytics on Spark.
Install Scala 2.9.3
The first thing you will need to do is to install Scala 2.9.3 as Spark 0.7.2 is dependent on it. As of this posting, the current version of Scala is 2.10 but there are some issues (at the time of this blog post) with Spark 0.7.2 and Scala 2.10.
1) A handy way to installing Scala is to use Home Brew; please reference Installing Hadoop on OSX Lion (10.7) for more information on how to use Home Brew as well installing Hadoop on Mac OSX. It may be handy to install Hadoop so that way you can use Spark against HDFS as well.
2) The current Home Brew scala formula installs Scala 2.10 but you will need to use Scala 2.9.3. A quick way to do to this is to modify the scala.rb formula (/usr/local/Library/Formula/scala.rb) to install Scala 2.9.3.
3) Installing Scala via HomeBrew by typing the command in a bash terminal:
Upon running this command, scala will be located in /usr/local/Cellar/scala
Ensure you have set the JAVA_HOME and SCALA_HOME variables
In my case, I have configured my .profile with the following:
Installing Spark 0.7.2
2) Open up the tgz file and place it into a folder where you will install Spark. For example, I placed mine in the HomeBrew Cellar location, i.e.
Configure and Build Spark 0.7.2
Follow the instructions as per the README.MD in /usr/local/Cellar/spark-0.7.2 or at http://spark-project.org/docs/latest/.
1) Run the Simple Build Tool (SBT) package from /usr/local/Cellar/spark-0.7.2
2) Modify the conf/spark-env.sh
Ensure that SCALA_HOME variable has been set
Running Spark 0.7.2
From here, you can now run Spark examples as noted in the Spark Quick Start. Note, if you are running the standalone job samples (e.g. A Standalone Job in Scala), make sure you have installed sbt first (via Home Brew, the command is ‘brew install sbt’).