


Right-Click src/main/scala/com.machinecreek Choose New > Scala Class Right-Click on the src/main/scala folder > New > Package: Open the pom.xml file and paste the following under the groupId, artifactId, and version: In the Maven projects need to be imported dialog box select > Select Enable Auto Import Untar and place contents into /usr/local/share/scalaĮxport SCALA_HOME=/usr/local/share/scala-2.11.12Ĭhoose: File > New > Project > Maven Untar and place contents into /usr/local/sparkĢ. Verify this release using the checksum (compare to download site) Download Spark: spark-2.4.3-bin-hadoop2.7.tgz into /usr/local/sparkĤ. Choose a package type: Prebuilt for apache Hadoop 2.7 and laterģ. It assumes you have IntelliJ and maven installed.ġ. True there are later versions of Scala but Spark 2.4.3 is compatible with Scala 2.11.12. This tutorial will assist in updating that same environment with the current versions, which, as of this writing is: Spark 2.4.3 and Scala 2.11.12. Update Your Spark/Scala Development Environment in IntelliJ using MavenĪ previous tutorial discussed the steps in setting up an original environment and running a simple Spark application from scratch.
