The primary focus for our May updates is to make the Spark development work easier for you in IntelliJ! In this release, your spark remote debugging experience is significantly improved. Scala SDKs Installation, Scala project creation, and Spark Job submission are also simplified. You can now use IntelliJ “run as” or “debug as” for Spark job submission and debugging. You can also load Spark job configuration from local property file now. The key updates are summarized in the following.
Simplified Scala project creation
The installation and Scala project creation processes are simplified greatly through on demand Scala Plugin installation and the Scala SDKs auto download during Scala project creation. To learn more please watch our demo Create Spark Scala Applications.
1. The Scala project creation wizard now checks whether Scala plugin is installed, and then it will help you to search and install the Scala plugin for the first time.
2. Previously you needed to specify the Spark version, find the corresponding Scala SDK version, and download the Scala SDK manually. Now you only need to specify the Spark version during project creation, the corresponding Scala SDKs, and libraries are auto downloaded. Please note that the Spark version you choose here should match your Spark cluster version.
Improved Spark remote debugging
We have developed a super cool remote debugging feature which allows you to run and debug Spark application remotely on a HDInsight cluster anytime. To learn more please watch our demo HDInsight Spark Remote Debugging.
The initial configuration to connect to your HDInsight Spark cluster for remote debugging is as simple as a few clicks which are outlined in the following:
- To configure Spark remote debugging, go to IntelliJ “Run” menu -> “Edit Configurations” -> new “Submit Spark Job” configuration. You are asked to enter information including Spark cluster, artifact, and main class name as shown below.
- By clicking “Advanced configuration”, you can “Enable Spark remote debug”, and specify SSH user name, password, or private key file as shown below.
IntelliJ “run as” / “debug as” integration
- You can either go to IntelliJ “Run” menu -> “Edit Configurations” –> click new “Submit Spark Job”, or right click the project and then choose “Submit Spark Job” to submit your Spark job.
- You can customize the configuration, such as cluster, main class, artifact, and so on.
You can use IntelliJ “run” or “debug” in menu, click the run icon, or the debug icon (as shown below) in the toolbar to start Spark remote debugging session.
You can also set up a breakpoint, edit the application, step through the code, and resume the execution while you are performing remote debugging.
Load Job Configuration
In the Spark submission window, you can now load job configuration from local property file by clicking the “browse” button besides “Job configuration”, as shown below.
How to install/update
You can get the latest bits by going to IntelliJ repository, and searching “Azure Toolkit.” IntelliJ will also prompt you for the latest update if you have already installed the plugin.
For more information, check out the following:
- IntelliJ User Guide: Use HDInsight Tools in Azure Toolkit for IntelliJ to create Spark applications for HDInsight Spark Linux cluster
- IntelliJ HDInsight Spark Local Run: Use HDInsight Tools for IntelliJ with Hortonworks Sandbox
- Create Scala Project (Video): Create Spark Scala Applications
- Remote Debug (Video): Use Azure Toolkit for IntelliJ to debug Spark applications remotely on HDInsight Cluster
Learn more about today’s announcements on the Azure blog and Big Data blog.
Discover more Azure service updates.
If you have questions, feedback, comments, or bug reports, please use the comments below or send a note to email@example.com.