Saltar al contenido principal

 Subscribe

The primary focus for our May updates is to make the Spark development work easier for you in IntelliJ! In this release, your spark remote debugging experience is significantly improved. Scala SDKs Installation, Scala project creation, and Spark Job submission are also simplified. You can now use IntelliJ “run as” or “debug as” for Spark job submission and debugging. You can also load Spark job configuration from local property file now. The key updates are summarized in the following.

Simplified Scala project creation

The installation and Scala project creation processes are simplified greatly through on demand Scala Plugin installation and the Scala SDKs auto download during Scala project creation. To learn more please watch our demo Create Spark Scala Applications.

1. The Scala project creation wizard now checks whether Scala plugin is installed, and then it will help you to search and install the Scala plugin for the first time.

Screenshot_1

2. Previously you needed to specify the Spark version, find the corresponding Scala SDK version, and download the Scala SDK manually. Now you only need to specify the Spark version during project creation, the corresponding Scala SDKs, and libraries are auto downloaded. Please note that the Spark version you choose here should match your Spark cluster version.

Screenshot_2

Improved Spark remote debugging

We have developed a super cool remote debugging feature which allows you to run and debug Spark application remotely on a HDInsight cluster anytime. To learn more please watch our demo HDInsight Spark Remote Debugging.

The initial configuration to connect to your HDInsight Spark cluster for remote debugging is as simple as a few clicks which are outlined in the following:

  • To configure Spark remote debugging, go to IntelliJ “Run” menu -> “Edit Configurations” -> new “Submit Spark Job” configuration. You are asked to enter information including Spark cluster, artifact, and main class name as shown below.

Run-Debug Config2

  • By clicking “Advanced configuration”, you can “Enable Spark remote debug”, and specify SSH user name, password, or private key file as shown below.

Screenshot_4

IntelliJ “run as” / “debug as” integration

  • You can either go to IntelliJ “Run” menu -> “Edit Configurations” –> click new “Submit Spark Job”, or right click the project and then choose “Submit Spark Job” to submit your Spark job.

Run-Debug Config

  • You can customize the configuration, such as cluster, main class, artifact, and so on. 

Run-Debug Config2

  • You can use IntelliJ “run” or “debug” in menu, click the run icon, or the debug icon (as shown below) in the toolbar to start Spark remote debugging session.      

Screenshot_7

  • You can also set up a breakpoint, edit the application, step through the code, and resume the execution while you are performing remote debugging.

Screenshot_8

Load Job Configuration

In the Spark submission window, you can now load job configuration from local property file by clicking the “browse” button besides “Job configuration”, as shown below.

LocalProfile

How to install/update

You can get the latest bits by going to IntelliJ repository, and searching “Azure Toolkit.” IntelliJ will also prompt you for the latest update if you have already installed the plugin.

IntelliJ-Installation_thumb4 

For more information, check out the following:

Learn more about today’s announcements on the Azure blog and Big Data blog.

Discover more Azure service updates.

If you have questions, feedback, comments, or bug reports, please use the comments below or send a note to hdivstool@microsoft.com.

  • Explore

     

    Let us know what you think of Azure and what you would like to see in the future.

     

    Provide feedback

  • Build your cloud computing and Azure skills with free courses by Microsoft Learn.

     

    Explore Azure learning


Join the conversation