We are pleased to reveal the release of Spark Interactive Console in Azure Toolkit for IntelliJ. This new component facilitates Spark job authoring, and enables you to run code interactively in a shell-like environment within IntelliJ.
The Spark console includes Spark Local Console and Spark Livy Interactive Session. When you run the Spark console, instances of SparkSession and SparkContext are automatically instantiated like in Spark shell. You can use 'spark' to access the SparkSession and use 'sc' to access the SparkContext. The Spark local console allows you to run your code interactively and validate your code logic locally. You can also check your programming variables and perform other scripting operations locally before submitting to the cluster. The Spark Livy interactive session establishes an interactive communication channel with your cluster so you can check on file schemas, preview data, and run ad-hoc queries while you are programming your Spark job. You can also easily switch the Livy interactive session against different Spark clusters.
The Spark console has a Language Service built-in for Scala programming. You can leverage the language service features, such as IntelliSense and autocomplete, to look up a Spark object (i.e., Spark context and Spark session) properties, query hive metadata, and check on function signatures.
A new feature, Send Selection to Spark Console (Ctrl + Shift + S), has been added to simplify the user experience for accessing the Spark console. You can send a highlighted single line of code or a block of code to the console from your main Scala project. This feature enables you to switch smoothly between contexts: coding and validation or testing code in the Spark console.
Summary of new features
- Run Spark local console
- Run Spark Livy interactive session console
- Language service for Scala enabled in the console
- Send selected code to console
The addition of the Spark console is an important step forward for the Azure Toolkit because of its expanding capabilities beyond batch job processing. This update also supports interactive querying across local and dev/test clusters.
To run your code and, please select and hold Ctrl + Enter, and use the up and down arrows to browse the history of previously run code.
How to access
You can easily start the Spark console either from the Tools menu or from the Scala file by right-clicking on the context menu.
For more information, check out the following:
- Get started by reading, “User manual – HDInsight IntelliJ plugin” on the Docs page.
- Learn how to, “Use Azure Toolkit for IntelliJ to debug Spark applications” in our documentation.
- Watch the Channel 9 demo, “Use Azure Toolkit for IntelliJ to debug Spark applications remotely on an HDInsight cluster.”
Feedback
We look forward to your comments and feedback. If there are any feature requests, customer asks, or suggestions, please send us a note to hdivstool@microsoft.com. For bug submissions, please open a new ticket using the template.