Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

merge #1

Merged
merged 6 commits into from
Mar 7, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
<idea-plugin url="https://github.com/Microsoft/azure-tools-for-java">
<id>com.microsoft.tooling.msservices.intellij.azure</id>
<name>Azure Toolkit for IntelliJ</name>
<version>3.18.0</version>
<version>3.19.0</version>
<vendor email="[email protected]" url="http://www.microsoft.com">Microsoft</vendor>

<description><![CDATA[
Expand All @@ -11,14 +11,34 @@
<li>Azure Web App Workflow: You can run your web applications on Azure Web App with One-Click experience using Azure Toolkits for IntelliJ.</li>
<li>Azure Container Workflow: You can dockerize and run your web application on Azure Web App (Linux) via Azure Container Registry.</li>
<li>Azure Explorer: You can view and manage your cloud resources on Azure with Azure Explorer in Azure Toolkits for IntelliJ.</li>
<li>Azure HDInsight: Use Azure HDInsight tool to submit Spark jobs to HDInsight cluster, monitor and debug Spark or Hadoop Hive jobs easily.</li>
<li>Azure HDInsight: Create a Spark project, author and submit Spark jobs to HDInsight cluster; Monitor and debug Spark jobs easily. </li>
<li>SQL Server Big Data Cluster: Link to SQL Server Big Data Cluster; Create a Spark project, author and submit Spark jobs to cluster; Monitor and debug Spark jobs easily</li>
</ul>
</html>
]]></description>

<change-notes>
<![CDATA[
<html>
<h3>[3.19.0]</h3>
<h4>Added</h4>
<ul>
<li>Support open browser after Web App deployment.</li>
<li>Support to link SQL Server Big Data cluster and submit Spark jobs.</li>
<li>Support WebHDFS storage type to submit job to HDInsight cluster with ADLS Gen 1 storage account.</li>
</ul>
<h4>Changed</h4>
<ul>
<li>Update UI of Web App creation and deployment</li>
<li>Subscription ID need to be specified for ADLS Gen 1 storage type</li>
</ul>
<h4>Fixed</h4>
<li><a href="https://github.com/Microsoft/azure-tools-for-java/issues/2840" rel="nofollow">#2840</a> Submit successfully with invalid path for WebHDFS storage type issue.</li>
<li><a href="https://github.com/Microsoft/azure-tools-for-java/issues/2747" rel="nofollow">#2747</a>,<a href="https://github.com/Microsoft/azure-tools-for-java/issues/2801" rel="nofollow">#2801</a> Error loadig HDInsight node issue.</li>
<li><a href="https://github.com/Microsoft/azure-tools-for-java/issues/2714" rel="nofollow">#2714</a>,<a href="https://github.com/Microsoft/azure-tools-for-java/issues/2688" rel="nofollow">#2688</a>,<a href="https://github.com/Microsoft/azure-tools-for-java/issues/2669" rel="nofollow">#2669</a>,<a href="https://github.com/Microsoft/azure-tools-for-java/issues/2728" rel="nofollow">#2728</a>,<a href="https://github.com/Microsoft/azure-tools-for-java/issues/2807" rel="nofollow">#2807</a>,<a href="https://github.com/Microsoft/azure-tools-for-java/issues/2808" rel="nofollow">#2808</a>,<a href="https://github.com/Microsoft/azure-tools-for-java/issues/2811" rel="nofollow">#2811</a>,<a href="https://github.com/Microsoft/azure-tools-for-java/issues/2831" rel="nofollow">#2831</a>Spark Run Configuration validation issues.</li>
<li><a href="https://github.com/Microsoft/azure-tools-for-java/issues/2810" rel="nofollow">#2810</a>,<a href="https://github.com/Microsoft/azure-tools-for-java/issues/2760" rel="nofollow">#2760</a> Spark Run Configuration issues when created from context menu.</li>
<ul>
</ul>
<h3>[3.18.0]</h3>
<h4>Added</h4>
<ul>
Expand Down Expand Up @@ -237,43 +257,43 @@
<li>Supported to fix Spark job configuration in run configuration before Spark job submission.</li>
<li>Updated Application Insights library to v2.1.2.</li>
<li>Fixed some bugs.</li>
<ul>
</ul>
<h3>[3.9.0]</h3>
<ul>
<li>Added Spark 2.3 support.</li>
<li>Spark in Azure Data Lake private preview refresh and bug fix.</li>
<li>Fixed some bugs.</li>
<ul>
</ul>
<h3>[3.8.0]</h3>
<ul>
<li>Supported to run Spark jobs in Azure Data Lake cluster (in private preview).</li>
<li>Fixed some bugs.</li>
<ul>
</ul>
<h3>[3.7.0]</h3>
<ul>
<li>Users do not need to login again in interactive login mode, if Azure refresh token is still validated.</li>
<li>Updated ApplicationInsights version to v2.1.0.</li>
<li>Fixed some bugs.</li>
<ul>
</ul>
<h3>[3.6.0]</h3>
<ul>
<li>Updated ApplicationInsights version to v2.0.2.</li>
<li>Added Spark 2.2 templates for HDInsight.</li>
<li>Added SSH password expiration check.</li>
<li>Fixed some bugs.</li>
<ul>
</ul>
<h3>[3.5.0]</h3>
<ul>
<li>Added open Azure Storage Explorer for exploring data in HDInsight cluster (blob or ADLS).</li>
<li>Improved Spark remote debugging.</li>
<li>Improved Spark job submission correctness check.</li>
<li>Fixed an login issue.</li>
<ul>
</ul>
<h3>[3.4.0]</h3>
<ul>
<li>Users can use Ambari username/password to submit Spark job to HDInsight cluster, in additional to Azure subscription based authentication. This means users without Azure subscription permission can still use Ambari credentials to submit/debug their Spark jobs in HDInsight clusters.</li>
<li>The dependency on storage permission is removed and users do not need to provide storage credentials for Spark job submission any more (storage credential is still needed if users want to use storage explorer).</li>
<ul>
</ul>
<h3>[3.3.0]</h3>
<ul>
<li>Added support of Enterprise Security Package HDInsight Spark cluster.</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,13 +44,18 @@ class SparkSubmitJobUploadStorageModel: ILogger {

@get:Transient @set:Transient var containersModel: DefaultComboBoxModel<String> = DefaultComboBoxModel()

@get:Transient @set:Transient var subscriptionsModel: DefaultComboBoxModel<String> = DefaultComboBoxModel()

@Attribute("upload_path")
var uploadPath: String? = null

// selectedContainer is saved to reconstruct a containersModel when we reopen the project
@Attribute("selected_container")
var selectedContainer: String? = null

@Attribute("selected_subscription")
var selectedSubscription: String? = null

@Attribute("storage_account_type")
var storageAccountType: SparkSubmitStorageType? = null

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,7 @@
import com.microsoft.azure.hdinsight.spark.run.configuration.LivySparkBatchJobRunConfiguration;
import com.microsoft.azure.hdinsight.spark.ui.SparkJobLogConsoleView;
import com.microsoft.azure.sqlbigdata.sdk.cluster.SqlBigDataLivyLinkClusterDetail;
import com.microsoft.azuretools.authmanage.AuthMethodManager;
import com.microsoft.azuretools.authmanage.models.SubscriptionDetail;
import com.microsoft.intellij.rxjava.IdeaSchedulers;
import org.apache.commons.lang3.StringUtils;
Expand All @@ -69,6 +70,7 @@
import java.util.AbstractMap.SimpleImmutableEntry;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
import java.util.regex.Matcher;
import java.util.regex.Pattern;

Expand Down Expand Up @@ -101,6 +103,7 @@ public ISparkBatchJob buildSparkBatchJob(@NotNull SparkSubmitModel submitModel,
.orElseThrow(() -> new ExecutionException("Can't find cluster named " + clusterName));

SparkSubmitStorageType storageAcccountType = submitModel.getJobUploadStorageModel().getStorageAccountType();
String subscription = submitModel.getJobUploadStorageModel().getSelectedSubscription();
switch (storageAcccountType) {
case BLOB:
String storageAccountName = submitModel.getJobUploadStorageModel().getStorageAccount();
Expand Down Expand Up @@ -137,15 +140,24 @@ public ISparkBatchJob buildSparkBatchJob(@NotNull SparkSubmitModel submitModel,
destinationRootPath = rawRootPath.endsWith("/") ? rawRootPath : rawRootPath + "/";
// e.g. for adl://john.azuredatalakestore.net/root/path, adlsAccountName is john
String adlsAccountName = destinationRootPath.split("\\.")[0].split("//")[1];
SubscriptionDetail subscriptionDetail =
AzureSparkClusterManager.getInstance().getSubscriptionDetailByStoreAccountName(adlsAccountName)
.toBlocking().singleOrDefault(null);
if (subscriptionDetail == null) {
throw new ExecutionException(String.format("Error getting subscription info by ADLS root path. Please check if the ADLS account is %s's storage account", submitModel.getClusterName()));

Optional<SubscriptionDetail> subscriptionDetail = Optional.empty();
try{
subscriptionDetail = AuthMethodManager.getInstance().getAzureManager().getSubscriptionManager()
.getSelectedSubscriptionDetails()
.stream()
.filter((detail) -> detail.getSubscriptionName().equals(subscription))
.findFirst();

}catch (Exception ignore){
}

if (!subscriptionDetail.isPresent()) {
throw new ExecutionException("Error getting subscription info. Please select correct subscription");
}
// get Access Token
try {
accessToken = AzureSparkClusterManager.getInstance().getAccessToken(subscriptionDetail.getTenantId());
accessToken = AzureSparkClusterManager.getInstance().getAccessToken(subscriptionDetail.get().getTenantId());
} catch (IOException ex) {
log().warn("Error getting access token based on the given ADLS root path. " + ExceptionUtils.getStackTrace(ex));
throw new ExecutionException("Error getting access token based on the given ADLS root path");
Expand All @@ -154,6 +166,9 @@ public ISparkBatchJob buildSparkBatchJob(@NotNull SparkSubmitModel submitModel,
break;
case WEBHDFS:
destinationRootPath = submitModel.getJobUploadStorageModel().getUploadPath();
if(StringUtils.isBlank(destinationRootPath) || !destinationRootPath.matches(SparkBatchJob.WebHDFSPathPattern)){
throw new ExecutionException("Invalid webhdfs root path input");
}

//create httpobservable and jobDeploy
try {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,4 +29,8 @@ class ArisSparkConfiguration(name: String, val module: ArisSparkConfigurationMod
override fun getConfigurationEditor(): SettingsEditor<out RunConfiguration> {
return LivySparkRunConfigurationSettingsEditor(ArisSparkConfigurable(module.project))
}

override fun getSuggestedNamePrefix(): String {
return "[Spark on SQL]"
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ import com.intellij.openapi.ui.TextFieldWithBrowseButton
import com.intellij.openapi.vfs.impl.jar.JarFileSystemImpl
import com.intellij.packaging.impl.elements.ManifestFileUtil
import com.intellij.uiDesigner.core.GridConstraints
import com.microsoft.azure.hdinsight.common.DarkThemeManager
import com.microsoft.intellij.forms.dsl.panel
import com.microsoft.intellij.helpers.ManifestFileUtilsEx
import javax.swing.JComponent
Expand Down Expand Up @@ -55,7 +56,13 @@ class SparkCommonRunParametersPanel(private val myProject: Project, private val
}
}

private val submissionPanel : JPanel by lazy {
private val errorMessageLabel = JLabel("")
.apply {
foreground = DarkThemeManager.getInstance().errorMessageColor
isVisible = true
}

private val submissionPanel: JPanel by lazy {
val formBuilder = panel {
columnTemplate {
col {
Expand All @@ -68,28 +75,30 @@ class SparkCommonRunParametersPanel(private val myProject: Project, private val
fill = GridConstraints.FILL_HORIZONTAL
}
}
row { c(mainClassPrompt); c(mainClassTextField) }
row { c(mainClassPrompt); c(mainClassTextField) }
row { c(); c(errorMessageLabel) }
}

formBuilder.buildPanel()
}

open val component: JComponent
val component: JComponent
get() = submissionPanel

fun setMainClassName(mainClassName: String) {
mainClassTextField.text = mainClassName;
}

fun getMainClassName() : String {
fun getMainClassName(): String {
return mainClassTextField.text
}

@Throws(ConfigurationException::class)
fun validateInputs() {
// Check for command arguments invisible chars
if (mainClassTextField.text.isNullOrBlank()) {
throw ConfigurationException("Main Class Name should not be null")
if(this.mainClassTextField.text.isNullOrBlank()) {
this.errorMessageLabel.text = "Main class name could not be null."
} else {
this.errorMessageLabel.text = ""
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -23,22 +23,31 @@
package com.microsoft.azure.hdinsight.spark.ui

import com.intellij.openapi.ui.ComboBox
import com.intellij.ui.ComboboxWithBrowseButton
import com.intellij.uiDesigner.core.GridConstraints
import com.intellij.uiDesigner.core.GridConstraints.ANCHOR_WEST
import com.microsoft.azure.hdinsight.spark.common.SparkSubmitStorageType
import com.microsoft.azuretools.ijidea.ui.HintTextField
import com.microsoft.azure.hdinsight.common.StreamUtil
import com.microsoft.intellij.forms.dsl.panel
import java.awt.CardLayout
import javax.swing.JLabel
import javax.swing.JPanel

class SparkSubmissionJobUploadStorageAdlsCard: SparkSubmissionJobUploadStorageBasicCard() {
private val refreshButtonIconPath = "/icons/refresh.png"
override val title: String = SparkSubmitStorageType.ADLS_GEN1.description
private val adlsRootPathTip = "e.g. adl://myaccount.azuredatalakestore.net/root/path"
private val adlsRootPathLabel = JLabel("ADLS Root Path").apply { toolTipText = adlsRootPathTip }
val adlsRootPathField = HintTextField(adlsRootPathTip)
private val authMethodLabel = JLabel("Authentication Method")
private val authMethodComboBox = ComboBox<String>(arrayOf("Azure Account"))
private val subscriptionsLabel = JLabel("Subscription List")
val subscriptionsComboBox = ComboboxWithBrowseButton().apply {
button.toolTipText = "Refresh"
button.icon = StreamUtil.getImageResourceFile(refreshButtonIconPath)
}

val signInCard = SparkSubmissionJobUploadStorageAdlsSignInCard()
val signOutCard = SparkSubmissionJobUploadStorageAdlsSignOutCard()
val azureAccountCards = JPanel(CardLayout()).apply {
Expand Down Expand Up @@ -67,6 +76,9 @@ class SparkSubmissionJobUploadStorageAdlsCard: SparkSubmissionJobUploadStorageBa
row {
c(); c(azureAccountCards)
}
row {
c(subscriptionsLabel); c(subscriptionsComboBox)
}
}
layout = formBuilder.createGridLayoutManager()
formBuilder.allComponentConstraints.forEach { (component, gridConstrains) -> add(component, gridConstrains) }
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,12 +31,14 @@ import com.intellij.ui.DocumentAdapter
import com.microsoft.azure.hdinsight.common.ClusterManagerEx
import com.microsoft.azure.hdinsight.common.logger.ILogger
import com.microsoft.azure.hdinsight.sdk.cluster.IClusterDetail
import com.microsoft.azure.hdinsight.sdk.common.AzureSparkClusterManager
import com.microsoft.azure.hdinsight.sdk.common.azure.serverless.AzureSparkCosmosCluster
import com.microsoft.azure.hdinsight.sdk.storage.ADLSStorageAccount
import com.microsoft.azure.hdinsight.sdk.storage.HDStorageAccount
import com.microsoft.azure.hdinsight.sdk.storage.IHDIStorageAccount
import com.microsoft.azure.hdinsight.spark.common.SparkSubmitJobUploadStorageModel
import com.microsoft.azure.storage.blob.BlobRequestOptions
import com.microsoft.azuretools.authmanage.AuthMethodManager
import com.microsoft.tooling.msservices.helpers.azure.sdk.StorageClientSDKManager
import com.microsoft.tooling.msservices.model.storage.BlobContainer
import com.microsoft.tooling.msservices.model.storage.ClientStorageAccount
Expand All @@ -47,6 +49,7 @@ import rx.schedulers.Schedulers
import java.awt.event.FocusAdapter
import java.awt.event.FocusEvent
import java.awt.event.ItemEvent
import java.util.stream.Collectors
import javax.swing.DefaultComboBoxModel
import javax.swing.event.DocumentEvent

Expand Down Expand Up @@ -105,6 +108,60 @@ class SparkSubmissionJobUploadStorageCtrl(val view: SparkSubmissionJobUploadStor
{ err -> log().warn(ExceptionUtils.getStackTrace(err)) })
}
}

//refresh subscriptions after refresh button is clicked
view.storagePanel.adlsCard.subscriptionsComboBox.button.addActionListener{
if (view.storagePanel.adlsCard.subscriptionsComboBox.button.isEnabled) {
view.storagePanel.adlsCard.subscriptionsComboBox.button.isEnabled = false
refreshSubscriptions()
.doOnEach { view.storagePanel.adlsCard.subscriptionsComboBox.button.isEnabled = true }
.subscribe(
{ },
{ err -> log().warn(ExceptionUtils.getStackTrace(err)) })
}
}
}

private fun refreshSubscriptions(): Observable<SparkSubmitJobUploadStorageModel> {
return Observable.just(SparkSubmitJobUploadStorageModel())
.doOnNext(view::getData)
// set error message to prevent user from applying the change when refreshing is not completed
.map { it.apply { errorMsg = "refreshing subscriptions is not completed" } }
.doOnNext(view::setData)
.observeOn(Schedulers.io())
.map { toUpdate ->
toUpdate.apply {
if (!AzureSparkClusterManager.getInstance().isSignedIn) {
errorMsg = "ADLS Gen 1 storage type requires user to sign in first"
} else {
try {
val subscriptionManager = AuthMethodManager.getInstance().azureManager.subscriptionManager
val subscriptionNameList = subscriptionManager.selectedSubscriptionDetails
.stream()
.map { subDetail -> subDetail.subscriptionName }
.collect(Collectors.toList<String>())

if (subscriptionNameList.size > 0) {
subscriptionsModel = DefaultComboBoxModel(subscriptionNameList.toTypedArray())
subscriptionsModel.selectedItem = subscriptionsModel.getElementAt(0)
selectedSubscription = subscriptionsModel.getElementAt(0)
errorMsg = null
} else {
errorMsg = "No subscriptions found in this storage account"
}
} catch (ex: Exception) {
log().info("Refresh subscriptions error. " + ExceptionUtils.getStackTrace(ex))
errorMsg = "Can't get subscriptions, check if subscriptions selected"
}
}
}
}
.doOnNext { data ->
if (data.errorMsg != null) {
log().info("Refresh subscriptions error: " + data.errorMsg)
}
view.setData(data)
}
}

private fun refreshContainers(): Observable<SparkSubmitJobUploadStorageModel> {
Expand Down
Loading