How to run SonarQube Analysis in Visual Studio Console

To generate a SonarQube token (required for authentication when running analyses from the command line or CI/CD pipelines), follow these steps:


Steps to Generate a SonarQube Token

  1. Log in to your SonarQube server (e.g., http://localhost:9000 for local setups).
  2. Click your profile icon (top-right corner) β†’ “My Account”.
  3. Go to the “Security” tab.
  4. Under “Tokens”, enter a name for your token (e.g., vs-console-token).
  5. Click “Generate”.
  6. Copy the token immediately (it won’t be shown again!).
    Example token format: sqp_1234567890abcdef

How to Use the Token

  • In dotnet-sonarscanner commands, pass the token via:shCopyDownloaddotnet sonarscanner begin /k:”your-project-key” /d:sonar.host.url=”http://localhost:9000″ /d:sonar.login=”sqp_1234567890abcdef”
  • For security, never hardcode the token in scripts. Use:
    • Environment variables (e.g., SONAR_TOKEN).
    • Secret management tools (e.g., Azure Key Vault, GitHub Secrets).

Important Notes

  • πŸ”’ Treat tokens like passwords (they grant access to your SonarQube projects).
  • πŸ”„ Regenerate tokens periodically or revoke old ones (under “Security”).
  • 🚫 No token? You’ll get errors like Not authorized or Authentication failed.

Example Workflow

# Set token as an environment variable (optional)
set SONAR_TOKEN=sqp_1234567890abcdef

# Run analysis (Windows CMD)
dotnet sonarscanner begin /k:"my-project" /d:sonar.host.url="http://localhost:9000" /d:sonar.login="%SONAR_TOKEN%"
dotnet build
dotnet sonarscanner end /d:sonar.login="%SONAR_TOKEN%"

Get the SonarQube Project URL

The project URL is the web address of your project in SonarQube. It typically follows this format:

http://<sonarqube-server-url>/dashboard?id=<project-key>
  • <sonarqube-server-url>: The host where SonarQube is running (e.g., http://localhost:9000 if running locally).
  • <project-key>: The unique key assigned to your project in SonarQube.

How to Find the Project Key?

  1. Log in to your SonarQube server.
  2. Navigate to your project.
  3. Check the URL in the browser’s address bar (e.g., http://localhost:9000/dashboard?id=my-project-key).
  4. Alternatively, go to Project Settings β†’ General Settings β†’ Key.

2. Run SonarQube Analysis in Visual Studio Console

To analyze a .NET project in Visual Studio Developer Command Prompt (or terminal), use the SonarScanner for .NET (dotnet-sonarscanner).

Prerequisites

  • Install Java (required for SonarQube scanner).
  • Install SonarScanner for .NET:shCopyDownloaddotnet tool install –global dotnet-sonarscanner

Steps to Run Analysis

  1. Start the SonarQube Analysis:shCopyDownloaddotnet sonarscanner begin /k:”” /d:sonar.host.url=”” /d:sonar.login=”
    • Replace:
      • <project-key> with your SonarQube project key.
      • <sonarqube-server-url> with your SonarQube server URL (e.g., http://localhost:9000).
      • <token> with a SonarQube user token.
  2. Build Your Project:shCopyDownloaddotnet build
  3. Complete & Publish Results to SonarQube:shCopyDownloaddotnet sonarscanner end /d:sonar.login=”<token>”
  4. Check Results:
    • Open the SonarQube project URL (e.g., http://localhost:9000/dashboard?id=my-project-key) in a browser.

Example

# Start analysis
dotnet sonarscanner begin /k:"my-dotnet-app" /d:sonar.host.url="http://localhost:9000" /d:sonar.login="sqp_1234567890abcdef"

# Build the project
dotnet build

# End analysis & upload results
dotnet sonarscanner end /d:sonar.login="sqp_1234567890abcdef"

After running these commands, your analysis results will appear in the SonarQube dashboard.

Azure AppInsights integration in Blazor WASM

Integrating Azure Application Insights in a Blazor WebAssembly (WASM) app is possible, though it requires special handling since Blazor WASM runs entirely in the browser, and you can’t use the full .NET SDK for Application Insights like you can in server-side apps.

Here’s how you can set it up using JavaScript SDK (since Blazor WASM ultimately runs in the browser):


βœ… Step-by-Step Guide

1. Create Application Insights Resource (if not done)

  • Go to Azure Portal β†’ Create a resource β†’ Application Insights.
  • Choose General β†’ Application Insights, select region, etc.
  • After creation, copy the Instrumentation Key or Connection String.

2. Add the Application Insights JavaScript SDK

In your Blazor WebAssembly project:

Modify wwwroot/index.html (for standalone Blazor WASM)

htmlCopyEdit<!-- Application Insights JavaScript SDK -->
<script type="text/javascript">
  var appInsights = window.appInsights || function (config) {
    function r(config) {
      t[config] = function () {
        var i = arguments;
        t.queue.push(function () { t[config].apply(t, i); })
      }
    }
    var t = { config: config }, u = document, e = window, o = "script", s = u.createElement(o), i, f;
    for (s.src = config.url || "https://az416426.vo.msecnd.net/scripts/a/ai.0.js", u.getElementsByTagName(o)[0].parentNode.appendChild(s), t.cookie = u.cookie, t.queue = [], i = ["Event", "Exception", "Metric", "PageView", "Trace", "Dependency"]; i.length;)
      r("track" + i.pop());
    return r("setAuthenticatedUserContext"), r("clearAuthenticatedUserContext"), r("flush"), config.disableExceptionTracking || (i = "onerror", r("_" + i), f = e[i], e[i] = function (config, r, u, e, o) {
      var s = f && f(config, r, u, e, o);
      return s !== !0 && t["_" + i](config, r, u, e, o), s
    }), t
  }({
    instrumentationKey: "YOUR_INSTRUMENTATION_KEY"
  });

  window.appInsights = appInsights;
  appInsights.trackPageView();
</script>

3. Call App Insights from C# Code

You can invoke JavaScript from your Blazor C# code like this:

Create a service to interact with JS (e.g., AppInsightsService.cs):

csharpCopyEditusing Microsoft.JSInterop;
using System.Threading.Tasks;

public class AppInsightsService
{
    private readonly IJSRuntime _jsRuntime;

    public AppInsightsService(IJSRuntime jsRuntime)
    {
        _jsRuntime = jsRuntime;
    }

    public async Task TrackEventAsync(string eventName)
    {
        await _jsRuntime.InvokeVoidAsync("appInsights.trackEvent", new { name = eventName });
    }

    public async Task TrackExceptionAsync(string errorMessage)
    {
        await _jsRuntime.InvokeVoidAsync("appInsights.trackException", new
        {
            exception = new { message = errorMessage }
        });
    }

    public async Task TrackPageViewAsync(string pageName)
    {
        await _jsRuntime.InvokeVoidAsync("appInsights.trackPageView", new { name = pageName });
    }
}

4. Register the Service

In Program.cs:

csharpCopyEditbuilder.Services.AddScoped<AppInsightsService>();

5. Use in Your Components

razorCopyEdit@inject AppInsightsService AppInsights

<button @onclick="TrackEvent">Track Event</button>

@code {
    private async Task TrackEvent()
    {
        await AppInsights.TrackEventAsync("ButtonClicked");
    }
}

🧠 Notes

  • Only client-side telemetry will be captured (JS-side) β€” no automatic dependency tracking, for example.
  • If you need full telemetry, consider combining it with Blazor WASM hosted model and using Application Insights server SDK in the backend.

Supervised and Unsupervised Learning

Supervised Learning

Definition:
The model learns from labeled data β€” meaning each input has a corresponding correct output.

Goal:
Predict an output (label) from input data.

Examples:

  • Email spam detection (Spam / Not Spam)
  • Predicting house prices (Price in $)
  • Handwriting recognition (0–9 digits)

Types:

  • Classification (output is a category): e.g., cat vs dog
  • Regression (output is a number): e.g., predicting temperature

Requires Labels? βœ… Yes

Example Dataset:

Input FeaturesLabel
“Free offer now” (email text)Spam
3 bedrooms, 2 baths, 1500 sq ft$350,000

πŸ” Unsupervised Learning

Definition:
The model learns patterns from unlabeled data β€” it finds structure or groupings on its own.

Goal:
Explore data and find hidden patterns or groupings.

Examples:

  • Customer segmentation (group customers by behavior)
  • Anomaly detection (detect fraud)
  • Topic modeling (find topics in articles)

Types:

  • Clustering: Group similar data points (e.g., K-Means)
  • Dimensionality Reduction: Simplify data (e.g., PCA)

Requires Labels? ❌ No

Example Dataset:

Input Features
Age: 25, Spent: $200
Age: 40, Spent: $800

(The model might discover two customer groups: low-spenders vs high-spenders)


βœ… Quick Comparison

FeatureSupervised LearningUnsupervised Learning
LabelsRequiredNot required
GoalPredict outputsDiscover patterns
OutputKnownUnknown
ExamplesClassification, RegressionClustering, Dimensionality Reduction
AlgorithmsLinear Regression, SVM, Random ForestK-Means, PCA, DBSCAN

Supervised Learning Use Cases

1. Email Spam Detection

  • βœ… Label: Spam or Not Spam
  • πŸ“ Tech companies like Google use supervised models to filter email inboxes.

2. Fraud Detection in Banking

  • βœ… Label: Fraudulent or Legitimate transaction
  • 🏦 Banks use models trained on historical transactions to flag fraud in real-time.

3. Loan Approval Prediction

  • βœ… Label: Approved / Rejected
  • πŸ“Š Based on income, credit history, and employment data, banks decide whether to approve loans.

4. Disease Diagnosis

  • βœ… Label: Disease present / not present
  • πŸ₯ Healthcare systems train models to detect diseases like cancer using medical images or lab reports.

5. Customer Churn Prediction

  • βœ… Label: Will churn / Won’t churn
  • πŸ“ž Telecom companies predict if a customer is likely to cancel a subscription based on usage data.

πŸ” Unsupervised Learning Use Cases

1. Customer Segmentation

  • ❌ No labels β€” model groups customers by behavior or demographics.
  • πŸ›’ E-commerce platforms use this for targeted marketing (e.g., Amazon, Shopify).

2. Anomaly Detection

  • ❌ No labeled “anomalies” β€” model detects outliers.
  • πŸ›‘οΈ Used in cybersecurity to detect network intrusions or malware.

3. Market Basket Analysis

  • ❌ No prior labels β€” finds item combinations frequently bought together.
  • πŸ›οΈ Supermarkets like Walmart use this to optimize product placement.

4. Topic Modeling in Text Data

  • ❌ No labels β€” model finds topics in documents or articles.
  • πŸ“š News agencies use it to auto-categorize stories or summarize themes.

5. Image Compression (PCA)

  • ❌ No labels β€” model reduces dimensionality.
  • πŸ“· Used in storing or transmitting large image datasets efficiently.

πŸš€ In Summary:

IndustrySupervised ExampleUnsupervised Example
FinanceLoan approvalFraud pattern detection
HealthcareDiagnosing diseases from scansGrouping patient records
E-commercePredicting purchase behaviorCustomer segmentation
CybersecurityPredicting malicious URLsAnomaly detection in traffic logs
RetailForecasting salesMarket basket analysis

Training, Validation and Test Data in Machine Learning

Training Data

  • Purpose: Used to teach (train) the model.
  • Contents: Contains both input features and corresponding output labels (in supervised learning).
  • Usage: The model learns patterns, relationships, and parameters from this data.
  • Size: Typically the largest portion of the dataset (e.g., 70–80%).

Example:
If you’re training a model to recognize handwritten digits:

  • Input: Images of digits
  • Label: The digit (0–9)

Test Data

  • Purpose: Used to evaluate how well the model performs on unseen data.
  • Contents: Same format as training data (features + labels), but not used during training.
  • Usage: Helps assess model accuracy, generalization, and potential overfitting.
  • Size: Smaller portion of the dataset (e.g., 20–30%).

Key Point: It simulates real-world data the model will encounter in production.

Validation Data

  • Purpose: Used to tune the model’s hyperparameters and monitor performance during training.
  • Contents: Same format as training/test data β€” includes input features and labels.
  • Usage:
    • Helps choose the best version of the model (e.g., best number of layers, learning rate).
    • Detects overfitting early by evaluating on data not seen during weight updates.
  • Not used to directly train the model (no weight updates from validation data).

Summary Table

AspectTraining DataValidation DataTest Data
Used forTraining modelTuning modelFinal evaluation
Used duringModel trainingModel trainingAfter model training
Updates model?YesNoNo
Known to modelYesSeen during trainingNever seen before

Tip:

In practice, for small datasets, we often use cross-validation, where the validation set rotates among the data to make the most of limited samples.

Typical Size Ranges for Small Datasets

Dataset TypeNumber of Samples (Roughly)
Very Small< 500 samples
Small500 – 10,000 samples
Medium10,000 – 100,000 samples
Large100,000+ samples

Why Size Matters

  • Small datasets are more prone to:
    • Overfitting – model memorizes data instead of learning general patterns.
    • High variance in performance depending on the data split.
  • Big models (e.g., deep neural networks) usually need large datasets to perform well.

πŸ’‘ Common Examples

  • Medical diagnosis: Often < 5,000 patient records β†’ small dataset.
  • NLP for niche domains: < 10,000 labeled texts β†’ small.
  • Handwritten digit dataset (MNIST): 60,000 training images β†’ medium-sized.

πŸ” Tip for Small Datasets

If your dataset is small:

  1. Use cross-validation (like 5-fold or 10-fold).
  2. Consider simpler models (e.g., logistic regression, decision trees).
  3. Use data augmentation (e.g., rotate/scale images, reword texts).
  4. Apply transfer learning if using deep learning (e.g., pre-trained models like BERT, ResNet).

Export SonarQube issues (community edition) in Excel File

SonarQube community edition has no direct way to export issues to excel file. Here are the steps to export;

  1. Install Python from here

https://www.python.org/downloads

Go through custom installation. Specify a manual path e.g. c:\Python313. Check all checkboxes.

Verify installation using “Python –version” in python console.

Clone following repository from GitHub;

https://github.com/talha2k/sonarqube-issues-export-to-excel

open “sonarqube-issues-export-to-excel” in python IDE (IDLE) and edit SonarQube URL, Project_Key and Token. Save file.

Run the script

python sonar-export.py

The script will fetch the issues and save them to an Excel file namedΒ sonarqube_issues.xlsx.

Hope this will help someone.