What does it mean that Azure Cosmos DB is multi-model?

Cosmos DB is a single NoSQL data engine, an evolution of Document DB. When you create a container (“database instance”) you choose the most relevant API for your use case which optimises the way you interact with the underling data store and how the data is persisted in to that store.

So, depending on the API chosen, it projects the desired model (graph, column, key value or document) on to the underlying store.

You can only use one API against a container, multiple are not possible due to the way the data is stored and retrieved. The API dictates the storage model – graph, key value, column etc, but they all map back on to the same technology under the hood.

Multi-model, multi-API support

Azure Cosmos DB natively supports multiple data models including documents, key-value, graph, and column-family. The core content-model of Cosmos DB’s database engine is based on atom-record-sequence (ARS). Atoms consist of a small set of primitive types like string, bool, and number. Records are structs composed of these types. Sequences are arrays consisting of atoms, records, or sequences. The database engine can efficiently translate and project different data models onto the ARS-based data model. The core data model of Cosmos DB is natively accessible from dynamically typed programming languages and can be exposed as-is as JSON.

The service also supports popular database APIs for data access and querying. Cosmos DB’s database engine currently supports DocumentDB SQL, MongoDB, Azure Tables (preview), and Gremlin (preview). You can continue to build applications using popular OSS APIs and get all the benefits of a battle-tested and fully managed, globally distributed database service.

This article is referenced here;

https://stackoverflow.com/questions/44304947/what-does-it-mean-that-azure-cosmos-db-is-multi-model

Difference between SSIS, Azure Data Factory and Azure Data Bricks

For Data Engineering workloads within Microsoft landscape, there are multiple options to carry out Data Engineering tasks to extract data from myriad of data sources. Currently three options are available:

  • SQL Server Integration Services (SSIS): It is part of Microsoft SQL Server Suite and SSIS is a very well-known popular ETL tool for Data Integration along with rich built in transformations. Introduced in 2005. Mainly for on-premises. Now you can run on-premise as well. Aggregations, splits and joins.
  • Azure Data Factory (ADF): Unlike SSIS, ADF is a ELT tool along with Data Orchestration tool to build pipelines to move data across different layers. From on-Premise to Cloud and within Cloud landscape. Movement and Orchestration but not Transformations.
    • Data movement & Orchestration
    • Extract, Load & Transform
    • Transformation activities.

People familiar with SSIS can use it and existing SSIS packages can also be migrated.

Azure Data Bricks: Azure Data Bricks is latest entry into this for Data engineering and Data Science workloads, unlike SSIS and ADF which are more of Extract Transform Load (ETL), Extract Load Transform (ELT) and data Orchestration tools, Azure data bricks can handle data Engineering and data science workloads.

In a nutshell, although you can compare and contrast these tools, they actually compliment each other. For example you can call existing SSIS packages using Azure Data Factory and trigger Azure data bricks notebooks using Azure Data Factory.

How to Read blob file from Microsoft Azure Storage with .NET Core

In order to read a blob file from a Microsoft Azure Blob Storage, you need to know the following:

  • The storage account connection string. This is the long string that looks like this:
    DefaultEndpointsProtocol=https;
    AccountName=someaccounfname;
    AccountKey=AVeryLongCrypticalStringThatContainsALotOfChars==
  • The blob storage container name. This is the name in the list of “Blobs”.
  • The blob file name. This is the name of the blob inside the container. A file name can be in form of a path, as blobs are structured as a file structure inside the container. For ecample: folder/folder/file.extension

You also need this NuGet package:

Windows.Azure.Storage

The code is pretty simple:

using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
 
public string GetBlob(string containerName, string fileName)
{
  string connectionString = $"yourconnectionstring";
 
  // Setup the connection to the storage account
  CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
   
  // Connect to the blob storage
  CloudBlobClient serviceClient = storageAccount.CreateCloudBlobClient();
  // Connect to the blob container
  CloudBlobContainer container = serviceClient.GetContainerReference($"{containerName}");
  // Connect to the blob file
  CloudBlockBlob blob = container.GetBlockBlobReference($"{fileName}");
  // Get the blob file as text
  string contents = blob.DownloadTextAsync().Result;
   
  return contents;
}

The usage is equally easy:

GetBlob(“containername”, “my/file.json”);