Working with Node.js (Express.js) and Cosmos DB

Toshi Honda
4 min readMay 15, 2023

--

Here is my story of building a Node.js web app with Cosmos DB. Although there are many guides available on the web, I still had to figure out/troubleshoot issues during the development. Mainly, I followed the guide ‘Quickstart — Azure Cosmos DB for NoSQL client library for Node.js’ in learn Microsoft website. However, I added user-managed identity, and View Engine along the way. I am hoping my story helps Node.js with Cosmos DB developers around the world.

Here is the summary of my story.

View Engine

This is something I just wanted to mention. Express.js is my web application framework of choice. Note that I am a casual developer. I used express-generator this time. Vola! View engine. View engines create HTML pages dynamically using templates. It also makes codes a lot cleaner and reusable. I got to like it. Pug is one of them.

Use of ES6 module

According to the Microsoft cosmos DB guide, “type” : “module” entry needs to be added in package.json to enable the ES6 module. That was easy. However, the guide did not mention anything about ES6 further than that. After a quick research (the keyword was ES6), I learned that all “require{}” phrase needs to be rephrased to “import” phrase. For example,

// var express = require('express');
import express from 'express';

Another thing was __dirname. A quick search reveals I need to rewrite it as

// var path = require('path');
import path from 'path';
import { fileURLToPath } from 'url';
import { dirname } from 'path';
const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename);

So, that was fun.

DefaultAzureCredential

In order to access Cosmos DB, some form of authentication method is required. One of them is the use of passwords or secret keys. But I do not feel like I want to expose it anywhere in my code, and Microsoft says to use DefaultAzureCredential. The guide explains it as

“This approach enables your app to use different authentication methods in different environments (local vs. production) without implementing environment-specific code.”

It sounds good! I am mentioning the class because some old guides used a secret key in codes. I got to like the new way of doing authentication.

One thing to note is that don’t forget to set the environment variable $env:COSMOS_ENDPOINT (this is for Windows) locally or in App Setting (App Services -> Configuration -> Application settings). Otherwise, it will error out.

Container contributor role (built-in)

Then confusion came along when the guide mentioned creating the custom role. The guide explains how we can create the custom “PasswordlessReadWrite” role but I found it is not necessary because the exact same role already exists in the name of “Cosmos DB Built-in Data Contributor.” In Azure CLI,

az cosmosdb sql role definition list --account-name accountname --resource-group resourcename

This will list the role definition and you will find the SQL role definition ID quickly. Then you can assign the role to your accounts. In my local development environment (Visual Studio Code), I assigned the role to my app developer Azure user account. I also assigned the role to the user-managed identity for the app after deployment.

I did not expect to use Azure CLI to assign the role. Initially, I was navigating through on Azure Portal. There is a “Contributor” role found in the portal, but there are many permissions associated with it beyond container access only. After a few hours of research, I concluded that I need to work with Azure CLI. Maybe that is the reason the guide shows the example of creating and assigning the role in Azure CLI.

Using Connect-AZAccount you should be able to create an item in your database in your local development environment.

Authenticating with a user-assigned managed identity

Now that I could create an item in the local environment, I deployed the app in the cloud. However, I could not open my app website, at all. I needed to enable diagnostic logging to see what was going on. It uses a storage account, and it costs barely. Among the errors, I have seen were

  • TypeError [ERR_INVALID_URL]: Invalid URL
  • CredentialUnavailableError: EnvironmentCredential is unavailable.

The first error is telling you to make sure COSMOS_ENDPOINT is set in your Application settings I mentioned above. For the second error, I needed to add the option in DefaultAzureCredential like the one below,

const cosmosClient = new CosmosClient({endpoint, aadCredentials: new DefaultAzureCredential({
managedIdentityClientId: "<MANAGED_IDENTITY_CLIENT_ID>"
})});

It was explained in this GitHub sample. Namely, it explains how I should be using DefaultAzureCredential in various situations with different options. It took me a while to figure it out.

Query items return nothing

The last piece was querying items. Although I was following the guide, I did not copy and paste the whole content. This is what I wrote for querying items

const { resource } = await this.container.items.query(querySpec).fetchAll();

It looks good but it did not return anything at all. I tried items.readAll().fetchAll() but nothing. I figured item(id, partition_key).read() did work. I could fetch this way but not with a query statement. After a long search, I found the following post in Stackoverflow, saying

Also, the query returns the data in resources key and not result key like you are using.

So I switched from { resource } to { resources } and it just started querying items!

Conclusion

I hope you find my story helpful in developing your Node.js + Cosmos DB Web app. Happy coding!

--

--