Managing Users & Permissions in CosmosDB with PowerShell

If you’re just getting started with CosmosDB, you might not have come across users and permissions in a CosmosDB database. However, there are certain use cases where managing users and permissions are necessary. For example, if you’re wanting to be able to limit access to a particular resource (e.g. a collection, document, stored procedure) by user.

The most common usage scenario for users and permissions is if you’re implementing a Resource Token Broker type pattern, allowing client applications to directly access the CosmosDB database.

Side note: The CosmosDB implementation of users and permissions only provides authorization – it does not provide authentication. It would be up to your own implementation to manage the authentication. In most cases you’d use something like Azure Active Directory to provide an authentication layer.

But if you go hunting through the Azure Management Portal CosmosDB data explorer (or Azure Storage Explorer) you won’t find any way to configure or even view users and permissions.

ss_cdb_cosmosdbdataexplorer

To manage users and permissions you need to use the CosmosDB API directly or one of the SDKs.

But to make CosmosDB users and permissions easier to manage from PowerShell, I created the CosmosDB PowerShell module. This is an open source project hosted on GitHub. The CosmosDB module allows you to manage much more than just users and permissions, but for this post I just wanted to start with these.

Requirements

This module works on PowerShell 5.x and PowerShell Core 6.0.0. It probably works on PowerShell 3 and 4, but I don’t have any more machines running this version to test on.

The CosmosDB module does not have any dependencies, except if you call the New-CosmosDbContext function with the ResourceGroup parameter specified as this will use the AzureRM PowerShell modules to read the Master Key for the connection directly from your CosmosDB account. So I’d recommend installing the Azure PowerShell modules or if you’re using PowerShell 6.0, install the AzureRM.NetCore modules.

Installing the Module

The best way to install the CosmosDB PowerShell module is from the PowerShell Gallery. To install it for only your user account execute this PowerShell command:

Install-Module -Name CosmosDB -Scope CurrentUser

ss_cdb_cosmosdbinstallmodulecurrentuser

Or to install it for all users on the machine (requires administrator permissions):

Install-Module -Name CosmosDB

ss_cdb_cosmosdbinstallmoduleallusers

Context Variable

Update 2018-03-06

As of CosmosDB module v2.0.1, the connection parameter has been renamed to context and the New-CosmosDbConnection function has been renamed New-CosmosDbContext. This was to be more inline with naming adopted by the Azure PowerShell project. The old connection parameters and New-CosmosDbConnection function is still available as an alias, so older scripts won’t break. But these should be changed to use the new naming if possible as I plan to deprecate the connection version at some point in the future.

This post was updated to specify the new naming, but screenshots still show the Connection aliases.

Before you get down to the process of working with CosmosDB resources, you’ll need to create a context variable containing the information required to connect. This requires the following information:

  1. The CosmosDB Account name
  2. The CosmosDB Database name
  3. The Master Key for the account (you can have the CosmosDB PowerShell module get this directly from your Azure account if you wish).

To create the connection variable we just use the New-CosmosDbContext:

ss_cdb_cosmosdbnewconnection

If you do not wish to specify your master key, you can have the New-CosmosDbContext function pull your master key from the Azure Management Portal directly:

ss_cdb_cosmosdbnewconnectionviaportal

Note: This requires the AzureRM.Profile and AzureRM.Resoures module on Windows PowerShell 5.x or AzureRM.Profile.NetCore and AzureRM.Resources.NetCore on PoweShell Core 6.0.0.

Managing Users

To add a user to the CosmosDB Database use the New-CosmosDBUser function:

New-CosmosDbUser -Context $context -Id 'daniel'

ss_cdb_cosmosdbnewuser

To get a list of users in the database:

Get-CosmosDbUser -Context $context

ss_cdb_cosmosdbgetusers

To get a specific user:

Get-CosmosDbUser -Context $context -Id 'daniel'

ss_cdb_cosmosdbgetuser

To remove a user (this will also remove all permissions assigned to the user):

Remove-CosmosDbUser -Context $context -Id 'daniel'

ss_cdb_cosmosdbremoveuser

Managing Permissions

Permissions in CosmosDB are granted to a user for a specific resource. For example, you could grant a user access to just a single document, an entire collection or to a stored procedure.

To grant a permission you need to provide four pieces of information:

  1. The Id of the user to grant the permission to.
  2. An Id for the permission to create. This is just string to uniquely identify the permission.
  3. The permission mode to the permission: All or Read.
  4. The Id of the resource to grant access to. This can be generated from one of the Get-CosmosDb*ResourcePath functions in the CosmosDB PowerShell module.

In the following example, we’ll grant the user daniel all access to the TestCollection:

ss_cdb_cosmosdbnewpermission

Once a permission has been granted, you can use the Get-CosmosDbPermission function to retrieve the permission and with it the Resource Token that can be used to access the resource for a limited amount of time (between 10 minutes and 5 hours).

Note: as you have the Master Key already, using the Resource Token isn’t required.

For example, to retrieve all permissions for the user with Id daniel and a resource token expiration of 600 seconds:

Get-CosmosDbPermission -Context $context -UserId 'daniel' -TokenExpiry '600' |
fl *

ss_cdb_cosmosdbgetpermission

You can as expected delete a permission by using the Remove-CosmosDbPermission function:

Remove-CosmosDbPermission -Context $context -UserId 'daniel' -Id 'AccessTestCollection'

ss_cdb_cosmosdbremovepermission

Final Thoughts

So this is pretty much all there is to managing users and permissions using the CosmosDB PowerShell module. This module can also be used to manage the following CosmosDB resources:

 

  • Attachments
  • Collections
  • Databases
  • Documents
  • Offers
  • Stored procedures
  • Triggers
  • User Defined Functions

You can find additional documentation and examples of how to manage these resources over in the CosmosDB PowerShell module readme file on GitHub.

Hopefully this will help you in any CosmosDB automation tasks you might need to implement.

 

Advertisements

Configure Azure SQL Server Automatic Tuning with PowerShell

One thing I’ve found with configuring Azure services using automation (e.g. Azure PowerShell Modules, Azure Resource Manager template) is that the automation features are a little bit behind the feature set. For example, the Azure PowerShell modules may not yet implement settings for new or preview features. This can be a an issue if you’re strictly deploying everything via code (e.g. infrastructure as code). But if you run into a problem like this, all is not lost. So read on for an example of how to solve this issue.

Azure REST APIs

One of the great things about Azure is that everything is configurable by making direct requests to the Azure REST APIs, even if it is not available in ARM templates or Azure PowerShell.

Depending on the feature/configuration you can sometimes use the Set-AzureRmResource cmdlets to make calls to the REST APIs. But this cmdlet is limited to using an HTTP method of POST. So if you need to use PATCH, you’ll need to find an alternate way to make the call.

So, what you need then is to use the Invoke-RestMethod cmdlet to create a custom call to the REST API. This is the process I needed to use to configure the Azure SQL Server Automatic Tuning settings and what I’ll show in my script below.

The Script

The following script can be executed in PowerShell (of course) and requires a number of parameters to be passed to it:

  • SubscriptionId – the subscription Id of the Azure subscription that contains the Azure SQL Server.
  • ResourceGroupName – The name of the resource group containing SQL Server or
    database.
  • ServerName – The name of the Azure SQL Server to set the automatic tuning
    options on.
  • DatabaseNameThe name of the Azure SQL Database to set the automatic tuning options on. If you pass this parameter then the automatic tuning settings are applied to the Azure SQL Database, not the server.
  • Mode – This defines where the settings for the automatic tuning are
    obtained from. Inherit is only valid if the DatabaseName is specified.
  • CreateIndexEnable automatic tuning for creating an index.
  • DropIndexEnable automatic tuning for dropping an index.
  • ForceLastGoodPlan – Enable automatic tuning for forcing last good plan.

Requirements: You need to have the installed the AzureRM.Profile PowerShell module (part of the AzureRM PowerShell Modules) to use this script. The script also requires you to have logged into your Azure Subscription using Add-AzureRmAccount (as a user or Service Principal).

Example Usage

To apply custom automatic tuning to an Azure SQL Server:

.\Set-AzureRMSqlServerAutotuning.ps1 -SubscriptionId '<Subscription Id>' -ResourceGroupName '<Resource Group name>' -ServerName '<Azure SQL server name>' -Mode Custom -CreateIndex On -DropIndex On -ForceLastGoodPlan Off

ss_sqlserver_serverautotuning

To apply custom automatic tuning to an Azure SQL Database:

.\Set-AzureRMSqlServerAutotuning.ps1 -SubscriptionId '<Subscription Id>' -ResourceGroupName '<Resource Group name>' -ServerName '<Azure SQL server name>' -DatabaseName '<Azure SQL database name>' -Mode Custom -CreateIndex On -DropIndex On -ForceLastGoodPlan Off

ss_sqlserver_databaseautotuning

Conclusion

I’ve not yet encountered something in Azure that I can’t configure via the Azure REST APIs. This is because the Azure Management Portal uses the same APIs – so if it is available in the portal then you can do it via the Azure REST APIs. The biggest challenge is determining the body, header and methods available if the APIs are not yet documented.

If the API you need is not documented then you can raise a question in the Microsoft Azure Forums or on Stack Overflow. Failing that you can use the developer tools in your browser of choice to watch the API calls being made to the portal – I’ve had to resort to this many times, but documenting that process is something I’ll save for another day.

 

Create a Scheduled Task with unlimited Execution Time Limit in PowerShell

When creating a scheduled task in PowerShell you may wish to set the Execution Time Limit of the task to be unlimited (no time limit).

ss_scheduledtask_executiontimelimit

This will prevent the task from being terminated if it is still running after a specific period of time.

Creating scheduled tasks using PowerShell is pretty easy using the *-ScheduledTask* cmdlets in Windows Server 2012 and above.

However, after working on this issue in the xScheduledTask DSC resource in the Microsoft DSC Resource Kit I found that there are some differences in how to do this between Windows Server 2012 R2 and Windows Server 2016.

So in this post I’m going to show how to create a scheduled task with no Execution Time Limit that will work on both Windows Server 2012 R2 (and Windows 8/8.1) and Windows Server 2016 (and Windows 10).

I’ll also show the method that works only on Windows Server 2016.

All Versions of Windows Server

To create a scheduled task with unlimited Execution Time Limit on Windows Server 2012 R2 and Windows Server 2016.

This should also work on Windows Server 2012, but I have not confirmed this. It will NOT work on Windows Server 2008 R2. It should also work on Windows 8/8.1/10.

Windows Server 2016 Only

To create a scheduled task with unlimited Execution Time Limit on Windows Server 2016 only:

This method is a more elegant approach and arguably how the Scheduled Task cmdlets are intended to be used. But you would only use this method if your task does not need to created on an operating system earlier than Windows Server 2016/Windows 10.

So, hopefully this will help anyone else out there who has struggled with this.

 

 

 

 

Auto Formatting PowerShell in Visual Studio Code

One of the features I’m most fond of in Visual Studio Code is the Format Document feature that is built into Visual Studio Code.

ss_vscode_formatdocument

Side Note: If you’re writing PowerShell scripts or modules then you should be using Visual Studio Code. You should only be using PowerShell ISE if you don’t have the ability to install Visual Studio Code.

The Format Document feature can be used in many different document types in Visual Studio Code to correct the layout based on your user settings or the workspace settings for the project you’re working on.

ss_vscode_settings

This enables me to configure Visual Studio Code to auto format PowerShell code the way I like it for my own projects, but still adhere to the code formatting standards of any other projects I work on without having to remember what they are for each. This saves so much time and hassle.

Tip: If you’re contributing code to an Open Source project, the project maintainers may have included a .vscode\settings.json in the project folder. This may contain workspace specific formatting settings that you should apply before submitting code back to the project.

But even if you’ve don’t define and code formatting settings Visual Studio Code will still do a great job of formatting your PowerShell code. Having nicely formatted code really is not a requirement to being awesome at writing PowerShell, but it does make it easier for not so awesome PowerShell people to read, understand and potentially maintain your work.

Formatting a PowerShell Script

Here are the simple instructions for auto formatting a document in Visual Studio Code:

  1. Download and install Visual Studio Code.
  2. Once Visual Studio Code is installed, add the PowerShell extension:
    ss_vscode_powershellextension
  3. In Visual Studio Code, open the folder of the project containing the file you want to format or open an individual PowerShell file (PS1, PSD1, PSM1).
    ss_vscode_powershellbadcodeformat

    Note: Workspace formatting settings are only used if you’ve opened the folder rather than an individual file.

  4. Press SHIFT+ALT+F (or press F1 and type Format and select Format Document).
    ss_vscode_powershellgoodcodeformat
  5. The code is now all well formatted, so save the document.

It really is as easy as that.

Happy formatting.

Stop, Start or Restart all Web Apps in Azure using PowerShell

Here is a short (and sometimes handy) single line of PowerShell code that can be used to restart all the Azure Web Apps in a subscription:

ss_azurecloudshell_restartallwebapps

Note: Use this with care if you’re working with production systems because this _will_ restart these Web Apps without confirming first.

This would be a handy snippet to be able to run in the Azure Cloud Shell. It could also be adjusted to perform different actions on other types of resources.

To stop all Web Apps in a subscription use:

To start them all again:

The key part of this command is the GetEnumerator() method because most Azure Cmdlets don’t return an array of individual objects into the pipeline like typical PowerShell cmdlets. Instead returning a System.Collections.Generic.List object, which requires a slight adjustment to the code. This procedure can be used for most Azure Cmdlets to allow the results to be iterated through.

ss_azurecloudshell_systemcollections

Thanks for reading.

Get Azure API Management Git Credentials using PowerShell

One of the many great features of Azure API Management is the fact that it has a built in Git repository for storing the current configuration as well as publishing new configurations.

ss_apim_gitrepository

This allows you to push updated Azure API Management configurations to this internal Git repository as a new branch and then Deploy the configuration to API Management.

The internal Git repository in Azure API Management is not intended to be used for a normal development workflow. You’ll still want to develop and store your Azure API management configuration in an external Git repository such as GitHub or TFS/VSTS and then copy configuration updates to the internal Git repository in Azure API Management using some sort of automated process (e.g. Continuous Integration/Continuous Delivery could be adopted for this).

The Internal Git Repository

To access the Internal Git Repository requires short lived (30 days maximum) Git credentials to be generated. This is fairly easy through the Azure API Management portal:

ss_apim_gitrepositorygeneratecreds

Unfortunately using the portal to get these credentials is a manual process and so would not be so good for an automated delivery process (e.g. CI/CD). You’d need to update these Git credentials in your CI/CD automation system every time they expired (every 30 days).

Get Git Credentials

A better approach to generating the Git Credentials is to use Azure PowerShell API Management cmdlets connected with a Service Principal to generate the Git credentials whenever you need them in your CI/CD pipeline.

This is not a completely straightforward process right now (which is unusual for the Azure PowerShell team), so I’ve created a simple PowerShell script that will take care of the nuts and bolts for you.

Requirements

To run this script you’ll need:

  1. PowerShell 5 (WMF 5.0) or greater.
  2. Azure PowerShell Modules installed (make sure you’ve got the latest versions – 4.0.3 at the time of writing this).

You’ll also need to supply the following parameters to the script:

  1. The Azure Subscription Id of the subscription containing the API Management instance.
  2. The name of the Resource Group where the API Management instance is installed to.
  3. The service name of the API Management instance.

You can also optionally supply which of the two internal API Management keys, primary or secondary, to use to generate the credential and also the length of time that the Git credential will be valid for (up to 30 days).

Steps

Download the Script

  1. Download the script Get-AzureRMApiManagementGitCredential.ps1 using the PowerShell command:
  2. Unblock the script using the PowerShell command:

Using the Script

  1. Use the Login-AzureRMAccount cmdlet to authenticate to Azure. This would normally be done using a Service Principal if using an automated process, but could be done interactively when testing.
  2. Execute the script providing the SubscriptionId, ResourceGroup and ServiceName parameters (and optionally the KeyType and ExpiryTimespan) using the following PowerShell command:

ss_apim_gitrepositoryinvoke

The script will return an object containing the properties GitUsername and GitPassword that can be provided to Git when cloning the internal Git repository.

The GitPassword is not escaped so can not be directly used within a Git Clone URL without replacing any / or @ with %2F and %40 respectively.

In the example above I generated an internal Git Credential using the Primary Secret Key that will expire in 4 hours.

Typically you’d assign the output of this script to a variable and use the properties to generate the URL to pass into the Git Clone. For example:

ss_apim_gitrepositoryclone

Tips

  • When cloning the internal Git Repository you’ll need the clone URL of the repository. This is always the name of your Azure API Management instance followed by with scm.azure-api.net appended to it E.g. https://myapimanagementinstance.scm.azure-api.net
  • Once you’ve uploaded a new Git branch containing a new or updated Azure API Management configuration you’ll need to use the Publish-AzureRmApiManagementTenantGitConfiguration cmdlet to tell Azure API Management to publish the configuration contained in the branch. I have not detailed this process here, but if there is interest I can cover the entire end-to-end process.
  • The Primary and Secondary Secret Keys that are used to generate the internal Git Credential can be re-generated (rolled) individually if a Git credential is compromised. However, this will invalidate all Git Credentials generated using that Secret Key.

The Script

If you wish to review the script itself, here it is:

So, hopefully that will be enough information to get anyone else started on building a CI/CD pipeline for deploying Azure API Management configurations.

 

Publish an Azure RM Web App using a Service Principal in PowerShell

Introduction

Deploying an Azure Web App is almost stupidly simple. If I were to list the methods and tools I’d still be typing next week. The problem with many of these tools and process is that they do a whole lot of magic under the hood which makes the process difficult to manage in source control.

I’m a big believer that all code (including deployment code) should be in the application source repository so it can be run by any tool or release pipeline – including manually by development teams. This ensures that whatever deployment process is used, it is the same no matter who or what runs it – and we end up continuously testing the deployment code and process.

So I decided to go and find out how to deploy an Azure Web App using PowerShell using an Service Principal.

Where is Publish-AzureRMWebsiteProject?

If you look through the Azure PowerShell cmdlets you’ll find a service manager one called Publish-AzureWebsiteProject. This cmdlet looks like it should do the trick, but it isn’t suitable because it requires authentication by a user account instead of a service principal.

Only service principal accounts can be authenticated using automation. Therefore using Publish-AzureWebsiteProject would only work if a development team member was able to interactively login– which would prevent the same process being used for automation or our continuous delivery pipeline. The newer Azure Resource Manager cmdlets (*-AzureRM*) all support a login using a service principal, but the problem is that there is no Publish-AzureRMWebsiteProject cmdlet.

So, to work around this limitation I determined I had to use Web Deploy/MSDeploy. The purpose of this post is to share the PowerShell function/code and process I used to do this. This will work with and without Web App deployment slots.

Note: in my case our teams put all deployment code into a PowerShell PSake task in the application source code repository to make it trivial for anyone to run the deployment. The continuous delivery pipeline was also able to call the exact same task to perform the deployment. There is no requirement to use PowerShell PSake – just a simple PowerShell script will do.

The Code

So, I’ll start by just pasting the function that does performs the task:

Just save this file as Publish-AzureRMWebappProject.ps1 and you’re ready to start publishing (almost).

Before you can use this function you’ll need to get a few things sorted:

  1. Create a Service Principal with a password to use to deploy the web app using the instructions on this page.
  2. Make sure you have got the latest version of the Azure PowerShell Modules installed (I used v4.0.0). See this page for instructions.
  3. Make sure you’ve got MSDeploy.exe installed on your computer – see this page for instructions. You can pass the path to MSDeploy.exe into the Publish-AzureRMWebappProject.ps1 using the MSDeployPath parameter.
  4. Gather the following things (there are many ways of doing that – but I’ll leave it up to you to figure out what works for you):
    1. the Subscription Id of the subscription you’ll be deploying to.
    2. the Tenant Id of the Azure Active Directory containing your Service Principal.
    3. the Application Id that was displayed to you when you created the Service Principal.
    4. the Password you assigned when you created the Service Principal.

Once you have got all this information you can call the script above like this:

Note: You’ll need to make sure to replace the variables $SubscriptionId, $TenantId, $Password and $Username with the values for your Azure Subscription, Tenancy and Service Principal.

When everything is done correctly this is what happens when you run it (with -Verbose enabled):

ss_webappdeploy_publishazurermwebappproject

Note: in the case above I was installing to a deployment staging slot called offline, so the new version of my website wouldn’t have been visible in my production slot until I called the Swap-AzureRmWebAppSlot cmdlet to swap the offline slot with my production slot.

All in all, this is fairly robust and allows our development teams and our automation and continuous delivery pipeline to all use the exact same deployment code which reduces deployment failures.

If you’re interested in more details about the code/process, please feel free to ask questions.

Thanks for reading.