November 04, 2023

Calendar Tables

Calendar tables, or functions, give us the ability to measure or aggregate over time and offer a solution to the problem of a day which is not represented in a result set being omitted from the data set.

Calendar objects are fairly simple to create, in their basic form their little more then a tally or sequence represented in useful columns such as the day of the week, the month of the year or the quarter.

I use a table valued function, which accepts two integers as a parameter which controls the number of years previous and forward of the current date. It returns the following columns:

  • Date `DATE` '2023-01-31'
  • DateEndTime `DATETIME` '2023-01-31 23:59:59.997' - The last measurable millisecond with a precision of 3 milliseconds.
  • Year `INT` `2023`
  • MONTH `INT` '1'
  • Day `INT` '31'
  • Quarter `INT` '1'
  • WeekNumber `INT` '5' - The ISO week the date belongs to.
  • MonthName `NVARCHAR(20)` 'January' - The month name, per the databases language setting.
  • DayName `NVARCHAR(20) 'Tuesday' - The name of the day, per the databases language setting.
  • WeekStartDate `DATE` '2023-01-29' - The date on which the week starts based on the current DATEFIRST setting.
  • WeekEndDate `DATE` '2023-02-04' - The date on which the week ends based on the current DATEFIRST setting.
  • MonthStartDate `DATE` '2023-01-01' 
  • MonthEndDate `DATE` '2023-01-31' 
  • QuarterStartDate `DATE` '2023-01-01' 
  • QuarterEndDate `DATE` '2023-03-31' 
  • YearStartDate `DATE` '2023-01-01`
  • YearEndDate `DATE` '2023-12-31'
  • WeekStartTime `DATETIME` '2023-01-29 00:00:00.000' - The date and time the week starts, based on the current DATEFIRST setting.
  • WeekEndTime `DATETIME` '2023-02-04 23:59:59.997' - The date and time the week ends, to the last millisecond, with a precision of 3 milliseconds.
  • MonthStartTime `DATETIME` '2023-01-01 00:00:00.000' - The date and time the month starts.
  • MonthEndTime `DATETIME` '2023-01-31 23:59:59.997' - The date and time the month ends, to the last millisecond, with a precision of 3 milliseconds.
  • QuarterStartTime `DATETIME` - The date and time the quarter starts.
  • QuarterEndTime `DATETIME` '2023-01-31 23:59:59.997' - The date and time the quarter ends, to the last millisecond, with a precision of 3 milliseconds.
  • IsWeekDay `BIT` '1' - True if the day is a week day (Monday-Friday).
Using the calendar table in a query makes date aggregation trivial. In this example I wanted to collect the number and total amount of invoices for each day in the second half of May 2023.


SELECT c.Date, COUNT(DISTINCT i.InvoiceID) AS Invoices, COALESCE(SUM(ii.Quantity * (ii.Price-ii.Cost)),0) AS InvoiceTotal
  FROM dbo.Calendar(0,0) c
    LEFT OUTER JOIN dbo.Invoices i
	  ON i.InvoiceDateTimeUTC BETWEEN c.Date AND c.DateEndTime
	LEFT OUTER JOIN dbo.InvoiceItems ii
	  ON i.InvoiceID = ii.InvoiceID
 WHERE c.Date BETWEEN '2023-05-15' AND '2023-05-31'
 GROUP BY c.Date
 ORDER BY c.Date
Invoices between May15 and May 31st
DateInvoicesInvoiceTotal
2023-05-15137 2792.10
2023-05-16132 2513.80
2023-05-17131 2476.60
2023-05-18129 2611.40
2023-05-19128 2570.10
2023-05-20146 3062.80
2023-05-21120 2454.50
2023-05-22128 2483.00
2023-05-2383 1634.40
2023-05-240 0.00
2023-05-2540 717.50
2023-05-260 0.00
2023-05-270 0.00
2023-05-280 0.00
2023-05-290 0.00
2023-05-300 0.00
2023-05-310 0.00

The days which did not have any invoices are still included in the result set, with 0 invoices and a total of 0.

We can also easily use the calendar table to compare one period to another. Here we're comparing the invoice counts between several week numbers in two years, and produce a delta:
;WITH ThisPeriod AS (
SELECT c.WeekNumber, c.WeekStartDate, c.Year, COUNT(i.InvoiceID) AS Invoices
  FROM dbo.Calendar(1,1) c
    LEFT OUTER JOIN Invoices i
	  ON i.InvoiceDateTimeUTC BETWEEN c.Date AND c.DateEndTime
 WHERE c.WeekNumber BETWEEN 20 AND 22
 GROUP BY c.WeekNumber, c.WeekStartDate, c.Year
)

SELECT tp.WeekNumber, tp.WeekStartDate, tp.Invoices, lp.WeekStartDate, lp.Invoices, CAST(CAST(tp.Invoices AS DECIMAL(10,2)) / CAST(lp.Invoices AS DECIMAL(10,2)) AS DECIMAL(3,2))-1 AS DeltaPercent
  FROM ThisPeriod tp
    INNER JOIN ThisPeriod lp
	  ON tp.WeekNumber = lp.WeekNumber
	  AND tp.Year = lp.Year + 1
 WHERE tp.year = 2023;
Invoices in weeks 20, 21 and 22 for 2022 and 2023
WeekNumberWeekStartDateInvoicesWeekStartDateInvoicesDeltaPercent
202023-05-14935 2022-05-08927 0.01
212023-05-21371 2022-05-15972 -0.62
222023-05-280 2022-05-22937 -1.00

We can also use the calendar table to perform duration calculations. Perhaps we want to know the number of week days in January 2023:
SELECT COUNT(Date) AS Days
  FROM dbo.Calendar(1,1)
 WHERE Month = 1
   AND Year = 2023
   AND IsWeekDay = 1
Week days in January 2023
Days
22
Calendar tables are very useful to have on hand, particularly in reporting type environments. If you'd like to implement the TVF used in these examples you can find the dbo.Calendar.sql in my Azure Dev Ops repo.



July 26, 2022

Using Encrypted columns with certificates in SQL Server

This article is going to walk through all of the steps required to demonstrate how to secure columns in a SQL Server database. We’re going to create a basic table to hold the information, a certificate, and a key to project some of those columns and grant the appropriate rights so permissioned users can access the values. 

The first thing we’re going to need is a database. For the sake of easy clean up, I’m going to create a brand new one for this. We’ll call it CaveOfWonders.
CREATE DATABASE CaveOfWonders;
Magic. We know we’re going to be creating a certificate and a key, so let’s get those set up now. We’ll choose a master password for the key, so be sure to make a note of this so it can be recorded for later use.
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '4wholeNewWorld!';
GO
CREATE CERTIFICATE magicLamp WITH SUBJECT = 'To obsfucate Client Secrets';
GO
CREATE SYMMETRIC KEY aladdin WITH ALGORITHM = AES_256 ENCRYPTION BY CERTIFICATE magicLamp ;
GO 
OPEN SYMMETRIC KEY aladdin DECRYPTION BY CERTIFICATE magicLamp;
We just need one more thing: a table to add the information we’ll be storing. Let’s create it and put some data into it.
CREATE TABLE ClientSecrets (ToolName NVARCHAR(128), SecretName NVARCHAR(128), EncryptedSecret VARBINARY(128));
INSERT INTO ClientSecrets(ToolName, SecretName, EncryptedSecret) VALUES
('Key Vault', 'Client ID', EncryptByKey(Key_GUID('aladdin'), N'super secret ID )),
('Key Vault', 'Client Secret', EncryptByKey(Key_GUID('aladdin'), N'super secret password' ));
And that’s it! We now have a table containing sensitive values which we can select and use in any other application that we can query SQL Server from, right?
USE CaveOfWonders;
SELECT *
 FROM dbo.ClientSecrets;


Not quite. We did not provide any instruction on how we should attempt to decrypt the values, so SQL Server just didn’t decrypt them. OK, so let’s have Aladdin use his lamp!
OPEN SYMMETRIC KEY aladdin DECRYPTION BY CERTIFICATE magicLamp; 
SELECT *
  FROM ClientSecrets;


Nope! We’re now using the key and certificate, but we need to provide explicit instructions to decrypt the values, as shown in the code below.
OPEN SYMMETRIC KEY aladdin DECRYPTION BY CERTIFICATE magicLamp; 
SELECT *, CONVERT(NVARCHAR(128), DecryptByKey(EncryptedSecret)) AS 'Decrypted ID Number'
  FROM ClientSecrets;


Excellent, we can now store, project and retrieve secrets in this table. Any user who has been granted the permissions to use the certificate and the key can now do so to retrieve the secrets. 
To grant the permissions required you’d need to add the following to a user who already has read access on the table
GRANT VIEW DEFINITION ON CERTIFICATE :: magicLamp TO SecretReaderUser;
GRANT VIEW DEFINITION ON SYMMETRIC KEY :: aladdin TO SecretReaderUser;

Using Azure Key Vault and Blob Store to Automate SAS token rotation

 In order to make use of the Azure blob stores life cycle management, you need to create your backup as block blobs. In order to create block blobs you need to create the backup using a SAS token credential. Creating a SAS token is pretty straightforward. Let’s assume the blob store is called myblobstore.

A SAS credential is always named as the full path:

https://myblobstore.blob.core.windows.net/

Let’s also assume we have generated a SAS token for the blob store and it’s value is:

sv=2020-08-03&ss=bfqt&srt=sco&sp=rwdlacupitfx&se=2021-05-09T06:30:19Z&st=2022-04-17T22:30:19Z&spr=https&sig=w9wu0E5z93Yiq6oIkZSzfQIm%2B3uVB9mEWWnmdVd4e%2FI%3D

We can use these two pieces of sensitive information to create a SQL Server Credential using either the GUI or T-SQL:




CREATE CREDENTIAL [https://myblobstore.blob.core.windows.net/my-serverName] 
WITH IDENTITY = N'SHARED ACCESS SIGNATURE', 
SECRET = N'sv=2020-08-03&ss=bfqt&srt=sco&sp=rwdlacupitfx&se=2021-05-09T06:30:19Z&st=2022-04-17T22:30:19Z&spr=https&sig=w9wu0E5z93Yiq6oIkZSzfQIm%2B3uVB9mEWWnmdVd4e%2FI%3D'
GO;

Notice that the identity is SHARED ACCESS SIGNATURE. This is always true for SAS credentials. The Credential name should have the name of the container you want to use appended to it. In this example we’re using a container called my-servername (which should be created ahead of time).

Great, we can now use this credential to back up to the container my-servername on myblobstore by creating a backup to URL:

BACKUP DATABASE [myDatabaseName] TO URL = 'https://myblobstore.blob.core.windows.net/my-serverName/mydatabaseName/FULL_backup_20220427_201112.bak'

Awesome, we can now have lifecycle management automatically move this backup around according to the rules we configure, move it between tiers, and ultimately delete it.

So, what’s so hard about that? Nothing really! The problem becomes when you need to change that SAS token out, either because it’s suspected of being compromised, or because you know the importance of rotating sensitive information periodically, you know your security team will come knocking if you don’t and/or you’d rather not fail a security audit.

If you wanted to, you could just drop and recreate the credentials on all the servers using a SAS token for a blob store by hand, each time you retire a SAS token.

That might work well if you only have a couple of servers and a single blob store to worry about and consider leaving toil job security. Usually, though, we want a way to automate this kind of task.

So, on to the magic!

We can store the SAS token and URI in Azure Key Vault. Now the chances are you won’t be changing the URI often, but if you wanted to change the blob store to another one (for a reason) you could just alter the stored URI. The SAS token can be stored in a secret for a duration, using active and expiration dates.



But how do we retrieve this and make use of it from within SQL Server? We can use a SQL Agent job step to execute some PowerShell that calls the Azure APIs to collect the token. There are multiple steps to consider when doing this, but this article will walk through each of them.

The first step is going to be authenticating with the APIs and collect a token.

To do so you’re going to need an Azure App Registration. You can create one of these from the Azure Portal. Navigate over to App Registrations, and click New Registration. Make the appropriate selection from the Supported Account Types. You likely want to use Default Directory Only, however you should be sure, so check with your network engineer(s)..

Once you have your registration, navigate to the API permissions and grant it permissions to access your key vault. Think about the permissions you want to grant in respect to the minimum permissions you’ll need. If you intend to use it for other purposes later, you may want to grant additional or elevated permissions.

Under Certificates and Secrets you can then create a secret. Consider the duration you want to leave it usable. We will use the secret and the client id from the overview page to authenticate with the API, and then use the token returned by it to access the key vault APIs and retrieve the values needed to create a brand new credential each time we run the job step.

$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Content-Type", "application/x-www-form-urlencoded")
$headers.Add("cache-control", "no-cache")
$body = @{
grant_type="client_credentials"
client_id="$client_id"
client_secret="$client_secret"
resource="https://vault.azure.net"
}
$params = @{
Method = "Post"
Uri = "https://login.microsoftonline.com/"+$tenantID+"/oauth2/token"
Body = $body
}
 
try{
$auth = Invoke-RestMethod @params
$auth.access_token
} catch {
return [pscustomobject]@{statusCode = [int]$_.Exception.Response.StatusCode
returnString = $_.Exception.Response.StatusCode+""+$_.error.message}
} 

This will reach out to the API and if successful return the token to the $auth.access_token object.

$headers = $null
$params = $null
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Content-Type", "application/x-www-form-urlencoded")
$headers.Add("cache-control", "no-cache")
$headers.Add("Authorization", "Bearer "+$auth.access_token)
$params = @{
Method = "Get"
Uri = "https://"+$keyvaultBaseURI+"/secrets/"+$secretName+"/versions?api-version="+$apiVersion+""
Body = $body
Headers = $headers
}
 
$versionsJsonSorted = [ordered] @{}
try{
$versions = Invoke-RestMethod @params
} catch {
return [pscustomobject]@{statusCode = [int]$_.Exception.Response.StatusCode
returnString = $_.Exception.Response.StatusCode+""+$_.error.message}
} 

Similarly, this PowerShell requires the $auth.access_token to pass in, the name of the secret and the base url of the key vault.

It’s important to note at this point that the key vault API returns all of the versions of a secret, and orders then alpha-numerically by the secret id which is a unique identifier, which is an interesting choice. Because of this you’ll need to order the results properly in order to collect the correct one, but have no fear:

$i  = $versionDataTable = New-Object System.Data.DataTable
$i  = $versionDataTable.Columns.Add("id", "System.String") 
$i  = $versionDataTable.Columns.Add("created", "System.DateTime") 
$i  = $versionDataTable.Columns.Add("updated", "System.DateTime") 
$i  = $versionDataTable.Columns.Add("exp", "System.DateTime")
$i  = $versionDataTable.Columns.Add("nbf", "System.DateTime")
$i  = $versionDataTable.Columns.Add("enabled", "System.String")
 
[DateTime] $bd = New-Object DateTime 1970, 1, 1, 0, 0, 0
 
foreach($a in $versions.value)
{
if ($a.attributes.enabled -eq "true" -and [DateTime]::UtcNow -gt $bd.AddSeconds($a.attributes.nbf) -and [DateTime]::UtcNow -lt $bd.AddSeconds($a.attributes.exp))
 {
  $i = $row = $versionDataTable.NewRow()
  $i = $row.id      = $a.id 
  $i = $row.created = $bd.AddSeconds($a.attributes.created)
  $i = $row.updated = $bd.AddSeconds($a.attributes.updated)
  $i = $row.exp     = $bd.AddSeconds($a.attributes.exp) 
  $i = $row.nbf     = $bd.AddSeconds($a.attributes.nbf)
  $i = $row.enabled = $a.attributes.enabled
  $i = $versionDataTable.Rows.Add($row)
 }
} 
$dw = New-Object System.Data.DataView($versionDataTable)
$dw.Sort="enabled DESC, updated DESC"

This PowerShell creates a DataTable, and brings all of the secret version into it. Then we can use it to sort them into an appropriate DataView. Now we know which version of the secret we need to ask for, we can at last call for it, and get the value:

if($dw.Count -lt 1)
{
return [pscustomobject]@{statusCode = 500
returnString = "Secret not found"}
}
 
$headers = $null
$params = $null
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Content-Type", "application/x-www-form-urlencoded")
$headers.Add("cache-control", "no-cache")
$headers.Add("Authorization", "Bearer "+$auth.access_token)
$params = @{
Method = "Get"
Uri = $dw[0].id+"?api-version="+$apiVersion+""
Body = $body
Headers = $headers
}
$dw[0].id
 
try{
$secret = Invoke-RestMethod @params
} catch {
return [pscustomobject]@{statusCode = [int]$_.Exception.Response.StatusCode
returnString = $_.Exception.Response.StatusCode+">>"+$_.error.message}
}

Since we’re going to be doing this all twice, once for the URI and once again for the SAS token, it makes sense to wrap it all up into a function, to make it easy to re-use. Here’s the whole function.

function getSecret {
param($secretName, $tenantID, $apiVersion, $keyvaultBaseURI, $toolsServer, $toolsDatabase)
 
$response = ""
$body = ""
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
 
$getCredentials = Invoke-Sqlcmd -Server $toolsServer -database $toolsDatabase -query "OPEN SYMMETRIC KEY settingsSecretsKey DECRYPTION BY CERTIFICATE settingsSecretsCertificate SELECT secretName, CONVERT(NVARCHAR(MAX), DecryptByKey(EncryptedSecret)) AS credential FROM settings.clientSecrets WHERE toolName = 'Keyvault';";
foreach($row in $getCredentials)
{
if($row.secretName -eq "client_id") { $client_id = $row.credential }
if($row.secretName -eq "client_secret") { $client_secret = $row.credential }
}
 
$headers = $null
$params = $null
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Content-Type", "application/x-www-form-urlencoded")
$headers.Add("cache-control", "no-cache")
$body = @{
grant_type="client_credentials"
client_id="$client_id"
client_secret="$client_secret"
resource="https://vault.azure.net"
}
$params = @{
Method = "Post"
Uri = "https://login.microsoftonline.com/"+$tenantID+"/oauth2/token"
Body = $body
}
 
try{
$auth = Invoke-RestMethod @params
$auth.access_token
} catch {
return [pscustomobject]@{statusCode = [int]$_.Exception.Response.StatusCode
returnString = $_.Exception.Response.StatusCode+""+$_.error.message}
}
 
$headers = $null
$params = $null
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Content-Type", "application/x-www-form-urlencoded")
$headers.Add("cache-control", "no-cache")
$headers.Add("Authorization", "Bearer "+$auth.access_token)
$params = @{
Method = "Get"
Uri = "https://"+$keyvaultBaseURI+"/secrets/"+$secretName+"/versions?api-version="+$apiVersion+""
Body = $body
Headers = $headers
}
 
$versionsJsonSorted = [ordered] @{}
try{
$versions = Invoke-RestMethod @params
} catch {
return [pscustomobject]@{statusCode = [int]$_.Exception.Response.StatusCode
returnString = $_.Exception.Response.StatusCode+""+$_.error.message}
}
 
$i  = $versionDataTable = New-Object System.Data.DataTable
$i  = $versionDataTable.Columns.Add("id", "System.String") 
$i  = $versionDataTable.Columns.Add("created", "System.DateTime") 
$i  = $versionDataTable.Columns.Add("updated", "System.DateTime") 
$i  = $versionDataTable.Columns.Add("exp", "System.DateTime")
$i  = $versionDataTable.Columns.Add("nbf", "System.DateTime")
$i  = $versionDataTable.Columns.Add("enabled", "System.String")
 
[DateTime] $bd = New-Object DateTime 1970, 1, 1, 0, 0, 0
 
foreach($a in $versions.value)
{
if ($a.attributes.enabled -eq "true" -and [DateTime]::UtcNow -gt $bd.AddSeconds($a.attributes.nbf) -and [DateTime]::UtcNow -lt $bd.AddSeconds($a.attributes.exp))
 {
  $i = $row = $versionDataTable.NewRow()
  $i = $row.id      = $a.id 
  $i = $row.created = $bd.AddSeconds($a.attributes.created)
  $i = $row.updated = $bd.AddSeconds($a.attributes.updated)
  $i = $row.exp     = $bd.AddSeconds($a.attributes.exp) 
  $i = $row.nbf     = $bd.AddSeconds($a.attributes.nbf)
  $i = $row.enabled = $a.attributes.enabled
  $i = $versionDataTable.Rows.Add($row)
 }
}
 
$dw = New-Object System.Data.DataView($versionDataTable)
$dw.Sort="enabled DESC, updated DESC"
 
if($dw.Count -lt 1)
{
return [pscustomobject]@{statusCode = 500
returnString = "Secret not found"}
}
 
$headers = $null
$params = $null
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Content-Type", "application/x-www-form-urlencoded")
$headers.Add("cache-control", "no-cache")
$headers.Add("Authorization", "Bearer "+$auth.access_token)
$params = @{
Method = "Get"
Uri = $dw[0].id+"?api-version="+$apiVersion+""
Body = $body
Headers = $headers
}
$dw[0].id
 
try{
$secret = Invoke-RestMethod @params
} catch {
return [pscustomobject]@{statusCode = [int]$_.Exception.Response.StatusCode
returnString = $_.Exception.Response.StatusCode+">>"+$_.error.message}
}
 
return [pscustomobject]@{statusCode = 200
returnString = $secret.value}
} 

Now we can call it using a little more PowerShell:

$sas = getSecret "MySecretSAS" "00000000-0000-0000-0000-000000000000" "2016-10-01" "mykeyvault.vault.azure.net" "my-toolsServer.domain.local" "MyToolsDatabase"
$uri = getSecret "MySecretURI" "00000000-0000-0000-0000-00000000000" "2016-10-01" "mykeyvault.vault.azure.net" "my-toolsServer.domain.local" "MyToolsDatabase"

This assumes the following:

  • MySecretSAS is the name of your secret holding the SAS token.
  • 00000000-0000-0000-0000-000000000000 is your Azure tenant ID, which you can find in the Azure Portal by navigating to the overview page in Azure Active Directory and inspecting the basic information section.
  • mykeyvault is the name of your Key Vault which hosts these secrets
  • that you have a server at my-toolsServer.domain.local which has a database called MyToolsDatabase which contains a table called settings.clientSecrets which has the columns secretName and EncryptedSecret which is encrypted by using a certificate secured using a key called settingsSecretKey.

That latter point is a topic unto itself, which I’ll cover in a future article in detail. If you’d rather use another method to expose the client ID and client secret to the function, simply set the variables $client_id and $client_secret  however you’d like.

Now we have the secrets, we can use them to build some T-SQL:

IF($sas.statusCode -eq 200 -and $uri.statusCode -eq 200)
{
$query = "DECLARE @dsql NVARCHAR(MAX) = 'IF EXISTS (SELECT * FROM sys.credentials WHERE name = ''+$uri.returnString+"'+LOWER(@@SERVERNAME)+LOWER(@@SERVERNAME)+''') DROP CREDENTIAL ["+$uri.returnString+"'+LOWER(@@SERVERNAME)+'] 
CREATE CREDENTIAL ["+$uri.returnString+"'+LOWER(@@SERVERNAME)+'] WITH IDENTITY = ''SHARED ACCESS SIGNATURE'', SECRET = ''"+$sas.returnString+"'';'
EXEC (@dsql);"
$refreshCred = Invoke-Sqlcmd -Server $serverUrl -database "master" -query $query
} else 
{
$sasMessage = $sas.returnString
if($sas.statusCode -eq 200) { $sasMessage = "Obsfucated"  }
Throw "Unexpected error when refreshing SAS token: 
SAS: "+$sas.statusCode +": " + $sasMessage +"
URI: "+$uri.statusCode +": " + $uri.returnString
}

This will drop the credential (if it exists), and create it with the values from the key vault secrets on the server $serverUrl.

Using this as a step before your backup job should ensure you always get the latest secrets from your Key Vault where the secrets are currently valid and enabled, and use them to refresh the credentials on any SQL Server you run the job on. It’s easy to configure different servers to use different blob stores and secrets. All you need to do is make sure you have available and valid SAS tokens in your vault for the current date, or have them created and stored ahead of time, if that is allowed under your organization’s security protocols.

It’s not specifically necessary to run the PowerShell as a SQL Agent step which is part of your backup process. It could be run by any other scheduling process, such as Windows Scheduled Tasks, separately from the backup process. As long as the process is triggered to coincide with, or pre-empt the usage of a credential which may have expired you will still benefit from the process.

And there you have it, SAS tokens are now yours to command, and automatically keep updated!

(This post has been migrated from a previous blog. It was originally posted July 26, 2022)