Calendar tables, or functions, give us the ability to measure or aggregate over time and offer a solution to the problem of a day which is not represented in a result set being omitted from the data set.
Calendar objects are fairly simple to create, in their basic form their little more then a tally or sequence represented in useful columns such as the day of the week, the month of the year or the quarter.
I use a table valued function, which accepts two integers as a parameter which controls the number of years previous and forward of the current date. It returns the following columns:
Date `DATE` '2023-01-31'
DateEndTime `DATETIME` '2023-01-31 23:59:59.997' - The last measurable millisecond with a precision of 3 milliseconds.
Year `INT` `2023`
MONTH `INT` '1'
Day `INT` '31'
Quarter `INT` '1'
WeekNumber `INT` '5' - The ISO week the date belongs to.
MonthName `NVARCHAR(20)` 'January' - The month name, per the databases language setting.
DayName `NVARCHAR(20) 'Tuesday' - The name of the day, per the databases language setting.
WeekStartDate `DATE` '2023-01-29' - The date on which the week starts based on the current DATEFIRST setting.
WeekEndDate `DATE` '2023-02-04' - The date on which the week ends based on the current DATEFIRST setting.
MonthStartDate `DATE` '2023-01-01'
MonthEndDate `DATE` '2023-01-31'
QuarterStartDate `DATE` '2023-01-01'
QuarterEndDate `DATE` '2023-03-31'
YearStartDate `DATE` '2023-01-01`
YearEndDate `DATE` '2023-12-31'
WeekStartTime `DATETIME` '2023-01-29 00:00:00.000' - The date and time the week starts, based on the current DATEFIRST setting.
WeekEndTime `DATETIME` '2023-02-04 23:59:59.997' - The date and time the week ends, to the last millisecond, with a precision of 3 milliseconds.
MonthStartTime `DATETIME` '2023-01-01 00:00:00.000' - The date and time the month starts.
MonthEndTime `DATETIME` '2023-01-31 23:59:59.997' - The date and time the month ends, to the last millisecond, with a precision of 3 milliseconds.
QuarterStartTime `DATETIME` - The date and time the quarter starts.
QuarterEndTime `DATETIME` '2023-01-31 23:59:59.997' - The date and time the quarter ends, to the last millisecond, with a precision of 3 milliseconds.
IsWeekDay `BIT` '1' - True if the day is a week day (Monday-Friday).
Using the calendar table in a query makes date aggregation trivial. In this example I wanted to collect the number and total amount of invoices for each day in the second half of May 2023.
SELECT c.Date, COUNT(DISTINCT i.InvoiceID) AS Invoices, COALESCE(SUM(ii.Quantity * (ii.Price-ii.Cost)),0) AS InvoiceTotal
FROM dbo.Calendar(0,0) c
LEFT OUTER JOIN dbo.Invoices i
ON i.InvoiceDateTimeUTC BETWEEN c.Date AND c.DateEndTime
LEFT OUTER JOIN dbo.InvoiceItems ii
ON i.InvoiceID = ii.InvoiceID
WHERE c.Date BETWEEN '2023-05-15' AND '2023-05-31'
GROUP BY c.Date
ORDER BY c.Date
Invoices between May15 and May 31st
Date
Invoices
InvoiceTotal
2023-05-15
137
2792.10
2023-05-16
132
2513.80
2023-05-17
131
2476.60
2023-05-18
129
2611.40
2023-05-19
128
2570.10
2023-05-20
146
3062.80
2023-05-21
120
2454.50
2023-05-22
128
2483.00
2023-05-23
83
1634.40
2023-05-24
0
0.00
2023-05-25
40
717.50
2023-05-26
0
0.00
2023-05-27
0
0.00
2023-05-28
0
0.00
2023-05-29
0
0.00
2023-05-30
0
0.00
2023-05-31
0
0.00
The days which did not have any invoices are still included in the result set, with 0 invoices and a total of 0.
We can also easily use the calendar table to compare one period to another. Here we're comparing the invoice counts between several week numbers in two years, and produce a delta:
;WITH ThisPeriod AS (
SELECT c.WeekNumber, c.WeekStartDate, c.Year, COUNT(i.InvoiceID) AS Invoices
FROM dbo.Calendar(1,1) c
LEFT OUTER JOIN Invoices i
ON i.InvoiceDateTimeUTC BETWEEN c.Date AND c.DateEndTime
WHERE c.WeekNumber BETWEEN 20 AND 22
GROUP BY c.WeekNumber, c.WeekStartDate, c.Year
)
SELECT tp.WeekNumber, tp.WeekStartDate, tp.Invoices, lp.WeekStartDate, lp.Invoices, CAST(CAST(tp.Invoices AS DECIMAL(10,2)) / CAST(lp.Invoices AS DECIMAL(10,2)) AS DECIMAL(3,2))-1 AS DeltaPercent
FROM ThisPeriod tp
INNER JOIN ThisPeriod lp
ON tp.WeekNumber = lp.WeekNumber
AND tp.Year = lp.Year + 1
WHERE tp.year = 2023;
Invoices in weeks 20, 21 and 22 for 2022 and 2023
WeekNumber
WeekStartDate
Invoices
WeekStartDate
Invoices
DeltaPercent
20
2023-05-14
935
2022-05-08
927
0.01
21
2023-05-21
371
2022-05-15
972
-0.62
22
2023-05-28
0
2022-05-22
937
-1.00
We can also use the calendar table to perform duration calculations. Perhaps we want to know the number of week days in January 2023:
SELECT COUNT(Date) AS Days
FROM dbo.Calendar(1,1)
WHERE Month = 1
AND Year = 2023
AND IsWeekDay = 1
Week days in January 2023
Days
22
Calendar tables are very useful to have on hand, particularly in reporting type environments. If you'd like to implement the TVF used in these examples you can find the dbo.Calendar.sql in my Azure Dev Ops repo.
This article is going to walk through all of the steps required to demonstrate how to secure columns in a SQL Server database.
We’re going to create a basic table to hold the information, a certificate, and a key to project some of those columns and grant the appropriate rights so permissioned users can access the values.
The first thing we’re going to need is a database. For the sake of easy clean up, I’m going to create a brand new one for this. We’ll call it CaveOfWonders.
CREATE DATABASE CaveOfWonders;
Magic. We know we’re going to be creating a certificate and a key, so let’s get those set up now. We’ll choose a master password for the key, so be sure to make a note of this so it can be recorded for later use.
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '4wholeNewWorld!';
GO
CREATE CERTIFICATE magicLamp WITH SUBJECT = 'To obsfucate Client Secrets';
GO
CREATE SYMMETRIC KEY aladdin WITH ALGORITHM = AES_256 ENCRYPTION BY CERTIFICATE magicLamp ;
GO
OPEN SYMMETRIC KEY aladdin DECRYPTION BY CERTIFICATE magicLamp;
We just need one more thing: a table to add the information we’ll be storing. Let’s create it and put some data into it.
And that’s it! We now have a table containing sensitive values which we can select and use in any other application that we can query SQL Server from, right?
USE CaveOfWonders;
SELECT *
FROM dbo.ClientSecrets;
Not quite. We did not provide any instruction on how we should attempt to decrypt the values, so SQL Server just didn’t decrypt them. OK, so let’s have Aladdin use his lamp!
OPEN SYMMETRIC KEY aladdin DECRYPTION BY CERTIFICATE magicLamp;
SELECT *
FROM ClientSecrets;
Nope! We’re now using the key and certificate, but we need to provide explicit instructions to decrypt the values, as shown in the code below.
OPEN SYMMETRIC KEY aladdin DECRYPTION BY CERTIFICATE magicLamp;
SELECT *, CONVERT(NVARCHAR(128), DecryptByKey(EncryptedSecret)) AS 'Decrypted ID Number'
FROM ClientSecrets;
Excellent, we can now store, project and retrieve secrets in this table. Any user who has been granted the permissions to use the certificate and the key can now do so to retrieve the secrets.
To grant the permissions required you’d need to add the following to a user who already has read access on the table
GRANT VIEW DEFINITION ON CERTIFICATE :: magicLamp TO SecretReaderUser;
GRANT VIEW DEFINITION ON SYMMETRIC KEY :: aladdin TO SecretReaderUser;
In order to make use of the Azure blob stores life cycle management, you need to create your backup as block blobs. In order to create block blobs you need to create the backup using a SAS token credential. Creating a SAS token is pretty straightforward. Let’s assume the blob store is called myblobstore.
A SAS credential is always named as the full path:
https://myblobstore.blob.core.windows.net/
Let’s also assume we have generated a SAS token for the blob store and it’s value is:
Notice that the identity is SHARED ACCESS SIGNATURE. This is always true for SAS credentials. The Credential name should have the name of the container you want to use appended to it. In this example we’re using a container called my-servername (which should be created ahead of time).
Great, we can now use this credential to back up to the container my-servername on myblobstore by creating a backup to URL:
BACKUP DATABASE [myDatabaseName] TO URL = 'https://myblobstore.blob.core.windows.net/my-serverName/mydatabaseName/FULL_backup_20220427_201112.bak'
Awesome, we can now have lifecycle management automatically move this backup around according to the rules we configure, move it between tiers, and ultimately delete it.
So, what’s so hard about that? Nothing really! The problem becomes when you need to change that SAS token out, either because it’s suspected of being compromised, or because you know the importance of rotating sensitive information periodically, you know your security team will come knocking if you don’t and/or you’d rather not fail a security audit.
If you wanted to, you could just drop and recreate the credentials on all the servers using a SAS token for a blob store by hand, each time you retire a SAS token.
That might work well if you only have a couple of servers and a single blob store to worry about and consider leaving toil job security. Usually, though, we want a way to automate this kind of task.
So, on to the magic!
We can store the SAS token and URI in Azure Key Vault. Now the chances are you won’t be changing the URI often, but if you wanted to change the blob store to another one (for a reason) you could just alter the stored URI. The SAS token can be stored in a secret for a duration, using active and expiration dates.
But how do we retrieve this and make use of it from within SQL Server? We can use a SQL Agent job step to execute some PowerShell that calls the Azure APIs to collect the token. There are multiple steps to consider when doing this, but this article will walk through each of them.
The first step is going to be authenticating with the APIs and collect a token.
To do so you’re going to need an Azure App Registration. You can create one of these from the Azure Portal. Navigate over to App Registrations, and click New Registration. Make the appropriate selection from the Supported Account Types. You likely want to use Default Directory Only, however you should be sure, so check with your network engineer(s)..
Once you have your registration, navigate to the API permissions and grant it permissions to access your key vault. Think about the permissions you want to grant in respect to the minimum permissions you’ll need. If you intend to use it for other purposes later, you may want to grant additional or elevated permissions.
Under Certificates and Secrets you can then create a secret. Consider the duration you want to leave it usable. We will use the secret and the client id from the overview page to authenticate with the API, and then use the token returned by it to access the key vault APIs and retrieve the values needed to create a brand new credential each time we run the job step.
Similarly, this PowerShell requires the $auth.access_token to pass in, the name of the secret and the base url of the key vault.
It’s important to note at this point that the key vault API returns all of the versions of a secret, and orders then alpha-numerically by the secret id which is a unique identifier, which is an interesting choice. Because of this you’ll need to order the results properly in order to collect the correct one, but have no fear:
This PowerShell creates a DataTable, and brings all of the secret version into it. Then we can use it to sort them into an appropriate DataView. Now we know which version of the secret we need to ask for, we can at last call for it, and get the value:
Since we’re going to be doing this all twice, once for the URI and once again for the SAS token, it makes sense to wrap it all up into a function, to make it easy to re-use. Here’s the whole function.
MySecretSAS is the name of your secret holding the SAS token.
00000000-0000-0000-0000-000000000000 is your Azure tenant ID, which you can find in the Azure Portal by navigating to the overview page in Azure Active Directory and inspecting the basic information section.
mykeyvault is the name of your Key Vault which hosts these secrets
that you have a server at my-toolsServer.domain.local which has a database called MyToolsDatabase which contains a table called settings.clientSecrets which has the columns secretName and EncryptedSecret which is encrypted by using a certificate secured using a key called settingsSecretKey.
That latter point is a topic unto itself, which I’ll cover in a future article in detail. If you’d rather use another method to expose the client ID and client secret to the function, simply set the variables $client_id and $client_secret however you’d like.
Now we have the secrets, we can use them to build some T-SQL:
This will drop the credential (if it exists), and create it with the values from the key vault secrets on the server $serverUrl.
Using this as a step before your backup job should ensure you always get the latest secrets from your Key Vault where the secrets are currently valid and enabled, and use them to refresh the credentials on any SQL Server you run the job on. It’s easy to configure different servers to use different blob stores and secrets. All you need to do is make sure you have available and valid SAS tokens in your vault for the current date, or have them created and stored ahead of time, if that is allowed under your organization’s security protocols.
It’s not specifically necessary to run the PowerShell as a SQL Agent step which is part of your backup process. It could be run by any other scheduling process, such as Windows Scheduled Tasks, separately from the backup process. As long as the process is triggered to coincide with, or pre-empt the usage of a credential which may have expired you will still benefit from the process.
And there you have it, SAS tokens are now yours to command, and automatically keep updated!
(This post has been migrated from a previous blog. It was originally posted July 26, 2022)