<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>David Barbarin &#187; Powershell</title>
	<atom:link href="https://blog.developpez.com/mikedavem/ptag/powershell-2/feed" rel="self" type="application/rss+xml" />
	<link>https://blog.developpez.com/mikedavem</link>
	<description>MVP DataPlatform - MCM SQL Server</description>
	<lastBuildDate>Thu, 09 Sep 2021 21:19:50 +0000</lastBuildDate>
	<language>fr-FR</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=4.1.42</generator>
	<item>
		<title>Extending SQL Server monitoring with Raspberry PI and Lametric</title>
		<link>https://blog.developpez.com/mikedavem/p13204/sql-server-2005/extending-sql-server-monitoring-with-raspberry-pi-and-lametric</link>
		<comments>https://blog.developpez.com/mikedavem/p13204/sql-server-2005/extending-sql-server-monitoring-with-raspberry-pi-and-lametric#comments</comments>
		<pubDate>Thu, 07 Jan 2021 21:59:25 +0000</pubDate>
		<dc:creator><![CDATA[mikedavem]]></dc:creator>
				<category><![CDATA[DevOps]]></category>
		<category><![CDATA[Docker]]></category>
		<category><![CDATA[K8s]]></category>
		<category><![CDATA[SQL Azure]]></category>
		<category><![CDATA[SQL Server 2005]]></category>
		<category><![CDATA[SQL Server 2008]]></category>
		<category><![CDATA[SQL Server 2008 R2]]></category>
		<category><![CDATA[SQL Server 2014]]></category>
		<category><![CDATA[SQL Server 2016]]></category>
		<category><![CDATA[SQL Server 2017]]></category>
		<category><![CDATA[SQL Server 2019]]></category>
		<category><![CDATA[Lametric]]></category>
		<category><![CDATA[monitoring]]></category>
		<category><![CDATA[Powershell]]></category>
		<category><![CDATA[Raspberry]]></category>
		<category><![CDATA[sqlserver]]></category>

		<guid isPermaLink="false">http://blog.developpez.com/mikedavem/?p=1742</guid>
		<description><![CDATA[First blog of this new year 2021 and I will start with a fancy and How-To Geek topic In my last blog post, I discussed about monitoring and how it should help to address quickly a situation that is going &#8230; <a href="https://blog.developpez.com/mikedavem/p13204/sql-server-2005/extending-sql-server-monitoring-with-raspberry-pi-and-lametric">Lire la suite <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p>First blog of this new year 2021 and I will start with a fancy and How-To Geek topic </p>
<p>In my <a href="https://blog.developpez.com/mikedavem/p13203/sql-server-2014/why-we-moved-sql-server-monitoring-on-prometheus-and-grafana" rel="noopener" target="_blank">last blog post</a>, I discussed about monitoring and how it should help to address quickly a situation that is going degrading. Alerts are probably the first way to raise your attention and, in my case, they are often in the form of emails in a dedicated folder. That remains a good thing, at least if you’re not focusing too long in other daily tasks or projects. In work office, I know I would probably better focus on new alerts but as I said previously, telework changed definitely the game.  </p>
<p><span id="more-1742"></span></p>
<p>I wanted to find a way to address this concern at least for main SQL Server critical alerts and I thought about relying on my existing home lab infrastructure to address the point. Reasons are it is always a good opportunity to learn something and to improve my skills by referring to a real case scenario. </p>
<p>My home lab infrastructure includes a cluster of <a href="https://www.raspberrypi.org/products/raspberry-pi-4-model-b/" rel="noopener" target="_blank">Raspberry PI 4</a> nodes. Initially, I use it to improve my skills on K8s or to study some IOT stuff for instance. It is a good candidate for developing and deploying a new app for detecting new incoming alerts in my mailbox and sending notifications to my Lametric accordingly. </p>
<p><a href="https://lametric.com/" rel="noopener" target="_blank">Lametric</a> is a basically a connected clock but works also as a highly-visible display showing notifications from devices or apps via REST APIs. First time I saw such device in action was in a DevOps meetup in 2018 around Docker and Jenkins deployment with <a href="https://www.linkedin.com/in/duquesnoyeric/" rel="noopener" target="_blank">Eric Dusquenoy</a> and Tim Izzo (<a href="https://twitter.com/5ika_" rel="noopener" target="_blank">@5ika_</a>). In addition, one of my previous customers had also one in his office and we had some discussions about cool customization through Lametric apps. </p>
<p>Connection through VPN to my company network is mandatory to work from home and unfortunately Lametric device doesn’t support this scenario because communication is limited to local network only. So, I need an app that run on my local (home) network and able to connect to my mailbox, get new incoming emails and finally sending notifications to my Lametric device. </p>
<p>Here my setup:</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2021/01/171-0-lametric_infra.jpg"><img src="http://blog.developpez.com/mikedavem/files/2021/01/171-0-lametric_infra-1024x711.jpg" alt="171 - 0 - lametric_infra" width="584" height="405" class="alignnone size-large wp-image-1743" /></a></p>
<p>There are plenty of good blog posts to create a Raspberry cluster on the internet and I would suggest to read <a href="https://dbafromthecold.com/2020/11/30/building-a-raspberry-pi-cluster-to-run-azure-sql-edge-on-kubernetes/" rel="noopener" target="_blank">that</a> of Andrew Pruski (<a href="https://twitter.com/dbafromthecold" rel="noopener" target="_blank">@dbafromthecold</a>). </p>
<p>As shown above, there are different paths for SQL alerts referring our infrastructure (On-prem and Azure SQL databases) but all of them are send to a dedicated distribution list for DBA. </p>
<p>The app is a simple PowerShell script that relies on Exchange Webservices APIs for connecting to the mailbox and to get new mails. Sending notifications to my Lametric device is achieved by a simple REST API call with well-formatted body. Details can be found the <a href="https://lametric-documentation.readthedocs.io/en/latest/reference-docs/device-notifications.html" rel="noopener" target="_blank">Lametric documentation</a>. As prerequisite, you need to create a notification app from Lametric Developer site as follows:</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2021/01/171-3-lametric-app-token.jpg"><img src="http://blog.developpez.com/mikedavem/files/2021/01/171-3-lametric-app-token-1024x364.jpg" alt="171 - 3 - lametric app token" width="584" height="208" class="alignnone size-large wp-image-1744" /></a></p>
<p>As said previously, I used PowerShell for this app. It can help to find documentation and tutorials when it comes Microsoft product. But if you are more confident with Python, APIs are also available in a <a href="https://pypi.org/project/py-ews/" rel="noopener" target="_blank">dedicated package</a>. But let’s precise that using PowerShell doesn’t necessarily mean using Windows-based container and instead I relied on Linux-based image with PowerShell core for ARM architecture. Image is provided by Microsoft on <a href="https://hub.docker.com/_/microsoft-powershell" rel="noopener" target="_blank">Docker Hub</a>. Finally, sensitive information like Lametric Token or mailbox credentials are stored in K8s secret for security reasons. My app project is available on my <a href="https://github.com/mikedavem/lametric" rel="noopener" target="_blank">GitHub</a>. Feel free to use it.</p>
<p>Here some results:</p>
<p>&#8211; After deploying my pod:</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2021/01/171-1-lametric-pod.jpg"><img src="http://blog.developpez.com/mikedavem/files/2021/01/171-1-lametric-pod.jpg" alt="171 - 1 - lametric pod" width="483" height="82" class="alignnone size-full wp-image-1745" /></a></p>
<p>&#8211; The app is running and checking new incoming emails (kubectl logs command)</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2021/01/171-2-lametric-pod-logs.jpg"><img src="http://blog.developpez.com/mikedavem/files/2021/01/171-2-lametric-pod-logs.jpg" alt="171 - 2 - lametric pod logs" width="828" height="438" class="alignnone size-full wp-image-1747" /></a></p>
<p>When email is detected, <a href="https://youtu.be/EcdSFziNc3U" title="Notification" rel="noopener" target="_blank">notification</a> is sendig to Lametric device accordingly</p>
<p>Geek fun good (bad?) idea to start this new year 2021 <img src="https://blog.developpez.com/mikedavem/wp-includes/images/smilies/icon_smile.gif" alt=":-)" class="wp-smiley" /></p>
]]></content:encoded>
			<wfw:commentRss></wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>AAD user creation on behalf AAD Service Principal with Azure SQL DB</title>
		<link>https://blog.developpez.com/mikedavem/p13197/sql-azure/aad-user-creation-on-behalf-aad-service-principal-with-azure-sql-db</link>
		<comments>https://blog.developpez.com/mikedavem/p13197/sql-azure/aad-user-creation-on-behalf-aad-service-principal-with-azure-sql-db#comments</comments>
		<pubDate>Sun, 02 Aug 2020 22:28:06 +0000</pubDate>
		<dc:creator><![CDATA[mikedavem]]></dc:creator>
				<category><![CDATA[DevOps]]></category>
		<category><![CDATA[PowerShell]]></category>
		<category><![CDATA[SQL Azure]]></category>
		<category><![CDATA[Authentication]]></category>
		<category><![CDATA[Azure Automation]]></category>
		<category><![CDATA[Azure SQL Database]]></category>
		<category><![CDATA[Azure SQL DB]]></category>
		<category><![CDATA[Powershell]]></category>
		<category><![CDATA[Runbook]]></category>
		<category><![CDATA[Security]]></category>
		<category><![CDATA[Service Principal]]></category>
		<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[System managed identity]]></category>

		<guid isPermaLink="false">http://blog.developpez.com/mikedavem/?p=1643</guid>
		<description><![CDATA[An interesting improvement was announced by the SQL AAD team on Monday 27th July 2020 and concerns the support for Azure AD user creation on behalf of Azure AD Applications for Azure SQL as mentioned to this Microsoft blog post. &#8230; <a href="https://blog.developpez.com/mikedavem/p13197/sql-azure/aad-user-creation-on-behalf-aad-service-principal-with-azure-sql-db">Lire la suite <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p>An interesting improvement was announced by the SQL AAD team on Monday 27th July 2020 and concerns the support for Azure AD user creation on behalf of Azure AD Applications for Azure SQL as mentioned to this <a href="https://techcommunity.microsoft.com/t5/azure-sql-database/support-for-azure-ad-user-creation-on-behalf-of-azure-ad/ba-p/1491121" rel="noopener" target="_blank">Microsoft blog post</a>. </p>
<p><span id="more-1643"></span></p>
<p>In my company, this is something we were looking for a while with our database refresh process in Azure. Before talking this new feature, let me share a brief history of different considerations we had for this DB refresh process over the time with different approaches we went through. First let’s precise DB Refresh includes usually at least two steps: restoring backup / copying database – you have both ways in Azure SQL Database – and realigning security context with specific users regarding your targeted environment (ACC / INT …).  But the latter is not as trivial as you may expect if you opted to use either a SQL Login / User or a Service Principal to carry out this operation in your process. Indeed, in both cases creating an Azure AD User or Group is not supported, and if you try you will face this error message:</p>
<blockquote><p>‘’ is not a valid login or you do not have permission. </p></blockquote>
<p>All the stuff (either Azure automation runbook and PowerShell modules on-prem) done so far and described afterwards meets the same following process:</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2020/08/164-1-DB-Refresh-process-e1596406580306.jpg"><img src="http://blog.developpez.com/mikedavem/files/2020/08/164-1-DB-Refresh-process-e1596406580306.jpg" alt="164 - 1 - DB Refresh process" width="800" height="566" class="alignnone size-full wp-image-1645" /></a></p>
<p>First, we used Invoke-SQCMD in Azure Automation runbook with T-SQL query to create a copy of a source database to the target server. T-SQL is mandatory in this case as per <a href="https://docs.microsoft.com/en-us/azure/azure-sql/database/database-copy?tabs=azure-powershell" rel="noopener" target="_blank">documented</a> in the Microsoft BOL because PROD and ACC or INT servers are not on the same subscription. Here a simplified sample of code:</p>
<div class="codecolorer-container text default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;"><div class="text codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap">...<br />
$CopyDBCMD = @{<br />
&nbsp; &nbsp; 'Database' = 'master'<br />
&nbsp; &nbsp; 'ServerInstance' = $TargetServerName<br />
&nbsp; &nbsp; 'Username' = $SQLUser<br />
&nbsp; &nbsp; 'Password' = $SQLPWD<br />
&nbsp; &nbsp; 'Query' = 'CREATE DATABASE '+ '[' + $DatabaseName + '] ' + 'AS COPY OF ' + '[' + $SourceServerName + '].[' + $DatabaseName + ']'<br />
} <br />
<br />
Invoke-Sqlcmd @CopyDBCMD <br />
...</div></div>
<p>But as you likely know, Invoke-SQLCMD doesn’t support AAD authentication and because SQL Login authentication was the only option here, it led us dealing with an annoying issue about the security configuration step with AAD users or groups as you may imagine. </p>
<p>Then, because we based authentication mainly on trust architecture and our security rules require using it including apps with managed identities or service principals, we wanted also to introduce this concept to our database refresh process. Fortunately, service principals are supported with <a href="https://techcommunity.microsoft.com/t5/azure-sql-database/token-based-authentication-support-for-azure-sql-db-using-azure/ba-p/386091" rel="noopener" target="_blank">Azure SQL DBs since v12</a> with access token for authentication by ADALSQL. The corresponding DLL is required on your server or if you use it from Azure Automation like us, we added the ADAL.PS module but be aware it is now deprecated, and I advise you to strongly invest in moving to MSAL. Here a sample we used:</p>
<div class="codecolorer-container text default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;height:450px;"><div class="text codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap">...<br />
$response = Get-ADALToken `<br />
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; -ClientId $clientId `<br />
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; -ClientSecret $clientSecret `<br />
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; -Resource $resourceUri `<br />
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; -Authority $authorityUri `<br />
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; -TenantId $tenantName<br />
<br />
...<br />
<br />
$connectionString = &quot;Server=tcp:$SqlInstanceFQDN,1433;Initial Catalog=master;Persist Security Info=False;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;&quot;<br />
# Create the connection object<br />
$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)<br />
# Set AAD generated token to SQL connection token<br />
$connection.AccessToken = $response.AccessToken<br />
<br />
Try {<br />
&nbsp; &nbsp; $connection.Open()<br />
&nbsp; &nbsp; ...<br />
} <br />
...</div></div>
<p>But again, even if the copy or restore steps are well managed, we still got stuck with security reconfiguration, because service principals were not supported for creating AAD users or groups so far &#8230;</p>
<p>In the meantime, we found out a temporary and interesting solution based on <a href="https://dbatools.io/" rel="noopener" target="_blank">dbatools framework</a> and the <a href="https://docs.dbatools.io/#Invoke-DbaQuery" rel="noopener" target="_blank">Invoke-dbaquery command</a> which supports AAD authentication (Login + Password). As we may not rely on service principal in this case, using a dedicated AAD account was an acceptable tradeoff to manage all the database refresh process steps. But going through this way comes with some disadvantages because running Invoke-dbaquery in a full Azure automation mode is not possible with missing ADALsql.dll. Workaround may be to use hybrid-worker, but we didn’t want to add complexity to our current architecture only for this special case. Instead we decided to move the logic of the Azure automation runbook into on-prem PowerShell framework which already include logic for DB refresh for on-prem SQL Server instances. </p>
<p>Here a simplified sample of code we are using:</p>
<div class="codecolorer-container text default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;height:450px;"><div class="text codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap">...<br />
Try {<br />
&nbsp; &nbsp; # Connect to get access to Key Vault info<br />
&nbsp; &nbsp; Connect-AzAccount | Out-Null<br />
<br />
&nbsp; &nbsp; [String]$user = (Get-AzKeyVaultSecret -VaultName $KeyvaultName -Name &quot;AZSQL-SQLBCKUSER&quot;).SecretValueText<br />
&nbsp; &nbsp; [System.Security.SecureString]$pwd = &nbsp;ConvertTo-SecureString (Get-AzKeyVaultSecret -VaultName $KeyvaultName -Name &quot;AZSQL-SQLBCKPWD&quot;).SecretValueText -AsPlainText -Force<br />
&nbsp; &nbsp; [String]$SourceServerName = (Get-AzKeyVaultSecret -VaultName $KeyvaultName -Name &quot;AZSQL-NAME&quot;).SecretValueText<br />
&nbsp; &nbsp; [String]$TargetServerName = (Get-AzKeyVaultSecret -VaultName $KeyvaultName -Name &quot;AZSQL-TARGETNAME&quot;).SecretValueText + '.database.windows.net'<br />
<br />
&nbsp; &nbsp; # DB Restore will be performed in the context of dedicated AAD account <br />
&nbsp; &nbsp; $pscredential = New-Object -TypeName System.Management.Automation.PSCredential($user, $pwd)<br />
<br />
&nbsp; &nbsp; Write-Host &quot;Restoring DB:$DatabaseName from Source Server: $SourceServerName to Target Server: $TargetServerName&quot;<br />
&nbsp; &nbsp; <br />
&nbsp; &nbsp; $Query = &quot;CREATE DATABASE [$DatabaseName] AS COPY OF [$SourceServerName].[$DatabaseName]&quot;<br />
&nbsp; &nbsp; Invoke-DbaQuery `<br />
&nbsp; &nbsp; &nbsp; &nbsp; -SqlInstance $TargetServerName `<br />
&nbsp; &nbsp; &nbsp; &nbsp; -Database master `<br />
&nbsp; &nbsp; &nbsp; &nbsp; -SqlCredential $pscredential `<br />
&nbsp; &nbsp; &nbsp; &nbsp; -Query $Query `<br />
&nbsp; &nbsp; &nbsp; &nbsp; -EnableException <br />
<br />
&nbsp; &nbsp; # Wait for DB online and ready ... <br />
&nbsp; &nbsp; # Code should be implemented for this check <br />
<br />
<br />
&nbsp; &nbsp; Write-Output &quot;Applying security configuration to DB: $DatabaseName on Server:$TargetServerName&quot;<br />
<br />
&nbsp; &nbsp; $Query = &quot;<br />
&nbsp; &nbsp; &nbsp; &nbsp; DROP USER [az_sql_ro];CREATE USER [az_sql_ro] FROM EXTERNAL PROVIDER;<br />
&nbsp; &nbsp; &quot;<br />
&nbsp; &nbsp; Invoke-DbaQuery `<br />
&nbsp; &nbsp; &nbsp; &nbsp; -SqlInstance $TargetServerName `<br />
&nbsp; &nbsp; &nbsp; &nbsp; -Database $DatabaseName `<br />
&nbsp; &nbsp; &nbsp; &nbsp; -SqlCredential $pscredential `<br />
&nbsp; &nbsp; &nbsp; &nbsp; -Query $Query `<br />
&nbsp; &nbsp; &nbsp; &nbsp; -EnableException<br />
<br />
}<br />
Catch {<br />
&nbsp; &nbsp; Write-Host &quot;Error encountered: $($_.Exception.Message)&quot;<br />
} <br />
...</div></div>
<p>Referring to the PowerShell code above, in the second step, we create an AAG group [az_sql_ro] on behalf of the AAD dedicated account with the CLAUSE FROM EXTERNAL PROVIDER. </p>
<p>Finally, with the latest news published by the SQL AAD team, we will likely consider using back service principal instead of dedicated Windows AAD account. <a href="https://techcommunity.microsoft.com/t5/azure-sql-database/support-for-azure-ad-user-creation-on-behalf-of-azure-ad/ba-p/1491121" rel="noopener" target="_blank">This Microsoft blog post</a> explains in details how it works and what you have to setup to make it work correctly. I don’t want to duplicate what is already explained so I will apply the new stuff to my context. </p>
<p>Referring to the above blog post, you need first to setup a server identity for your Azure SQL Server as below:</p>
<div class="codecolorer-container text default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;"><div class="text codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap">Set-AzSqlServer `<br />
&nbsp; &nbsp; -ResourceGroupName sandox-rg `<br />
&nbsp; &nbsp; -ServerName a-s-sql02 `<br />
&nbsp; &nbsp; -AssignIdentity<br />
<br />
# Check server identity<br />
Get-AzSqlServer `<br />
&nbsp; &nbsp; -ResourceGroupName sandox-rg `<br />
&nbsp; &nbsp; -ServerName a-s-sql02 | `<br />
&nbsp; &nbsp; Select-Object ServerName, Identity</div></div>
<div class="codecolorer-container text default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;"><div class="text codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap">ServerName Identity &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<br />
---------- -------- &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<br />
a-s-sql02 &nbsp;Microsoft.Azure.Management.Sql.Models.ResourceIdentity</div></div>
<p>Let&rsquo;s have a look at the server identity</p>
<div class="codecolorer-container text default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;"><div class="text codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap"># Get identity details<br />
$identity = Get-AzSqlServer `<br />
&nbsp; &nbsp; &nbsp; &nbsp; -ResourceGroupName sandox-rg `<br />
&nbsp; &nbsp; &nbsp; &nbsp; -ServerName a-s-sql02<br />
<br />
$identity.identity</div></div>
<div class="codecolorer-container text default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;"><div class="text codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap">PrincipalId &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;Type &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; TenantId &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<br />
----------- &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;---- &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; -------- &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<br />
7f0d16f7-b172-4c97-94d3-34f0f7ed93cf SystemAssigned 2fcd19a7-ab24-4aef-802b-6851ef5d1ed5</div></div>
<p>In fact, assigning a server identity means creating a system assigned managed identity in the Azure AD tenant that&rsquo;s trusted by the subscription of the instance. To keep things simple, let’s say that System Managed Identity in Azure is like to Managed Account or Group Managed Account on-prem. Those identities are self-managed by the system. Then you need to grant this identity the Azure AD &laquo;&nbsp;Directory Readers &laquo;&nbsp;permission to get rights for creating AAD Users or Groups on behalf of this identity. A PowerShell script is provided by Microsoft <a href="https://docs.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-service-principal-tutorial" rel="noopener" target="_blank">here</a> a sample of code I applied in my context for testing:</p>
<div class="codecolorer-container text default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;height:450px;"><div class="text codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap">...<br />
Try {<br />
&nbsp; &nbsp; $DatabaseName = &quot;test-DBA&quot; &nbsp; <br />
&nbsp; &nbsp; &nbsp; <br />
&nbsp; &nbsp; # Connect to get access to Key Vault info<br />
&nbsp; &nbsp; Connect-AzAccount | Out-Null<br />
<br />
&nbsp; &nbsp; [String]$user = (Get-AzKeyVaultSecret -VaultName $KeyvaultName -Name &quot;AZSQL-SQLBCKAPPID&quot;).SecretValueText<br />
&nbsp; &nbsp; [System.Security.SecureString]$pwd = &nbsp;ConvertTo-SecureString (Get-AzKeyVaultSecret -VaultName $KeyvaultName -Name &quot;AZSQL-SQLBCKAPPSECRET&quot;).SecretValueText -AsPlainText -Force<br />
&nbsp; &nbsp; [String]$SourceServerName = (Get-AzKeyVaultSecret -VaultName $KeyvaultName -Name &quot;AZSQL-NAME&quot;).SecretValueText<br />
&nbsp; &nbsp; [String]$TargetServerName = (Get-AzKeyVaultSecret -VaultName $KeyvaultName -Name &quot;AZSQL-TARGETNAME&quot;).SecretValueText + '.database.windows.net'<br />
<br />
&nbsp; &nbsp; # DB Restore will be performed in the context of dedicated AAD account <br />
&nbsp; &nbsp; $pscredential = New-Object -TypeName System.Management.Automation.PSCredential($user, $pwd)<br />
<br />
&nbsp; &nbsp; $adalPath &nbsp;= &quot;${env:ProgramFiles}\WindowsPowerShell\Modules\Az.Profile.7.0\PreloadAssemblies&quot;<br />
&nbsp; &nbsp; # To install the latest AzureRM.profile version execute &nbsp;-Install-Module -Name AzureRM.profile<br />
&nbsp; &nbsp; $adal &nbsp; &nbsp; &nbsp;= &quot;$adalPath\Microsoft.IdentityModel.Clients.ActiveDirectory.dll&quot;<br />
&nbsp; &nbsp; $adalforms = &quot;$adalPath\Microsoft.IdentityModel.Clients.ActiveDirectory.WindowsForms.dll&quot;<br />
&nbsp; &nbsp; [System.Reflection.Assembly]::LoadFrom($adal) | Out-Null<br />
&nbsp; &nbsp; $resourceAppIdURI = 'https://database.windows.net/'<br />
<br />
&nbsp; &nbsp; # Set Authority to Azure AD Tenant<br />
&nbsp; &nbsp; $authority = 'https://login.windows.net/' + $tenantId<br />
<br />
&nbsp; &nbsp; $ClientCred = [Microsoft.IdentityModel.Clients.ActiveDirectory.ClientCredential]::new($clientId, $clientSecret)<br />
&nbsp; &nbsp; $authContext = [Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext]::new($authority)<br />
&nbsp; &nbsp; $authResult = $authContext.AcquireTokenAsync($resourceAppIdURI,$ClientCred)<br />
&nbsp; &nbsp; $Tok = $authResult.Result.CreateAuthorizationHeader()<br />
&nbsp; &nbsp; $Tok=$Tok.Replace(&quot;Bearer &quot;,&quot;&quot;)<br />
&nbsp; &nbsp; <br />
&nbsp; &nbsp; Write-host &quot;Token generated is ...&quot;<br />
&nbsp; &nbsp; $Tok<br />
&nbsp; &nbsp; Write-host &nbsp;&quot;&quot;<br />
<br />
&nbsp; &nbsp; Write-Host &quot;Create SQL connectionstring&quot;<br />
&nbsp; &nbsp; $conn = New-Object System.Data.SqlClient.SQLConnection <br />
&nbsp; &nbsp; <br />
&nbsp; &nbsp; $conn.ConnectionString = &quot;Data Source=$TargetServerName;Initial Catalog=master;Connect Timeout=30&quot;<br />
&nbsp; &nbsp; $conn.AccessToken = $Tok<br />
<br />
&nbsp; &nbsp; Write-host &quot;Connect to database and execute SQL script&quot;<br />
&nbsp; &nbsp; $conn.Open() <br />
<br />
&nbsp; &nbsp; Write-Host &quot;Check connected user ...&quot;<br />
&nbsp; &nbsp; $Query = &quot;SELECT USER_NAME() AS [user_name];&quot;<br />
&nbsp; &nbsp; $command = New-Object -TypeName System.Data.SqlClient.SqlCommand($Query, $conn)<br />
&nbsp; &nbsp; $Command.ExecuteScalar()<br />
&nbsp; &nbsp; $conn.Close()<br />
<br />
&nbsp; &nbsp; Write-Host &quot;Restoring DB:$DatabaseName from Source Server: $SourceServerName to Target Server: $TargetServerName&quot;<br />
<br />
&nbsp; &nbsp; $conn.ConnectionString = &quot;Data Source=$TargetServerName;Initial Catalog=master;Connect Timeout=30&quot;<br />
&nbsp; &nbsp; $conn.AccessToken = $Tok<br />
&nbsp; &nbsp; $conn.Open()<br />
&nbsp; &nbsp; $Query = &quot;DROP DATABASE IF EXISTS [$DatabaseName]; CREATE DATABASE [$DatabaseName] AS COPY OF [$SourceServerName].[$DatabaseName]&quot;<br />
&nbsp; &nbsp; $command = New-Object -TypeName System.Data.SqlClient.SqlCommand($Query, $conn)<br />
&nbsp; &nbsp; $command.CommandTimeout = 1200<br />
&nbsp; &nbsp; $command.ExecuteNonQuery()<br />
&nbsp; &nbsp; $conn.Close()<br />
<br />
&nbsp; &nbsp; # Wait for DB online and ready ... <br />
&nbsp; &nbsp; # Code should be implemented for this check <br />
<br />
&nbsp; &nbsp; <br />
&nbsp; &nbsp; Write-Output &quot;Applying security configuration to DB: $DatabaseName on Server:$TargetServerName&quot;<br />
<br />
&nbsp; &nbsp; $conn.ConnectionString = &quot;Data Source=$TargetServerName;Initial Catalog=$DatabaseName;Connect Timeout=30&quot;<br />
&nbsp; &nbsp; $conn.AccessToken = $Tok<br />
&nbsp; &nbsp; $conn.Open() <br />
&nbsp; &nbsp; $Query = 'CREATE USER [az_sql_ro] FROM EXTERNAL PROVIDER;'<br />
&nbsp; &nbsp; $command = New-Object -TypeName System.Data.SqlClient.SqlCommand($Query, $conn) &nbsp; &nbsp; &nbsp; <br />
&nbsp; &nbsp; $command.ExecuteNonQuery()<br />
&nbsp; &nbsp; $conn.Close()<br />
<br />
}<br />
Catch {<br />
&nbsp; &nbsp; Write-Output &quot;Error encountered: $($_.Exception.Message)&quot;<br />
} <br />
...</div></div>
<p>Using service principal required few changes in my case. I now get credentials of the service principal (ClientId and Secret) from Azure Key Vault instead of the AAD dedicated account used in previous example. I also changed the way to connect to SQL Server by relying on ADALSQL to get the access token instead of using dbatools commands. Indeed, as far as I know, dbatools doesn’t support this authentication way (yet?). </p>
<p>The authentication process becomes as follows:</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2020/08/164-3-new-auth-process-e1596407082747.jpg"><img src="http://blog.developpez.com/mikedavem/files/2020/08/164-3-new-auth-process-e1596407082747.jpg" alt="164 - 3 - new auth process" width="800" height="610" class="alignnone size-full wp-image-1647" /></a></p>
<p>My first test seems to be relevant:</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2020/08/164-4-test-with-SP-e1596407153885.jpg"><img src="http://blog.developpez.com/mikedavem/files/2020/08/164-4-test-with-SP-e1596407153885.jpg" alt="164 - 4 - test with SP" width="800" height="301" class="alignnone size-full wp-image-1648" /></a></p>
<p>This improvement looks promise and may cover broader scenarios as the one I described in this blog post. This feature is in preview at the moment of this write-up and I hope to see it coming soon in GA as well as a potential support of preferred PowerShell framework DBAtools <img src="https://blog.developpez.com/mikedavem/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" /></p>
<p>See you!</p>
]]></content:encoded>
			<wfw:commentRss></wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>dbachecks and AlwaysOn availability group checks</title>
		<link>https://blog.developpez.com/mikedavem/p13194/sql-server-2012/dbachecks-and-alwayson-availability-group-checks</link>
		<comments>https://blog.developpez.com/mikedavem/p13194/sql-server-2012/dbachecks-and-alwayson-availability-group-checks#comments</comments>
		<pubDate>Mon, 20 Apr 2020 19:57:31 +0000</pubDate>
		<dc:creator><![CDATA[mikedavem]]></dc:creator>
				<category><![CDATA[DevOps]]></category>
		<category><![CDATA[SQL Server 2012]]></category>
		<category><![CDATA[SQL Server 2014]]></category>
		<category><![CDATA[SQL Server 2016]]></category>
		<category><![CDATA[SQL Server 2017]]></category>
		<category><![CDATA[SQL Server 2019]]></category>
		<category><![CDATA[automation]]></category>
		<category><![CDATA[dbachecks]]></category>
		<category><![CDATA[dbatools]]></category>
		<category><![CDATA[monitoring]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Powershell]]></category>
		<category><![CDATA[sqlserver]]></category>

		<guid isPermaLink="false">http://blog.developpez.com/mikedavem/?p=1591</guid>
		<description><![CDATA[When I started my DBA position in my new company, I was looking for a tool that was able to check periodically the SQL Server database environments for several reasons. First, as DBA one of my main concern is about &#8230; <a href="https://blog.developpez.com/mikedavem/p13194/sql-server-2012/dbachecks-and-alwayson-availability-group-checks">Lire la suite <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p>When I started my DBA position in my new company, I was looking for a tool that was able to check periodically the SQL Server database environments for several reasons. First, as DBA one of my main concern is about maintaining and keeping the different mssql environments well-configured against an initial standard. It is also worth noting I’m not the only person to interact with databases and anyone in my team, which is member of sysadmin server role as well, is able to change any server-level configuration settings at any moment. In this case, chances are that having environments shifting from our initial standard over the time and my team and I need to keep confident by checking periodically the current mssql environment configurations, be alerting if configuration drifts exist and obviously fix it as faster as possible.  </p>
<p><span id="more-1591"></span></p>
<p>A while ago, I relied on SQL Server Policy Based Management feature (PBM) to carry out this task at one of my former customers and I have to say it did the job but with some limitations. Indeed, PBM is the instance-scope feature and doesn’t allow to check configuration settings outside the SQL Server instance for example. During my investigation, <a href="https://dbachecks.readthedocs.io/en/latest/" rel="noopener" target="_blank">dbachecks</a> framework drew my attention for several reasons:</p>
<p>&#8211;	It allows to check different settings at different scopes including Operating System and SQL Server instance items<br />
&#8211;	It is an open source project and keeps evolving with SQL / PowerShell community contributions.<br />
&#8211;	It is extensible, and we may include custom checks to the list of predefined checks shipped with the targeted version.<br />
&#8211;	It is based on PowerShell, Pester framework and fits well with existing automation and GitOps process in my company</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2020/04/161-0-dbachecks-process.jpg"><img src="http://blog.developpez.com/mikedavem/files/2020/04/161-0-dbachecks-process.jpg" alt="161 - 0 - dbachecks process" width="1003" height="395" class="alignnone size-full wp-image-1592" /></a></p>
<p>The first dbacheck version we deployed in production a couple of month ago was 1.2.24 and unfortunately it didn’t include reliable tests for availability groups. It was the starting point of my first contributions to open source projects and I felt proud and got honored when I noticed my 2 PRs validated for the dbacheck tool including Test Disk Allocation Unit and Availability Group checks:</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2020/04/161-1-release-note.jpg"><img src="http://blog.developpez.com/mikedavem/files/2020/04/161-1-release-note.jpg" alt="161 - 1 - release note" width="798" height="348" class="alignnone size-full wp-image-1593" /></a></p>
<p>Obviously, this is just an humble contribution and to be clear, I didn’t write the existing tests for AGs but I spent some times to apply fixes for a better detection of all AG environments including their replicas in a simple and complex topologies (several replicas on the same server and non-default ports for example). </p>
<p>So, here the current list of AG checks in the version 1.2.29 at the moment of this write-up:</p>
<p>&#8211;	Cluster node should be up<br />
&#8211;	AG resource + IP Address in the cluster should be online<br />
&#8211;	Cluster private and public network should be up<br />
&#8211;	HADR should be enabled on each AG replica<br />
&#8211;	AG Listener + AG replicas should be pingable and reachable from client connections<br />
&#8211;	AG replica should be in the correct domain name<br />
&#8211;	AG replica port number should be equal to the port specified in your standard<br />
&#8211;	AG availability mode should not be in unknown state and should be in synchronized or synchronizing state regarding the replication type<br />
&#8211;	Each high available database (member of an AG) should be in synchronized / synchronizing state, ready for failover, joined to the AG and not in suspended state<br />
&#8211;	Each AG replica should have an extended event session called AlwaysOn_health which is in running state and configured in auto start mode</p>
<p>Mandatory parameters are <strong>app.cluster</strong> and <strong>domain.name</strong>.</p>
<div class="codecolorer-container text default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;"><div class="text codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap">Get-DbcCheck -Tag HADR | ft Group, Type, AllTags, Config -AutoSize<br />
<br />
Group Type &nbsp; &nbsp; &nbsp; &nbsp;AllTags &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; Config<br />
----- ---- &nbsp; &nbsp; &nbsp; &nbsp;------- &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; ------<br />
HADR &nbsp;ClusterNode ClusterHealth, HADR app.sqlinstance app.cluster skip.hadr.listener.pingcheck domain.name policy...</div></div>
<p>The starting point of the HADR checks is the Windows Failover Cluster component and hierarchically other tests are performed on each sub component including availability group, AG replicas and AG databases. </p>
<p>Then you may change the behavior on the HADR check process according to your context by using the following parameters:</p>
<p>&#8211;	skip.hadr.listener.pingcheck =&gt; Skip ping check of hadr listener<br />
&#8211;	skip.hadr.listener.tcpport   =&gt; Skip check of standard tcp port about  AG listerners<br />
&#8211;	skip.hadr.replica.tcpport    =&gt; Skip check of standard tcp port about AG replicas</p>
<p>For instance, in my context, I configured the <strong>hadr.replica.tcpport</strong> parameter to skip checks on replica ports because we own different environments that including several replicas on the same server and which listen on a non-default port.</p>
<div class="codecolorer-container text default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;"><div class="text codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap">Get-DbcConfig skip.hadr.*<br />
Name &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; Value Description<br />
---- &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; ----- -----------<br />
skip.hadr.listener.pingcheck False Skip the HADR listener ping test (especially useful for Azure and AWS)<br />
skip.hadr.listener.tcpport &nbsp; False Skip the HADR AG Listener TCP port number (If port number is not standard acro...<br />
skip.hadr.replica.tcpport &nbsp; &nbsp; True Skip the HADR Replica TCP port number (If port number is not standard across t...</div></div>
<p>Running the HADR check can be simply run by using HADR tag as follows:</p>
<div class="codecolorer-container text default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;"><div class="text codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap">Invoke-DbcCheck -Tag HADR &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<br />
Pester v4.10.1 &nbsp;Executing all tests in 'C:\Program Files\WindowsPowerShell\Modules\dbachecks\1.2.29\checks\HADR.Tests.ps1' with Tags HADR &nbsp; <br />
...</div></div>
<p><a href="http://blog.developpez.com/mikedavem/files/2020/04/161-2-hadr-checks-e1587413256656.jpg"><img src="http://blog.developpez.com/mikedavem/files/2020/04/161-2-hadr-checks-e1587413256656.jpg" alt="161 - 2 - hadr checks" width="1000" height="426" class="alignnone size-full wp-image-1599" /></a>                               </p>
<p>Well, this a good start but I think some almost of checks are state-oriented and some configuration checks are missing. I’m already willing to add some of them in a near the future or/and feel free to add your own contribution as well <img src="https://blog.developpez.com/mikedavem/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" /> </p>
<p>Stay tuned! </p>
]]></content:encoded>
			<wfw:commentRss></wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>Database maintenance thoughts with Azure SQL databases</title>
		<link>https://blog.developpez.com/mikedavem/p13192/sql-azure/database-maintenance-concerns-with-azure-sql-databases</link>
		<comments>https://blog.developpez.com/mikedavem/p13192/sql-azure/database-maintenance-concerns-with-azure-sql-databases#comments</comments>
		<pubDate>Sun, 29 Mar 2020 20:55:33 +0000</pubDate>
		<dc:creator><![CDATA[mikedavem]]></dc:creator>
				<category><![CDATA[DevOps]]></category>
		<category><![CDATA[SQL Azure]]></category>
		<category><![CDATA[Azure Automation]]></category>
		<category><![CDATA[Azure SQL Scheduling]]></category>
		<category><![CDATA[devops]]></category>
		<category><![CDATA[Git]]></category>
		<category><![CDATA[Powershell]]></category>
		<category><![CDATA[Runbook]]></category>

		<guid isPermaLink="false">http://blog.developpez.com/mikedavem/?p=1552</guid>
		<description><![CDATA[As DBA, your priority is to ensure your data are consistent, safely backed up and you get steady performance of your database. In on-prem environments, these tasks are generally performed through scheduled jobs including backups, check integrity and index / &#8230; <a href="https://blog.developpez.com/mikedavem/p13192/sql-azure/database-maintenance-concerns-with-azure-sql-databases">Lire la suite <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p>As DBA, your priority is to ensure your data are consistent, safely backed up and you get steady performance of your database. In on-prem environments, these tasks are generally performed through scheduled jobs including backups, check integrity and index / statistics maintenance tasks. </p>
<p>But moving databases to the cloud in Azure (and others) tells a different story. Indeed, even if the same concern and tasks remain, some of them are under the responsibility of the Cloud provider and some other ones not.  If you’re working with Azure SQL databases – like me – some questions raise very quickly on this topic and it was my motivation to write this write-up. I would like to share with you some new experiences by digging into the different maintenance items. If you have a different story to tell, please feel free to comment and to share your own experience!</p>
<p><span id="more-1552"></span></p>
<p><strong>Database backups</strong></p>
<p>Microsoft takes over the database backups with a strategy based on FULL (every week), DIFF (every 12 hours) and LOGs (every 5 to 10min) with cross-datacenter replication of the backup data. As far as I know, we cannot change this strategy, but we may change the retention period and extend it with an archiving period  extend up to 10 years by enabling the Long-term retention. The latter assumes this is supported by your database service level and options that come with. For instance, we are using some SQL Azure databases in serverless mode which doesn’t support LTR. This strategy provides different methods to restore an Azure database including PITR, Geo-Restore or the ability to restore a deleted database. We are using some of them for our database refresh between Azure SQL Servers or sometimes to restore previous database states for testing. However, just be aware that even if restoring a database may be a trivial operation in Azure, the operation may take a long time regarding your context and factors described <a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-recovery-using-backups" rel="noopener" target="_blank">here</a>. In our context and regarding the operation, a restore operation may take up to 2.5h (600GB of data to restore on GEN5</p>
<p>In addition, it is worth noting that there is not a free lunch here and you will pay for storing your backups and probably more than you initially expect. Cost is obviously tied to your backup size for FULL, DIFF and LOG and the retention period making the budget sometimes hard to predict. According to discussions with some colleagues and other MVPs, it seems we are not alone in this case and my advice is to keep an eye of your cost. Here a quick and real picture of the cost ratio between compute + database storage versus backup storage (PITR + LTR) with a PITR retention of 35 days and LITR (max retention of one year)</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2020/03/159-3-DB-retention-policies-PITR-LTR-1.jpg"><img src="http://blog.developpez.com/mikedavem/files/2020/03/159-3-DB-retention-policies-PITR-LTR-1.jpg" alt="159 - 3 - DB retention policies PITR LTR" width="1492" height="376" class="alignnone size-full wp-image-1565" /></a></p>
<p>&#8230;</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2020/03/159-2-Cost-ratio-compute-storage-backup-.jpg"><img src="http://blog.developpez.com/mikedavem/files/2020/03/159-2-Cost-ratio-compute-storage-backup-.jpg" alt="159 - 2 - Cost ratio compute - storage - backup" width="819" height="263" class="alignnone size-full wp-image-1555" /></a></p>
<p>As you may notice half of the total fees for the Azure SQL Database may concern only the backup storage. From our side, we are working hard on reducing this ratio, but this is another topic out of the scope of this blog post.</p>
<p><strong>Database integrity check</strong></p>
<p>Should we continue to use the famous DBCC CHECKDB command? Well, the response is no, and the Azure SQL Database engineering team takes responsibility for managing data integrity. During internal team discussions we wondered what the process would be to recover corrupt data and how fast corruptions are treated by the Azure team. All questions seem to be addressed in this Microsoft blog post here and for us, it was important to know the Microsoft response time in case of database corruption because it may impact the retention policy. Faster Microsoft warns you about your integrity issue, less the retention could be to rewind to the last consistent point (in a reasonable order of magnitude obviously). </p>
<p><strong>Database maintenance (statistics and indexes)</strong></p>
<p>Something that is likely misunderstood with Azure SQL database is the maintenance of indexes and statistics are not anymore under the responsibility of the DBA. Referring to some discussions around me, it seems to be a misconception and the automatic index tuning was often mentioned in the discussions. Automatic tuning aims to adapt dynamically database to a changing workload by applying tuning recommendations either by creating new indexes or dropping redundant and duplicate indexes or forcing last good plan for queries as well. Even this feature (not by default) helps improving the performance for sure, it doesn’t substitute neither updating statistics nor rebuilding fragmented indexes. Concerning the statistics, it is true that some improvements about statistics has been shipped with SQL Server over the time like TF2371 which makes the formula for large tables more dynamic (by default since SQL Server 2016+) but we may arguably say that it remains situations where updating statistics should be done manually and as database administrator it is still under your own responsibility to maintain them.</p>
<p>Database maintenance and schedulng in Azure?</p>
<p>As said as the beginning of this write-up with Azure SQL DB, database maintenance is a different story and the same applies when it comes scheduling. Indeed, you quickly noticed we lacked built-in job scheduler capabilities like the traditional SQL Server agent with on-premises installations, but it doesn’t mean we were not able to schedule any job at all. In fact, there is exists different options to look at to replace the traditional SQL Server agent for database maintenance in Azure we had to look at: </p>
<p>1) SQL Agent jobs still exist but only available for SQL Managed Instances. In our context, we use Azure Single Database with GP_S_Gen5 SKU, so definitely not an option for us.</p>
<p>2) Elastic database jobs can run across multiple servers and allow to write DB maintenance tasks in T-SQL or PowerShell. But this feature has some limitations which has excluded it from the equation:<br />
&#8211; It’s still in preview and we cannot rely on it for production scenarios<br />
&#8211; Serverless and auto-pausing / auto-resuming used with our GP_S_Gen5 SKU database are not supported </p>
<p>3) Data factory could be an option because it is already part of the Azure Services consumed in our context, but we wanted to be decoupled from ETL / Business workflow. </p>
<p>4) Finally, we were interested by Data factory especially the integration with Git and Azure DevOps and the same capabilities are shipped with Azure Automation. One another important factor of decision was the cost because Azure automation runs for free until 500 minutes of job execution per month. In our context, we have a weekly-based schedule for our maintenance plan and we estimated one hour per runbook execution. Thus, we stay under the limit of additional fees.</p>
<p>Azure Automation brings a good control on credentials, but we already use the Azure Key Vault to protect sensitive information. We found that using Azure automation native capabilities and Azure Key Vault may be duplicate that could lead to decentralize our secret management and it more complex. Here a big picture of the process to perform the maintenance of our Azure databases from a scheduled runbook in Azure automation:</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2020/03/159-1-Azure-automation-DB-maintenance-process--e1585510197141.jpg"><img src="http://blog.developpez.com/mikedavem/files/2020/03/159-1-Azure-automation-DB-maintenance-process--e1585510197141.jpg" alt="159 - 1 - Azure automation DB maintenance process" width="1000" height="373" class="alignnone size-full wp-image-1557" /></a></p>
<p>Firstly, we use a PowerShell-based runbook which in turn calls different stored procedures on the target Azure database to perform the database maintenance. To be compliant with our DevOps processes, the runbook is stored in a source control repository (Git) and published to Azure Automation through the built-in sync process. The runbook runs with “Run As Account” option to get access of Azure Key Vault and AppID for using the dedicated application identity. Finally, this identity is then used to connect to the SQL Azure DB and to perform the database maintenance based on the corresponding token authentication and granted permissions on the DB side. New token-based authentication available since the Azure SQL DB v12 and helped us to meet our security policy that prevents using SQL Logins when possible. To generate the token, we still use the old ADAL.PS module. This is something we need to update in the future.</p>
<p>Here a sample of interesting parts of the PowerShell code to authenticate to the Azure database:</p>
<div class="codecolorer-container text default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;"><div class="text codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap"># Run runbook as special account to get access the Azure Key Vault<br />
$AzureAutomationConnectionName = &quot;xxxx&quot;<br />
$ServicePrincipalConnection = Get-AutomationConnection -Name $AzureAutomationConnectionName<br />
<br />
…<br />
<br />
$clientId = (Get-AzKeyVaultSecret -VaultName $KeyvaultName -Name &quot;xxxxx&quot;).SecretValueText<br />
$response = Get-ADALToken -ClientId $clientId -ClientSecret $clientSecret -Resource $resourceUri -Authority $authorityUri -TenantId $tenantName<br />
<br />
# Connection String<br />
$connectionString = &quot;Server=tcp:$SqlInstance,1433;Initial Catalog=$Database;Persist Security Info=False;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;&quot;<br />
<br />
# Create the connection object<br />
$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)<br />
<br />
# Set identity by using the corresponding token to connect to the Azure DB<br />
$connection.AccessToken = $response.AccessToken<br />
<br />
...</div></div>
<p>Yes, Azure is a different beast (like other Clouds) and requires from DBAs to review their habits. It may be very confusing at the beginning but everything you made in the past is possible or at least can be achieved in a different way in Azure. Just think differently would be my best advice in this case! </p>
]]></content:encoded>
			<wfw:commentRss></wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>Powershell : scripter la cr&#233;ation d&#8217;alias SQL Server</title>
		<link>https://blog.developpez.com/mikedavem/p11586/sql-server-2008/powershell-scripter-la-cration-dalias-sql-server</link>
		<comments>https://blog.developpez.com/mikedavem/p11586/sql-server-2008/powershell-scripter-la-cration-dalias-sql-server#comments</comments>
		<pubDate>Tue, 18 Dec 2012 17:20:37 +0000</pubDate>
		<dc:creator><![CDATA[mikedavem]]></dc:creator>
				<category><![CDATA[SQL Server 2008]]></category>
		<category><![CDATA[SQL Server 2008 R2]]></category>
		<category><![CDATA[SQL Server 2012]]></category>
		<category><![CDATA[alias]]></category>
		<category><![CDATA[Powershell]]></category>

		<guid isPermaLink="false">http://blog.developpez.com/mikedavem/?p=284</guid>
		<description><![CDATA[J’ai eu récemment à définir une procédure d’installation de serveurs liés SQL Server pour une application. Cette procédure doit être bien entendu testée en environnement de qualité et en production.&#160; De plus chez mon client, un 2ème environnement de qualité &#8230; <a href="https://blog.developpez.com/mikedavem/p11586/sql-server-2008/powershell-scripter-la-cration-dalias-sql-server">Lire la suite <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p>J’ai eu récemment à définir une procédure d’installation de serveurs liés SQL Server pour une application. Cette procédure doit être bien entendu testée en environnement de qualité et en production.&#160; De plus chez mon client, un 2ème environnement de qualité est prévu en parallèle pour installer l’application concernée. Le principal souci ici est que les noms de serveurs liés vont changés et que cela risque d’avoir un impact au niveau du code de l’application. Nous ne pouvions pas nous permettre de mettre à jour l’ensemble du code TSQL applicatif à chaque changement d’environnement. Pour répondre à cette problématique nous avons choisi d’utiliser des alias SQL Server. L’utilisation des alias est beaucoup plus flexible avec les serveurs liés. Il suffit de changer ces noms sans avoir un impact sur le code. </p>
<p>Mais revenons à notre problématique initiale : comment scripter la création des alias SQL Server ? Il n’existe pas de procédure système T-SQL qui permette de faire cela. La solution ici est d’utiliser les classes WMI correspondantes (ici j’utilise un SQL Server 2008). Comme nous sommes à l’air du PowerShell voici le script utilisé permettant la création d’alias sous SQL Server :</p>
<p><div class="codecolorer-container powershell default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;"><div class="powershell codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap"><span style="color: #0000FF;">param</span> <br />
<span style="color: #000000;">&#40;</span> <br />
<span style="color: #800080;">$server</span><span style="color: pink;">=</span>$<span style="color: #000000;">&#40;</span><span style="color: #0000FF;">throw</span> <span style="color: #800000;">&quot;Mandatory parameter -instance_name_sot not supplied&quot;</span><span style="color: #000000;">&#41;</span> <br />
<span style="color: #000000;">&#41;</span> <br />
<br />
<br />
<span style="color: #008000;"># Example script to create an alias </span><br />
<span style="color: #800080;">$alias</span> <span style="color: pink;">=</span> <span style="color: #000000;">&#40;</span><span style="color: #000000;">&#91;</span><span style="color: #008080;">wmiclass</span><span style="color: #000000;">&#93;</span> <span style="color: #800000;">'\\.\root\Microsoft\SqlServer\ComputerManagement10:SqlServerAlias'</span><span style="color: #000000;">&#41;</span>.CreateInstance<span style="color: #000000;">&#40;</span><span style="color: #000000;">&#41;</span> <br />
<span style="color: #800080;">$alias</span>.AliasName <span style="color: pink;">=</span> <span style="color: #800000;">'SERVER_ALIAS'</span> <br />
<span style="color: #800080;">$alias</span>.ConnectionString <span style="color: pink;">=</span> <span style="color: #800000;">'1433'</span> <span style="color: #008000;">#connection specific parameters depending on the protocol </span><br />
<span style="color: #800080;">$alias</span>.ProtocolName <span style="color: pink;">=</span> <span style="color: #800000;">'tcp'</span> <br />
<span style="color: #800080;">$alias</span>.ServerName <span style="color: pink;">=</span> <span style="color: #800080;">$server</span> <br />
<span style="color: #800080;">$alias</span>.Put<span style="color: #000000;">&#40;</span><span style="color: #000000;">&#41;</span> <span style="color: pink;">|</span> <span style="color: #008080; font-weight: bold;">Out-Null</span>; <br />
<br />
<span style="color: #008000;"># List existing aliases </span><br />
<span style="color: #008080; font-weight: bold;">Get-WmiObject</span> <span style="color: #008080; font-style: italic;">-Namespace</span> <span style="color: #800000;">'root\Microsoft\SqlServer\ComputerManagement10'</span> <span style="color: #008080; font-style: italic;">-Class</span> <span style="color: #800000;">'SqlServerAlias'</span> <span style="color: pink;">|</span> <br />
&nbsp; &nbsp; <span style="color: #008080; font-weight: bold;">Format-Table</span> <span style="color: #008080; font-style: italic;">-Property</span> <span style="color: #800000;">'AliasName'</span><span style="color: pink;">,</span> <span style="color: #800000;">'ServerName'</span><span style="color: pink;">,</span> <span style="color: #800000;">'ProtocolName'</span><span style="color: pink;">,</span> <span style="color: #800000;">'ConnectionString'</span></div></div>
</p>
<p>&#160;</p>
<p>… à utiliser de la manière suivante :</p>
<p><div class="codecolorer-container powershell default" style="overflow:auto;white-space:nowrap;border:1px solid #9F9F9F;width:650px;"><div class="powershell codecolorer" style="padding:5px;font:normal 12px/1.4em Monaco, Lucida Console, monospace;white-space:nowrap">&nbsp;<span style="color: pink;">&amp;</span> <span style="color: #800000;">'.\deployalias'</span> –server servername</div></div>
</p>
<p>&#160;</p>
<p><a href="http://blog.developpez.com/mikedavem/files/2012/12/image.png"><img style="border-left-width: 0px;border-right-width: 0px;border-bottom-width: 0px;padding-top: 0px;padding-left: 0px;padding-right: 0px;border-top-width: 0px" border="0" alt="image" src="http://blog.developpez.com/mikedavem/files/2012/12/image_thumb.png" width="747" height="56" /></a></p>
<p>&#160;</p>
<p>Vous pouvez tout à fait modifier ce script et le rendre plus paramétrable (nom de l’alias, no de port etc …). Encore une chose, en fonction du besoin vous allez sans doute devoir installer les alias pour les 2 types d’architecteur x86 et x64. Pour se faire il suffit de lancer le script avec Windows PowerShell (x64) et Windows PowerShell (x86). </p>
<p>&#160;</p>
<p>Bon déploiement</p>
<p>David BARBARIN (Mikedavem)    <br />MVP SQL Server</p>
]]></content:encoded>
			<wfw:commentRss></wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
