<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>SQL Server Archives - DBAduck</title>
	<atom:link href="https://www.dbaduck.com/category/sql-server/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.dbaduck.com/category/sql-server/</link>
	<description>DBA</description>
	<lastBuildDate>Tue, 10 Mar 2026 18:29:55 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>PowerShell for SQL Server DBAs Course is Finally Here!</title>
		<link>https://www.dbaduck.com/powershell-for-sql-server-dbas-course-is-finally-here/</link>
					<comments>https://www.dbaduck.com/powershell-for-sql-server-dbas-course-is-finally-here/#respond</comments>
		
		<dc:creator><![CDATA[dbaduck]]></dc:creator>
		<pubDate>Fri, 29 Nov 2024 06:02:24 +0000</pubDate>
				<category><![CDATA[Powershell]]></category>
		<category><![CDATA[SQL Server]]></category>
		<guid isPermaLink="false">https://www.dbaduck.com/?p=4769</guid>

					<description><![CDATA[<p>I have finally finished preparing my Fundamentals of PowerShell for SQL Server DBAs Course. You can see the curriculum at this link PowerShell for DBAs Fundamentals &#124; DBADuck Training School. The class is scheduled to be delivered live and the recordings will be available to you for a year as part of the class. The [&#8230;]</p>
<p>The post <a href="https://www.dbaduck.com/powershell-for-sql-server-dbas-course-is-finally-here/">PowerShell for SQL Server DBAs Course is Finally Here!</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>I have finally finished preparing my Fundamentals of PowerShell for SQL Server DBAs Course.</p>



<p>You can see the curriculum at this link <a href="https://training.dbaduck.com/p/powershell-for-dbas-fundamentals">PowerShell for DBAs Fundamentals | DBADuck Training School</a>. The class is scheduled to be delivered live and the recordings will be available to you for a year as part of the class.  The enrollment link is live and will allow you to enroll, but make sure you use the discount link below.</p>



<p>BLACK FRIDAY special. For 2 weeks there is a coupon <strong><em>BLACKFRIDAY</em></strong> that will get you $2000.00 off the regular price of $2,995.00. Don&#8217;t miss out on this deal before the end of this year.  If you have any questions you can feel free to use the Contact link.</p>
<p>The post <a href="https://www.dbaduck.com/powershell-for-sql-server-dbas-course-is-finally-here/">PowerShell for SQL Server DBAs Course is Finally Here!</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.dbaduck.com/powershell-for-sql-server-dbas-course-is-finally-here/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Contained Availability Groups using DbMail Query</title>
		<link>https://www.dbaduck.com/contained-availability-groups-using-dbmail-query/</link>
					<comments>https://www.dbaduck.com/contained-availability-groups-using-dbmail-query/#comments</comments>
		
		<dc:creator><![CDATA[dbaduck]]></dc:creator>
		<pubDate>Sun, 13 Oct 2024 18:03:44 +0000</pubDate>
				<category><![CDATA[Community]]></category>
		<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[Availability Groups]]></category>
		<category><![CDATA[Fixes]]></category>
		<guid isPermaLink="false">https://www.dbaduck.com/?p=4767</guid>

					<description><![CDATA[<p>Call to action for Microsoft. Contained Availability Groups came out in SQL 2022 and they definitely have their use. But there were some artifacts left behind that need some fixing. Namely when you use DBMail while in the Availability Group jobs or operations. Let&#8217;s see what there is left. First, here is the link to [&#8230;]</p>
<p>The post <a href="https://www.dbaduck.com/contained-availability-groups-using-dbmail-query/">Contained Availability Groups using DbMail Query</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Call to action for Microsoft. Contained Availability Groups came out in SQL 2022 and they definitely have their use. But there were some artifacts left behind that need some fixing. Namely when you use DBMail while in the Availability Group jobs or operations. Let&#8217;s see what there is left.</p>



<p>First, here is the link to the <a href="https://feedback.azure.com/d365community/idea/e2454e20-3d80-ef11-a4e5-000d3a01397d">Feedback Item</a> that is out there for voting to get Microsoft to fix this issue. There has already been an issue fixed with the msdb proc to activate dbmail in a Contained AG ([dbo].[sp_sysmail_activate]). </p>



<h3 class="wp-block-heading" id="h-what-is-the-issue">What is the Issue?</h3>



<p>The issue is that within a Contained AG the master and msdb are actually named AGNAME_master and AGNAME_msdb which in the procedures mentioned use the following code which uses the original msdb database instead of the AGNAME_msdb database.</p>



<pre class="wp-block-code"><code>-- sp_RunMailQuery is used to run a query result into the mail message
SET @mailDbName = db_name() -- which will return msdb instead of AGNAME_msdb</code></pre>



<p>The fix is to have this code instead so that it uses the real database name since it is in a contained availability  group which makes the AGNAME_msdb look like msdb when connected to the AG Listener. See below.</p>



<pre class="wp-block-code"><code>-- Note: we cannot use DB_NAME() here since this SP could be running in the context
    -- of a contained AG: in that case, we really want to fetch the physical name of the DB
    SELECT @mailDbName = physical_database_name FROM sys.databases WHERE database_id = DB_ID()
    SET @mailDbId   = DB_ID()</code></pre>



<p>Notice how the database name instead of just being DB_NAME() it is using the physical_database_name that is in sys.databases for this DB_ID()?  This is the fix. You can make the fix yourself until <a href="https://www.microsoft.com">Microsoft</a> gets to it in the feedback item.</p>



<p>If you have not applied the <a href="https://www.microsoft.com/en-us/download/details.aspx?id=105013">latest SQL 2022 CU</a> to your servers and you run a Contained Availability Group, then you should look to update your server so that you have the most up to date fixes in SQL Server. So far this fix is not in a CU, but the previous one for dbo.sp_sysmail_activate has been fixed in one of the CUs previously.</p>
<p>The post <a href="https://www.dbaduck.com/contained-availability-groups-using-dbmail-query/">Contained Availability Groups using DbMail Query</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.dbaduck.com/contained-availability-groups-using-dbmail-query/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
			</item>
		<item>
		<title>TSQL Tuesday &#8211; One of my favorite Extended Events</title>
		<link>https://www.dbaduck.com/tsql-tuesday-one-of-my-favorite-extended-events/</link>
					<comments>https://www.dbaduck.com/tsql-tuesday-one-of-my-favorite-extended-events/#comments</comments>
		
		<dc:creator><![CDATA[dbaduck]]></dc:creator>
		<pubDate>Tue, 12 Sep 2023 14:41:21 +0000</pubDate>
				<category><![CDATA[Extended Events]]></category>
		<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[ExtendedEvents]]></category>
		<category><![CDATA[SQLServer]]></category>
		<category><![CDATA[tsqltuesday]]></category>
		<guid isPermaLink="false">https://www.dbaduck.com/?p=4736</guid>

					<description><![CDATA[<p>In honor of TSQL Tuesday for September, the mother post is with Grant Fritchey &#8211; TSQL Tuesday 166. I wanted to share one of the Extended Events I always put on a server when I am in charge of it. It has to do with File growths and captures some important things for me. Before [&#8230;]</p>
<p>The post <a href="https://www.dbaduck.com/tsql-tuesday-one-of-my-favorite-extended-events/">TSQL Tuesday &#8211; One of my favorite Extended Events</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>In honor of TSQL Tuesday for September, the mother post is with <a href="https://www.scarydba.com/">Grant Fritchey</a>  &#8211; <a href="https://www.scarydba.com/2023/09/11/t-sql-tuesday-166-extended-events/">TSQL Tuesday 166.</a></p>



<p>I wanted to share one of the Extended Events I always put on a server when I am in charge of it. It has to do with File growths and captures some important things for me. Before you say that it is in the system_health extended events session, I know that it is there. I have had system_health sessions cycle pretty fast and there are a lot of other events in that trace, so I decided to make my own for just that specific thing so that I can archive the sessions and keep the disk clean as well as pull this information into a table and analyze data in a tabular way instead of mining XE files.</p>



<p>Here it is for those that would like to use it as well.</p>



<pre class="wp-block-preformatted">CREATE EVENT SESSION [FileSizeAutogrow] ON SERVER 
ADD EVENT sqlserver.database_file_size_change(SET collect_database_name=(1)
    ACTION(package0.collect_system_time,sqlos.task_time,sqlserver.client_app_name,sqlserver.client_hostname,sqlserver.client_pid,sqlserver.database_id,sqlserver.database_name,sqlserver.server_instance_name,sqlserver.session_id,sqlserver.sql_text,sqlserver.username)),
ADD EVENT sqlserver.databases_log_file_size_changed(
    ACTION(package0.collect_system_time,sqlos.task_time,sqlserver.client_app_name,sqlserver.client_hostname,sqlserver.client_pid,sqlserver.database_id,sqlserver.database_name,sqlserver.server_instance_name,sqlserver.session_id,sqlserver.sql_text,sqlserver.username))
ADD TARGET package0.event_file(SET filename=N'FileSizeAutogrow.xel',max_file_size=(500),max_rollover_files=(10))
WITH (MAX_MEMORY=4096 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=30 SECONDS,MAX_EVENT_SIZE=0 KB,MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=OFF,STARTUP_STATE=ON)
GO

</pre>



<p>You will notice that the events that are captured are sqlserver.database_file_size_change and sqlserver.databases_log_file_size_changed (yes, it is weird to have the names be so different. The first one has no (s) on database_file_size_change  and the second there is no (d) on change. On the log file size change it is database(s) plural and change(d). Not sure what the significance of that is, but it is not intuitive to find in XE. These events are available from SQL 2012 and onwards. They do not exist in SQL 2008R2.</p>



<h2 class="wp-block-heading" id="h-what-do-i-get-out-of-this"><strong>What do I get out of this?</strong></h2>



<figure class="wp-block-image size-full"><img fetchpriority="high" decoding="async" width="394" height="427" src="https://www.dbaduck.com/wp-content/uploads/2023/09/LogFileAutogrow1.png" alt="" class="wp-image-4737" srcset="https://www.dbaduck.com/wp-content/uploads/2023/09/LogFileAutogrow1.png 394w, https://www.dbaduck.com/wp-content/uploads/2023/09/LogFileAutogrow1-277x300.png 277w" sizes="(max-width: 394px) 100vw, 394px" /></figure>



<figure class="wp-block-image size-full"><img decoding="async" width="414" height="433" src="https://www.dbaduck.com/wp-content/uploads/2023/09/DataFileAutogrow1-1.png" alt="" class="wp-image-4739" srcset="https://www.dbaduck.com/wp-content/uploads/2023/09/DataFileAutogrow1-1.png 414w, https://www.dbaduck.com/wp-content/uploads/2023/09/DataFileAutogrow1-1-287x300.png 287w" sizes="(max-width: 414px) 100vw, 414px" /></figure>



<p>Notice that in the first graphic I have marked it up. You get a lot of things out of this data.<br>* Database Name / Filename<br>* File Type<br>* Duration of the growth<br>* Size change in KB</p>



<p>I carry a mantra that if you keep all files from growing (proactive maintenance of files) then you can grow them automatedly but at a time when it is more conducive to grow it and not in the middle of a great big transaction. Automation is how you get the most out of your day without the heavy lifting that many DBAs still have as part of their DBA careers.</p>



<p>Happy TSQL Tuesday from the Ducks on my side of the pond.</p>
<p>The post <a href="https://www.dbaduck.com/tsql-tuesday-one-of-my-favorite-extended-events/">TSQL Tuesday &#8211; One of my favorite Extended Events</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.dbaduck.com/tsql-tuesday-one-of-my-favorite-extended-events/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
			</item>
		<item>
		<title>Ola Hallengren Maintenance part 1</title>
		<link>https://www.dbaduck.com/ola-hallengren-maintenance-part-1/</link>
					<comments>https://www.dbaduck.com/ola-hallengren-maintenance-part-1/#comments</comments>
		
		<dc:creator><![CDATA[dbaduck]]></dc:creator>
		<pubDate>Tue, 31 Jan 2023 20:40:35 +0000</pubDate>
				<category><![CDATA[Community]]></category>
		<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[DBA]]></category>
		<category><![CDATA[SQL]]></category>
		<guid isPermaLink="false">https://www.dbaduck.com/?p=4708</guid>

					<description><![CDATA[<p>Many of us are using Ola Hallengren&#8217;s maintenance solution. This post will assist you in configuring this solution when you add it to your SQL Servers. Jobs are created with this solution. You download it at the link above. I recommended creating a database to use for this solution or even installing it into an [&#8230;]</p>
<p>The post <a href="https://www.dbaduck.com/ola-hallengren-maintenance-part-1/">Ola Hallengren Maintenance part 1</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Many of us are using <a href="https://ola.hallengren.com">Ola Hallengren&#8217;s maintenance solution</a>.</p>



<p>This post will assist you in configuring this solution when you add it to your SQL Servers. Jobs are created with this solution. You download it at the link above.</p>



<ul class="wp-block-list">
<li>Backup Jobs for System and User databases</li>



<li>Database Integrity jobs for System and User Databases</li>



<li>Cleanup jobs for Command Log and Output Files</li>
</ul>



<p>I recommended creating a database to use for this solution or even installing it into an existing DBA function database. I usually create a DBA database and use it for this purpose and others as well. With this new database, you configure the Database in the header of the maintenance solution SQL file, whether to create jobs, retention time and backup directory for the jobs.</p>



<div class="wp-block-urvanov-syntax-highlighter-code-block"><pre class="urvanov-syntax-highlighter-plain-tag">USE [master] -- Specify the database in which the objects will be created.

SET NOCOUNT ON

DECLARE @CreateJobs nvarchar(max)          = 'Y'         -- Specify whether jobs should be created.
DECLARE @BackupDirectory nvarchar(max)     = NULL        -- Specify the backup root directory. If no directory is specified, the default backup directory is used.
DECLARE @CleanupTime int                   = NULL        -- Time in hours, after which backup files are deleted. If no time is specified, then no backup files are deleted.
DECLARE @OutputFileDirectory nvarchar(max) = NULL        -- Specify the output file directory. If no directory is specified, then the SQL Server error log directory is used.
DECLARE @LogToTable nvarchar(max)          = 'Y'         -- Log commands to a table.</pre></div>



<h3 class="wp-block-heading" id="h-parameters-to-fill-out">Parameters to Fill Out</h3>



<p>Pay close attention to the highlighted lines:<br>The database name you created is put in the USE statement and the following items are created in that database:</p>



<ul class="wp-block-list">
<li>CommandLog table</li>



<li>Stored Procedures that are in the solution</li>
</ul>



<p>I use the DBA database here, which keeps it out of System databases and in an isolated place.</p>



<p>The @CreateJobs variable specifies whether or not the create the jobs. If you already have it installed, turning that off is easy, you just change it to an &#8216;N&#8217; and it will merely update the code so that you have the latest version.</p>



<p>Next is @BackupDirectory which is where you specify the database backup location you want to use for the backups. If you do not specify this one, then the default is used. The default location is in the Database Settings tab of the Server Properties in SSMS.</p>



<p>Next is @CleanupTime and it is specified in hours. If you want to clean up the backup files after 1 week, you specify 168 as the hours for cleanup time. This number is put in all the Backup jobs that are created.</p>



<p>Last but not least, is the @OutputFileDirectory which specifies the folder for the output files that are on each step in the Ola jobs. If not specified, it will default to the LOG directory specified in the startup of SQL Server. The SQL Server Errorlogs are kept in that folder. I usually leave the parameter to be NULL to leave the output files there.</p>



<p>The @LogToTable parameter is configured as &#8216;Y&#8217; because this is going to be put in the jobs as well to ensure that the work that is done in this solution is logged in the dbo.CommandLog table in the specified database.</p>



<p>In the code block below, the data is filled out so you can see what it looks like and the jobs get created with these settings as their defaults. The stored procedures are created in the specified database as well.</p>



<div class="wp-block-urvanov-syntax-highlighter-code-block"><pre class="urvanov-syntax-highlighter-plain-tag">USE [DBA] -- Specify the database in which the objects will be created.

SET NOCOUNT ON

DECLARE @CreateJobs nvarchar(max)          = 'Y'         -- Specify whether jobs should be created.
DECLARE @BackupDirectory nvarchar(max)     = '\\server\share'    -- Specify the backup root directory. If no directory is specified, the default backup directory is used.
DECLARE @CleanupTime int                   = 168        -- Time in hours, after which backup files are deleted. If no time is specified, then no backup files are deleted.
DECLARE @OutputFileDirectory nvarchar(max) = NULL        -- Specify the output file directory. If no directory is specified, then the SQL Server error log directory is used.
DECLARE @LogToTable nvarchar(max)          = 'Y'         -- Log commands to a table.</pre></div>



<h3 class="wp-block-heading" id="h-conclusion">Conclusion</h3>



<p>It is important to remember that this solution creates jobs that have a structure and at <a href="https:.//ola.hallengren.com">https:.//ola.hallengren.com</a> the documentation has a full complement of examples of how to configure these jobs with parameters. The items that you need to ensure are configured include @LogToTable = &#8216;Y&#8217; and @Compress = &#8216;Y&#8217; so that you get it logged to the table and the database backups are compressed. If you omit the @Compress parameter then it will rely on the server setting to compress backups by default. Never rely on a setting that can be changed with or without your knowledge.</p>



<p>This part was not meant to be 100% comprehensive, but to ensure that you have information of how to install the jobs and solution, other parts are going to go into more depth so that you can see how to use it in the real world.</p>
<p>The post <a href="https://www.dbaduck.com/ola-hallengren-maintenance-part-1/">Ola Hallengren Maintenance part 1</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.dbaduck.com/ola-hallengren-maintenance-part-1/feed/</wfw:commentRss>
			<slash:comments>2</slash:comments>
		
		
			</item>
		<item>
		<title>SQL Server Agent PowerShell Job Step with #NOSQLPS</title>
		<link>https://www.dbaduck.com/sql-server-agent-powershell-job-step-with-nosqlps/</link>
					<comments>https://www.dbaduck.com/sql-server-agent-powershell-job-step-with-nosqlps/#comments</comments>
		
		<dc:creator><![CDATA[dbaduck]]></dc:creator>
		<pubDate>Wed, 06 Apr 2022 04:36:20 +0000</pubDate>
				<category><![CDATA[Powershell]]></category>
		<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[PowerShell]]></category>
		<category><![CDATA[SQL]]></category>
		<guid isPermaLink="false">https://www.dbaduck.com/?p=4625</guid>

					<description><![CDATA[<p>In researching some things for a presentation, I was to give at Intersections, Spring 2022, I came across this documentation page for SQL Server PowerShell that in the SQL Server Agent section it indicates that there is a way to get the Agent Job Step to skip loading the SQLPS module and you get to [&#8230;]</p>
<p>The post <a href="https://www.dbaduck.com/sql-server-agent-powershell-job-step-with-nosqlps/">SQL Server Agent PowerShell Job Step with #NOSQLPS</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>In researching some things for a presentation, I was to give at Intersections, Spring 2022, I came across this documentation page for <a href="https://docs.microsoft.com/en-us/sql/powershell/sql-server-powershell?msclkid=8f06e1c7b56011ecb42db58762054e7f&amp;view=sql-server-ver15" target="_blank" rel="noreferrer noopener">SQL Server PowerShell</a> that in the SQL Server Agent section it indicates that there is a way to get the Agent Job Step to skip loading the SQLPS module and you get to use PowerShell natively and can load whichever modules you would like.</p>



<p>What?  Huge News!</p>



<h2 class="wp-block-heading" id="h-how-is-this-done">How is this done?</h2>



<p>According to the documentation page, you simply would put #NOSQLPS in the first line of the SQL Agent PowerShell Job Step as in the image below, and when the job runs, it will NOT load the SQLPS module. <strong><em>This starts in SQL Server 2019</em></strong>, so it has not been back ported to previous versions of SQL Server.</p>



<p>If you want to test this, you can put the #NOSQLPS in the top of the step and then on the second line put &#8216;Get-Module&#8217; without the quotes and then run the job.  When you look at the job output, you will see nothing. If you remove the #NOSQLPS from the job step and put Get-Module in the line, run the job and then look at the output, you will see the SQLPS module be returned.</p>



<figure class="wp-block-image size-full"><img decoding="async" width="696" height="663" src="https://www.dbaduck.com/wp-content/uploads/2022/04/NOSQLPS_PowerShell_JobStep.png" alt="" class="wp-image-4627" srcset="https://www.dbaduck.com/wp-content/uploads/2022/04/NOSQLPS_PowerShell_JobStep.png 696w, https://www.dbaduck.com/wp-content/uploads/2022/04/NOSQLPS_PowerShell_JobStep-300x286.png 300w" sizes="(max-width: 696px) 100vw, 696px" /><figcaption>SQL Server Agent PowerShell Job Step</figcaption></figure>



<p>Hopefully you found this useful.</p>
<p>The post <a href="https://www.dbaduck.com/sql-server-agent-powershell-job-step-with-nosqlps/">SQL Server Agent PowerShell Job Step with #NOSQLPS</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.dbaduck.com/sql-server-agent-powershell-job-step-with-nosqlps/feed/</wfw:commentRss>
			<slash:comments>4</slash:comments>
		
		
			</item>
		<item>
		<title>UpdateHadronTruncationLsn messages in SQL 2019 Errorlog</title>
		<link>https://www.dbaduck.com/updatehadrontruncationlsn-messages-in-sql-2019-errorlog/</link>
					<comments>https://www.dbaduck.com/updatehadrontruncationlsn-messages-in-sql-2019-errorlog/#comments</comments>
		
		<dc:creator><![CDATA[dbaduck]]></dc:creator>
		<pubDate>Wed, 22 Jul 2020 22:16:59 +0000</pubDate>
				<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[Errorlog]]></category>
		<category><![CDATA[SQL]]></category>
		<guid isPermaLink="false">https://dbaduck.com/?p=4166</guid>

					<description><![CDATA[<p>If you have SQL Server 2019 installed and are using Availability Groups you may incur messages that look like the below message in your SQL Server Errorlog. UpdateHadronTruncationLsn(6) (force=0): Primary: 0:0:0 Secondary: 892:4720:2 Partial Quorum : 0:0:0 Aggregation: 892:4720:2 Persistent: 892:382:4 According to Microsoft (unofficially as I did not hear this personally), there will be [&#8230;]</p>
<p>The post <a href="https://www.dbaduck.com/updatehadrontruncationlsn-messages-in-sql-2019-errorlog/">UpdateHadronTruncationLsn messages in SQL 2019 Errorlog</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>If you have SQL Server 2019 installed and are using Availability Groups you may incur messages that look like the below message in your SQL Server Errorlog.</p>



<pre class="wp-block-preformatted">UpdateHadronTruncationLsn(6) (force=0): Primary: 0:0:0 Secondary: 892:4720:2 Partial Quorum : 0:0:0 Aggregation: 892:4720:2 Persistent: 892:382:4</pre>



<p>According to Microsoft (unofficially as I did not hear this personally), there will be a fix in CU6 for SQL 2019 that will remove these messages by default and if you enable Trace Flag 3605 then it will turn them on to be logged as below.</p>



<pre class="wp-block-syntaxhighlighter-code">DBCC TRACEON(3605, -1)</pre>



<p>If this changes, I will come and update this post.</p>



<p>Thanks for listening. Have a great day and stay safe and happy!</p>
<p>The post <a href="https://www.dbaduck.com/updatehadrontruncationlsn-messages-in-sql-2019-errorlog/">UpdateHadronTruncationLsn messages in SQL 2019 Errorlog</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.dbaduck.com/updatehadrontruncationlsn-messages-in-sql-2019-errorlog/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
			</item>
		<item>
		<title>Insights from your Database using PowerShell and TSQL</title>
		<link>https://www.dbaduck.com/insights-from-your-database-using-powershell-and-tsql/</link>
					<comments>https://www.dbaduck.com/insights-from-your-database-using-powershell-and-tsql/#comments</comments>
		
		<dc:creator><![CDATA[dbaduck]]></dc:creator>
		<pubDate>Thu, 02 Jul 2020 02:51:04 +0000</pubDate>
				<category><![CDATA[SQL Server]]></category>
		<guid isPermaLink="false">https://dbaduck.com/?p=4131</guid>

					<description><![CDATA[<p>Hello All. I have been looking for ways for quite sometime to understand my databases better. There are a lot of DMVs and even some DMFs that help us get some information from the SQL Server engine that can be very insightful. Many shops use monitoring tools and they are great because they watch things [&#8230;]</p>
<p>The post <a href="https://www.dbaduck.com/insights-from-your-database-using-powershell-and-tsql/">Insights from your Database using PowerShell and TSQL</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<div class="wp-block-group"><div class="wp-block-group__inner-container is-layout-flow wp-block-group-is-layout-flow">
<p>Hello All. </p>



<p>I have been looking for ways for quite sometime to understand my databases better. There are a lot of DMVs and even some DMFs that help us get some information from the SQL Server engine that can be very insightful. Many shops use monitoring tools and they are great because they watch things all the time. I used to rely solely on these tools to get me information about my SQL Server environment. That has shifted since I have been managing a SaaS platform built on top of SQL Server. With hundreds of TB of data and many databases being the same, it becomes pretty daunting to know which databases to care about and how much to care.</p>



<p>This article is about the start of things you can do to get ready for a better maintenance strategy or understanding how your indexes are used. </p>



<p>Let’s begin.</p>



<h3 class="wp-block-heading" id="h-problem">Problem</h3>



<p>You have many tables in your databases and you want to know how they are used. There are DMVs for index usage stats which will tell you about like <a href="https://docs.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-db-index-usage-stats-transact-sql">sys.dm_db_index_usage_stats</a> and querying them is insightful, but how do the stats change over time? These stats are reset when the instance is restarted and it is good to know that you have 2000 seeks and 500 scans of the index, but when did they happen? Was it on a common day? Common hour?</p>



<h3 class="wp-block-heading" id="h-solution">Solution</h3>



<p>First the elements of the solution should be defined:</p>



<ul class="wp-block-list"><li>PowerShell at least 4.0 (preferable 5.1)</li><li>SQL Server at least 2008 R2, but could be earlier but this is the earliest that I have run against</li><li>A way to run a job/task</li><li><a href="http://dbatools.io">dbatools.io</a> Get the dbatools PowerShell Module</li><li>Content listed below &#8211; and it will be on my <a href="https://github.com/dbaduck">Github Repository</a>.</li></ul>



<p>I say PowerShell because this is what I love to use for automation tasks. I will illustrate how to do this with SQL Server Agent, but it will still work with Task Scheduler. Let’s define the solution so that we can put the pieces together for use in your own environment. This will give you a good framework to gather any type of information, whether it be with TSQL, which this solution is using, or straight PowerShell and SMO or other items (like WMI).<br>First, we need to create a table to hold what I call Iterations to put lines in the sand and allow you to use timeframes in your statistics gathering. We will also need a SCHEMA called “stat” for holding the Index Usage data.<br></p>



<pre class="lang:SQL">CREATE SCHEMA stat AUTHORIZATION dbo</pre>



<p>Now the table&#8230;</p>



<pre class="lang:SQL">/****** Object: Table [dbo].[Iterations] Script Date: 2/15/2020 3:40:49 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Iterations] (
    [IterationId] [int] IDENTITY(1,1) NOT NULL,
    [GatherDate] [datetime2](3) NOT NULL,
    CONSTRAINT [PK_Iterations_ID] PRIMARY KEY CLUSTERED (
        [IterationId] ASC
    )  WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 100) ON [PRIMARY]) 
ON [PRIMARY]
GO
ALTER TABLE [dbo].[Iterations] ADD DEFAULT (getdate()) FOR [GatherDate]
GO</pre>



<p>Notice that we have an IDENTITY column which could easily be a SEQUENCE candidate from a DEFAULT on the IterationId column. Also notice that the only other column is the GatherDate as a datetime column. This allows us to put an INT (you could use a BIGINT) column for related tables instead of having dates to join on.<br>Next up is the table to hold the statistics.</p>



<pre class="lang:SQL">CREATE SEQUENCE stat.IndexUsage_Seq as bigint
START WITH 1 INCREMENT BY 1
GO
/****** Object:  Table [stat].[IndexUsage]    Script Date: 4/20/2020 10:59:19 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [stat].[IndexUsage](
	[server_name] [nvarchar](128) NULL,
	[database_name] [nvarchar](128) NULL,
	[database_id] [smallint] NULL,
	[object_id] [int] NOT NULL,
	[schema_name] [sysname] NOT NULL,
	[table_name] [sysname] NOT NULL,
	[index_name] [sysname] NULL,
	[index_id] [int] NOT NULL,
	[user_seeks] [bigint] NULL,
	[user_scans] [bigint] NULL,
	[user_lookups] [bigint] NULL,
	[user_updates] [bigint] NULL,
	[last_user_seek] [datetime] NULL,
	[last_user_scan] [datetime] NULL,
	[iterationid] [int] NOT NULL,
	[IndexUsageId] [bigint] NOT NULL,
        CONSTRAINT [PK_stat_IndexUsage__IndexUsageId] PRIMARY KEY CLUSTERED 
        (
	    [IndexUsageId] ASC
        ) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
GO
ALTER TABLE [stat].[IndexUsage] ADD  CONSTRAINT [DF_IndexUsage_IndexUsageSeq]  DEFAULT (NEXT VALUE FOR [stat].[IndexUsage_Seq]) FOR [IndexUsageId]
GO</pre>



<p>Next up is the query and to get the information, encapsulated in a PowerShell function called Get-BmaIndexUsage (Bma is to keep things separate from other functions, Bma is Ben Miller &amp; Associates). There are parameters for the SQL instance and credential with the reporting instance and database to put the data in. This example will give you an idea of how easy it is to get the data and put it in the table.</p>



<h4 class="wp-block-heading" id="h-index-usage-stats">Index Usage Stats</h4>



<pre class="lang:SQL">SELECT
    @@servername as [server_name],
    db_name() as [database_name],
    db_id() as [database_id],
    t.object_id,
    s.name as [schema_name],
    t.name as [table_name],
    ix.name as [index_name],
    ix.index_id,
    ius.user_seeks,
    ius.user_scans,
    ius.user_lookups,
    ius.user_updates,
    ius.last_user_seek,
    ius.last_user_scan,
    $IterationId AS [iterationid]
from sys.tables t
inner join sys.indexes ix on t.object_id = ix.object_id
inner join sys.schemas s on t.schema_id = s.schema_id
left join sys.dm_db_index_usage_stats ius on ius.object_id = ix.object_id and ius.index_id = ix.index_id 
        and ius.database_id = DB_ID()
WHERE t.is_ms_shipped = 0;
</pre>



<p>Let’s tie this together before we get into why we would even do this activity.</p>



<ul class="wp-block-list"><li>CREATE DATABASE DBA</li><li>Inside DBA database<ul><li>CREATE SCHEMA stat AUTHORIZATION dbo</li><li>CREATE SEQUENCE stat.IndexUsage_Seq</li><li>CREATE TABLE dbo.Iterations</li><li>CREATE TABLE stat.IndexUsage</li></ul></li><li>Command File with code to call PowerShell<ul><li>GetIndexUsage.cmd</li></ul></li><li>PowerShell file to run all the things to get the data with TSQL and put it in the table above<ul><li>GetIndexUsage.ps1</li></ul></li><li>Job in SQL Agent to get the stats and schedule for every hour</li><li>When all of this is put into place, the next step is to understand how to use the information that has been gathered. With the IterationId in place, the data can be sliced by iteration, or more</li></ul>



<pre class="lang:shell">REM This would go in the GetIndexUsage.cmd that would be called from the Agent Job in a CmdExec step.
@powershell c:\bin\GetIndexUsage.ps1
</pre>



<pre class="lang:powershell">Import-Module dbatools
function Get-BmaIndexUsage
{
	param (
		[string]$ServerInstance,
		$SqlCredential,
		[int]$IterationId,
		[string]$ReportSqlInstance = "localhost",
		[string]$ReportSqlDb = "DBA"
	)
	
	# Scheduled Per Day
	# Run per Database
	
	$query = @"
select 
	@@servername as [server_name],
	db_name() as [database_name],
	db_id() as [database_id],
	t.object_id,
 	s.name as [schema_name],
	t.name as [table_name],
	ix.name as [index_name],
	ix.index_id,
	ius.user_seeks,
	ius.user_scans,
	ius.user_lookups,
	ius.user_updates,
	ius.last_user_seek,
	ius.last_user_scan,
	$IterationId AS [iterationid]
--INTO stat.IndexUsage
from sys.tables t 
inner join sys.indexes ix on t.object_id = ix.object_id
inner join sys.schemas s on t.schema_id = s.schema_id
left join sys.dm_db_index_usage_stats ius on ius.object_id = ix.object_id and ius.index_id = ix.index_id and ius.database_id = DB_ID()
WHERE t.is_ms_shipped = 0
ORDER BY [schema_name], [table_name], [index_id];
"@
	if($SqlCredential) {
		$s = Connect-DbaInstance -SqlInstance $ServerInstance -SqlCredential $SqlCredential
	}
	else {
		$s = Connect-DbaInstance -SqlInstance $ServerInstance 
	}	
	$dblist = $s.Databases.Name | Where { $_ -notin @("master","model","msdb","tempdb") }
	
	foreach ($db in $dblist)
	{
		try
		{
			if($SqlCredential) {
				$dt = Invoke-DbaQuery -SqlInstance $ServerInstance -Database $db -Query $query -As DataTable -SqlCredential $SqlCredential
			}
			else {
				$dt = Invoke-DbaQuery -SqlInstance $ServerInstance -Database $db -Query $query -As DataTable
			}
			$dt | Write-DbaDbTableData -SqlInstance $ReportSqlInstance -Database $ReportSqlDb -Schema stat -Table IndexUsage 
			
		}
		catch
		{
			Write-Host "Error occurred in $($s.Name) - $($db) - IndexUsage"
		}
		
	}
}
Get-BmaIndexUsage -ServerInstance "localhost" -IterationId 1 </pre>



<p>The above PowerShell code would be run from the CMD file that runs and basically connects to the server specified in the bottom of the script, in this case &#8220;localhost&#8221; or change it to the real one. Then specifying the IterationId of 1 would get the data and put it in the table created with an IterationId of 1, and the next run would be 2 and so forth.  If you execute the following TSQL you can get an IterationId from the dbo.Iterations table.</p>



<pre class="lang:sql">INSERT INTO dbo.Iterations (GatherDate)  OUTPUT inserted.IterationId VALUES (GetDate())</pre>



<p>A simple query to the sys.dm_db_index_usage DMV that will show what the statistics look like for the index. Key elements:</p>



<ul class="wp-block-list"><li>DB_ID() in the where clause to look only in the database you are in. This DMV covers the entire instance so narrow it down with database_id = DB_ID()</li><li>Object_id of the table, you can even use the OBJECT_ID(‘tablename’)</li><li>Optionally you can narrow it down with the index_id as well</li></ul>



<p>If I see that at 12:00 AM my indexes are being used 2 seeks and 1 scan and 4 updates. As shown in the graphic and then at 1:00 AM there are 10 seeks and 1 scan and 40 updates. At 2:00 AM there are 10 seeks and 1 scan and 100 updates and the following hour at 3:00 AM there are 20 seeks and 4 scans and 100 updates total. At 8:00 AM you get on the scene and find the final stats to be 100 seeks and 8 scans and 200 updates.<br>Let’s digest this, we did not watch every hour and when we did get a chance at 8:00 AM to look and see the final stats at 100 seeks, 8 scans and 200 updates. How do we know what happened? We have these statistics captured every hour by iterationId and datetime. I want to know how many seeks, scans and updates take place between 4:00 AM and 5:00 AM. Most monitoring software packages do not capture these stats at all, let alone capture it by hour to give you insights you need to understand how your index is being used and even potentially give you opportunity to understand when you may want to maintain the index.</p>



<h3 class="wp-block-heading" id="h-scenario">Scenario</h3>



<ul class="wp-block-list"><li>Here is an example of a scenario that you can consider:</li><li>I created 14 indexes on a table and I want to know a few things.</li><li>1. Are they being used at all?</li><li>2. Are they seeked or scanned?</li><li>3. How often are they being seeked or scanned?</li></ul>



<pre class="lang:SQL">SELECT
    I1.server_name,
    I1.database_name,
    I1.schema_name,
    I1.table_name,
    I1.index_name,
    I1.index_id,
    ISNULL(I1.user_seeks,0) as user_seeks_1,
    ISNULL(I2.user_seeks,0) as user_seeks_2,
    ISNULL(I2.user_seeks,0) – ISNULL(I1.user_seeks,0) as user_seeks_diff,
    ISNULL(I1.user_scans,0) as user_scans_1,
    ISNULL(I2.user_scans,0) as user_scans_2,
    ISNULL(I2.user_scans,0) – ISNULL(I1.user_scans,0) as user_scans_diff,
    ISNULL(I1.user_lookups,0) as user_lookups_1,
    ISNULL(I2.user_lookups,0) as user_lookups_2,
    ISNULL(I2.user_lookups,0) – ISNULL(I1.user_lookups,0) as user_lookups_diff,
    ISNULL(I1.user_updates,0) as user_updates_1,
    ISNULL(I2.user_updates,0) as user_updates_2,
    ISNULL(I2.user_updates,0) – ISNULL(I1.user_updates,0) as user_updates_diff
FROM stat.IndexUsage I1
INNER JOIN stat.IndexUsage I2 ON I1.server_name = I2.server_name 
        and I1.database_name = I2.database_name
        and I1.object_id = I2.object_id
        and I1.index_id = I2.index_id
WHERE 
    I1.IterationId = 4 
    AND I2.IterationId = 5
    AND I1.table_name = 'MyTable'
    AND I2.table_name = 'MyTable'
</pre>



<p>The output from the query is below and this would give you insights into the indexes for a table and how they have behaved over the last hour since the iterations are sequential. If your interval is longer than 1 hour then it is 1 interval of time so you can see based on your interval how it behaved.</p>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="261" src="https://dbaduck.com/wp-content/uploads/2020/07/IndexUsageStats-1024x261.png" alt="" class="wp-image-4161" srcset="https://www.dbaduck.com/wp-content/uploads/2020/07/IndexUsageStats-1024x261.png 1024w, https://www.dbaduck.com/wp-content/uploads/2020/07/IndexUsageStats-300x77.png 300w, https://www.dbaduck.com/wp-content/uploads/2020/07/IndexUsageStats-768x196.png 768w, https://www.dbaduck.com/wp-content/uploads/2020/07/IndexUsageStats-1200x306.png 1200w, https://www.dbaduck.com/wp-content/uploads/2020/07/IndexUsageStats.png 1262w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /><figcaption>Index Usage trend difference</figcaption></figure>
</div></div>



<div class="wp-block-group"><div class="wp-block-group__inner-container is-layout-flow wp-block-group-is-layout-flow">
<p>As you can see, the data is coming out with the 1 and the 2 and then a diff with the query. This gives you a view if it is 1 hour of data, then you know that the diff of Index 6 is 101 seeks in the last hour. Over time you can trend this data and see some correlations with some queries, or you see lots of 0&#8217;s like above, and that will help you know which indexes are used and maybe those are the ones that are needing maintenance more, if the updates are high.</p>



<p>Hopefully this was a good start to understanding how you can get some insights into your database with a little automation. Stay safe and Happy.</p>
</div></div>



<p></p>
<p>The post <a href="https://www.dbaduck.com/insights-from-your-database-using-powershell-and-tsql/">Insights from your Database using PowerShell and TSQL</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.dbaduck.com/insights-from-your-database-using-powershell-and-tsql/feed/</wfw:commentRss>
			<slash:comments>3</slash:comments>
		
		
			</item>
		<item>
		<title>Microsoft has been confirmed as the Premium Sponsor for PASS Virtual Summit</title>
		<link>https://www.dbaduck.com/microsoft-has-been-confirmed-as-the-premium-sponsor-for-pass-virtual-summit/</link>
					<comments>https://www.dbaduck.com/microsoft-has-been-confirmed-as-the-premium-sponsor-for-pass-virtual-summit/#respond</comments>
		
		<dc:creator><![CDATA[dbaduck]]></dc:creator>
		<pubDate>Tue, 23 Jun 2020 20:06:32 +0000</pubDate>
				<category><![CDATA[SQL Server]]></category>
		<guid isPermaLink="false">https://dbaduck.com/?p=4159</guid>

					<description><![CDATA[<p>The formal announcement is here (Announcement). What does this mean to you? This means that you will have 1 on 1 time with Microsoft Engineers and highly sought after content from Microsoft for the Summit. If you have not had a chance to interact with the engineers at Microsoft with yours or your company&#8217;s issues, [&#8230;]</p>
<p>The post <a href="https://www.dbaduck.com/microsoft-has-been-confirmed-as-the-premium-sponsor-for-pass-virtual-summit/">Microsoft has been confirmed as the Premium Sponsor for PASS Virtual Summit</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>The formal announcement is here (<a href="https://www.pass.org/AboutPASS/PASSNews/TabId/15340/ArtMID/23897/ArticleID/830/Microsoft-at-PASS-Virtual-Summit.aspx">Announcement</a>).</p>
<p>What does this mean to you? This means that you will have 1 on 1 time with Microsoft Engineers and highly sought after content from Microsoft for the Summit.</p>
<p>If you have not had a chance to interact with the engineers at Microsoft with yours or your company&#8217;s issues, you are missing out and should consider <a href="https://www.pass.org/summit/2020/RegisterNow.aspx?utm_source=Partner&amp;utm_medium=Blogger&amp;utm_campaign=Microsoft_At_Virtual_Summit">registering</a> for the Summit this year in November. With registration you get this opportunity as well as to hear from MVPs, MCMs and highly sought after content from the SQL Community of speakers and knowledge experts.</p>
<p>I am excited for the years Summit and as you have heard out on the web, this is your opportunity to contribute to the betterment of PASS from a budget and community perspective. I will be there, and I hope to see you there. Have a great day!</p>
<p>The post <a href="https://www.dbaduck.com/microsoft-has-been-confirmed-as-the-premium-sponsor-for-pass-virtual-summit/">Microsoft has been confirmed as the Premium Sponsor for PASS Virtual Summit</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.dbaduck.com/microsoft-has-been-confirmed-as-the-premium-sponsor-for-pass-virtual-summit/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Using Clipmate for great lookback ability  #tsql2sday</title>
		<link>https://www.dbaduck.com/using-clipmate-for-great-lookback-ability-tsql2sday/</link>
					<comments>https://www.dbaduck.com/using-clipmate-for-great-lookback-ability-tsql2sday/#respond</comments>
		
		<dc:creator><![CDATA[dbaduck]]></dc:creator>
		<pubDate>Tue, 09 Jun 2020 18:46:26 +0000</pubDate>
				<category><![CDATA[SQL Server]]></category>
		<guid isPermaLink="false">https://dbaduck.com/?p=4153</guid>

					<description><![CDATA[<p>#TSQL2sday is today. The theme is Tips and Tricks NOT related to SQL Server or RDBMS systems. My tip today is about a utility that I have been using for a long time thanks to Lars Rasmussen (b &#124; t) long time very good friend. The utility is called Clipmate and it is a clipboard [&#8230;]</p>
<p>The post <a href="https://www.dbaduck.com/using-clipmate-for-great-lookback-ability-tsql2sday/">Using Clipmate for great lookback ability  #tsql2sday</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>#TSQL2sday is today. The theme is Tips and Tricks NOT related to SQL Server or RDBMS systems.</p>
<p>My tip today is about a utility that I have been using for a long time thanks to Lars Rasmussen (<a href="https://larsrasmussen.blogspot.com/">b</a> | <a href="https://twitter.com/nanodba">t</a>) long time very good friend. The utility is called <a href="http://www.clipmate.com/">Clipmate</a> and it is a clipboard manager.&nbsp; It does cost to use it after a trial, but it is so useful that I could easily justify the money.</p>
<p>Here is how it works.&nbsp; Whenever you copy something (right click Copy, or Ctrl+C) it ends up in the clipboard and <a href="http://www.clipmate.com">Clipmate</a> keeps track of it.&nbsp; Imagine going through your day whether it is programming, or TSQLing, or just being on the web and marking everything as favorites in your browser. Imagine being able to just copy the URLs and move on and after you <a href="https://sqlstudies.com/2020/06/02/tsql-tuesday-127-invite-non-sql-tips-and-tricks/"><img loading="lazy" decoding="async" class="alignright wp-image-4154 size-full" src="https://dbaduck.com/wp-content/uploads/2020/06/t-sql-tuesday-250x250-1.png" alt="" width="250" height="250" srcset="https://www.dbaduck.com/wp-content/uploads/2020/06/t-sql-tuesday-250x250-1.png 250w, https://www.dbaduck.com/wp-content/uploads/2020/06/t-sql-tuesday-250x250-1-150x150.png 150w" sizes="auto, (max-width: 250px) 100vw, 250px" /></a></p>
<p>&nbsp;</p>
<p>get done going into the <a href="http://www.clipmate.com">Clipmate</a> interface and grabbing them from the list and dragging them into a folder in <a href="http://www.clipmate.com">Clipmate</a>.&nbsp; This utility has it&#8217;s own store of where it puts things and you can create your own folder structure in there too and then you have a utility that you can back up your store and restore it on another machine or do other things creative.</p>
<p>The benefits is that everything you copy goes into this utility and is retrievable so if you had copied some code or something from a website and didn&#8217;t paste it then, you have access to it in your <a href="http://www.clipmate.com">Clipmate</a> store.</p>
<p><a href="https://dbaduck.com/wp-content/uploads/2020/06/Clipmate_Explorer.png"><img loading="lazy" decoding="async" class="alignnone size-medium wp-image-4157" src="https://dbaduck.com/wp-content/uploads/2020/06/Clipmate_Explorer-300x52.png" alt="" width="300" height="52" srcset="https://www.dbaduck.com/wp-content/uploads/2020/06/Clipmate_Explorer-300x52.png 300w, https://www.dbaduck.com/wp-content/uploads/2020/06/Clipmate_Explorer-1024x177.png 1024w, https://www.dbaduck.com/wp-content/uploads/2020/06/Clipmate_Explorer-768x133.png 768w, https://www.dbaduck.com/wp-content/uploads/2020/06/Clipmate_Explorer-1200x207.png 1200w, https://www.dbaduck.com/wp-content/uploads/2020/06/Clipmate_Explorer.png 1210w" sizes="auto, (max-width: 300px) 100vw, 300px" /></a></p>
<p>You will see an Inbox, everything will land in the inbox and you can either let them pile up, or you can create a folder structure on the left inside of Inbox or outside. You see that it indicates in the explorer where the clip came from and the line of the clip so that you can recognize it.&nbsp; The trick here is that if you click on one of the items in the explorer, that becomes the one in the Clipboard at that moment so that you can go out into the application and Paste it and that will be the thing pasted.&nbsp; You will also see a Graphic item in the list. I copied an image like a Print Screen into the clipboard and I can click on that and then go paste it into Paint.</p>
<p>Hopefully this information can help you get more out of your clipboard, Lars has a much better discipline than I, and he has a wealth of information of scripts he has saved into folders that do X or Y, and he uses them wherever he can. They are literally saved for posterity.&nbsp; He has also in the past shared pieces of them with me, which is super cool and helpful when you are trying to help others with things that you have accumulated.</p>
<p>Enjoy your #TSQL2SDAY and enjoy.</p>
<p>&nbsp;</p>
<p><a href="https://sqlstudies.com/2020/06/02/tsql-tuesday-127-invite-non-sql-tips-and-tricks/">&nbsp;</a></p>
<p>The post <a href="https://www.dbaduck.com/using-clipmate-for-great-lookback-ability-tsql2sday/">Using Clipmate for great lookback ability  #tsql2sday</a> appeared first on <a href="https://www.dbaduck.com">DBAduck</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.dbaduck.com/using-clipmate-for-great-lookback-ability-tsql2sday/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
