Missing Biometric devices Node in Device Manager

If you have a Windows Hello compatible device (face or finger-print) and you are unable to use it, make sure Device Manager has a node for Biometric devices (see below). I have a Windows Hello Face capable laptop and lost the ability to login with my face. This node was no longer there. After some back and forth with tech support, they were finally able to fix it. I’m blogging this since I could never find a good answer searching on my own and want to document the fix.

First make sure your camera still works using your preferred app. If not, you’ll want to fix that first. Next browse to the folder “C:\Windows\System32\WinBioPlugIns\FaceDriver” and right-click the two .inf files (Setup Information) and Install them. Go back into Settings -> Accounts -> Sign-in Options and Windows Hello Face should be available again.

Which Secure Protocols are Allowed on Your Servers

A script to have handy to tell you which schannel protocols are enabled / disabled for PCI or certificate compliance.

This one works well for local servers. If you want to query a remote server, look below.

$path = "HKLM://SYSTEM/CurrentControlSet/Control/SecurityProviders/Schannel/Protocols"
Set-Location -Path $path
Get-ChildItem . |
  Select-Object @{
                  Expression={Split-Path $_.PSPath -leaf}
                  Name="Client Enabled"
                  Expression={(Get-ItemProperty -Path (Join-Path $_.PSPath 'Client') -ErrorAction silentlycontinue).Enabled}
                  Name="Client DisabledByDefault"
                  Expression={(Get-ItemProperty -Path (Join-Path $_.PSPath 'Client') -ErrorAction silentlycontinue).DisabledByDefault}
                  Name="Server Enabled"
                  Expression={(Get-ItemProperty -Path (Join-Path $_.PSPath 'Server') -ErrorAction silentlycontinue).Enabled}
                  Name="Server DisabledByDefault"
                  Expression={(Get-ItemProperty -Path (Join-Path $_.PSPath 'Server') -ErrorAction silentlycontinue).DisabledByDefault}
              } | ft -autosize

This version works against remote servers.

$path = "SYSTEM\\CurrentControlSet\\Control\\SecurityProviders\\Schannel\\Protocols\"
$Reg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine', $ComputerName)
$Reg.OpenSubKey($path).GetSubKeyNames() | ForEach-Object {
    $protocol = $_
    try {
        $cEnabled = $Reg.OpenSubKey("$($path)\\$($_)\\Client").GetValue('Enabled')
        $cDisabledByDefault = $Reg.OpenSubKey("$($path)\\$($_)\\Client").GetValue('DisabledByDefault')
    } catch {}
    try {
        $sEnabled = $Reg.OpenSubKey("$($path)\\$($_)\\Server").GetValue('Enabled')
        $sDisabledByDefault = $Reg.OpenSubKey("$($path)\\$($_)\\Server").GetValue('DisabledByDefault')
    } catch {}
  New-Object –TypeName PSObject –Prop @{"Protocol"=$protocol; "Client Enabled"=$cEnabled; "Client DisabledByDefault"=$cDisabledByDefault; "Server Enabled"=$sEnabled; "Server DisabledByDefault"=$sDisabledByDefault}
  $cEnabled, $cDisabledByDefault = $null;
  $sEnabled, $sDisabledByDefault = $null;
} | Format-Table -Property "Protocol", "Client Enabled", "Client DisabledByDefault", "Server Enabled", "Server DisabledByDefault" -AutoSize
Remove-Variable Reg

Display All SQL DateTime Conversions

If you want a quick way to determine which format, or style, to specify when converting a datetime value in SQL to a character data type, you can use this simple PowerShell script. Or you can look at the MSDN page.

0..200 | %{
    Invoke-Sqlcmd -ServerInstance . -Query ("SELECT num = {0}, CONVERT(varchar(100), GETDATE(), {0})" -f $_)
} | ft -AutoSize

An Amusing New Chapter of My Career

My New Office

Starting next week, that’s the first week of April, this will be my new office. Well, not really, but I’ll be working for the company that hosts this coaster, Six Flags. I’ll be overseeing all the SQL Server stuff for the entire organization. That means the back office stuff, plus what it takes to run all 19 parks. This will offer an interesting set of challenges that I haven’t had to face before. One example is up-time. During the day, availability must be 100% as many times we won’t get a second chance to work with a customer. They’ve gone home, which could be in a different state. Even though we must be up 100% of the time the park is open, the parks don’t usually stay open 24 hours which helps with maintenance windows.

This will be an exciting change of pace for me and working with an organization whose purpose is to entertain and amuse. I look forward to what this change will bring. Also, if I ever have a stressful day, I can go to the park that is nearby and ride a coaster to chill out. I’m sure I’ll be taking advantage of this a lot.

T-SQL Tuesday #028 – Jack of All Trades ie Integration Specialist

Argenis Fernandez (blog | twitter) is hosting the latest T-SQL Tuesday and asking about specialization.

For the earlier part of my career I was doing application development while really wanting to do database work. I also learned plenty about systems administration while in college. The variety of the tasks made me a Jack of All Trades with a lean toward databases. I wanted to specialize as a DBA or DB developer, just the jobs were never there until I moved to a bigger market.

When I was able to specialize, I was happy I could focus on one thing and get really deep with it. Now I am able to get deep with SQL Server, but I find myself constantly recalling information I learned in my previous positions to help with current issues. CLR, PowerShell, and little custom written utilities all draw from my .NET experience, which I am grateful for. I run into other DBAs that have no clue what goes on outside of their world. Having multiple skills allows me to create better solutions as I don’t just consider what I’m responsible for, but all the connections to my piece as well.

Being a generalist early in my career has allowed me better understanding of the whole stack of any given solution. It is easier for me to spot an integration problem or see a problem with an application’s implementation than it would have been without that variety of experience. Jack of All Trades tend to be better at seeing the big picture as they don’t focus on just one area. They can be the ones that help everything fit together better. Specialists are an important part of our ecosystem, but without someone to put the pieces together, you would have perfect components in a flawed system. Solutions really are greater than the sum of their parts.

Extract CLR Assemblies from SQL Server

I’ve run into a few situations that required examining the existing CLR assemblies on a server. Whether I needed to do a comparison between two versions to make sure they are the same or confirm something in the assembly itself, this script has come in handy. Point it at a database, give it an output path and it will save all the assemblies in that database to dlls in the given folder. You can then use a .NET disassembler to confirm suspicions or a binary comparison to make sure the dll matches what it should.

        Extracts CLR Assemblies from a SQL 2005+ database.
        Extracts CLR Assemblies from a SQL 2005+ database.
    .PARAMETER  ServerInstance
        The Server\Instance to connect to
    .PARAMETER  Database
        The Database to extract assemblies from
    .PARAMETER  OutPath
        The path to output the assemblies
        PS C:\> .\Get-SqlAssemblies.ps1 -ServerInstance 'MyServer\MyInstance' -Database 'MyDatabase'
        This example shows how to call Get-SqlAssemblies with named parameters.
        For more information about advanced functions, call Get-Help with any
        of the topics in the links listed below.
    [string]$ServerInstance = 'LOCALHOST',
    [string]$OutPath = '.'
#Correct for variations of incoming ServerInstance names
if(-not $ServerInstance.Contains('\')) {$ServerInstance += '\DEFAULT'}
if($ServerInstance.Contains(',') -and -not $ServerInstance.Contains('`,')) {$ServerInstance = $ServerInstance.Replace(',', '`,')}
dir SQLSERVER:\SQL\$ServerInstance\Databases\$Database\Assemblies | %{
    $_.SqlAssemblyFiles | %{
        $str = $_.name
        $path = Join-Path $OutPath ($str.Substring($str.LastIndexOf('\')+1))
        Set-Content -Path $path -Value $_.GetFileBytes() -Encoding byte;

Split Typeperf Output in More Managable Chunks

I really like using typeperf for perfmon counter collection. It allows me to save a collection of counters to monitor and store those readings in a csv file for later analysis. Sometimes I end up running the output through the PAL tool. Unfortunately, the PAL tool generates graphs that are fairly narrow. Monitoring sessions of long duration causes these graphs to be really cramped. I wanted a way to split the typeperf output to get a reasonable amount of data points in these graphs. A side benefit is the processing per file is a lot quicker.

The script takes a filepath as its only argument. It splits by the hour and copies the header row to each new file.

param (
	[string]$filepath #incoming file
#Does the file exist
if (Test-Path $filepath) {
	$infile = Get-Item $filepath
	$data = Get-Content $infile
#if not, exit the script
else {
	Write-Warning "Failed to find $filepath"
#Get the header to be able to repeat it in each file
$header = $data | Select-Object -First 1
$lastHour = $null
$outFile = $null
#Loop through the data, skipping the header line.
$data | Select-Object -Skip 1 | ForEach-Object {
	$date = [DateTime]::Parse([string]$_.Substring(1, [string]$_.IndexOf('"',1)-1))
	if($lastHour -eq $null -or $date.Hour -ne $lastHour.Hour -or $outFile -eq $null) {
		$lastHour = $date.AddMinutes(-$date.Minute).AddSeconds(-$date.Second)
		$outFile = Join-Path $infile.Directory ("{0}_{1}{2}" -f $infile.BaseName, $lastHour.ToString('yyyyMMdd_HHmmss'), $infile.extension)
		$header | Out-File $outFile -Encoding UTF8
	$_ | Out-File $outFile -Encoding UTF8 -Append

Update: I cleaned up the script a little based on Jeff’s comments below.

Automate CPU-Z Capture to Check for Throttled Processors

Several professionals have posted about checking whether or not your processors are running at full speed by using CPU-Z. Some recommendations are to check your servers every couple of months or, if virtual, every time your guest moves. Me being lazy efficient, I’d rather automate having these servers send me their info on a scheduled basis.

First things, first. Be sure to copy CPU-Z out to all your servers, preferably in a consistent folder. Edit the cpuz.ini file to turn off extra scanning (we just need the CPU info).


Save the following as a script and schedule it to run on a monthly or so basis using your favorite scheduler. The email portion was taken from this post.

& '.\cpuz64.exe' "-txt=$env:COMPUTERNAME"
Start-Sleep -Seconds 15 #Give CPUZ enough time to generate output
$smtpServer = ""
$msg = new-object Net.Mail.MailMessage
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$msg.From = "emailadmin@test.com"
$msg.Subject = "[CPU-Z] $env:COMPUTERNAME"
$msg.Body = gc "$env:COMPUTERNAME.txt" | ?{$_ -match "(Specification|Core Speed|Stock frequency)"}

Script level upgrade for database ‘master’ failed

A few weeks ago one of our clustered server nodes blue-screened and when it came back on SQL refused to start. I started digging in and this is what I found in the Event Log: “Script level upgrade for database ‘master’ failed because upgrade step ‘sqlagent100_msdb_upgrade.sql’ encountered error 200, state 7, severity 25.” Immediately I started to suspect a corrupt master database. And I found this in the SQL Error Log:

Creating procedure sp_sqlagent_get_perf_counters…
Error: 468, Severity: 16, State: 9.
Cannot resolve the collation conflict between “SQL_Latin1_General_CP1_CI_AS” and “Latin1_General_CI_AS” in the equal to operation.
Error: 912, Severity: 21, State: 2.
Script level upgrade for database ‘master’ failed because upgrade step ‘sqlagent100_msdb_upgrade.sql’ encountered error 200, state 7, severity 25. This is a serious error condition which might interfere with regular operation and the database will be taken offline. If the error happened during upgrade of the ‘master’ database, it will prevent the entire SQL Server instance from starting. Examine the previous errorlog entries for errors, take the appropriate corrective actions and re-start the database so that the script upgrade steps run to completion.
Error: 3417, Severity: 21, State: 3.
Cannot recover the master database. SQL Server is unable to run. Restore master from a full backup, repair it, or rebuild it. For more information about how to rebuild the master database, see SQL Server Books Online.
SQL Trace was stopped due to server shutdown. Trace ID = ‘1’. This is an informational message only; preser action is required.

It turns out that the collation of msdb did not match master. Master was using SQL_Latin1_General_CP1_CI_AS while only msdb and one other database was using  Latin1_General_CI_AS. The upgrade script that SQL Server ran at startup performed a join across those two databases resulting in the error. I’m still not sure about why this script got run as no service packs / cumulative updates had been applied to this server recently and it was originally installed as 2008, not an upgrade from 2005. With the upgrade script interrupted, it left the master db unusable.

I started going through the steps to restore master. I was able to start the server in single user mode, yet I could never get in as the single user. Something seemed to always be taking the connection before I could get in.

I had a long, long conversation with the #sqlhelp folks on Twitter about this. All of these guys and gals were tossing out help: @sqlinsaneo @kbriankelly @darrelllandrum @rusanu @DBArgenis @Kendra_Little @mrdenny @Bugboi @SQLSoldier. We stopped all services that were known to connect, stopped the browser service, changed the port, only allow IP connections, paused the cluster node I was working on, and still I was getting an error that only one admin was allowed at a time in single user mode.

Since I was getting nowhere fast, I decided on a different tact, namely to rebuild master then recover from backup. Rebuilding master was a fairly quick process. An important note about rebuilding master is that is also rebuilds msdb. Now I had two databases to recover from backup. Once the rebuild was complete, I was successfully able to start the service in single user mode  and was able to connect (yay!!!). Now I could restore a backup of master, restart SQL normally and restore msdb. I finally had a running instance 2.5 hours later.

Important lessons learned from this experience:

  • If single user mode isn’t working and you have a good backup of msdb, rebuild then recover. Rebuilding might fix some issues and get you on the road to recovery quicker.
  • Collations of all the system databases really should match. This whole issue was because of a collation conflict between master and msdb.
Thanks again to:

Sync SSMS Tools Pack Across Computers Using SyncToy

My favorite and most used feature of the SSMS Tools Pack is the query history. This feature has saved me a lot of time and effort. DBAs where I work are assigned both a desktop for normal day to day operations and a laptop for when we’re at home, meetings, travel, etc. Today I ran into a scenario that made me want to keep the query history of both machines in sync. I asked Mladen Prajdic, the creator, about syncing across computers and he directed me to a blog post written by REAL SQL Guy that does just that. Unfortunately his method uses Dropbox which is blocked at work, so I needed an alternative. I use SyncToy from Microsoft regularly to keep other folders in sync, so I decided to do the same for this project. Using SyncToy manually means that both machines need to be powered on and can see each other on the network since there is no intermediary.

The first thing I did was to share the C:\SSMSTools folder on each machine so I can trigger the sync from whichever computer I’m sitting at. I then created a folder pair on each machine specifying the “Left” folder as the remote shared folder and the “Right” folder as the local one. I made sure to use the “Synchronize” setting so that changes can go both ways. Give it a name, click Preview and Run. Now I have my entire history from both machines available from either one.

Link Recap: