### Archive

Archive for the ‘powershell’ Category

## SQL Saturday #57 Slides, Code & Video

This past weekend I spoke at SQL Saturday #57 in Houston, TX.

I want to thank the local PASS chapter HASSUG, the organizers Nancy Wilson (blog | twitter), Jonathan Gardner (blog | twitter), Malik Al-Amin (twitter), the volunteers, the other speakers, and the attendees. It was a great event with a great venue.

I especially want to thank those attendees who chose my sessions out of the many other great ones.

### Slides, Code & Video

So What Can I Do With PowerShell?

ORMs for the DBA

*Thanks to the MidnightDBAs for lending me their mic for this recording.

Categories: Tags:

Those of us admins that download each SQL Cumulative Update when it comes out may find this script handy. When you request a CU from Microsoft, you get this plain text email with links and passwords. Each link is to a different self-extracting zip file with the corresponding password below it.

When a CU has many files, it becomes a long series of point-and-click to download them all. Then you have to manually put in the password to each file in order to extract it. I wanted an easier solution.

Categories: Tags:

January 3rd, 2011 1 comment

As a follow-up to my previous posts: I now present a script to get the SqlBits7 Videos using PowerShell and the power of BITS. I will upload the script as a file later tonight.

*** Edit: At the request of Simon Sabin, I have removed my script. I don’t want to unfairly drain their bandwidth. ***

Categories: Tags:

I was playing around with BITS transferring backups between servers and discovered that BITS has PowerShell cmdlets. I was previously using the BITSAdmin Tool which was easy, but I wanted to find out more about the PoSh cmdlets. After reading the docs, I decided to use these cmdlets in my MCM video download script.

BITS stands for Background Intelligent Transfer Service. It is a service built into windows since XP designed to transfer files in the background without interrupting normal user network usage. Windows Update uses BITS to download all the patches that need to be applied to your computer. As stated above, you too can also use BITS to transfer files around. If you regularly transfer large files over the network and get annoyed by how it affects your computer experience, look into replacing the normal copy, xcopy, robocopy commands with it.

My first attempt queued all the files into one BITS transfer. The upside is only one BITS job to manage. The downside is the filenames aren’t written until the whole job completes. Long download jobs often get cancelled and the user would want to be able to listen to or watch the files that had already finished. I wanted the opposite, so I went for one job per file.

The BITS cmdlets require PowerShell v2.0

Categories: Tags:

## Splitting SQL Server Logs by ProcessInfo

I wanted a way to parse through my SQL logs and be able to look at one particular spid or ProcessInfo class such as Server, Backup, Logon, etc. I came up with the following PowerShell script: *You must have either trace flag 1204 and/or 1222 turned on. (More info)

param($path) if($path -eq $null) {exit} if(Test-Path$path) { "Parsing $path"$spid = $null gc$path | %{ #if ($_.Length -gt 35 -and$_.Substring(24, 12).StartsWith("spid")) { if($_ -match "^\d{4}") {$spid = $_.Substring(24, 12).Trim().PadRight(12, '_') } else { if($_.Trim().Length -lt 1 -and $spid -ne$null) {} else { $spid = "Default".PadRight(12, '_') } }$outfile = $path.Substring(0,$path.LastIndexOf(".")) + [string]::Format(".{0}.txt", $spid)$_ | Out-File -FilePath $outfile -Encoding "ASCII" -Append } } else {"Bad path given"} If you dump your log file via the following command, this works really well. sqlcmd -S MyServer\Inst1 -Q "xp_ReadErrorLog" -o myserver.log Once you have it, just call the script and specify the -Path parameter as such: & '.\Split SQL Log by ProcessInfo.ps1' -Path C:\path\to\myserver.log It looks at each line and figures either the spid or processinfo class and puts the line into a file with that string in its name. If it sees a blank line, it will put it into the previous file. WARNING: This could result any many files. Yes, you could dump this to SQL or parse this some other way, but I wanted to split it into files so I can run it through another process to get all deadlock information that I’ll be posting later. Download script Categories: Tags: ## Download the MCM Readiness Videos with PowerShell December 15th, 2010 9 comments Microsoft has changed their SQL MCM program to allow more people access to the course and certification. In doing so, they are recording much of the classroom training that was previously required and putting it up on TechNet. We’re talking close to 60 videos already. Sure, you could stream them one at a time, but I wanted to be able to download them all and watch them offline without a bunch of clicking. So I built the below PowerShell scripts. They’re really simple to use, and only have one external requirement that makes dealing with web pages much easier. You’ll need to download the Html Agility Pack library from here. Get the script Usage: Get urls of all WMV files >& ‘.\Download MCM Videos.ps1′ Download all WMV files >& ‘.\Download MCM Videos.ps1′ -download Download all iPod files >& ‘.\Download MCM Videos.ps1′ -download -type ipod Categories: Tags: ## Getting VLF stats for multiple databases December 9th, 2010 1 comment After watching Paul Randal’s (blog | twitter) MCM video on log files, I wanted to find a quick way to get VLF (Virtual Log File) statistics across multiple databases and servers. I went two routes, PowerShell and T-SQL. Our internal monitoring uses T-SQL already, so I wanted to be able to easily work with the existing framework. I wanted to create a PowerShell version, as some others might find it useful and I just like working with PowerShell. T-SQL Version I wound up making my table structure the same as David Levy in this blog post. Michelle Ufford has a very similar script at her blog post. PowerShell Version Thanks to Aaron Nelson (blog | twitter) for giving me an idea to tweak this. More on VLFs: Categories: Tags: ## Active / Active Cluster Server Memory Allocation December 6th, 2010 No comments We have several active / active clusters with multiple instances per node in our environment. We want to ensure the instances are spread as evenly as possible across the nodes. I wrote a quick PowerShell script to go through a text file and get the physical node where the instance is running, how much physical memory the node has and how much memory the instance is configured to use. Just create a text file “servers.txt” with the list of instances on a cluster and run the following. [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null$servers = get-content "servers.txt" foreach($server in$servers) { $sqlServer = New-Object Microsoft.SqlServer.Management.Smo.Server$server [String]::Format("{0} uses {1}MB RAM. Resides on {2}.", $server,$sqlServer.Configuration.MaxServerMemory.ConfigValue, $sqlServer.ComputerNamePhysicalNetBIOS)$objCS = Get-WmiObject Win32_ComputerSystem -ComputerName $server.Substring(0,$server.IndexOf("\")) [String]::Format("{0}\{1}: {2}MB", $objCS.Domain,$objCS.Name, [Math]::Round(\$objCS.TotalPhysicalMemory / 1mb)) }
The formatting leaves much to be desired, but it’s a quick check and that was what I was interested in. Excel can do all the formatting and calculating later.
Categories: Tags: