There are a lot of different answers to this question, most which work for older versions of Windows but perhaps not Windows 10.

Once you’ve got the method down, though, it really is straightforward:

  1. Open an admin Powershell window
  2. Create a self-signed cert for your domain. Here, I’m making a wildcard cert for *.test.com:
    New-SelfSignedCertificate -certstorelocation cert:\localmachine\my -dnsname *.test.com
  3. Create a password using ConvertTo-SecureString and export the cert. Replace the thumbprint and path with values for your environment:
    $CertPwd = ConvertTo-SecureString -String "foobarpassword" -Force –AsPlainText
    Export-PfxCertificate -cert cert:\localMachine\my\<thumbprint from new cert command above> -FilePath c:\dev\cert.pfx -Password $CertPwd
  4. Open Certificate Manager and import the certificate under Trusted Root Certification Authorities.
  5. Open IIS manager and create an HTTPS binding for the site using your new cert.
  6. Reboot (to clear Chrome cache etc.)

I do a lot of work with SQL Server, and I’m often bouncing in and out of domains, using VPN to remote sites, etc. I always thought that you couldn’t connect using Windows Authentication to a SQL server in a different domain, so I’d remote desktop in and use SSMS on that machine.

But it turns out there’s a way!

  1. Open Administrative Tools > Credential Manager (Control Panel > Stored Usernames and Passwords in older versions).
  2. Add a Windows credential
    • Use the fully qualified domain name for the server, followed by the SQL Server port, for example: server.domain.com:1433
    • Enter your fully qualified domain user, for example: domain\user
    • Your domain password

Now you can connect to that server in SSMS! Use the fully qualified domain name there too (no port this time): server.domain.com. It'll show your local user in the disabled user segment there, but actually pick up the credentials you entered before.

Of course, the proper ports will have to be open, etc. but if you’re connecting through VPN that should be handled.

Yesterday, I wrote about how to grab Git commit statistics for author by date. I also work on some projects that use Subversion. It’s a bit tougher to grab insertion/deletion data with SVN but I came up with a way: grab the diffs for a day and parse them.

Of course, because SVN has to go and be different, the date filter also doesn’t work how you expect – it includes the first rev before that day. There’s even a note on it in the docs: http://svnbook.red-bean.com/en/1.7/svn.tour.revs.specifiers.html (the “Is Subversion a Day Early” box at the bottom).

So we’ve got to skip the first revision. Unfortunately, PowerShell processes output line by line, so we’ve got to join all the lines together, split to revs by regex, and then split out the lines again.

This works, even if it isn’t elegant. As Blaise Pascal said, “I have only made this ... longer because I have not had the time to make it shorter.” If the PowerShell pros know a better way to do this, hit me up.

Just like the git version, swap out my name for yours and the dates with todays date:

((((svn log --search chris -r '{2016-01-25T00:00:00}:{2016-01-25T23:59:59}' --diff) -join "`n" -split "-+\nr\d+", 0, "multiline") | ? { $_ }) | select -skip 1) -split "\n" | Select-String '^([+-]) ' -AllMatches | ForEach-Object { $_.Matches } | ForEach-Object { New-Object psobject -Property @{ Adds = if ($_.Groups[1].Value -eq "+") { 1 } else { 0 }; Deletes = if ($_.Groups[1].Value -eq "-") { 1 } else { 0 } } } | Measure-Object -Property @("Adds","Deletes") -Sum

And the output:

Count    : 504
Average  :
Sum      : 402
Maximum  :
Minimum  :
Property : Adds

Count    : 504
Average  :
Sum      : 102
Maximum  :
Minimum  :
Property : Deletes

I read this post the other day: How To Compete Against Yourself: Don’t Do Your Best, Do Better Than Your Personal Average. It’s got some great points on how to establish better habits and be more productive.

One of the big points was to find measurable stats and track them, and try to beat your average (not your best, because you’re putting out a high level of effort constantly which will burn you out).

I though to myself, “self, what’s a good stat for coding?” Doesn’t have to be a perfect model, but something simple and traceable. I thought of grabbing lines changed per day.

I did a bunch of research, and there’s not an easy way to get that data, not from git tools, not from github stats.

So I wrote the following PowerShell snippet that grabs the entire git log by author and date, parses the insertion/deletion counts, and sums it all up:

git log --pretty=oneline --shortstat --no-notes -C --after 2016-01-22 --before 2016-01-23 --author=chris | # grab git log. replace dates and author
Select-String '(\d+) insertion.*?(\d+) deletion' -AllMatches | # find insertion/deletion line and capture values
ForEach-Object { $_.Matches } | # pull in the regex
ForEach-Object { New-Object psobject -Property @{ Adds = $_.Groups[1].Value; Deletes = $_.Groups[2].Value } } | # for each match, capture adds/deletes into a ps object
Measure-Object -Property @("Adds","Deletes") -Sum # sum up the values

Replace my name with yours and use the dates you want (after yesterday and before today) and you’ll get some useful output like:

Count    : 7
Average  :
Sum      : 704
Maximum  :
Minimum  :
Property : Adds

Count    : 7
Average  :
Sum      : 95
Maximum  :
Minimum  :
Property : Deletes

I’m sure someone with some better PowerShell-fu could simplify the snippet. If that’s you, let me know!

How do you get a numeric-only string and ignore all non-numeric characters, using just standard SQL functions? There’s SQL replace and pattern functions, but they only work once on a string. So barring 53 nested REPLACE functions, it’s tough. I’ve run into this issue many times before while writing import scripts and the like.

Today, I had a stroke of brilliance – what if it could be done using FOR XML and spt_values to parse the entire string? A bit of googling and stack overflowing later, and some gluging of different parts together, and I came up with the following:

SELECT 
    (SELECT CAST(CAST((
        SELECT SUBSTRING(FieldToStrip, Number, 1)
        FROM master..spt_values
        WHERE Type='p' AND Number <= LEN(FieldToStrip) AND
            SUBSTRING(FieldToStrip, Number, 1) LIKE '[0-9]' FOR XML Path(''))
    AS xml) AS varchar(MAX)))
FROM
    SourceTable

Replace SourceTable with your table and FieldToStrip with your field, and away you go. If you want to include other allowed characters, change the pattern in the LIKE -- that specifies which characters to retain. All other fields will be removed.