PSA

Facilitating KMS Activations

Occasionally we run into situations where Windows or the Office Suite fails to activate.

The solution: Use the built-in product-specific scripts to activate.

The problem: Many people in IT are not aware of these scripts, and thus how to use them, which results in a guaranteed call or email to the appropriate team.

Since I’m all about empowering people, I figured I’d put together a little script to help facilitate all this.

Enter: KMSActivate-MicrosoftProducts

I probably could have just left it as ‘Activate-MicrosoftProducts’ but I wanted to make sure potential users knew this was specifically for KMS scenarios, not MAK which has it’s own procedure.


[cmdletbinding()]
Param
    (
        [Parameter(Mandatory=$false)]
            [Switch]$ActivateOS = $true,

        [Parameter(Mandatory=$false)]
            [Switch]$ActivateOffice = $true
    )

Function Check-IfRunningWithAdministratorRights { If (!([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole]“Administrator”)) { Write-Warning “ERROR: You DO NOT have Administrator rights to run this script!`nPlease re-run this script as an Administrator!”; Break } }

Function Get-KMSHost { try { return (nslookup -type=srv _vlmcs._tcp | ? { $_ -like '*svr hostname*' }).Replace('svr hostname','').Replace('=','').Trim() } catch { throw $_ } }

Function Get-KMSClientSetupKey
    {
        # https://technet.microsoft.com/en-us/library/jj612867(v=ws.11).aspx
        [cmdletbinding()]
        Param
            (
                [Parameter(Mandatory=$false)]
                    [string]$OperatingSystemCaption = (Get-CimInstance -ClassName win32_operatingsystem).caption
            )

        Switch -Wildcard ($OperatingSystemCaption)
            {
                '*Windows Server 2016 Datacenter*' { return 'CB7KF-BWN84-R7R2Y-793K2-8XDDG' }

                '*Windows Server 2016 Standard*' { return 'WC2BQ-8NRM3-FDDYY-2BFGV-KHKQY' }

                '*Windows Server 2016 Essentials*' { return 'JCKRF-N37P4-C2D82-9YXRT-4M63B' }

                '*Windows 10 Enterprise*' { return 'NPPR9-FWDCX-D2C8J-H872K-2YT43' }

                '*Windows 7 Enterprise*' { return '33PXH-7Y6KF-2VJC9-XBBR8-HVTHH' }

                default { write-host "ERROR INVALID OPERATING SYSTEM CAPTION: $_"; throw }
            }
    }

Function KMSActivate-OperatingSystem
    {
        [cmdletbinding()]
        Param
            (
                [Parameter(Mandatory=$false)]
                    [string]$KMSHost = $(Get-KMSHost),

                [Parameter(Mandatory=$false)]
                    [string]$KMSClientSetupKey = $(Get-KMSClientSetupKey)
            )

        Start-Process -FilePath 'cscript' -ArgumentList "/nologo `"$env:windir\System32\slmgr.vbs`" -dlv" -Wait

        Start-Process -FilePath 'cscript' -ArgumentList "/nologo `"$env:windir\System32\slmgr.vbs`" -skms $KMSHost" -Wait

        Start-Process -FilePath 'cscript' -ArgumentList "/nologo `"$env:windir\System32\slmgr.vbs`" -ipk $KMSClientSetupKey" -Wait

        Start-Process -FilePath 'cscript' -ArgumentList "/nologo `"$env:windir\System32\slmgr.vbs`" -ato" -Wait

        Start-Process -FilePath 'cscript' -ArgumentList "/nologo `"$env:windir\System32\slmgr.vbs`" -dlv" -Wait
    }

Function KMSActivate-OfficeSuite
    {
        [cmdletbinding()]
        Param
            (
                [Parameter(Mandatory=$false)]
                    [string]$KMSHost = $(Get-KMSHost)
            )

        #write-host "KMSHost [$KMSHost]"

        [System.Collections.ArrayList]$OfficeInstallationDirs = @()
        foreach($ProgFilePath in $env:ProgramFiles,${env:ProgramFiles(x86)})
            {
                if(!(Test-Path -Path $ProgFilePath -PathType Container)) { continue }
                foreach($OfficeVersion in (gci "$ProgFilePath\Microsoft Office" -Filter Office* -ErrorAction SilentlyContinue)) { $OfficeInstallationDirs += $OfficeVersion.Fullname }
            }

        foreach($OfficeInstallationDir in $OfficeInstallationDirs)
            {
                if(!(Test-Path -Path "$OfficeInstallationDir\ospp.vbs" -PathType Leaf)) { continue }

                Start-Process -FilePath 'cscript' -ArgumentList "/nologo `"$OfficeInstallationDir\ospp.vbs`" /dstatusall" -Wait

                Start-Process -FilePath 'cscript' -ArgumentList "/nologo `"$OfficeInstallationDir\ospp.vbs`" /sethst:$KMSHost" -Wait

                Start-Process -FilePath 'cscript' -ArgumentList "/nologo `"$OfficeInstallationDir\ospp.vbs`" /act" -Wait

                Start-Process -FilePath 'cscript' -ArgumentList "/nologo `"$OfficeInstallationDir\ospp.vbs`" /dstatusall" -Wait
            }
    }

Check-IfRunningWithAdministratorRights

if($ActivateOS -eq $true) { KMSActivate-OperatingSystem }

if($ActivateOffice -eq $true) { KMSActivate-OfficeSuite }

 

The script will attempt to:

  • Confirm it’s running elevated otherwise it’ll quit
  • Determine the KMS server (or use the one supplied)
  • Determine the OS to set the correct client setup key
  • Determine the version(s) of Office installed (if any)
  • Activate Windows 7, 10 and a few flavors of Server 2016
  • Activate any various versions of Office if present

Not saying this is the best way – just a way.  I merely wanted a turn-key solution for our IT staff and this was what made sense as it solves some of the more infrequent issues that occasionally arose.

Good Providence to you!

Advertisements

Backing up Recovery Keys to MBAM and AD During OSD

Scenario

As we prepared for our Windows 10 roll out, we had MBAM all setup and ready to go when a wise man suggested we backup the keys to AD too.  I was a little perplexed: In my mind this is redundant since that’s what MBAM is supposed to do.  Can’t we just trust MBAM to do its thing?  But then the same wise man dropped a statement that I totally agreed with:

“I don’t want to be the one to have to explain to our CIO that we have no way of unlocking some VIP’s machine.”

Neither did I.

Here’s a high level overview of how we setup MBAM during OSD.
It’s not the best way and it’s not the only way.  It’s just a way.

Prerequisites:

  1. Export your BitLocker registry settings from a properly configured machine
  2. Edit the export, set the ‘ClientWakeupFrequency‘ to something low like 5 minutes
  3. Edit the export, set the ‘StatusReportingFrequency‘ to something low like 10 minutes
  4. Package up the .REG file as part of your MBAM client installation
    • This could either be a true Package, but I would recommend an Application that runs a wrapper to import the registry configuration; or create an MST; or add it to the original MSI.

Task Sequence Setup

  1. Wait until the machine is in real Windows, not WinPE
  2. Install the MBAM client (obviously!)
  3. Reboot
  4. Stop the MBAM service – We need to do this so that the settings we make below take effect
  5. Set the MBAM service to start automatically without delay – Want to make sure it fires as soon as possible.
  6. Import your BitLocker registry settings you exported & edited
    • This is the real meat: Since GPO’s are not applied during OSD, your GPO policies won’t reach the machine during the imaging process.  This will ensure your policies are in play as soon as possible.
    • Most places don’t set the  ‘ClientWakeupFrequency‘ and/or ‘StatusReportingFrequency‘ values to something insanely low via GPO which is why we manually edited the .REG file.  If you left them at the default values, the keys wouldn’t get escrowed for a few hours due to the way the MBAM client works.
  7. Optional but Recommended: Switch to AES-XTS-256 by setting ‘EncryptionMethodWithXtsOs‘ in ‘HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\FVE‘ to ‘7
  8. Start the MBAM service
  9. Enable BitLocker using the MBAM Deployment Scripts
  10. Reboot the machine
  11. Continue with your normal imaging process

The Good

  • We’ve not run into machines with improper configurations.
  • Every machine is encrypted using
    • full disk encryption versus used space
    • leverages AES-XTS-256
  • Keys are quickly escrowed to both AD and MBAM.
  • It just works: Deployed with 1511, we’re moving to 1607 and IT is testing 1703.

The Bad

  • I couldn’t figure out how to perform full disk AES-XTS-256 encryption in WinPE so this has to happen when we’re in a real OS.
    • I tried setting the keys via the registry but didn’t bother editing WSF files or trying to reverse engineer what goes on in that step to see if I could make it work.
  • Encryption does NOT begin until after someone logs on.
  • Encryption takes a while (but not too long) on SSDs.

In Closing

I would really like to hear from others on this one.  Because it was – and has been working for a over a year now – we really couldn’t justify dedicating bandwidth to exploring this further.  So we left it as-is.  My brain would like to see it work ‘properly’ one day, but that’ll have to wait.

Good Providence to you!

PSA: Disabling Bluetooth via PowerShell in Response to BlueBorne

Please note that nearly all of the machines in our environment are on Windows 10 so this is written with that in mind.

With the recent news about BlueBorne:

It’s been an interesting day!

Since Bluetooth is generally enabled in our environment, one school of thought is to reduce the attack surface by disabling Bluetooth across the board, then re-enable where necessary as some users have Bluetooth keyboards, mice and other peripherals.

There were two approaches:

  • Disable it in the BIOS (more on that later)
  • Disable it in Device Manager

Since the latter was more universal, and I’ll explain my perspective on that, I slapped together something basic to disable Bluetooth devices on the system.  I haven’t tested it extensively but on all of my Lenovo ThinkPad laptops, it’s worked without issue.

Disable-Bluetoothv1


Function Disable-Bluetoothv1
    {
        foreach($BTDevice in $(Get-PnpDevice -FriendlyName '*bluetooth*' -Status OK -ErrorAction SilentlyContinue))
            {
                if(Get-PnpDevice -FriendlyName $BTDevice.FriendlyName -Status OK -ErrorAction SilentlyContinue)
                    {
                        try { Disable-PnpDevice -InstanceId $BTDevice.InstanceId -Verbose -Confirm:$false -WhatIf }
                        catch { Write-Output "ERROR DISABLING [$($BTDevice.FriendlyName)] @ [$($BTDevice.InstanceId)]:`r`n$_" }
                    }
            }
    }

Disable-Bluetoothv2


Function Disable-Bluetoothv2
    {
        foreach($BTDevice in $(gwmi -Class Win32_PnPEntity | ? { (($_.Caption -like '*bluetooth*') -or ($_.Description -like '*bluetooth*')) -and $_.Status -eq 'OK' }))
            {
                if(gwmi -Class Win32_Pnpentity -Filter "Caption='$($BTDevice.Caption)' AND Status='OK'")
                    {
                        try { Invoke-WmiMethod -InputObject $BTDevice -Name 'Disable' -Verbose -WhatIf }
                        catch { Write-Output "ERROR DISABLING [$($BTDevice.Caption)][$($BTDevice.Description)] @ [$($BTDevice.DeviceID)]:`r`n$_" }
                    }
            }
    }

Both work and I personally don’t have a preference.  It’s really just a tomato tomahtoe / potaytoh potato / six of one half dozen of another type situation.

Also, be sure to remove the -WhatIf parameter if you decide to use it!

So here’s what my Lenovo laptop looked like before:

BTBefore

And here’s what it looks like after:

BTAfter

Good Providence and be safe out there!

PSA: Locating Bad/Corrupt Registry.POL Files

For several months – a good chunk of 2017 in fact – we’ve encountered machines where Group Policy failed as evidenced by the following on affected machines:

  • Seeing errors like the following when runing gpupdate /force:

    Computer policy could not be updated successfully.   The following errors were encountered:
    The processing of Group Policy failed.  Windows could not apply the registry-based settings for the Group Policy object LocalGPO.  Group Policy settings will not be resolved until this event is resolved.  View the event details for more information on the file name and path that caused the failure.
    To diagnose the failure, review the event log or run GPRESULT /H GPReport.html from the command line to access information about Group Policy results.

  • Event ID 1096 which itself would show ErrorCode 13 with an ErrorDescription of ‘The data is invalid.’ along with the problematic Registry.POL file.

This is a well known and well documented problem:

The cause: Unknown but we have some suspects.

The fix is easy: Delete or rename the problematic Registry.pol file, which so far is always in %SystemRoot%\System32\GroupPolicy\Machine

But that’s reactionary and we want to be as proactive as possible.

Enter: Test-IsRegistryPOLGood

I spent a bunch of time trying to figure out an intelligent manner of doing this, and after a lot of trial & error I was seeing inconsistent results.  After really getting into the weeds I wondered whether or not the structure/format of the file was documented.  Turns out it is: https://msdn.microsoft.com/en-us/library/aa374407(v=vs.85).aspx)

The header contains the REGFILE_SIGNATURE expressed as 0x67655250 which is 80, 82, 101, 103 in bytes:


[System.BitConverter]::GetBytes(0x67655250)

The header is in the first 4 bytes of the file so we read that and evaluate the bytes


Get-Content -Encoding Byte -Path 'path\to\someRegistry.pol' -TotalCount 4

On a good file, this returns an array consisting of 80, 82, 101, 103 which is exactly what we want.

On a bad file – or at least all the ones I had access to – it returned all zero’s.

To examine the file via PowerShell:


Function Test-IsRegistryPOLGood
    {
        [cmdletbinding()]
        Param
            (
                [Parameter(Mandatory=$false)]
                    [string[]]$PathToRegistryPOLFile = $(Join-Path $env:windir 'System32\GroupPolicy\Machine\Registry.pol')
            )

        if(!(Test-Path -Path $PathToRegistryPOLFile -PathType Leaf)) { return $null }

        [Byte[]]$FileHeader = Get-Content -Encoding Byte -Path $PathToRegistryPOLFile -TotalCount 4

        if(($FileHeader -join '') -eq '8082101103') { return $true } else { return $false }
    }

 

However, I wasn’t sure how we were going to implement this, so I explored doing this via VBScript and found ADO to be the ideal (only?) way:


Option Explici
Dim arrRegistryPolFiles
if(WScript.Arguments.Count > 0) then
    arrRegistryPolFiles = Array(WScript.Arguments(0))
else
    arrRegistryPolFiles  = Array(CreateObject("WScript.Shell").ExpandEnvironmentStrings("%WINDIR%") & "\System32\GroupPolicy\Machine\Registry.pol")
end if

Dim POLFile
For each POLFile in arrRegistryPolFiles
    if (CreateObject("Scripting.FileSystemObject").FileExists(POLFile)) then

        if(Join(ReadBinaryData(POLFile,3),"") = "8082101103") then
            wscript.echo True & vbtab & POLFile
        else
            wscript.echo False & vbtab & POLFile
        end if
    end if
next

wscript.quit

Function ReadBinaryData(Required_File,Int_Byte_Count)
' https://docs.microsoft.com/en-us/sql/ado/reference/ado-api/stream-object-ado
' https://docs.microsoft.com/en-us/sql/ado/reference/ado-api/stream-object-properties-methods-and-events

Const adTypeBinary = 1

' Requires Read/Write, otherwise it fails with Operation is not allowed in this context.
Const adModeReadWrite = 3

Dim arrByteArray : arrByteArray = Array(-1)
With CreateObject("ADODB.Stream")
    .Mode = adModeReadWrite
    .Type = adTypeBinary
    .Open
    .LoadFromFile Required_File

    Dim i
    For i=0 To Int_Byte_Count
        arrByteArray(i) = AscB(.Read(1))
        ReDim Preserve arrByteArray(UBound(arrByteArray)+1)
    Next
    .Close
End With

ReadBinaryData = arrByteArray
end function

 

After testing this out on our known good and known bad registry.pol files, we cast our rod (the code) into a special script all machines run to see if we’d catch any fish and sure enough we did!

From there it was just a matter of deciding how to handle the bad files: Alert IT for manual remediation or fix it during execution.

Here’s something ‘odd’ to me: When I convert the bits 80, 82, 101, 103 into HEX, I get 50, 52, 65, 67 which is the reverse of 67655250. Does anyone know why that is? I think it has something to do with Intel processors being ‘Little Endian’, which results in the bytes in memory not appearing in the same order as they do when fetched into a register but I don’t know; That’s a more than little beyond me!

In any event, hopefully someone will find this as useful as we did!

Good Providence to you!

Practical Use: Manipulating XML Files With PowerShell

In my previous post I talked about getting values from XML files.  Today we’re going to update a value in an XML file.

As before, I’m not a PowerShell or XML guru, but what I’m going to cover below works for me and I can’t think of a reason it wouldn’t work for you.

We’re going to continue using the sample XML file provided by Microsoft.  Copy the XML content & paste into a new file, books.xml, & save it some place convenient.

Getting the Current Value

We want to change an author’s name from Stefan to Stephane.  First, let’s find the entries that we need to change:

[xml]$XML = Get-Content "C:\Users\Julius\Downloads\books.xml"
$XML.catalog.book | ? { $_.author -like '*stefan' }

XML-SetValue-001

Great only one result so we don’t need to loop!

Setting the New Value

We know the property we want is author so we can simply  do something like this to set the new value:


($XML.catalog.book | ? { $_.author -like "*stefan" }).author = "Knorr, Stephane"

But I find it helpful to do something like this instead

$Node = $XML.catalog.book | ? { $_.author -like "*stefan" }

$Node = $XML.catalog.book | ? { $_.author -like "*stefan" }

"Old Author: {0}" -f $Node.author

$Node.author = $Node.author.Replace('Stefan','Stephane')

"New Author: {0}" -f $Node.author

XML-SetValue-002.PNG

If you had multiple entries, as is the case with author Eva Corets, you could do this:

foreach($Result in ($XML.catalog.book | ? { $_.author -like "Corets*" }))
    {
        write-host "Book #$($Result.id) Written by $($Result.author)"
        $Result.author = 'Mendez, Eva'
        write-host "Book #$(Result.id) Updated to $($Result.author)`r`n
    }

XML-SetValue-003.PNG

Saving the Updated XML

The save operation is super easy:

$XML.Save("C:\Users\Julius\Downloads\book.xml")

Done!
But I’m a big fan of having backup copies, so I opt for something like:


[string]$XMLFile = "C:\Users\Julius\Downloads\books.xml"
[xml]$XML = Get-Content $XMLFile
foreach($Result in ($XML.catalog.book | ? { $_.author -like "Corets*" }))
    {
        "Book #{0} Written by {1}" -f $Result.id,$Result.author
        $Result.author = 'Mendez, Eva'
        "Book #{0} Updated to {1}`r`n" -f $Result.id,$Result.author
    }
Copy-Item -Path $XMLFile -Destination "$XMLFile.ORIG.$(Get-date -Format 'yyyymmdd_hhmmss')"
$XML.Save($XMLFile)

 

Function Set-XMLValue

Try to think of this function a more of a framework than a complete solution.  I recently had to update a few different XML files, some of which were on a few hundred machines and used this to do the heavy lifting.

Function Set-XMLValue
    {
        Param
            (
                [Parameter(Mandatory=$true)]
                    [string]$XMLFile,

                [Parameter(Mandatory=$true)]
                    [string]$XMLTreePath,

                [Parameter(Mandatory=$true)]
                    [string]$Property,

                [Parameter(Mandatory=$false)]
                    $OldValue,

                [Parameter(Mandatory=$true)]
                    $NewValue,

                [Parameter(Mandatory=$true)]
                    [string]$NewFile = $null
            )

        if(!(Test-Path -Path $XMLFile -PathType Leaf)) { return 2 }

        [bool]$DoUpdate = $false

        Try
            {
                [xml]$XML = Get-Content $XMLFile

                $XMLPath = '$XML.' + $XMLTreePath

                Foreach($Node in (Invoke-Expression $XMLPath | ? { $_.$Property -ieq "$OldValue" }))
                    {
                        # Check to confirm that particular property exists
                        if([string]::IsNullOrEmpty($Node)) { Write-host "ERROR: NO PROPERTY [$Property] FOUND CONTAINING ORIGINAL VALUE [$OldValue]"; [int]$Return = 2 }

                        # Get current value from XML
                        $CurrValue = $Node.$Property

                        # Phase 1: Analysis of parameters and values

                        # Check if the old value was specified
                        if($OldValue)
                            {
                                # When the old value is specified, the script check if the current value matches the old value.

                                # If the current value matches old value and will need to be updated
                                if($CurrValue -eq $OldValue) { $DoUpdate = $true }

                                # If the current value doesn't match the old value but matches the new, its already up to date
                                Elseif($CurrValue -eq $NewValue) { [int]$Return = 0 }

                                # If the current value doesn't match the old or new value we won't change anything but return the current value
                                Else { [string]$Return = "WARNING: The current value [$CurrValue] did not match the specified [$OldValue] so NO changes were made." }

                            }
                        # If an old value was not specified, we'll update regardless of the current value
                        Else
                            {
                                # If the current value doesn't match the new value it will need to be updated
                                if($CurrValue -ne $NewValue) { $DoUpdate = $true }

                                # If the current value matches the new value its already up to date
                                elseif($CurrValue -eq $NewValue) { [int]$Return = 0 }
                            }

                        # Phase 2: Performing the update if deemed necessary
                        If($DoUpdate -eq $true)
                            {
                                # Update value
                                $Node.$Property = [string]$NewValue

                                # If we're using a new file, we don't need to backup the original file.
                                if([string]::IsNullOrEmpty($NewFile)) { $XML.Save($NewFile) }
                                Else
                                    {
                                        # Backup existing XML
                                        Copy-Item -Path $XMLFile -Destination "$XMLFile.ORIG.$(Get-date -Format 'yyyymmdd_hhmmss')" -Force -ErrorAction Stop

                                        # Save new/updated XML (overwrite existing)
                                        $XML.Save($XMLFile)
                                    }

                                # Success!
                                [int]$Return = 0
                            }
                    }
            }
        Catch { Write-Warning "ERROR DOING XML UPDATE OPERATION for [$XMLFile]: $_"; [int]$Return = 1 }
        return $Return
    }

$UpdateResult = Update-XMLValue -XMLFile "C:\Users\Julius\Downloads\books.xml" -XMLTreePath catalog.book -Property author -OldValue 'Knorr, Stefan' -NewValue 'Knorr, Stephane' -NewFile "C:\Users\Julius\Downloads\newbooks.xml"

write-host "UpdateResult [$UpdateResult]"

This may not be fancy, and arguably complicated, but if you’re dealing with multiple XML files, PowerShell can be a huge timesaver.

 

Good Providence!

Practical Use: Getting Values from XML Files

A number of applications we use in the organization have XML-based configuration files and occasionally the need arises to validate the settings from the configuration.  For example, we might want to verify that a user is pointed to the right server, or that a particular setting is set to X.  We could treat XML files the same way we would a typical text file, as seen here, but where’s the fun in that?

I’m not a PowerShell or XML guru, but what I’m going to cover below works for me and I can’t think of a reason it wouldn’t work for you.

To keep things simple, we’ll be working with a sample XML file provided by Microsoft.  Copy the XML content & paste into a new file, books.xml, & save it some place convenient.

Ingest An XML File

In order to get started, you need to pull in the XML into a variable for further manipulation as such:

[xml]$XML = Get-Content "C:\Users\Julius\Downloads\books.xml"
# OR
$XML = [xml](Get-Content "C:\Users\Julius\Downloads\books.xml")

Six of one, half a dozen of another; whatever works for you.

If you type $XML you should see something to the effect of:

XML-GetValues-001

Browsing the Structure

Once your XML variable is populated, it’s time to walk the XML tree structure to find the key that contains the data you’re looking for; And in order to do that you have to know the tree structure.

Without going too deep into this

  • The first line is the XML prolog containing the version information
  • The second is the <catalog> and that is our root element or node
  • All of the <book> nodes are children of the root
  • Each <book> node has a number of children: author, title, genre, price, publish_date & description.

Keep in mind this is a pretty simple & basic XML example, so the structure doesn’t go too deep.  You may find that your application XML’s are several layers deep.

If I wanted to see all the books, I would use

$XML.catalog.book

Which would then list the books:

XML-GetValues-002.PNG

 

Locating the Right Data

If I wanted to find all books by written by authors Stefan Knorr and Eva Corets, I would use

$XMl.catalog.book | ? { $_.author -like &quot;*stefan&quot; }

$XMl.catalog.book | ? { $_.author -like &quot;Corets*&quot; }

That would quickly narrow the scope:

XML-GetValues-003.PNG

Getting Specific Values

To grab key bits of details from Stefan Knorr’s book I could do

$XML.catalog.book | ? { $_.author -like &quot;*stefan&quot;} | select id,price,publish_date

# OR

$Node = $XML.catalog.book | ? { $_.author -like &quot;*stefan&quot; }

$Node.id

$Node.price

$Node.publish_date

XML-GetValues-004.PNG

I try to be as specific as possible when searching for data.

 

Function Get-XMLValue

This function is more of a framework as it depends squarely on the XML you’re working with and what you want to retrieve.  I recently had to verify a setting on a few hundred machines, so I used a modified version of this function to do the heavy lifting.

The function below is geared to return the book description when supplied the ID.

Function Get-XMLValue
    {
        Param
            (
                [Parameter(Mandatory=$true)]
                    [string]$XMLFile,

                [Parameter(Mandatory=$true)]
                    [string]$XMLTreePath,

                [Parameter(Mandatory=$true)]
                    [string]$AnchorProperty,

                [Parameter(Mandatory=$true)]
                    [string]$AnchorValue,

                [Parameter(Mandatory=$true)]
                    [string]$Property
            )

        if(!(Test-Path -Path $XMLFile -PathType Leaf)) { return 2 }

        Try
            {
                [xml]$XML = Get-Content $XMLFile

                $XMLPath = '$XML.' + $XMLTreePath
                $Return = @()
                Foreach($Node in (Invoke-Expression $XMLPath | ? { $_.$AnchorProperty -ieq &quot;$AnchorValue&quot; })) { $Return += $Node.$Property }

                if(-not $Return.count -gt 0) { Write-host &quot;ERROR: NO PROPERTY [$Property] FOUND BASED ON QUERY: [$AnchorProperty] = [$AnchorValue]&quot;; [int]$Return = 2 }
            }
        Catch { Write-Warning &quot;ERROR RETRIEVING VALUE OR PROPERTY [$Property] BASED ON QUERY: [$AnchorProperty] = [$AnchorValue] FROM [$XMLFile]: $_&quot;; [int]$Return = $_ }
        return $Return
    }

I couldn’t think of an elegant way to retrieve the values I wanted without specifying some qualifiers.  For instance, if I wanted the price of all Eva Corets books

  • The AnchorPrperty would be: author
  • The AnchorValue would be: Corets, Eva
  • The Property would be price

This way I’m certain to get the results I’m looking for.

Is there a better way?  Maybe.  This is what I came up with that met my need.

Usage:

# These two work
Get-XMLValue -xmlFile &quot;C:\Users\Julius\Downloads\books.xml&quot; -value bk108
Get-XMLValue -xmlFile &quot;C:\Users\Julius\Downloads\books.xml&quot; -value bk107

# This deoesn't
Get-XMLValue -xmlFile &quot;C:\Users\Julius\Downloads\books.xml&quot; -value bk100

It may not be the most elegant solution but I’m hoping it’ll at least point you in the right direction should the need arise to get data from XML files.

 

Good Providence to you!

Practical Use: Find & Replace in a Text File

A number of applications rely on simple text-based configuration files versus some proprietary format.  This makes editing files – ones that can’t simply be replaced via GPO/GPP or Login Script – really easy to update.

Back in my VBScript days I would likely

  1. Ingest the file via ReadAll()
  2. Check if it contains the value via InStr
  3. Replace(old_value,new_value)
  4. Write out the file with the updated content

I figured Get-Content was going to behave similarly but I discovered each line is its own separate object which meant iterating through the array for the content.  No big deal but something new and good to know.

The method for locating your text is important depending on what you’re searching for.

Locating X File

If it’s simple text, you can get away with either operator: -match or -like.

  • Match will simply return true or false and is geared towards regular expression based searches.
  • Like will return the actual objects that match.

So if you just want to know whether or not the file contains X, you could go either way.

[string]$File = 'C:\windows\Temp\ASPNETSetup_00000.log'
$OriginalContent = Get-Content -Path $File
[string]$Find = 'Vista'

# Search via Match
($OriginalContent | % { $_ -match $Find }) -contains $true

# Search via Like (similar concept)
($OriginalContent | % { $_ -like &quot;*$Find*&quot; }) - contains $true

But if you want to locate X and do something with it, -like is your friend.

#Like
[string]$File = 'C:\windows\Temp\ASPNETSetup_00000.log'
$OriginalContent = Get-Content -Path $File
[string]$Find = 'Vista'
$Results = $OriginalContent | % { $_ -like &quot;*$Find*&quot; }

 

I don’t want to get too deep into this, because it’s well documented elsewhere, but I just want to mention that if you’re searching for something that contains special characters some additional care is necessary.

If you’re using -like, you should be fine:

# Like
[string]$File = &quot;C:\WINDOWS\temp\ASPNETSetup_00000.log&quot;
$OrigContent = Get-Content -Path $File
[string]$Find = '\/\/I/\/D0WZ'

# Return true/false
($OrigContent | % { $_ -like $Find }) -contains $true

# Get the lines that match
$Results = $OriginalContent | % { $_ -like &quot;*$Find*&quot; }

 

But if you’re using -match, you’ll need to either escape those characters manually:

# Match
[string]$File = &quot;C:\WINDOWS\temp\ASPNETSetup_00000.log&quot;
$OriginalContent = Get-Content -Path $File
[string]$Find = '\\\/\\\/I\/\\\/D0WZ'
($OriginalContent | % { $_ -match $Find }) -contains $true

Or rely on the Escape() method of the Regex class:

# Match
[string]$File = &quot;C:\WINDOWS\temp\ASPNETSetup_00000.log&quot;
$OriginalContent = Get-Content -Path $File
[string]$Find = '\/\/I/\/D0WZ'

($OriginalContent | % { $_ -match [regex]::Escape($Find) }) -contains $true

Replacing the Content

Now that you’ve confirmed file X contains Y, its time to replace it.  Since humans are prone to making mistakes, I always like to have a way of backing out of programmatic changes like, so the steps below include a backup process.

# Replace&amp;nbsp;X with Y and store it in a new variable
$NewContent = $OriginalContent | % { $_ -replace $Find,$Replace }

# Create a new file that will ultimately replace the existing file.
#     If you want a UTF-8 file with BOM use this
#$NewContent | Out-File -FilePath &quot;$File.NEW&quot; -Encoding utf8 -Force

#     If you just want a UTF-8 without BOM, this does the trick.
$NewContent | Out-File -FilePath &quot;$File.NEW&quot; -Encoding default -Force

# Backup the existing file
Copy-Item -Path $File -Destination &quot;$File.ORIG.$(Get-date -Format 'yyyymmdd_hhmmss')&quot; -Force

# Move the new file that we staged to overwrite the orignal
Move-Item -Path &quot;$File.NEW&quot; -Destination $File -Force

To the experts, this is really simple and basic stuff.  To those less seasoned, this is practical. 🙂

Good Providence!

Preparing for Windows 10: Upgrading to Internet Explorer 11 on Windows 7/8[.1]

To most, this is really old news.  But some organizations on Windows 7 are still running Internet Explorer 8/9/10 due to [potential] compatibility issues.  This is bad because these organizations are in an unsupported configuration:

Beginning January 12, 2016, only the most current version of Internet Explorer available for a supported operating system will receive technical support and security updates. Please visit the Internet Explorer Support Lifecycle Policy FAQ here http://support.microsoft.com/gp/Microsoft-Internet-Explorer for list of supported operating systems and browser combinations.

In the legal vertical, so much relies on IE add-ons, ActiveX controls and just general compatibility.  Most external sites by now support IE11 – or are getting there – but there are some stragglers.  However, the real problem is with the myriad of internal sites, and its not uncommon to run into one or more key legacy web-based applications still in play that is either not upgradable or requires a significant amount of effort to do so.  This makes people uneasy about upgrading to IE11, which is probably the largest hurdle for getting to Windows 10.

Hopefully this is just enough detail to help get you on your way.

Internet Explorer Upgrade Testing Strategy

Dive right in.

  • Get IE11 setup on a machine
  • Expose the ‘Enterprise Mode’ option under the Tools menu by creating an empty ‘Enable’ string value under ‘HKCU\Software\Microsoft\Internet Explorer\Main\EnterpriseMode’.
  • Start testing

Testing Document Modes

Internet Explorer supports the following Document Modes:

  • Internet Explorer 11 (Edge)
  • Internet Explorer 10
  • Internet Explorer 9
  • Internet Explorer 8
  • Internet Explorer 7 (Compatibility View)
    Also falls back to IE5 for sites without a DOCTYPE tag
  • Internet Explorer 5 (Quirks)

In addition, Microsoft also added support for:

  • Interoperable Quirks, primarily for public facing websites that were designed to use the quirks mode of other browsers.
  • IE8 Enterprise Mode which provides higher fidelity emulation for IE8.
  • IE7 Enterprise Mode which is essentially Enterprise Mode running in high fidelity emulation BUT running in either IE7 Document Mode if there is an explicit DOCTYPE tag or in IE5 Document Mode if there is not.
    Its an additive version of Enterprise Mode running in Compatibility View.

Hacking a Combination Lock

Launch IE, go to your first site and test.  If all is well your job is done and you’re off to the next one.  But if text isn’t lining up correctly, images not loading, functions not working then you have to go deeper.

Open the Developer Tools (F12) and start by matching both Document Mode and User Agent String in order, leaving the Browser Profile set to ‘Desktop’.

  • You already know Document Mode ‘IE11 (Default)’ & User Agent String ‘Internet Explorer 11 (Default)’ doesn’t work, so move on
  • Next try ‘IE 10’ & ‘Internet Explorer 10’
  • Then IE9
  • Wash, rinse, repeat
  • Document the winning combination.

I’m guessing that 99% of your sites will work with minor to no manipulation.

Testing Enterprise Mode

If none of the Document Modes work, then you fall back on Enterprise Mode because it provides higher fidelity emulation for those older versions of IE.

  • Start by setting the Browser Profile to Enterprise
  • This will default Document Mode to IE8
  • If IE8 does not work, then use IE7 and IE5 doc modes for IE7 Enterprise Mode.

You should know that there’s a little bit of a ‘cost’ with Enterprise Mode:

  • Performance because of its high fidelity capabilities.  However keep in mind:
    • IE11 in Enterprise Mode is an order of magnitude faster than running IE8 natively.
    • Running in IE11 in Native Mode (Standards Mode) is significantly faster than IE11 in Enterprise Mode.
  • Risk – potentially – because deprecated functions have been brought back.

Deploying the Right Configuration

Great you’ve got a list of sites and their required configurations, the hard part is mostly done.  You’ll need to put those configurations into an XML format that IE can understand using the Enterprise Mode Site List Manager .  Find an existing webserver (or share) were you can serve up this tiny XML file, install the Site List Manager & generate your Site List XML file.

In terms of setting this up from scratch, I happen to like Nystrom’s approach, but you can follow the Microsoft process to get this setup with minimal effort.  Once its up and running you’re all set to pilot with a larger audience.

As much as I was interested in trying out Enterprise Site Discovery, it wasn’t something we felt we needed.  I’m mentioning it here as it could be of significant value to some.

I recommend creating a new GPO to set:

  • ‘Let users turn on and use Enterprise Mode from the Tools menu’
  • ‘Use the Enterprise Mode IE website list’

If you’re in a rush just put together a quick .reg file your testers can use

  • HKCU\Software\Microsoft\Internet Explorer\Main\EnterpriseMode
    • Enable the Tools menu:  “Enable” = “”
      • Or if you want feedback (and I think you do): “Enable” = “{URL}{:port}”
    • Enable the XML site list:  “SiteList” = “{File Path or URL}”

Note:  In case you don’t already know, you can put it in HKLM vs HKCU so all users of the same machine get the settings.  Alternatively you can put it in HKLM\Software\Policies\ or the HKCU equivalent.  Just depends on your environment.

When to use Document Mode vs. Enterprise Mode

Document Mode

While the original <emie> functionality provided great compatibility for enterprises on Internet Explorer 8, the new <docMode> capabilities can help enterprises stay up-to-date regardless of which versions of Internet Explorer are running in their environment. Because of this, Microsoft recommends starting the testing process like this:

  • If your enterprise primarily uses Internet Explorer 8, start testing using Enterprise Mode.
  • If your enterprise primarily uses Internet Explorer 9 or Internet Explorer 10, start testing using the various document modes.

Because you might have multiple versions of Internet Explorer deployed, you might need to use both Enterprise Mode and document modes to effectively move to Internet Explorer 11.

The <docMode> section:

  • only sets the Document Mode for a particular page/website and sends the User Agent String
  • will override what the site itself is asking for.

Enterprise Mode

Enterprise Mode is a compatibility mode that let’s websites render using a modified browser configuration that’s designed to emulate Internet Explorer 8, avoiding the common compatibility problems associated with web apps written and tested on older versions of Internet Explorer.

Through improved emulation, Enterprise Mode lets many legacy web apps run unmodified on Internet Explorer 11, support a number of site patterns that aren’t currently supported by existing document modes.

The <emie> section is higher fidelity emulation of IE8 focused on these compatibility issues reported over the years

  • User Agent String – it’s a faithful representation/replication of the original
    • original IE8 user agent string
    • this includes the versions of .NET on the machine
    • whether the machine is a media center or not.
  • ActiveX Controls – telling the site you’re using IE8 which allows most ActiveX controls to work correctly.  Although you should note that some ActiveX controls query the OS version & browser and as far as I know, you can’t do anything about that.
  • Deprecated Functionality has been brought back like CSS Expressions
  • Turned off some performance improvements to favor compatibility.
  • Fixed things for vertical languages (Japanese, Chinese, Korean etc.)

IE7 Enterprise Mode is effectively this higher fidelity emulation for IE8 running with Compatibility View.  So a site will get either IE7 Document Mode or IE5 Document Mode if it doesn’t have a DOCTYPE tag.  This is useful for some sites and helps organizations as they wean themselves away from displaying all Intranet Sites in Compatibility View because they now have the granular controls they need!

So you can either use:

  • IE7 document mode on the docModes section, because IE7 will fall back to IE5 if there isn’t a DOCTYPE tag which is effectively Compatibility View
  • But if that doesn’t work, you have the higher fidelity emulation within Enterprise Mode to be able to use Enterprise Mode plus Compatibility View.

Once you get a handle on things, you can turn off the ‘Display All Intranet Sites in Compatibility View’ setting allowing your Intranet sites to default to modern standards not old standards.

What Exactly is Compatibility View?

Compatibility View is basically a switch that says:

  • If you have a webpage that has a DOCTYPE tag, it will be rendered in IE7 document mode.
  • If there’s no [explicit] DOCTYPE you end up in IE5 document mode.

Enterprise Mode Site List

This is what the Site List XML file looks like

IE11EnterpriseModeSiteListXML

The XML formatting of the Site List file is fairly easy to understand and the true/false exclude syntax allows for fine-grained control:

<rules version="3">
 <emie>
    <domain exclude="false">crm
      <path exclude="true">/NewModule</path>
    </domain>
  </emie>
  <docMode>
      <domain docMode="9">webtool</domain>
  </docMode>
</rules>

 

I bid you Good Providence in your endeavor to get up to IE11

References

Non Sequitur: Creating Dynamic Variables on the Fly in PowerShell

I’m only mentioning this because of something that came up recently and I thought it was kind of neat.

To keep things simple, I’m going to break this up into a few posts:

  1. Batch
  2. VBScript
  3. PowerShell

PowerShell

In the last 3 years or so I’ve tried to focus exclusively on PowerShell.  I’m no Jeff Snover, Dan Cunningham or any other PoSh guru you might know.  It’s still very new to me but I’m slowly working at it – so bear with me!

If you read my last two posts on this subject, you’ll know that I was all about creating variables dynamically for various reasons. I started each post the same:

I’m only mentioning this because of something that came up recently…

So here’s that something.

I had a repetitive task that involved doing something in a specific order so I challenged myself to create jobs and have the script wait for those jobs to finish before moving on.  I wanted a way to create a unique enough variable name for the jobs to help keep track of what was going on.  This is what really sparked that whole trip down memory lane of creating variables dynamically, on the fly, and I wanted to see if I could do it in PowerShell.

This is what I ended up with:

[array]$Colors = @('Blue','Black','Silver','Yellow','Orange')
Foreach($Color in $Colors)
    {
        $varName = '$Paint_' + $Color
        write-host "varName [$varName]"

        $tmpVar = $varName + ' = "Paint this car [' + $Color + ']"'
        write-host "tmpVar [$tmpVar]"

        Invoke-Expression $tmpVar
        Invoke-Expression $varName
Remove-Variable varName,tmpVar -force -ea silentlycontinue
    }
Get-Variable Paint_*

This allowed me to retrieve the individual jobs via $Paint_Color which was really useful for the task at hand as it had about 6 different moving parts some of which were dependencies.

Anyway, maybe you’ll find an application for this in your environment and if you do, I’d really like to hear about it.

 

Good Providence!

Non Sequitur: Creating Dynamic Variables on the Fly in Batch

I’m only mentioning this because of something that came up recently and I thought it was kind of neat.

To keep things simple, I’m going to break this up into a few posts:

  1. Batch
  2. VBScript
  3. PowerShell

 

Batch

Maybe 5 years ago, before I dunked my head into the vat of PowerShell Kool-Aid™ (or is it ­®?), I was doing some work in the registry on various machines to get some specific data.  I thought it would be neat to create a function that would allow me to reference the data by calling on the value name.

I came up with this little number:

@echo off
setlocal enabledelayedexpansion

rem This happens to be Microsoft Visual C++ 2010 x64 Redistributable - 10.0.40219 - everyone has that ! 😉
Set _Key=HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall\{1D8E6291-B0D5-35EC-8441-6616F567A0F7}
Set _QueryValues=Publisher DisplayName DisplayVersion InstallSource

rem execute to generate variables dynamically
For %%b in (!_QueryValues!) do ( for /F &quot;usebackq tokens=2*&quot; %%c in (`reg query &quot;!_Key!&quot; /v &quot;%%b&quot;`) do (set %%b=%%d) )

rem validate dynamicallly generated variables
For %%e in (!_QueryValues!) do ( echo [%%e] is [!%%e!] )

:end
rem clear dynamically generated variables
For %%e in (!_QueryValues!) do (Set %%e=)
set _QueryValues=
set _Key=
endlocal
pause

Which allowed me to do things like this:

echo Publisher is [!Publisher!]
echo DisplayName is [!DisplayName!]
echo DisplayVersion is [!DisplayVersion!]
echo InstallSource is [!InstallSource!]

Throughout the script wherever I needed to reference one of those varaibles.

Worked like a charm for my needs then and I still think it’s pretty slick.

 

Good Providence!

If You’re Paranoid, Remove TeamViewer

So, naturally, this is in response to the recent allegations that TeamViewer has been hacked…

While TeamViewer hasn’t admitted to having been breached, and although what they’ve suggested is completely plausible, one thing is clear: What has been reported thus far doesn’t give me the warm and fuzzy … so I’m going to play it safe for now.

I put together a script to remove TeamViewer from not only ma own machines, but also from the machines of friends and family I often support.  I’ve run this on Windows 7+ and so far it works as expected.  If you run into an issue, let me know and I’ll do what I can to troubleshoot asap.

Also

  1. If you’re not using a password manager or are still using easy to remember passwords or are recycling/reusing passwords across multiple sites;
  2. If you’re not using two-factor authentication (2FA)

You really should reconsider.  Check yourself out on https://haveibeenpwned.com/ to see what accounts may have been compromised in a data breach and take the necessary precautions.

This needs to be run from an elevated PowerShell console or ISE.

# Define TeamViewer Installation directory array for use below
$arrTVInstallDirs = @()

# Define TeamViewer Uninstaller EXE's for use below
$arrTVUninstallers = @()

# Get TeamViewer Install Directories for both architectures
$arrTVInstallDirs += gci $env:ProgramFiles *TeamViewer*
if($env:PROCESSOR_ARCHITECTURE -eq 'AMD64') { $arrTVInstallDirs += gci ${env:ProgramFiles(x86)} *TeamViewer* }

# Loop through each 'TeamViewer' directory for EXE's and kill those processes
foreach($TVInstallDir in $arrTVInstallDirs)
    {
        write-host "Processing TVInstallDir [$($TVInstallDir.FullName)]"
        Foreach($TVEXE in $(gci -Path $($TVInstallDir.FullName) -Recurse *.exe))
            {
                if($TVEXE.Name -eq 'uninstall.exe') { $arrTVUninstallers += $TVEXE }
                write-host "Killing Process [$($TVEXE.Name)]"
                Stop-Process -Name $($TVEXE.Name) -Force -ErrorAction SilentlyContinue
            }
    }

# Stop Team Viewer services
Foreach($TVService in $(Get-WmiObject -Class Win32_Service -Filter "Name like '%TeamViewer%'"))
    {
        # Stop Service
        write-host "Stopping Service [$($TVService.Name)]"
        $TVService.StopService() | Out-Null

        # Disable Service
        write-host "Disabling Service [$($TVService.Name)]"
        If($TVService.StartMode -ne 'Disabled') { Set-Service -Name $TVService.Name -StartupType Disabled | Out-Null }

        # Delete Service
        write-host "Deleting Service [$($TVService.Name)]"
        $TVService.Delete() | Out-Null
    }

# Loop through the uninstallers
Foreach($TVUninstaller in $arrTVUninstallers)
    {
        $PSI = New-Object -TypeName 'System.Diagnostics.ProcessStartInfo' -ErrorAction 'Stop'
        $PSI.Arguments = '/S'
        $PSI.CreateNoWindow = $false
        $PSI.FileName = $TVUninstaller.FullName
        $PSI.UseShellExecute = $false
        $PSI.WindowStyle = 'Normal'
        $PSI.Verb = 'runas'

        $Proc = New-Object -TypeName 'System.Diagnostics.Process' -ErrorAction 'Stop'
        $Proc.StartInfo = $PSI

        write-host "Uninstalling TeamViewer [$($TVUninstaller.FullName)]"
        if($Proc.Start() -eq $true)
            {
                write-host "Uninstall started - waiting for it to finish..."
                $Proc.WaitForExit()
                Do { $Proc.Refresh(); Start-Sleep -Seconds 3 } while($Proc.HasExited -ne $true)
                if($Proc.ExitCode -eq 0) { write-host "Uninstall completed successfully! [$($Proc.ExitCode)]" -ForegroundColor Green }
                else { write-host "ERROR: Uninstall completed WITH ERRORS [$($Proc.ExitCode)]" -ForegroundColor Red }
            }
            else { write-host "ERROR Failed to start uninstall [$($TVUninstaller.FullName)] [$($Proc.ExitCode)]" -ForegroundColor Yellow }
    }

 

Good Providence and be safe!