Scripting Games Event 4 Commentary

How Random is Random

I am going to switch gears and talk about a logical issue instead of a syntax issue. In Event 4 your are required to retrieve 20 random users from your Active Directory environment for the purpose of an audit. I have seen numberous submissions where users have limited the records returned from either ADSI or the PowerShell AD Cmdlets. In every case they mention the performance impact of returning all of the users in their AD environment.

While I agree in principal the problem is that you are limiting the randomness of the pool. I have no idea in what order entries are returned but my guess is that its not truely random. If you were just pulling 20 random users for your own records that might be acceptable. However since you are talking about an audit I don’t think that limiting the record set would be a good thing to do.

Regular Expressions

I have seen many submissions use regular expressions to match the output file. Kudos to all of you that are embracing the power of regular expressions.

Loading Modules

There is no need to check whether a module is loaded before you load it. Just load the module. PowerShell is smart enough to know whether the module is loaded and only load it if its not already loaded.

Scripting Games Event 2 Commentary

Put your comments at the top of your functions!

According to the about_Comment_Based_Help you can place your comment based help in any of three locations:

  • At the beginning of the function body.
  • At the end of the function body.
  • Before the function keyword. (There can not be more than one blank line between the last line of the function help and the Function keyword.)

Generally the third option is used most of the time. However during the games I saw a lot of people put the comment-based help at the end of the function. While technically a valid place to put the help I dislike it very much.

Sure comment-based help is technically used for displaying help from within PowerShell but it also has a second purpose. It also serves as inline documentation for the functions. When reading a function to understand it comment-based help provides documentation and insight into what the function does. Putting it at the end doesn’t make it very useful for this purpose. Put it at the top.

The Good The Bad and The Ugly – Scripting Games 2013 Event 1

The submission deadline of the 2013 Scripting Games has passed and now all of the scripts are open for judging. For the most part the community has been doing a pretty good job of judging the scripts. In my opinion the ratings are on par with the what I would give as a rating for the entries. I have had some time to review the submissions and prepare some notes on the submissions.


I am not a big fan of the whole one-liner syndrome, that is the need to compact everything into a single line of code. Sure, its great if the code naturally fits into a single line, but for god sake don’t force it. If the line contains a “;” then its not a one-liner! The following code was submitted for an event:

[codesyntax lang="powershell"]
$Source="C:\Application\Log";$Destination="\\NASServer\Archives";Get-ChildItem $Source -recurse -Filter *.log|%{If ($_.LastWriteTime -lt (Get-Date).AddDays(-90)){md -Force ($Destination +"\" + $_.Directory.Name)|Out-Null;Move-Item $_.FullName ($Destination + "\" + $_.Directory.Name + "\" + $_.Name) -Force}}

This code is very hard to read even without the word wrap. If the code was broken up at every “;” then we would have this:

[codesyntax lang="powershell"]
Get-ChildItem $Source -recurse -Filter *.log|%{If ($_.LastWriteTime -lt (Get-Date).AddDays(-90)){md -Force ($Destination +"\" + $_.Directory.Name)|Out-Null Move-Item $_.FullName ($Destination + "\" + $_.Directory.Name + "\" + $_.Name) -Force}}

Just splitting it up like that makes it much easier to read. Now, if you some additional line breaks and indenting you are left with this:

[codesyntax lang="powershell"]
Get-ChildItem $Source -recurse -Filter *.log |
    If ($_.LastWriteTime -lt (Get-Date).AddDays(-90))
        md -Force ($Destination +"\" + $_.Directory.Name) | Out-Null
        Move-Item $_.FullName ($Destination + "\" + $_.Directory.Name + "\" + $_.Name) -Force

Make your scripts readable because you never know who might need to view your script and know how it works. But even more importantly you will find that you make far less mistakes when you are able to read your own code.


I had bookmarked a couple of scripts that used backticks that I wanted to comment on. However Richard Siddaway beat me to it by saying “Backticks are baaaaaad; don’t do backticks”. Take a look at his article and realize that

99.9% recurring of the time you don’t need backticks as line continuation markers


Aliases are great when you are entering PowerShell commands in the shell. But when you are writing scripts, especially when you are sharing those scripts, do not use aliases. Aliases in scripts make it harder to understand what a script does and you can’t always guarantee that a system will have the exact same aliases.


One area that I did notice a number of scripts doing very well was documentation. Documenting your script is never a bad thing. In the advanced event the documentation was in the form of advanced help. Help was not a specific requirement for the event but most of the submissions that I reviewed included it. That is awesome and keep up the good work. The beginners did a pretty good job as well by inserting basic comments into their entries.

2013 Scripting Games

In little over a week, on Thursday, April 25, 2013, the much anticipated 2013 Scripting Games will begin. The Scripting Guy, Ed Wilson has managed the event himself until now. This year he gets some help, with the games being managed by

I am honored to resume my duties as a judge. This is a great event and I am always impressed with the submissions that I review. During each event I will review the entries and then blog about their technique, while offering my own critique.

This is a truly unique and rewarding event for all involved, from voters to scripters to judges. It’s not too late to join the fun. You can find everything you need to know at the Scripting Games page here.

Locating All of the Unlinked GPOs in Your Domain

Have you ever wanted to identify all of the Group Policy Objects (GPOs) in your domain that are not currently linked to any Organizational Units (OU)? This was the task I was given today by my project manager. He wanted me to identify any of our GPOs that were unlinked. We follow a certain naming scheme so that is how we identify ours. Well PowerShell made it so easy that I obtained ALL of the GPOs in the domain that were unlinked so I sent the list to our Active Directory (AD) team lead. PowerShell rocks!

I knew there was a GPO module in Windows so I started my search there. After looking at the module I focused on the Get-GPO cmdlet for my start. Using Get-Help I was able to obtain all of the GPOs. Unfortunately, the cmdlet didn’t help me find out whether it was linked . So, I focused on Get-GPOReport which did actually report on the details of the GPO. I ran it on a couple of GPOs, one linked and one unlinked, to get a better idea of what I was looking for in the report. What I noticed was that there was a ‘LinksTo’ node in the XML that indicates where the GPO is linked. For the one where it wasn’t linked that node was not present. Sure, I could have done XML  mojo but sometimes the simplest way is the best way. In my case I just decided to test  to determine where that string was not present in the report.

[codesyntax lang=”powershell” lines=”normal”]

Import-Module GroupPolicy
Get-GPO -All | 
       If ( $_ | Get-GPOReport -ReportType XML | Select-String -NotMatch "<LinksTo>" )
        Write-Host $_.DisplayName



Using PowerShell Default Parameter Values to Prevent Unfortunate Accidents

If you have ever encountered the unfortunate circumstance of accidentally shutting down or rebooting your computer with PowerShell I am here to give you a tip you will definitely want to remember. First of all this solution requires version 3 so if you are still on version 2 then you will need to stash this one away. 

The PowerShell cmdlets for shutting down and rebooting a computer are Stop-Computer and Restart-Computer respectively. By default these cmdlets don’t prompt for confirmation so if you accidentally enter them your system is going to go down. To prevent this you can use the default parameter values to set the default for these cmdlets to always confirm. 

I am not going to explain how default parameter values work in PowerShell. Instead I am going point you to use the builtin and very descriptive help topic built in to PowerShell. Read all about it by executing help about_Parameters_Default_Values.

[codesyntax lang=”powershell”]

$PSDefaultParameterValues = @{

    "Restart-Computer:Confirm" = $True
    "Stop-Computer:Confirm" = $True

Exploring the AppSense PersInfo Tool

The PersInfo utility from AppSense presents a gold mine of information about your AppSense Personalization environment and activities from a user’s perspective.  Last week I was going through the AppSense Exchange on MyAppSense browsing through the configs and tools they have. I came across the Personalization Information Tool and thought it sounded really interesting so I downloaded it.

PersInfo Menu ScreenShot

After downloading the zip file and extracting it I noticed that it was just a single executable. So I  logged in to my XenApp environment and launched the Persinfo.exe executable. I noticed a green sphere in my systray and right-clicked the icon. I was presented with the menu shown to the right. There are many options and I won’t go through all of them. I will let you explore the other options.

I clicked on the Session Information option and was presented with the following window. Unfortunately I had to blur a lot of the information because it was executed on my work system so I will explain the various sections.

PersInfo Session Information

In the top left you have the username and SID of the user who is running the utility. In the top right you get the version of the AppSense agent as well as the Environment Manager config and version. Useful for making sure you have the correct configuration and version. In the bottom left you get information on the Personalization configuration. Cache Container is a clickable link to filesystem where your cache files are stored. Information about the Personalization group is also included. The last Active Link field is very interesting because it tells you the last Personalization server that was connected to. This is great for identifying a troubled Personalization server in a load-balanced rotation. The Link List lists all of the possible Personalization servers that it can contact to retrieve information.   On the right site you get can see the Whitelisted Applications and Network statistics. This is just the Session Information screen and I encourage you to check out the other options in the menu.

Link State graphic

Perhaps the best feature for my environment is the information generated with info bubbles in the systray itself.  When you first connect it lets you know the agent connected to the Personalization environment and lets you k now which site is currently active. Now let’s open Microsoft Outlook and see what happens. The PersInfo utility indicates that Outlook as been opened and that data is

being downloaded. After Outlook has started you will notice something different with it as well. Its surrounded with a colored border.

Now what is that about. Well the PersInfo utility actually puts borders around applications depending on its configuration within AppSense. In this case the green border means that the application is configured within the Personalization environment and settings are stored. There are colors for several different situations including Passive (unmanaged) and masquerading. Helpful identifying whether applications are configured.


The last piece of information we will look at is when you close the application. As shown below you will see a notice that the application’s data was successfully uploaded to the Personalization server as well as the amount of data and the throughput of the upload.





This was a really brief look at the PersInfo tool from AppSense but I  hope it peaks your interest enough to download it and run it in your environment. There are many features that I didn’t touch on like being able to control the debugging.


AppSense Personalization API: Getting Started and Loading the Proxy DLL

As you may know last week AppSense released the 8.3 version of there AppSense Environment Manager product. It has several impressive features added, like user self-service. The feature that I found the most thrilling is the inclusion of the AppSense Personalization API. Now I am not sure if this is a new API or just the documentation of the existing functionality. Nevertheless it is a huge step in automation of AppSense Personalization.  I have on multiple occasions stated that if AppSense were to at least document their API then I would be more than happy to create a PowerShell module. Its time to keep up on my end of the promise so I will be working on creating a PowerShell module for the AppSense API.

You don’t need a special module to interact with the AppSense Personalization API as it is a Windows Communication Foundation (WCF)service. However a module will encapsulate the functions and make it easier to operate against.  In the documentation for the AppSense Personalization API they even have a Windows PowerShell example for calling the API. A small snippet is included below.

[codesyntax lang=”powershell” highlight_lines=”2,4″]

# Load service model
[Reflection.Assembly]::LoadWithPartialName("System.ServiceModel") > $null
# Load proxy dll
[Reflection.Assembly]::LoadFrom("$homeProfileManagement.dll") > $null
# Create binding
$wsHttpBinding = new-object System.ServiceModel.WSHttpBinding
$wshttpBinding.MaxReceivedMessageSize = 67108864
# Create endpoint
$endpoint = new-object System.ServiceModel.EndpointAddress(“http://localhost/PersonalizationServer/ProfileManagementService.svc”)

# And return client
new-object ProfileManagementClient($wsHttpBinding,$endpoint)


In the snippet, Lines 2 and 4 load the WCF framework and the proxy dll for the Personalization service respectively.  These lines use the static methods from the System.Reflection.Assembly .Net class.

If you are using PowerShell version 2, and you should be using version 2 by now, there is a better way. Use the Add-Type cmdlet.  In the snippet below the respective lines have been replaced with the Add-Type cmdlet. This is a much cleaner implementation.

[codesyntax lang=”powershell” highlight_lines=”2,4″]

# Load service model
Add-Type -Assembly "System.ServiceModel"
# Load proxy dll
Add-Type -Path "$homeProfileManagement.dll"
# Create binding
$wsHttpBinding = new-object System.ServiceModel.WSHttpBinding
$wshttpBinding.MaxReceivedMessageSize = 67108864
# Create endpoint
$endpoint = new-object System.ServiceModel.EndpointAddress(“http://localhost/PersonalizationServer/ProfileManagementService.svc”)

# And return client
new-object ProfileManagementClient($wsHttpBinding,$endpoint)


 Related articles

Announcing Atlanta TechStravaganza 2012

I have been working hard with the leaders of some of the other Atlanta user groups as well as some very helpful Microsoft employees to organize this years Atlanta TechStravaganza. I am happy to announce this year’s event on Friday June 1st, 2012. Last year’s event was such a great hit that we decided to keep the same format. There will be three tracks with four sessions in each track. And of course there is a track dedicated to Windows PowerShell.

Back again this year with two sessions is the fan favorite Ed Wilson, Mr. Scripting Guy himself. This year we introduce Glen Gordon, Developer Evangelist for Microsoft to talk about what’s new in Windows 8 and Server 2012. And finally we have Hal Rottenberg and Jonathan Walz with PowerScripting Live, a live PowerShell roundtable that will be featured on an upcoming PowerScripting Podcast.

Don’t delay. Attendance is limited so make sure to reserve your seat now and I look forward to seeing you on June 1st. For more information visit

TechStravaganza 2012


Centennial Park

Grant Park

Piedmont Park


Track 1 System Center

Track 2 Windows Server

Track 3 PowerShell

8:00 – 8:50

Registration, Breakfast, Announcements , (sponsor) Presentation


Keynote – Jeremy Moskowitz:

Managing Desktops from the Cloud: Inside Microsoft’s Windows Intune Service


Get “AMP”ed with SCOM
Greg Cameron

VDI Implementation, Scalability and Performance metrics
Jeff Stokes

Use PowerShell to manage the remote Windows 7 workstation Ed Wilson


System Center 2012 Licensing
Melissa Seeker

Group Policy: Where it rocks (and where it needs a boost).

Jeremy Moskowitz

What’s New in Windows 8

Glen Gordon


Lunch, Prize Giveaway, 1E Presentation


Configuration Manager Servicing and Tools
Brian Huneycutt

Active Directory Forest Disasters: How they occur and how to recover

Gary Olsen

PowerShell Best Practices

Ed Wilson


Common Migration blockers for Configuration Manager 2012
Rodney Jackson

Overview of VMware vSphere Editions and vSphere Active Directory Integration

Cindy Manderson

PowerScripting Live!

Jon Walz
Hal Rottenberg


Closing Comments and Grand Prize Giveaway

TechStravaganza 2012 – Save the Date

The 3rd annual TechStravaganza 2012  will be held Friday June 1, 8am – 3pm at the Microsoft office in Alpharetta.  The past 3 TechStravaganzas have been huge successes, requiring us to limit attendance within about 2 weeks of opening registration.

More details will come in the announcement to be sent Friday April 27, but we can tell you:

·          Well known keynote speaker

·         12 technical sessions in 3 tracks: Cloud/Deployment, Windows Infrastructure, PowerShell

·         Breakfast/Lunch served

·         Prize give away

·         Best of all, it’s FREE!


Mark it on your calendar and watch for the announcement on Friday

Enhanced by Zemanta