gshaw0 Origins – the PC where it all began…

If I looked at my career like a movie franchise the prequel would surely involve the subject of today’s post.

Seeing as it turned 21 this year I thought it apt to go back in time to feature the machine which got me started on the path to working in IT.

So step back in time and relive some (now) retro memories of days gone by when computers were beige and floppy disks were more than just a Save icon…

Looking back I can trace my interests in computing further into childhood, starting in primary school with memories of pressing black and orange keys on what I now know was the BBC Micro. During my journey through early years education that then progressed to the Acorn Risc PC, of which I have very fond memories. However by 1997 when setting out to buy a new computer Windows was the main game in town.

Setting the scene

We start the journey in late 1997, having convinced my parents that a computer was going to help at secondary school and how it would make the perfect Christmas present we ventured down to our local Comet (remember them?) to take a look at what was available and see if we could get something under £1000. That was far from a given in those days! After initially asking the salesman whether they sold Acorn Risc PCs (spoiler, they obviously didn’t!) eventually a PC was found for £999 with a printer thrown in to sweeten the deal.

This is what I came home with, printer and speakers out of shot…

Siemens Nixdorf Xpert

The machine itself was a Siemens Nixdorf Xpert, the consumer brand of the well-known German tech company. Those Germanic roots formed quite a large part of my early IT experiences, more on that later…

  • Pentium 166MHz with MMX Technology
  • 32MB SDRAM
  • 1.5GB Fujitsu HDD
  • 3.5″ floppy drive
  • CD-ROM drive
  • 14″ CRT monitor
  • Microsoft PS/2 Mouse
  • Windows 95 OSR2
  • Microsoft Works 4.0
  • Microsoft Money

Early experiences

Starting off with Windows 95 OSR2 and Microsoft Works 4.0 was a pretty gentle introduction to computing but after only a few weeks things turned a lot more technical. The PC failed to boot and started throwing Scandisk (remember that?) errors. Initially a factory restore was tried, which wasn’t quite as straightforward as one might expect given half the process was in German, including the config.sys and autoexec.bat files (!) but I got there in the end.

That first Windows install has now turned into thousands, I dread to wonder what the actual number is!

  
screenshots taken from Windows 98 SE, which was what the machine has spent most of its life running, much nicer than 95. Random trivia – you could obtain much of the UI functionality of Windows 98 by installing IE4 onto Windows 95, but it still wasn’t quite as pretty

The original hard drive soldiered on for a while longer but eventually succumbed to Bad Sectors and was replaced by a 6.4GB Samsung unit. I never trusted Fujitsu HDDs after that…

It also turned out that the Lexmark (shudder) printer that was bundled in didn’t work either so troubleshooting skills were also quickly picked up before that went back and was replaced with an Epson Stylus Color 300.

The Microsoft Mouse that came with the PC probably shaped the almost claw grip I have that now only seems to comfortably fit Microsoft mice – even for gaming I can’t find anything that fits my hand better (the “new” example in the photo is now 13 years old itself!)

 

Upgrades

The machine gained a fair few upgrades along the way as I used it throughout school years, at the time fitted by the local computer repair tech we found in the local paper. I remember being inspired by his proficiency working on the (then expensive) tech and thinking “that’s what I want to do as a job”. That’s where it all began I think.

First upgrade was memory as I soon exhausted the stock 32MB. Initially it was getting a boost all the way to 128MB but a huge earthquake struck Taiwan and the price of RAM doubled overnight so ended up with 1 stick instead of two, for a total of 96MB.

After the 6.4GB Samsung drive filled up another Samsung 10.2GB HDD was added alongside. I’ve still got the receipt for that one and dug it out of my spares pile recently to re-fit at some point 🙂

The OS was upgraded to Windows 98 SE and a USB card was also added to take advantage of that fancy new connector which didn’t need a reboot to detect a new device… magic!

The other significant upgrade of note was adding a CD-RW drive in place of the existing CD-ROM. That 4x LG unit cost a cool £99.99 when it came out but proved well worth the investment as USB sticks hadn’t yet gone mass-market and transferring multiple files via floppy disk was a rather painful experience to say the least.

One upgrade this PC never had was a modem, indeed it’s never been on the Internet – ever! By the time I got my first 56k connection I’d obtained \ rescued a well-worn 486 machine which was used for web duties, mainly because it was easier to run a phone line cable to (!)

Wi-Fi was also just a pipe dream back then so if it didn’t have a cable, it wasn’t getting connected. These days although I have an Intel NIC that can go in the phrase “just because you can, doesn’t mean you should” comes to mind…

Classic software

It’s not all about the hardware and this machine also introduced me to Microsoft Office 97, many tips and tricks still relevant 21 years later even in these Office 365 times. Of course documents were filled with ClipArt and WordArt as was almost obligatory for school homework projects 🙂

  


I remember this being much nicer than first-gen Windows Media Player, until I discovered WinAmp

Image editing was done in a cut-down Photoshop-lite product called Adobe PhotoDeluxe, which also taught me never to trust v1.0 software as it crashed regularly, usually right in the middle of a large edit. My CTRL+S reflex was well honed by that program, plus Windows 9x’s  general tendency to do this..


that’s a genuine Windows 98 crash from yesterday, no VM necessary here folks…

What struck me at the time was the amount of system and inventory tools that Siemens Nixdorf bundled with a home machine, it felt like something you’d get for a corporate environment than your average home user. Similarly the machine came with a large suite of manuals including technical reference for the motherboard and BIOS, how often do you see that?

The software itself was called DeskView and DeskInfo. Recently we bought a Fujitsu Primergy server at work, which is a descendent of the Siemens Nixdorf computer business that was eventually bought out by Fujitsu. Sure enough the kit was built in Germany and comes with management tools called… ServerView. It felt like seeing an old friend again all those years later.

This machine also introduced me to a PC game that changed things forever, you could almost say it had Unforseen Consequences…


I only got this disc because I was looking for backup and recovery tools at the time after the HDD incident…

I remember installing the demo of this new “Half-Life” game and being initially impressed by the design and atmosphere, even if I did have to run it at the very lowest setting for it to run anything like acceptably on the 2MB (!) internal Matrox graphics chip.

I nearly gave up on it when I couldn’t get past the army grunts armed with only a crowbar and low-ammo handgun… then I beat that particular bit of level and 21 years later I’m still just as hooked.

If, like me you’re a massive Half-Life fan you need to see the next two links…

  • Unforeseen Consequences: A Half-Life Documentary – https://www.youtube.com/watch?v=BQLEW1c-69c
    A brilliantly researched and produced feature documentary on the background story of Half-Life
  • Half-Life: Echoes – https://www.moddb.com/mods/half-life-echoes
    This is simply stunning, 20+ years after the original Half-Life was released comes a mod that’s right up there with Valve’s releases. Download Half-Life on Steam, install this, set difficulty to Hard and put yourself right back where the story started!

Restoration

Fast forward some years to around 2002 and it had got to a point where I needed to move onto a faster AMD Athlon CPU and graphics card so with regret I had to put the Siemens Nixdorf into retirement. Unfortunately around the same time the motherboard failed and it seemed that may be it as the floppy drive rang out one last time before the boot screen went dark. I couldn’t face throwing out the machine so kept it tucked away hoping I could fix it somehow.

It took until 2007 but one day I was browsing eBay and I had a thought to randomly search for “Siemens Nixdorf Xpert” and by pure chance there it was, an identical machine being listed for spares and repairs and even better starting at 99p. One catch, it was in Germany, of course it had to be! My knowledge of German only goes about as far as how to enable the mouse driver in DOS mode so I put my faith and Google Translate and waited…

The motherboard, manuals and RAM cost cost the grand total of €3 plus shipping, what a lucky find.

One replacement motherboard fitted and the machine sprung back into life, 10 years after it first started up 🙂 I’d always wanted to put the fastest available CPU in so it was a nice bonus when the replacement board came with a Pentium 233Mhz installed. I also found a brand new keyboard in a surplus store on eBay so added that in as well as part of the rebuild.

Now repaired it was kept nearby for any retro pangs and recently moved with me into its new home, where it was time to emerge from hibernation once again. This time the hard drive needed a bit of… persuasion (yes, a well-placed thump) to free the disk heads and one Scandisk later it was back on the familiar desktop from all those years ago.

One item left on the list is the original CRT monitor, which stopped powering on some time back. I suspect a power board issue but need to find someone who’s happy working around CRTs as that’s one area I’m happy to leave well alone due to the high voltages present.

All that is old is new again

It seems that Windows 98 and associated machines have now hit the “retro” phase so I’m glad I kept it all for that nostalgia blast, never thought the sound of a floppy drive seek at boot would be so comforting!

Now it sits on the other side of the desk to my day-to-day gaming machine – the past and present side by side, just a monitor input change away…

Advertisements

Learn something new every day – troubleshooting DMX lighting

One thing I really enjoy about working in education is the wide range of tech that we get to work with day-to-day. Yesterday was no exception and something a bit more exotic than the usual networking queries.

Due to our work with setting up streaming media for our Creative Arts department I was asked by one of the technicians if I could try and help resolve an issue they were having with remote managed lighting in our TV studio. With trusty ThinkPad in hand I wandered up to take a look.

The lighting itself was controlled by a small USB LimeLIGHT DX2 controller
This then hooked into lighting dimmer units, which feed off to the individual lights themselves.

The software launched correctly but the unit itself was unresponsive, as were the lights when attempting to change brightness. Initially tried power cycling the dimmers, reconnecting the interface cables, reinstalling drivers and so on but with no joy. We even tried a different brand of digital DMX controller with no luck.

I then read that the DMX cables need to form a continuous daisy chain and one faulty fitting could cause the kind of issues we were experiencing. With that in mind we tried disconnecting sections of the chain until eventually realising it was some LED light fittings at the end that were causing the issue.

Upon closer inspection it was found that they’d been switched from DMX mode to manual; a quick reconfiguration by the media tech later we were back in business!

  

There’s something about AV cables that makes them rather satisfying to work with. Heavy-duty cables with big mechanical connectors definitely feel the part when connecting everything up!

 

Deploying an Azure AD Joined machine to existing hardware with MDT and Windows Autopilot

We’ve recently started a refresh of our staff laptops to provide a better remote working experience, as well as baselining a common standard to their configuration (Windows 10, Office 365, BitLocker etc.) At this point we were also faced with a decision on how best to deploy this:

  • domained with DirectAccess
  • domained with VPN
  • Azure AD Joined

https://techcommunity.microsoft.com/t5/Azure-Active-Directory-Identity/Azure-AD-Join-on-Windows-10-devices/ba-p/244005

Given that we’re heavy users of Office 365 I decided to give Azure AD Join a try and go for a cloud-native solution, rather than extending the reach of internal services outwards. One flaw in the plan is that I’m still trying to make a case for adding InTune to our EES agreement so have had to get a bit creative in terms of deployment rather than using MDM to do the heavy lifting.

Windows Autopilot

Whilst at Future Decoded last year I attended a demo of Windows Autopilot, which sounded a very easy way to assign Windows 10 devices and get them up and running quickly.

Ref: https://docs.microsoft.com/en-us/windows/deployment/windows-autopilot/windows-10-autopilot

However looking a bit more closely it wouldn’t do here as without InTune we can’t install any additional software and on top of that the devices we’re using need upgrading to Windows 10, rather than being nice fresh kit with the latest OS already installed. That said it still has a part to play in this deployment, more on that later…

MDT time

So with that initial thought discounted we turn back to trusty MDT. Having already gained its Windows 10 deployment stripes over summer I wondered if there’s a way to make a Task Sequence that will give a similar Autopilot experience but with a bit more flexibility around apps and re-using existing kit.

A fresh Task Sequence was created to handle the usual driver packs, generic Applications and machine naming. Now time for the fun part of integrating Office 365 ProPlus and Azure AD Join!

Deploying Office 365 ProPlus

Before we get to the Azure AD Join we need to deploy some basic software to the machine such as Chrome, VLC and, of course Office Apps. Internally we use Office 2016 ProPlus but for these Azure AD Joined devices Office 365 ProPlus is a better bet in order to ensure smooth SSO from the Azure AD account.

Deployment of Office 365 ProPlus looks a lot simpler now than it was previously thanks to Microsoft creating a handy web app for generating the Configuration XML file you need for deployment.

First download the Office Deployment Tool
https://www.microsoft.com/en-us/download/details.aspx?id=49117

Then create the XML file using the Office Customization Tool, configuring your desired options
https://config.office.com/

Place all the files in a folder on your MDT server, along with a new file called install.cmd using the code from Rens Hollanders’ instructions (you don’t need to use his config.xml though as the one from the Customization Tool does the same job and is a bit more granular in terms of installation options)
http://renshollanders.nl/2015/08/office-365-updated-deployment-guide/

Finally create an MDT Application (with source files pointing to the folder above) that runs install.cmd

In your Task Sequence add this and any further Applications you wish to install in the usual place under State Restore.

If using a Volume Licensing versions of Windows I also create an Application to install the MAK product key at this point.

Preparing for Azure AD Join

Now at this point you have a pretty bog standard Task Sequence and might be wondering how we get back to the first-run wizard. The reason for this is because it’s where we get our only chance to properly join to Azure AD if we want to log into the machine using an Office 365 account, otherwise if you join later on you end up with a mix of a local account connected to Office 365, which we don’t want.

The process of getting back to that OOBE wizard is simpler than you might think and just requires one command at the end of your Task Sequence

cmd /c c:\windows\system32\sysprep\sysprep.exe /oobe /quiet /quit

This does assume a couple of things though:

  • your Deployment Share is configured with SkipDomainMembership=YES and JoinWorkgroup=WORKGROUP
    these are already set on my Deployment Share as I prefer to Join Domain manually via a TS step; that way I can control exactly when domain policies come down to the machine during deployment, or in this case not join a Domain at all
  • your FinishAction in MDT is set to SHUTDOWN
    you can either set this at Deployment Share level or override it (as I do) for a single TS by adding this step in early on…

With this configured the machine will automatically run sysprep and return to OOBE state, ready for the user (or admin) to join the machine to Azure AD via the first-run wizard.

Provisioning Packages and customisation

Now what we have so far is good, but we can go a bit further and add some InTune-esque customisation of the deployed system via the MDM method of Provisioning Packages. This will allow you to prepare an identical base set of machines that you can quickly customise by plugging a USB stick into them at the OOBE screen for any additional changes. It’s also a good insight into how the policies, or should I say CSPs (Configuration Service Providers) work.

To create a Provisioning Package you need to open the Windows ICD, which in turn is part of the ADK (don’t you just love acronyms!)
https://docs.microsoft.com/en-us/windows-hardware/get-started/adk-install

Windows Configuration Designer provisioning settings (reference)
https://docs.microsoft.com/en-us/windows/configuration/wcd/wcd

I initially started with this side of things via the Set up School PCs app but ended up opening up the package it built manually to take a look exactly what it did. Not all the settings were required so I decided to build a new one from scratch. Again it gives a good idea what’s going on “under the hood” though 🙂

Ref: https://docs.microsoft.com/en-us/education/windows/use-set-up-school-pcs-app

Applying the package file from a USB when OOBE appears works fine but I couldn’t resist the automated approach outlined in the post below to do it via MDT. If you’re using one of the latest builds of Windows 10 note the comment at the bottom that you don’t appear to need to sign the packages for 1803 onwards.

Ref: http://chrisreinking.com/apply-a-provisioning-package-with-mdt/

However I found that in MDT running the AddProvisioningPackage Powershell command with a UNC path didn’t work, giving me “path not supported errors”.

There’s not much documentation online about using this particular cmdlet (most people seem to be using DISM instead) but I found that if you map a drive letter to the Application path it works fine. My code for this is below (also includes the nifty transcript logging wrapper from deploymentresearch.com so you get full visibility of the process in your MDT Logs folder)

# Determine where to do the logging 
# https://deploymentresearch.com/Research/Post/318/Using-PowerShell-scripts-with-MDT-2013

$tsenv = New-Object -COMObject Microsoft.SMS.TSEnvironment 
$logPath = $tsenv.Value("LogPath") 
$logFile = "$logPath\$($myInvocation.MyCommand).log"
$DeployRoot = $tsenv.Value("DeployRoot")

# Start the logging 
Start-Transcript $logFile
Write-Output "Logging to $logFile"

net use P: "$DeployRoot\Applications\Provisioning Package - Your Name"

Write-Output "Adding TrustedProvisioners Registry Key"
Start-Process -filepath "C:\windows\regedit.exe" -argumentlist "/s P:\TrustedProvisioners.reg"

Write-Output "Adding Provisioning Package from folder: $DeployRoot\Applications\Provisioning Package - Your Name mapped to P:"
Add-ProvisioningPackage -Path "P:\Provisioning Package - Your Name.ppkg" -ForceInstall

# Stop logging
Stop-Transcript

Note: you need to copy the entire contents of the folder where the signed exported Provisioning Package is created for the silent install to work correctly. Thanks to Dhanraj B for pointing this out in the Technet forums otherwise I may well have given up on it…

https://social.technet.microsoft.com/Forums/en-US/a01ad169-7aaa-42be-937a-e82169f88d4f/provisioning-and-code-signing?forum=win10itprosetup

If you followed the Chris Reinking instructions to the letter you should see a successful import in your logs, which looks something like this…

IsInstalled : False
PackageID : d69c654b-3546-4b77-abcd-93f09285c123
PackageName : Provisioning Package - Your Name
PackagePath : P:\Provisioning Package - Your Name.ppkg
Description :
Rank : 1
Altitude : 5001
Version : 1.17
OwnerType : ITAdmin
Notes :
LastInstallTime : 22/11/2018 15:51:52
Result : 0__Personalization_DeployDesktopImage.provxml
Category:Content
LastResult:Success
Message:Provisioning succeeded
NumberOfFailures:0 (0x0)

1__Personalization_DeployLockScreenImage.provxml
Category:Content
LastResult:Success
Message:Provisioning succeeded
NumberOfFailures:0 (0x0)

2__Personalization_DesktopImageUrl.provxml
Category:UxLockdown
LastResult:Success
Message:Provisioning succeeded
NumberOfFailures:0 (0x0)

3__Personalization_LockScreenImageUrl.provxml
Category:UxLockdown
LastResult:Success
Message:Provisioning succeeded
NumberOfFailures:0 (0x0)

Start Menu layout

One part of ICD that I found didn’t seem to work at all was Start Menu layout deployment. In Active Directory land it’s a simple GPO but despite following the MS documentation to the letter it never applied despite the Provisioning Package working otherwise.

Instead of using the ICD method I took an easy way out and created another Application in MDT, which copies the desired LayoutModification.xml to the Default User AppData folder:

cscript.exe CopyFiles.vbs C:\Users\Default\AppData\Local\Microsoft\Windows\Shell

Get CopyFiles.vbs from here https://mdtguy.wordpress.com/2014/07/30/how-to-copy-folders-in-mdt-like-a-boss-the-easy-way/

The final piece of the puzzle – automatic Azure AD Join

At this point we’re very close to having a hands-off deployment but we need a method to join the machine to Azure AD itself. Of course you can do this manually if you wish; the machine at this point can be handed to a user to complete the OOBE wizard and they’ll be able to log in with their Office 365 credentials.

Be mindful that the user who performs the Azure AD Join becomes local admin on that device. If you don’t want this you’ll need to use the method below to get a bit more control.

Initially I thought the Azure AD Join would be simple as there’s an option sitting right there in the ICD wizard
https://docs.microsoft.com/en-us/intune/windows-bulk-enroll

However the URL gives away what seems to be the crux of the issue in that you need an InTune license to get the Bulk Token 😦 When I try it fails miserably with the error “Bulk token retrieval failed”

Again documentation on this isn’t exactly forthcoming but there’s one particular Technet post by a Microsoft staffer that suggests it is limited by licensing.

“By the way, the error message “Bulk token retrieval failed” might be caused if there are no more licenses available for Azure AD Premium and Microsoft Intune”

Interestingly the Set up School PCs app is able to join to Azure AD in bulk, which suggests Microsoft can enable this functionality to non-InTune users when they feel like it but seemingly not for ICD use.

Windows Autopilot returns

Remember I said I’d return to Autopilot… well, “later” in the post has now arrived and it’s time to make use of it.
https://docs.microsoft.com/en-us/windows/deployment/windows-autopilot/windows-autopilot

Without InTune we can still configure Autopilot using the Microsoft Store for Business (or Microsoft Store for Education if you’re an edu user). You’ll need to set it up if you haven’t already…
https://docs.microsoft.com/en-gb/microsoft-store/windows-store-for-business-overview

and then access it via one of the links below

https://businessstore.microsoft.com/en-gb/store
https://educationstore.microsoft.com/en-gb/store

Now follow the steps below:

At this point once your freshly imaged device hits the OOBE screen it’ll connect to Autopilot, apply the profile settings and skip all screens apart from bare minimum input required from the user for keyboard layout and login info. Once they log in Office 365 will already be installed, along with any other apps you provisioned via the Task Sequence and the device will be branded up as per your organisation’s design (provided you’ve configured this in the Azure Portal)

Note: during my testing Autopilot didn’t seem to work so well with a Hyper-V VM. The gather script managed to obtain 3 different sets of hardware hashes for the same VM on 3 separate image attempts, whereas on physical laptops the data was gathered consistently. One to keep an eye on but it was a case of third time lucky in this case, which allowed for some nice screenshots of the end product…

Multiple user access

An interesting observation during this process was the mysterious appearance of the “Other user” button, which I’d been chasing on Azure AD Joined machines for a while before this but without much joy. As you can imagine I was quite pleased when it popped up after running the first couple of test Task Sequences.

I’m not sure if it’s a recent Windows Update, enabling some of the “Shared PC” settings in ICD or (more likely) the additional of Azure AD Premium P1 licenses on our Office 365 tenant but it makes an Azure AD Joined machine much more usable for our use case, where machines need to be loaned out to multiple members of staff.

If anyone can clear up this quandary it’d be great to hear from you!


Note the use of the branded login screen to add some additional instructions for the user to help them log in for the first time 🙂

At this point once your freshly imaged device hits the OOBE screen it’ll connect to Autopilot, apply the profile settings and skip all screens apart from bare minimum input required from the user for keyboard layout and login info.

Once they log in Office 365 will already be installed and visible on the (customised) Start Menu, along with any other apps you provisioned via the Task Sequence and the device will be branded up as per your organisation’s design (provided you’ve configured this in the Azure Portal)

And here’s one we made earlier…

Not bad eh 😉

Troubleshooting

If Autopilot doesn’t work check network connectivity first, particularly if using a proxy server internally

https://docs.microsoft.com/en-us/windows/deployment/windows-autopilot/windows-autopilot-requirements-network
https://docs.microsoft.com/en-us/windows/deployment/windows-autopilot/troubleshooting

 

image credit: blickpixel – https://pixabay.com/en/gift-made-surprise-loop-christmas-548290/

MDT – Windows 10 deployment watch

With our MDT environment up and running we’ve been refining our Windows 10 build over the past couple of months, sending out pilot builds to specific areas so we’re confident in the process when it comes to large-scale deployment over summer.#

This post focuses on a few Windows 10-specific tweaks that we’ve made to the Task Sequence that may be of interest…

Thin image approach

In the past I was a fan of what could be called a Hybrid image model in as much that I’d create a “Base” Reference image in a VM, usually comprised of Windows + Office + Updates. That would get captured and become the WIM file that goes into the Task Sequence.

However with Windows 10 I’ve decided to go down the completely thin approach that’s best represented as either a sandwich or hamburger depending on your culinary preference (!) Effectively the deployment gets built from its component parts, starting from an unaltered source Windows 10 WIM file extracted from its parent ISO image.

In our case we’ve settled on Education 1709 x64 as the build to deploy, due to some useful features such as OneDrive Files on Demand and Windows Defender Exploit Prevention. Along the way we’ve also used the 1607 and 1703 builds. The advantage of using the Thin image method is that we can swap the OS out at will with two clicks, rather than having to go through a Capture process that seems to have the potential for error.

Secure Boot validation

Windows 10 1709 brought in some new security features which benefit from machines being converted to UEFI rather than BIOS mode and in some cases (Windows Defender Credential Guard) needs Secure Boot too. Seeing as we need to update the BIOS > UEFI on older machines anyway it made sense to enable Secure Boot at the same time.

Ref: https://docs.microsoft.com/en-us/windows/whats-new/whats-new-windows-10-version-1709

The question was how to ensure that a machine is correctly configured before starting the imaging process (as converting later on is far from ideal).

The answer is to run cmd.exe to send a non-zero return code if specific requirements are met:

  1. Task Sequence variable isUEFI is false and \ or
  2. UEFISecureBootEnabled registry key is 0

If the machine is configured incorrectly the Task Sequence will fail before it even starts to pull down the image. To ensure you catch it early enough add the step here:

Putting the two together looks like this:

  

Removing the cruft

Sadly despite Microsoft giving Education our very own specific build of Windows they didn’t extend the effort into cleaning up the junk that gets pushed down with a standard Windows 10 installation. Seriously who wants Candy Crush on their business machines?!

Fortunately scripts exist to assist with cleaning up the junk shipped with the OS so it’s suitable for deployment. Now we can do this with DISM at image level but again my aim is to avoid tinkering with the Microsoft media if possible so I prefer the following PowerShell method…

PowerShell: Removing UWP apps from Windows 10 1607/1703/1709

Disable Refresh \ Reset

Another Windows 10-specific tweak is to disable the Refresh \ Reset menu that users can access either by using the Settings app or by holding shift while a machine reboots. In our case we don’t want users to wipe their machine clean of provisioned applications and it appears that this functionality will work even without local admin rights (!)

The solution to this one came via the EduGeek forums courtesy of ErVaDy using bcdedit commands:

Ref: http://www.edugeek.net/forums/windows-10/164236-preventing-shift-restart-into-recovery-mode-5.html

Place the commands below into a batch file and run as an Application or Task Sequence step:

reagentc /disable
bcdedit /deletevalue {current} recoverysequence
bcdedit /set {bootmgr} bootems off
bcdedit /set {bootmgr} advancedoptions off
bcdedit /set {bootmgr} optionsedit off
bcdedit /set {bootmgr} recoveryenabled off
bcdedit /set {current} bootems off
bcdedit /set {current} advancedoptions off
bcdedit /set {current} optionsedit off
bcdedit /set {current} bootstatuspolicy IgnoreAllFailures
bcdedit /set {current} recoveryenabled off

Updating OneDrive Files on Demand Client

In a way that only Microsoft can Windows 1709 shipped with an old version of the OneDrive client that doesn’t work with the much-anticipated Files on Demand feature straight out the box 😦

Although the client does auto-update we didn’t want any automatic sync starting without the placeholder functionality being in place so I’ve scripted an Application in the MDT Task Sequence to take ownership of the file on the newly deployed image, copy the latest version of the client over and then set everything back as it was.

For more details and the script itself please see my previous post OneDrive Files on Demand – update!

Pre-staging printer drivers

During our Windows 10 deployment we’re also migrating to a new set of Windows Print Servers, along with new GPOs to map them. However in initial testing I noted the first user to log in had a long wait whilst drivers were copied down from the server and installed.

Although subsequent logins won’t get this issue it doesn’t give a good first impression to the initial user so I wanted to find a way around it.

Step forward the very useful printui.dll 🙂

Ref: https://larslohmann.blogspot.co.uk/2013/12/install-printer-driver.html

Because we’ve rationalised our print fleet over the past few years in a move towards MFDs I only have 3 drivers to cover the entire range of hardware. By using a script method I can then pre-stage the drivers onto the machine at image time and speed up that first logon significantly!

Again paste this into a batch file and call as an Application (use an Application step instead of  Run Command Line as you want the driver files copied into the Deployment Share)

cscript "prndrvr.vbs" -a -m "HP Universal Printing PCL 6" -i "%CD%\HP Universal Print Driver\pcl6-x64-6.4.x.xxxxx\hpcu196u.inf"

Note the use of %CD% to ensure the path to the driver file is resolved correctly!

WSUS resources

Although there’s nothing special about running Windows Updates in MDT (use the built-in Task Sequence steps) we noticed that our WSUS server was struggling and sometimes hung the “Install Updates” step of the Sequence. The WSUS console then become unresponsive on the server end too.

After further research it turns out our increasing number of machines needs more resource than the default WSUS limit of 2GB  in the IIS Application Pool to handle the connections. Upon making the change below it’s back to being stable again.

Ref: https://sysadminplus.blogspot.co.uk/2016/11/wsus-console-crashed-after-running-some.html

Ref: https://www.saotn.org/wsuspool-keeps-crashing-stops

Ref: https://blogs.technet.microsoft.com/configurationmgr/2017/08/18/high-cpuhigh-memory-in-wsus-following-update-tuesdays

Run WinSAT

An oldie-but-goodie; running the WinSAT assessment tool at the end of setup will make sure your machine is properly benchmarked and appropriate performance tuning is performed by Windows. It doesn’t take long so I thought it worth continuing with:

Ref: https://deploymentresearch.com/Research/Post/624/Why-adding-WinSAT-formal-to-your-task-sequence-can-be-a-shiny-thing-to-do

Just add a Run Command Line step with the following in the box:

winsat.exe formal

Adobe CC and the case of the vanishing Explorer

As far as blog titles go this one has ended up sounding more like an adventure book than a technical post! Unfortunately not that exciting but useful nonetheless.

My colleague Tristan Revell has recently been building new installer packages for our Adobe CC apps but ran into an odd-yet-irritating bug where explorer.exe would disappear during the install process and not be restarted at the end, leaving the user stranded on a blank desktop.

It didn’t happen every time either, so the behaviour looked to be rather unpredictable. Everything else on the install in terms of the Adobe side went through fine

Upon reading around the support forums it seems to be an issue Adobe have been aware of for some time but still not fixed (first post in 2013 and still being reported 5 years later!)

Ref: https://forums.adobe.com/thread/1351913?start=40&tstart=0

Whilst trying to find a solution I remembered a useful script from a while back that checked if a process was running and then took action based on the results. Looked perfect to use here so we tried adding it as a post-install action on our ZCM Bundle.

Dim i 
 Dim strComputer
 Dim FindProc
Dim oShell
Set oShell = WScript.CreateObject ("WScript.Shell")
 
 strComputer = "."

FindProc = "explorer.exe"

Set objWMIService = GetObject("winmgmts:" _
 & "{impersonationLevel=impersonate}!\\" & strComputer & "\root\cimv2")
 Set colProcessList = objWMIService.ExecQuery _
 ("Select Name from Win32_Process WHERE Name LIKE '" & FindProc & "%'")

If colProcessList.count>0 then
 'wscript.echo FindProc & " is running"
 else
 'wscript.echo FindProc & " is not running"
 oShell.run "explorer.exe"
 Set oShell = Nothing
 End if

Set objWMIService = Nothing
 Set colProcessList = Nothing

Et voila, if explorer disappears during install it comes back neatly at the end; in the other instance that it’s already running no action will be taken. Sorted.

Here’s our new ZCM 2017 “My Apps” window taking shape with the new Adobe CC packages ready to go, doesn’t it look pretty 🙂

Image credit: Photo by Stefan Stefancik from Pexels https://www.pexels.com/photo/light-mountains-sky-night-42148/

MDT imaging megapost – part 2 (database automation)

With MDT installed we initially used some basic out-the-box Task Sequences to get up and running. Deployment worked as expected but it was quite a manual process (entering the machine name, selecting Applications to install and so on).

On our old ZCM \ Windows 7 imaging project we were starting from scratch to some extent with a lot of new hardware so entering certain information manually at image time was actually a desired behaviour. Not so much so now with a fairly settled estate and ever increasing time pressures – automation is name of the game.

As such the database-driven model now makes a lot more sense as we were able to export a list of machines and roles from ZENWorks so MDT could “know” what it needs to do with a machine rather than anyone needing to tell it.

SQL Installation

Nice and simple (free too) with SQL Express as per the previous post. One thing you need to watch out for is to ensure Named Pipes are enabled in SQL Server Configuration Manager or you’ll get errors when trying to connect to the database remotely.

Ref: http://www.vkernel.ro/blog/creating-and-configuring-the-mdt-database

Now go ahead and create the database itself…

Ref: https://docs.microsoft.com/en-us/windows/deployment/deploy-windows-mdt/use-the-mdt-database-to-stage-windows-10-deployment-information

Managing the database

The MDT console is functional when it comes to managing the database but it’s not the ideal interface, especially if you need to make a lot of changes as MMC can be somewhat clunky at times. Although you can use the MDT Workbench remotely it’s not perhaps something you’d want to give everyone access to.

However, there is a better way 🙂

Whilst browsing across forums I came across a link to a brilliant little tool called MDT Administrator, it’s currently hosted on the soon-to-be-defunct Codeplex site. Although Microsoft say an archive will be kept running how long for is anyone’s guess so keep a copy saved somewhere safe!

Ref: https://mdtadmin.codeplex.com

It’s a nifty HTA-based front-end that provides a much slicker way to manage your database. Adding and removing Roles is much quicker in particular, which is something we use a lot (more on that later).

One additional tweak to the setup was to create a new group of MDT Database Admins who were granted write access against their SQL login. This meant we could delegate management of the computer records in the database to technicians without needing to open up access to the full Deployment Workbench interface. Perfect for on-the-go updates as machines are moved around and replaced.

Restarting deployment

Sometimes we’ll come across a machine that isn’t in the database, usually something that’s been on the shelf for a while or a laptop that’s been “off the grid” and come back for reimaging. In those cases you only find out that there’s no record after the deployment wizard has started and you get offered a randomly-generated name starting with MININT.

You can also check this in the ZTIGather.log file to see what information was found about the machine and whether any matching records were returned from the database. This step can be handy to troubleshoot unexpected behaviours that can be caused by something a bit out the ordinary e.g. DMI information entered into BIOS incorrectly by the manufacturer, which has happened to us a few times.

To save yourself an unwanted reboot after amending a record in the database hit F8 whilst at the deployment wizard (assuming you’re in PXE environment) then type in the magic command

wpeinit

Deployment will now restart with a fresh “Gather” phase and query the database again to pick up your new record; you should then see the correct name appear in the deployment wizard.

Bulk operations

Picture the situation… you’ve had a batch of 100 new laptops arrive, who gets the painful job of entering them into the database? Answer: PowerShell!

If manual data entry leaves you cold you’ll love the next set of scripts, allowing you to create a CSV of import data then run one command et voila, lots of effort and fingers saved.

First though you need to do a little fix on the database:

Ref: https://syscenramblings.wordpress.com/2016/01/15/mdt-database-the-powershell-module-fix/

The package comes in two parts:

  1. PowerShell cmdlets: https://blogs.technet.microsoft.com/mniehaus/2009/05/14/manipulating-the-microsoft-deployment-toolkit-database-using-powershell/
  2. Import Check script: https://deploymentbunny.com/2016/04/22/os-deployment-using-the-powershell-to-work-with-the-mdt-database-module-sample-1/

The check script is rather important as without it as MDT will quite happily create duplicate records and you don’t want that! If you don’t want to do the additional checks in Active Directory you can disable those sections by commenting them out.

I then made some changes to the Import Check script so it would process a CSV file to do all the work in one go. One big change was to replace the BREAK sections with CONTINUE as I didn’t want one duplicate record error to prevent the rest of the import from running. It seems to work for me but I’d advise testing that yourself before doing the same.

Ref: http://www.computerperformance.co.uk/powershell/powershell_continue.htm

Roles

Another part of the database that comes in really useful is Roles. In our case we install different software for machines deployed in a classroom to those that go in offices. On our previous ZCM imaging system I made a custom script for the technician to select the machine type but now we can automate that via the database.

Once a Role is assigned to a machine specific Applications can be assigned. That’s neat in itself but for added flexibility you can also then query the Roles during Task Sequence execution to take specific actions based on what type of machine you’re dealing with.

Ref: https://docs.microsoft.com/en-us/windows/deployment/deploy-windows-mdt/assign-applications-using-roles-in-mdt

At the moment I’ve stuck to only using one Role per machine in the database to make life easy for myself in the Task Sequence. Reason being that way I know when I query the TS variable “Role001” it will always return the data I’m looking for i.e. is this a classroom machine or one in an office? In an ideal world I’d test with multiple machine Roles to see what order they’re returned in and split things out a bit but I’m short on time and this method works for what we need.

During the Task Sequence I can then use WMI queries to get the granularity required to deploy software for specific machines, more on that in a later post…