Adobe CC and the case of the vanishing Explorer

As far as blog titles go this one has ended up sounding more like an adventure book than a technical post! Unfortunately not that exciting but useful nonetheless.

My colleague Tristan Revell has recently been building new installer packages for our Adobe CC apps but ran into an odd-yet-irritating bug where explorer.exe would disappear during the install process and not be restarted at the end, leaving the user stranded on a blank desktop.

It didn’t happen every time either, so the behaviour looked to be rather unpredictable. Everything else on the install in terms of the Adobe side went through fine

Upon reading around the support forums it seems to be an issue Adobe have been aware of for some time but still not fixed (first post in 2013 and still being reported 5 years later!)

Ref: https://forums.adobe.com/thread/1351913?start=40&tstart=0

Whilst trying to find a solution I remembered a useful script from a while back that checked if a process was running and then took action based on the results. Looked perfect to use here so we tried adding it as a post-install action on our ZCM Bundle.

Dim i 
 Dim strComputer
 Dim FindProc
Dim oShell
Set oShell = WScript.CreateObject ("WScript.Shell")
 
 strComputer = "."

FindProc = "explorer.exe"

Set objWMIService = GetObject("winmgmts:" _
 & "{impersonationLevel=impersonate}!\\" & strComputer & "\root\cimv2")
 Set colProcessList = objWMIService.ExecQuery _
 ("Select Name from Win32_Process WHERE Name LIKE '" & FindProc & "%'")

If colProcessList.count>0 then
 'wscript.echo FindProc & " is running"
 else
 'wscript.echo FindProc & " is not running"
 oShell.run "explorer.exe"
 Set oShell = Nothing
 End if

Set objWMIService = Nothing
 Set colProcessList = Nothing

Et voila, if explorer disappears during install it comes back neatly at the end; in the other instance that it’s already running no action will be taken. Sorted.

Here’s our new ZCM 2017 “My Apps” window taking shape with the new Adobe CC packages ready to go, doesn’t it look pretty 🙂

Image credit: Photo by Stefan Stefancik from Pexels https://www.pexels.com/photo/light-mountains-sky-night-42148/

Advertisements

MDT imaging megapost – part 2 (database automation)

With MDT installed we initially used some basic out-the-box Task Sequences to get up and running. Deployment worked as expected but it was quite a manual process (entering the machine name, selecting Applications to install and so on).

On our old ZCM \ Windows 7 imaging project we were starting from scratch to some extent with a lot of new hardware so entering certain information manually at image time was actually a desired behaviour. Not so much so now with a fairly settled estate and ever increasing time pressures – automation is name of the game.

As such the database-driven model now makes a lot more sense as we were able to export a list of machines and roles from ZENWorks so MDT could “know” what it needs to do with a machine rather than anyone needing to tell it.

SQL Installation

Nice and simple (free too) with SQL Express as per the previous post. One thing you need to watch out for is to ensure Named Pipes are enabled in SQL Server Configuration Manager or you’ll get errors when trying to connect to the database remotely.

Ref: http://www.vkernel.ro/blog/creating-and-configuring-the-mdt-database

Now go ahead and create the database itself…

Ref: https://docs.microsoft.com/en-us/windows/deployment/deploy-windows-mdt/use-the-mdt-database-to-stage-windows-10-deployment-information

Managing the database

The MDT console is functional when it comes to managing the database but it’s not the ideal interface, especially if you need to make a lot of changes as MMC can be somewhat clunky at times. Although you can use the MDT Workbench remotely it’s not perhaps something you’d want to give everyone access to.

However, there is a better way 🙂

Whilst browsing across forums I came across a link to a brilliant little tool called MDT Administrator, it’s currently hosted on the soon-to-be-defunct Codeplex site. Although Microsoft say an archive will be kept running how long for is anyone’s guess so keep a copy saved somewhere safe!

Ref: https://mdtadmin.codeplex.com

It’s a nifty HTA-based front-end that provides a much slicker way to manage your database. Adding and removing Roles is much quicker in particular, which is something we use a lot (more on that later).

One additional tweak to the setup was to create a new group of MDT Database Admins who were granted write access against their SQL login. This meant we could delegate management of the computer records in the database to technicians without needing to open up access to the full Deployment Workbench interface. Perfect for on-the-go updates as machines are moved around and replaced.

Restarting deployment

Sometimes we’ll come across a machine that isn’t in the database, usually something that’s been on the shelf for a while or a laptop that’s been “off the grid” and come back for reimaging. In those cases you only find out that there’s no record after the deployment wizard has started and you get offered a randomly-generated name starting with MININT.

You can also check this in the ZTIGather.log file to see what information was found about the machine and whether any matching records were returned from the database. This step can be handy to troubleshoot unexpected behaviours that can be caused by something a bit out the ordinary e.g. DMI information entered into BIOS incorrectly by the manufacturer, which has happened to us a few times.

To save yourself an unwanted reboot after amending a record in the database hit F8 whilst at the deployment wizard (assuming you’re in PXE environment) then type in the magic command

wpeinit

Deployment will now restart with a fresh “Gather” phase and query the database again to pick up your new record; you should then see the correct name appear in the deployment wizard.

Bulk operations

Picture the situation… you’ve had a batch of 100 new laptops arrive, who gets the painful job of entering them into the database? Answer: PowerShell!

If manual data entry leaves you cold you’ll love the next set of scripts, allowing you to create a CSV of import data then run one command et voila, lots of effort and fingers saved.

First though you need to do a little fix on the database:

Ref: https://syscenramblings.wordpress.com/2016/01/15/mdt-database-the-powershell-module-fix/

The package comes in two parts:

  1. PowerShell cmdlets: https://blogs.technet.microsoft.com/mniehaus/2009/05/14/manipulating-the-microsoft-deployment-toolkit-database-using-powershell/
  2. Import Check script: https://deploymentbunny.com/2016/04/22/os-deployment-using-the-powershell-to-work-with-the-mdt-database-module-sample-1/

The check script is rather important as without it as MDT will quite happily create duplicate records and you don’t want that! If you don’t want to do the additional checks in Active Directory you can disable those sections by commenting them out.

I then made some changes to the Import Check script so it would process a CSV file to do all the work in one go. One big change was to replace the BREAK sections with CONTINUE as I didn’t want one duplicate record error to prevent the rest of the import from running. It seems to work for me but I’d advise testing that yourself before doing the same.

Ref: http://www.computerperformance.co.uk/powershell/powershell_continue.htm

Roles

Another part of the database that comes in really useful is Roles. In our case we install different software for machines deployed in a classroom to those that go in offices. On our previous ZCM imaging system I made a custom script for the technician to select the machine type but now we can automate that via the database.

Once a Role is assigned to a machine specific Applications can be assigned. That’s neat in itself but for added flexibility you can also then query the Roles during Task Sequence execution to take specific actions based on what type of machine you’re dealing with.

Ref: https://docs.microsoft.com/en-us/windows/deployment/deploy-windows-mdt/assign-applications-using-roles-in-mdt

At the moment I’ve stuck to only using one Role per machine in the database to make life easy for myself in the Task Sequence. Reason being that way I know when I query the TS variable “Role001” it will always return the data I’m looking for i.e. is this a classroom machine or one in an office? In an ideal world I’d test with multiple machine Roles to see what order they’re returned in and split things out a bit but I’m short on time and this method works for what we need.

During the Task Sequence I can then use WMI queries to get the granularity required to deploy software for specific machines, more on that in a later post…

OneDrive Files on Demand – update!

OneDrive logo

After our initial post getting the new Windows 10 1709 OneDrive client up and running with Files on Demand we had one or two little snags left to fix. Both of which are now resolved so thought I’d make a quick ICYMI post to cover the final pieces of the puzzle to getting everything up and running perfectly 🙂

Outdated client on the image

In true MS fashion the 1709 ISO ships with the old OneDrive client (epic fail) which means users have an annoying wait while it updates. There’s also the possibility to start off with the wrong client and therefore syncing files down by mistake.

I was trying out an updater script that would copy over the new client but didn’t have much success in MDT. After looking more closely at the logs with CMTrace I could see it failing on the copy operation so I added a Suspend action and tried each step manually. That flagged up an access denied error.

I then realised that MDT runs its scripts as the local Administrator user rather than SYSTEM as SCCM would, therefore the script’s permissions need tweaking for MDT use:

%SYSTEMROOT%\system32\takeown /f %SYSTEMROOT%\SysWOW64\OneDriveSetup.exe >> %SYSTEMROOT%\logs\Onedrive.log
%SYSTEMROOT%\system32\icacls %SYSTEMROOT%\SysWOW64\OneDriveSetup.exe /Grant Administrator:(F) >> %SYSTEMROOT%\logs\Onedrive.log
Copy OneDriveSetup.exe %SYSTEMROOT%\SysWOW64\OneDriveSetup.exe >> %SYSTEMROOT%\logs\Onedrive.log /Y
%SYSTEMROOT%\system32\icacls %SYSTEMROOT%\SysWOW64\OneDriveSetup.exe /Remove Administrator:(F) >> %SYSTEMROOT%\logs\Onedrive.log

This works like a charm! The updated client is installed during the Task Sequence and the first run as a user now begins with the 2017 client.

I’m also thinking of setting up a scheduled task on the MDT server to pull down the latest OneDrive client at regular intervals so the Task Sequence always deploys the latest version. That should do the trick until Microsoft see sense and push it out properly via WSUS.

Silently configure OneDrive using the primary Windows account

The final piece of the puzzle is to make the client log in via SSO so users have a fully configured OneDrive without any additional login prompts. I was puzzled by this not working initially as the GPO looks straightforward but it didn’t seem to do anything.

I’d read that the SSO relies on ADAL (aka modern authentication) so I initially wondered if our SSO provider hadn’t implemented that yet. That didn’t seem to make much sense as ADAL has been out for a while now so I hit Google a bit more deeply to try and find some further detail.

Soon came to this page, which I’m sure I’ve seen before:

Ref: https://support.office.com/en-gb/article/Use-Group-Policy-to-control-OneDrive-sync-client-settings-0ecb2cf5-8882-42b3-a6e9-be6bda30899c#silentconfig

The key (pun not intended, honest!) is the EnableADAL.reg file that’s squirrelled away at the bottom of the page. Deploy that via GPP et voila, one perfect blue OneDrive icon without any user interaction 🙂

What next?

Having got Files on Demand working how we want with minimal cache, SSO and the latest client we can now move onto piloting it with our users. I’ve been tweaking Windows 10 GPOs today for some of the newer features such as Windows Defender Security Center, Exploit Protection etc. so the configuration is looking good enough for some early adoption!

OneDrive Files on Demand – first steps

OneDrive logo

After much anticipation and playing with Windows Insider previews OneDrive Files on Demand finally hit general release alongside Windows 10 1709 (Fall Creators Update) the other week. I’ve been giving it a test drive over the past week or two along with fellow Network tech Matt Stevens – here’s a few of our observations so far along with workarounds for a couple of teething issues.

Windows 10 build

There is one pretty important requirement to bear in mind with the new Files on Demand feature; it’s only available in build 1709 and above. That means you need to be on the semi-annual (aka CB) branch rather than the LTSB route that some people have taken.

Ref: https://blog.juriba.com/windows-10-branching-timeline

It’s new features like Files on Demand that make the additional work of staying up-to-date worthwhile; so far we have a couple of hundred laptops running 1703 without too much fuss so 1709 should slot in fairly smoothly as we build our images layer-by-layer now using only the pure Microsoft WIM as a starting point.

We tamed (nuked) the built-in apps via a very handy Powershell script we found online (also see alternative version here) that runs during MDT deployment and the Start Menu default tiles are cleaned up via a GPO layout file. Configure your Windows Store for Business (or Education as case would have it), tweak a few more policies for Cortana, Telemetry etc. and Windows 10 becomes much more manageable even on the latest build.

Why Files on Demand?

If you don’t know what all the fuss is about check out the initial Insider announcement:

Ref: https://blogs.windows.com/windowsexperience/2017/06/13/onedrive-files-demand-now-available-windows-insiders/#kwLbqguOTefId6pv.97

Ref: https://blogs.office.com/en-us/2017/05/11/introducing-onedrive-files-on-demand-and-additional-features-making-it-easier-to-access-and-share-files/?eu=true

What it basically means is that we can finally integrate (huge amounts of) cloud storage with our on-premise desktops in a much tighter fashion and dispense with (unsupported) scripts or (expensive) third party tools to access OneDrive on a Windows desktop using File Explorer. It also means not having to deal with WebDAV, which always felt a horribly dated and clunky protocol to use for accessing cloud storage.

As soon as the 1709 ISO hit VLSC I grabbed it from Microsoft, slotted the new WIM into one of my MDT Task Sequences and deployed a VM to give the production version a try. It shows much promise but as always there’s some gotchas that mean nothing is ever quite straightforward.

Client version

Microsoft being Microsoft always have one shoot-self-in-foot moment whenever a new product comes out and this release was no exception. Despite having the freshly downloaded 1709 ISO I noticed that on first launch the client was showing up as 2016 and not the latest 2017 (17.3.7076.1026) that brings in Files on Demand

https://support.office.com/en-gb/article/New-OneDrive-sync-client-release-notes-845dcf18-f921-435e-bf28-4e24b95e5fc0


that’s the one that you want…

There’s a useful summary of the client install \ update process below. It does strike me as odd that the client self-updates and installs from appdata rather than being managed by WSUS.

Ref: http://deploynovellas.com/2016/05/25/install-onedrive-ngsc-update-windows-10-osd

Similarly it also takes a while to update when deployed on a clean 1709 build due to the initial client being out-of-date. This also means if a user is a bit too quick off the mark they can end up with an old-school full sync rather than Files on Demand.

I’ve been trying to replace the client during the deployment Task Sequence but more testing is required as my initial attempt failed with “Application Microsoft OneDrive 17.3.7073.1013 returned an unexpected return code: 1”.

Ref: http://model-technology.com/next-gen-onedrive-deployment-during-sccm-osd

I’ve added a Suspend action to the Task Sequence and will examine the logs to see what’s going on as the script tries to run…

Group Policy

To get more control over how the client is used grab the updated Group Policy templates from the local installation folder %localappdata%\Microsoft\OneDrive\BuildNumber\adm\

Ref: https://support.office.com/en-gb/article/Use-Group-Policy-to-control-OneDrive-sync-client-settings-0ecb2cf5-8882-42b3-a6e9-be6bda30899c

We force Files on Demand to be enabled as we don’t want sync cache eating up drive space on machines. We also configure our tenant ID (found via the Azure AD portal) so only Office 365 accounts can be used.

Configure these under Computer Settings > Administrative Templates > OneDrive

  • Allow syncing OneDrive accounts for only specific organizations > Enabled (using Tenant ID)
  • Enable OneDrive Files On-Demand > Enabled
  • Silently configure OneDrive using the primary Windows account > Enabled

I need to check if our third-party identity provider supports ADAL to make sure that last GPO setting works correctly. In the future we may well move to Azure AD Connect Passthrough authentication instead.

Clearing local cache (Free up space)

One important thing to remember about using Files on Demand is that when a file is either downloaded from the cloud, or freshly uploaded to it a cached copy will be kept on the local machine.

Over time (or with a large upload) this cache could grow and cause similar issues to what we were trying to avoid, especially with a shared machine and large volumes of users (pretty much the case for all our classroom machines)

At present it seems that no policies exist to force the “Free up space” option that removes the cached copies of files. However the article below suggests that using the new file attributes that have been brought in with 1709 can automate the process.

“Attrib.exe enables 2 core scenarios.  “attrib -U +P /s”, makes a set of files or folders always available and “attrib +U -P /s”, makes a set of files or folders online only.”

https://techcommunity.microsoft.com/t5/OneDrive-Blog/OneDrive-Files-On-Demand-For-The-Enterprise/ba-p/117234

We tried a script that runs on the root OneDrive folder and sure enough it resets all files back to Online only and reduces the space used down to a megabyte or so 🙂

cd "%userprofile%\Onedrive - Name of your Organisation"
attrib +U -P /s

Running this script on Logoff should in theory keep the cache files down to the bare minimum.

Disclaimer: we only just figured this one out today so again caveat emptor if you go and run this in production without testing it first!!!

Future Decoded 2017 highlights

Today I took a trip down to ExCeL London for Microsoft’s annual Future Decoded conference. As always it proved an interesting showcase of their future vision and gain technical insights into current and future projects. Here’s a few of my take-aways from the day…

Deploying Windows 10 with Autopilot

Although I’d read a bit about this a while back it was useful to see the Windows 10 Autopilot deployment process in action and the rationale behind using it. Given that we have been deploying some pilot Windows 10 devices to staff it does in theory help speed up that initial out-of-box process for devices that we predominantly see as cloud-managed and want to hand out without too much fuss.

Future Decoded slides: https://www.futuredecoded.com/session/fd76e051-a6a9-e711-80c2-000d3a2269dd

Ref: https://docs.microsoft.com/en-us/windows/deployment/windows-10-auto-pilot

For me this method will be applied to devices that will spend more time off the main AD network than on it and likely have a fairly simple requirements for pre-installed software. My colleagues in the office will also be pleased to hear Autopilot helps to skip the initial talking Cortana screen that’s been heard many a time so far during testing (!)

However the next part and real power of the “Modern” deployment method being showcased requires InTune in order to set up full profiles with customisable apps, settings etc. Although an MDM solution is on my wish list to get more control over roaming mobile devices it’s another software subscription bolt-on so making it an almost-necessary part of the Modern deployment experience sits a bit uneasy with me.

Another useful piece of advice was to check out Windows Analytics to help prepare for our Win10 migration project, which I need to have a proper look at tomorrow.

Ref: https://www.microsoft.com/en-us/WindowsForBusiness/windows-analytics

Microsoft Hands On labs

During the break out sessions there were plenty of Surfaces put out on the 3rd floor running “Hands On” lab training materials. These looked like they’d be perfect for students in IT courses to use for trying out Azure etc. rather than needing access to a physical lab or trial accounts in a live environment.

The content covers Windows 10, Office 365 and Azure so it’s perfect for either keeping your own skills up to date or providing students with a good few hours’ worth of e-learning material, which is interactive because you actually configure VMs rather than just watching videos.

Check them out at https://www.microsoft.com/handsonlabs

All you need is some form of Microsoft account to log in with and away you go 🙂


here’s one I made earlier…

Security & ATP

One thing 2017 will certainly be remembered for in the tech world is the high profile ransomware attacks that have brought home the realities of modern malware threats to a much broader audience than perhaps ever before. As such the session on Advanced Threat Protection was particularly interesting.

Future Decoded slides: https://www.futuredecoded.com/session/f6204a3e-e5a8-e711-80c2-000d3a2269dd

We were also recommended to check out the NCSC presentation from yesterday, another one for tomorrow’s reading list:

NCSC slides: https://www.futuredecoded.com/session/e1382eb1-01a9-e711-80c2-000d3a2269dd

The ATP offering now covers email, endpoint and Azure-based analytics. Moving to Windows 10 (1709) brings additional security and exploit protection such as:

  • Windows Defender Application Guard
  • Windows Defender Exploit Guard (aka EMET for those who remember it from Windows 7 days)

Ref: https://www.microsoft.com/en-us/windowsforbusiness/windows-atp

All of this sounds great until the dreaded “l” word comes around… yup, it’s licensing. Although none of these services grow on trees there’s only so far budgets can stretch, particularly for us Education users. One thing that’s a real problem for Education in particular is that all the new cloud-first offerings are being sold solely on a per-user basis rather than the fairer per-FTE staff method for our on-prem EES-licensed products. Costs can soon spiral upwards and make some of these offerings (Azure AD Premium I’m looking at you!) almost unobtanium

A small plea to the powers that be…

If someone from Microsoft happens to end up reading this just think of it this way… in Edu we want to make use of these new solutions and embrace the tech that’s on offer to help provide the best environment we can for users.

I’m not saying we expect Microsoft to give it all away for free (although we’d be more than happy if you’re feeling generous!) but realise that we need to protect student accounts and machines as much as we do staff and paying for a 5000-seat EMS or ATP setup is just impossible. The end result, everyone loses (well perhaps not if you’re Google, who are working hard to take that Edu market if Microsoft don’t want it for some reason) so please rethink these pricing models and help make them work for non-profits as well.

Windows Mixed Reality

Towards the end of the day I went to the Mixed Reality stand to try out the new headsets, which sit in a much more affordable price range than the incredibly-cool-but-very-pricey HoloLens. We’re currently building a new campus for construction and engineering so I was interested to see if Mixed Reality could fit in there.

https://www.microsoft.com/en-us/store/collections/vrandmixedrealityheadsets

Having tried a Lenovo headset with its associated controllers I’m impressed! Whilst VR headsets \ Google Cardboard made that first step there still felt a disconnect in terms of interacting with the world you were immersed in but the hand-held controllers help take this a step further and bring you more into the 3D virtual environment.

The out-the-box demo of walking around a house picking up and manipulating objects showed potential for me as I can imagine students being able to design in 3D using something like Maya then showcase those objects in a virtual environment using Mixed Reality.

The idea of pinning multiple virtual screens, opening Windows apps and working through the headset is also intriguing, although I suspect it needs 4K lenses for longer periods of use than the 2K ones being fitted into the kit at present.

The demo finished off with a rather addictive space invaders-style game using the VR controllers. Anyone with a Playstation VR or similar has no doubt already experienced something similar and more but it’s good to see an attempt to bring the technology into productivity tools as well. One of the opening keynotes focused heavily on HoloLens and Mixed Reality so it does seem Microsoft are really going for this area of the market.

It’s also another reason to go down the Windows 10 (1709) route as these features are only available on the new Fall Creators Update.

Fail of the day

However Microsoft wouldn’t be Microsoft if they didn’t shoot themselves in the foot from time to time. At the first Future Decoded it was the irony of queuing at a tech event to collect a piece of aper but today’s award moves the bar up a notch… step forward the Future Decoded app!

Paris Tuileries Garden Facepalm statue

At an event where you spend the whole day watching cutting-edge Azure cloud technology Microsoft hired an external company to make possibly the worst conference app I’ve ever used…

  • slow to load and required registration to view even basic content, why MS would need that data is beyond me as they spend all day scanning your badge as you move between rooms
  • website scraping to populate the app content, if I wanted a web page I’d open it directly
  • seminar sessions list that had to be manually filtered per day (looks like a GETDATE function was too difficult to implement?)
  • but the worst & most irritating was the “My Agenda” planner that didn’t generate a personal agenda at all and just scraped the keynote details from the website… hopeless

Maybe next year get some of your in-house people to showcase some of those cutting-edge Azure technologies via the app,but whatever you do don’t bring this one back!

Save yourself from insanity: Black Magic software installer

After 3 years of longingly looking at Black Magic’s stand at various AV events our wishes have been granted and we’re now proud owners of an ATEM Production Studio 4k, HyperDeck Studio Minis and DeckLink card 😀

In preparation for the new kit I’ve rebuilt our main streaming machine which runs vMix HD as it needed a bit of freshening up. It now runs Windows 10 LTSB with some added local storage and Google Drive File Stream for longer-term video archives (may as well make use of that unlimited Google Drive!)

Software install

Installing the DeckLink card looked pretty straightforward; find the PCI-E slot with 8x support, pop some software on et voila. But (you know what’s coming next)… nothing is ever as easy as it seems.

Running the software installer bombed out shortly after “trying” to install with this:

“Blackmagic Design Desktop Video Setup Wizard ended prematurely because of an error”

The fix

Having had a look around there’s a few reports of the error on the BlackMagic forums but no solutions listed.

Having noticed the installer was an MSI I thought I’d give it a go via command line instead:

msiexec /i "Desktop Video Installer v10.9.3.msi" /qb

Quelle surprise, it installed perfectly! Not sure what the installer GUI is trying to do that makes the process fail but everything is there using the msiexec method, software and drivers all looking good.

MDT imaging megapost – part 1 (our first server)

The great thing about working in the tech field is that it keeps moving on, ever changing, always evolving. That means sometimes you have to let go of systems that were once the bright shining light of progress once it becomes apparent something better has taken its place. Now is that time for my trusty ZCM 11 custom imaging system; built back in 2013 and star of a 6-part thread series I look back on now and think “wow, I actually did that”.

Until I moved imaging onto a Satellite the stats say the original Primary server pushed out over 5000 images. Given the length time the Satellite has been in place, plus the stats from our other sites that figure can easily be doubled and over the course of 4 years around 10,000 image cycles have been completed.

Compared to the previous process that was in place a huge amount of time was saved and allowed us to complete a large-scale Windows 7 migration with relative ease. Add to that a 4-year saving on ENGL license costs and my motley crew of Bash and PowerShell scripts can retire with a satisfied feeling of a job well done 🙂

The future calls, and it’s shaped like the number 10…

However we need to move on, funny enough it’s another OS migration knocking on the door that prompted the change along with a shift in hardware and environment that meant the Linux-based PXE environment was starting to hold us back.

Windows 10 support from ZCM seemed patchy at best, as was timely support for new hardware such as Surfaces and their ilk. Reading the forums and email groups didn’t inspire much confidence either so we decided to start looking elsewhere.

SCCM was the natural direction of travel but having made a substantial investment of time creating ZCM Bundles we weren’t necessarily ready to move all that just yet. Similarly ZCM Patch Management works pretty well these days for covering our 3rd-party apps. With that in mind the Microsoft Deployment Toolkit was the obvious choice.

A nice GUI-based managed scripting environment with Windows PE as the underlying OS ticked all the boxes. Oh and did I mention it’s free!

It’s time for my own MDT… Massive Deployment Thread!

What originally started as a small side-project to push Windows 10 out to a couple of trial tablets has now expanded into a core system that’s been at the heart of our summer works. With that in mind it’s time to write up the journey and the numerous tips, tricks and tools used along the way.

Many of those ideas come from some of the best deployment pros in the business such as Johan Arwidmark, Michael Niehaus and Mikael Nystrom so a big shout out for all the knowledge they share. Hopefully this post will give an idea of how we put those pieces together in a live environment.

The beginning, our first server

Initially we started out deploying MDT for the sole purpose of imaging up a batch of demo Surface 3 devices so the first thing was to spool up a new VM with all the required software and roles installed. Links can be found below to save you some time:

Early fixes and customisations

After getting the basic Deployment Share running we hit a few minor issues that need resolving, which are worth bearing in mind:

Multiple DNS namespaces

We have two domains that are in use internally, one of which usually gets appended as part of the domain join process and the other via DHCP.

In the PE environment the machine isn’t domain joined and as such the default setting in Bootstrap.ini wouldn’t connect to the deployment share as it didn’t know the correct DNS suffix to append.

Ref: https://scottisageek.wordpress.com/2011/12/22/mdt-2010-and-multiple-dns-namespaces/

…we found it quicker in our case to change the DeployRoot setting to the MDT server’s FQDN rather than short name… problem solved 🙂

Share permissions

The default permissions applied to the Deployment Share by the installation wizard weren’t set up as we liked. Can’t remember the exact reason now but looking back documentation on other sites I think the share needed locking down to prevent users viewing the Deployment Share content or (even worse) making unauthorised changes to it (!)

We now have specific AD groups and a service account set up so nominated MDT Administrators can read \ write to the share to upload Application install files etc. but the imaging account (more on that later) can only read and all other users are denied access by virtue of having no rights.

Set UK Locale

A quick an easy tweak sets up the keyboard settings for UK users in Bootstrap.ini

Ref: http://kabri.uk/2010/01/20/sample-bootstrap-ini-for-uk-deployments/

Similarly set them also in CustomSettings.ini

Ref: https://scriptimus.wordpress.com/2011/06/23/mdt-2010-sample-customsettings-ini-for-fully-automated-deployments/

There are quite a few other settings you’ll want to add in CustomSettings.ini but more detail on those will follow in relevant posts so keep your eyes peeled!

Update the Deployment Share

This is one action you’ll soon need to get into the habit of! If you make changes to the settings in any of the .ini files or add drivers that you’ll need in the PE environment (basically network and storage) then you need to update the Deployment Share.

This recompiles the Boot Images to include your changes, otherwise you’ll find all those nice new additions above make no difference whatsoever!

Think of this as step 1 / 2 to completely updating the Boot Images though. If the MDT wizard says that the Boot Images have changed you also need to copy the new WIMs over to WDS so PXE boot is using the latest images.

In WDS browse your server select Boot Images then right click as per screenshot above and click Replace Image. Browse to your Deployment Share’s Boot folder and select the correct image for each architecture.

Windows Deployment Services service won’t start

At an early point in our testing WDS decided it didn’t want to start after a server reboot and was spewing error code 0x906. We weren’t sure why and were on the verge of reinstalling from scratch when I spotted this:

Ref: https://social.technet.microsoft.com/Forums/windows/en-US/265b4b53-63ac-491f-817c-6030daa39b81/cant-start-windows-deployment-services-service?forum=itprovistadeployment

As per Aaron Tyler’s advice in the link above run the wdsutil commands to uninitialize then reinitialize the server manually pointing to the RemoteInstall folder WDS creates.

wdsutil /uninitialize-server
wdsutil /initialize-server /reminst:[PATH_TO_REMOTEINSTALL_DIRECTORY]

Next time…

That should be enough to get your first server up and running. For the second post in the series we’ll look at the MDT Database and how it turns MDT from a good imaging solution into a great one 🙂