eduroam on Aruba and Microsoft NPS – an end-to-end guide

After a meeting with our Jisc account manager a few months back we decided to join the eduroam service. This provides RADIUS based Wi-Fi access for both our students and any educational visitors who have an eligible account via their own institutions. This post outlines some of the infrastructure changes we put in place to provide the service using our Aruba controller and Microsoft NPS.


To find out more about the eduroam service and how to get started visit:

External DNS record & IP address

Create an external DNS record and assign to an external-facing IP address that will be used by eduroam to contact your RADIUS server.

Firewall access

Ensure that your firewall rules are tight and locked down to the specific eduroam NRPS servers via their IP addresses. Only allow the RADIUS port 1812 to accept connections.

Note: you must allow ping on the eduroam external IP otherwise you will get server down errors in the support portal.

Also note the status page has a 24 hour refresh period so if you need to resolve a configuration issue don’t expect everything to go green straight away 🙂

NPS configuration

You’ll refer to the follow two links a lot during this process:

Jon Agland has written an excellent step-by-step guide to configuring Microsoft NPS for eduroam. Follow the steps precisely and you won’t go wrong! You’ll also set up a CA along the way, again no drama as long as you follow the guide:

RADIUS attributes and roles

In order for the Aruba controller to be able to assign eduroam Home and Visitor traffic to specific VLANs you’ll need to send RADIUS Vendor-Specific Attributes across during authentication on the NPS server. For more background see these two very handy links:

Note the requirement to assign the attributes for VLAN assignment, which acts as a filter in case any incompatible ones come down from visiting organisations.

Aruba configuration

(replace XX with a short name for your organisation, or an identifier of your own choosing)

  • Add RADIUS Server your-nps-server configure with shared secret etc.
  • Add to Server Group XX_eduroam
  • Add L2 Authentication > 802.1X Authentication profile XX_eduroam
  • Ad AAA Profile eduroam_AAA
  • Add User Roles eduroam-logon, eduroam-home and eduroam-visitor
  • SSID Profile eduroam_SSID
  • Virtual AP eduroam_VAP


By sending RADIUS attributes across after matching a rule on NPS you can set additional rules on eduroam traffic within the controller. For example we provide a “pot” of bandwidth for all users in a particular role to share, ensuring that our Internet connection doesn’t get saturated by BYOD traffic. We have separate bandwidth contracts for staff, students and visitors.

Configure the Attributes within your NPS Network Policies (under the Settings > Vendor Specific tab)

For a list of all supported Aruba VSAs visit

Aruba PEF rules (firewall policies)

After ensuring that the mandatory eduroam applications are allowed to connected outbound we also ensure Guest Isolation is enabled (so devices can’t contact each other) and also that eduroam users can only contact specific internal services (such as Moodle) on defined ports using Aruba role-based firewall ACLs.

If you’re not using a tunnelled \ controller setup then the security rules will need to be done with switch ACLs instead.

Logging and troubleshooting

Part of the eduroam specification requires you to retain DHCP and RADUS logs for 3 months. Use the following to make the process easy on yourself…

NPS logs

NPS logs aren’t particularly human-friendly by default but with the help of this rather handy tool you can use PowerShell to search through them for particular usernames. Very handy if you’re experiencing authentication issues.

You need to change the Log File format to IAS (Legacy) for the viewer tool to work correctly. Set the options as per screenshot below:

Once done you then browse the LogFiles folder, check the current file name and use the script as per below, subsitituing “AUser” for either a username or MAC address to search for. Very handy to diagnose connection errors or in the event you need to investigate activity on the eduroam network…

C:\Windows\System32\LogFiles\NAP_Logs_Interpreter.ps1 -filename C:\Windows\System32\LogFiles\IN1805.log -SearchData AUser -SearchDays 5

(use of the SearchDays switch is optional)

DHCP logs

eduroam also requires that you keep 3 months of DHCP logs to identify users and computers that have connected to the network. Fortunately there’s a handy DHCP log file retention script available that can help (as the standard Windows functionality is rather basic to say the least).

Credit to Jason Carter for the original script and Patrick Hoban for keeping a copy alive when the original blog post went down 🙂

I made a couple of tweaks to make the script easier to edit and added some logging using the handy PowerShell Transcript method

Download my edited script from OneDrive and replace the server name variable with the name of your backup target then follow these steps to implement it.

  1. create a shared folder on your central logging server (or file server if you prefer)
  2. create a service account user to run the Scheduled Task that will archive the logs.
    I recommend a standard Domain User account with a complex password.
  3. add the account to the Backup Operators group on each DHCP server
    (this allows the script to access the files it needs)
  4. create a Scheduled Task to run the script weekly

DNS server

I decided to run a separate DNS server for the eduroam clients. That way they only resolve the internal server names we want to expose (e.g. web server, VLE etc.) and it reduces any load on our main AD infrastructure.

I’m a fan of the CentOS distro so set up a basic server and added BIND

yum install bind bind-utils -y

Then configure BIND via /etc/named.conf using as a template. We use the IBM Quad9 DNS resolver (set as a forwarder) to ensure clients don’t connect to known malicious domains.

Walled Garden for CAT

If you don’t have an onboarding tool such as Aruba ClearPass, Ruckus CloudPass etc. then the eduroam CAT tool will be your friend. Initially you’ll need to configure CAT with your eduroam certificate, organisation logo etc. to create the profiles used for configuration. Follow the guide below to set up your organisational profile:

Once done the site used for configuring clients is available at the dedicated URL

Unfortunately Android users need to download an additional app to install a configuration profile on their device. Because of this I’ve personally found it easier to tell users to download the eduroam CAT app from the Play Store as their first action then find the Havering College profile in there, rather than bouncing back and forward from browser > app > browser.


Onboarding SSID

I tried a few ways to get the eduroam CAT site to automatically open when users connect to a setup \ guest SSID, so that we can easily onboard them when they first arrive at the college.

However sometimes client devices get too clever for their own good and if they see a Captive Portal with redirect then try and open up the CAT tool within their own Captive Network Assistant mini-browser. The problem with this being that the CNA browser doesn’t support certain features & scripts, leading to the CAT page appearing but not doing very much as the profile doesn’t download or install as it normally would.

Although you may be able to do some funky URL redirection with roles on your Wi-Fi system also bear in mind some users may just want to connect to wireless to use apps and won’t touch the browser at all. At this point it seems the old fashioned methods may work best and clear signage telling users to visit the CAT site may be necessary (perhaps cat memes may well be a valid tactic?!)

In Aruba you can use the “Walled Garden” feature to set up an SSID that only allows access to the CAT website

However note that you will need to add a series of Google domains to the whitelist to ensure Google Play access is also allowed for Android users to get the eduroam CAT app.

check the About > About eduroam CAT page for the most up-to-date domain list as the Google URLs in particular change from time to time


  • (the service itself)
  •, (the CRL Distribution Points for the site certificate), also TCP/80
  • (the OCSP Responder for the site certificate), also TCP/80
  • (Google Play access for Android App)
  • (Google Play access for Android App)
  • (Google Play access for Android App)
  • (Google Play access for Android App)

RECOMMENDED for full Google Play functionality (otherwise, Play Store will look broken to users and/or some non-vital functionality will not be available)

  • *


A mandatory requirement of joining eduroam is that you provide a support page for users looking to connect to the service.

2.5. eduroam Service Information Website

2.5.1. Requirements

  1. Participants MUST publish an eduroam service information website which MUST be generally accessible from the Internet and, if applicable, within the organisation to allow visitors to access it easily on site.

Creating a Moodle course to house documentation and guides seemed a good fit for this seeing as our Student Intranet is hosted there.

If you’d like to use our course as a template for your own site get in contact and I’ll send a copy over.


Mac deployment with Jamf, DEP and more

Thanks to a funding bid from the Mayor of London we were recently able to upgrade our key Mac teaching rooms with the latest hardware for our Media students. With it brought a foray into the brave new world of Jamf Pro for management and DEP \ MDM deployment rather than the trusty DeployStudio imaging method the college has used for many years.

This post outlines some of the new things my colleague Tristan Revell and I have learnt along the way.

Most of the setup process during our Jamf Jumpstart session went smoothly but watch out when configuring your Active Directory bind settings. The service account we used initially wouldn’t connect for love nor money and in desperation I tried making a new account, avoiding any extra characters (-, _ etc) in the username and no special characters in the password… voila it worked first time. Whether that’s a bug or standard behaviour I’m unsure as it didn’t seem to be documented anywhere as a specific requirement.

DEP deployment process

Order from a well-known reseller to ensure they add your purchases into DEP correctly. Sometimes it can take a few days (or longer) to get sorted out so plan this in early if you’re on a deadline!

Once assigned add the Macs to a PreStage Enrollment group in Jamf. Remember to select all the Macs you want to activate with DEP and then hit Save when ready to deploy.

Firewall rules

In order for DEP & MDM Enrollment to work correctly you need to ensure the Macs can contact Apple’s servers for various applications, such as

  • Apple Push Notifications
  • Apple Maps (to set time and date correctly during out-of-box setup)

We allow these apps out to Apple’s IP range as per their documentation

Internet recovery and OS upgrade

We needed to upgrade the Macs straight out the box as they shipped with High Sierra but we wanted to start from the latest OS Mojave. Rather than install the older version and upgrade we decided to wipe and use Internet Recovery to start from a fresh copy of Mojave:

The downside of Internet recovery is that it not only uses Apple’s published URLs such as as found at but also a bunch of Akamai IPs as well 😦

At present we’re monitoring the firewall for traffic blocks when we do Internet Recovery and so far the IP-based whitelist entries we added have worked for a couple of weeks without change. However I don’t expect that to stay the case long-term and no doubt will be repeating the process at some point.

Naming machines

For reasons only known to the Jamf developers there’s no built-in functionality to name a machine when it runs through the DEP Enrollment process. The automatically-generated name created by DEP is useless to us as we rely on room-based naming for classroom management.

Fortunately there’s a workaround using a CSV file (or Google Docs if you prefer) that can do a lookup based on serial number and name the Mac to something of our choosing. We host the file internally and provide access to technicians via a network share so they can update the list as required.

Just make sure you run this early on in the deployment process so the correct name gets used when binding to Active Directory (more on that later)


By default the DEP Enrollment process (and subsequent Policies) run silently in the background, not giving the best first-run impression to the user, especially if something like Adobe CC is installed as part of the process! Again the community has stepped in and written the excellent SplashBuddy tool (also check out DEPNotify if your requirements aren’t as complex)

Sachin Parmar’s blog is a fantastic guide to getting the whole process up and running

In practice the only thing we’ve found different to his steps is the order we created the components; due to naming and selecting order of deployment we found it easier to do it like this

  1. create DEP Policies in Jamf
  2. create Packages in Jamf Composer
  3. edit the io.fti.SplashBuddy.plist file to reference the Packages named above

Apart from that the process works like a charm and looks spot on too!

I’ve took a few liberties with the screenshot below showing SplashBuddy doing its thing as I could never get a photo of a Mac looking as good as Thomas Quaritsch has with the ones he’s kindly uploaded for free on Unsplash 😎

Thomas Quaritsch

We wanted to skip as many of the Setup Assistant screens as possible so ticked all the options within Computers > PreStage Enrollments but the “add user” screen still appeared. After some help via the MacAdmins Slack channel it turns out the option to skip user creation is split out into the Account Settings tab…

Dock management

One area Jamf definitely need to improve upon is management of the Dock. The built-in functionality only provides configuration of the default Apple apps, which definitely aren’t the ones we’re concerned about presenting to the user (think Adobe CC, Microsoft Office etc. instead)

We found two third-party tools that work better than the Jamf functionality, try both and see which one works for you:

Dock Master

Dock Master has an online version and a more recent local utility to help build a custom Dock layout:

Profile Creator

In the end Tristan found another utility that he found more flexible when creating the Dock profile. It’s called Profile Creator and does exactly what it says on the tin; not just for the Dock but for numerous other settings too, such as Firefox, Chrome and many more. It’s still in beta but working well for us:

Mounting DFS shares

We’ve used Windows DFS shares under a new dedicated Namespace to store the home and shared drives for the new Macs and wanted them to auto-mount on login. Although Jamf includes built-in functionality to do something along those lines “Use UNC path from AD to get network home folder” there’s a couple of reasons we didn’t want to do this:

  • we’ve been burnt by issues relating to network homes in the past and found local homes provide a more reliable experience
  • we don’t currently populate the Home folder attribute in Active Directory (Folder Redirection works much better) and didn’t want to change that configuration in case of any unforeseen consequences by doing so

So instead we looked for what I thought would be a simple script to mount a share on login. That turned out to be quite a voyage of discovery!

Putting things in context

The first thing we learnt was that Jamf scripts that run at login run as root, so any scripts need to use a Jamf-defined parameter called $3 to obtain the currently logged-in user. We also found another setting that determines whether scripts run before the Finder loads or whether they can run asynchronously in the background. The latter setting helped with getting speedy login and the environment set as we wanted it.

Searching for the one

The first script we tried was created by Jamf themselves

However it seemed to struggle with reading the DFS namespace as well as mounting a dynamic path relating to the currently logged in user; for example we wanted to mount a path something like \\domain.fqdn\Mac\Home\$3 but the script either took the path very literally if entered as a Parameter or not at all if hardcoded in.

Other scripts we tried worked fine from Self Service (provided the user logs in first in order to get $3 populated by Jamf) but not at the Login Trigger. It was beginning to get a bit disheartening then we found the miracle cure!

Reading through various comments in the Jamf Nation forums turned up this gem from Samuel Look (scroll to second post from the bottom)

Key things it does differently to the Jamf script include

  1. using AppleScript rather than Bash to do the actual mount
  2. waiting for CoreServicesUIAgent to start i.e. desktop ready before attempting to proceed
  3. accepts a path that can be concatenated into a variable so we can do our dynamic DFS path for the user folder

We have up to four instances of this script running, depending on the combination of user groups and shared drives we wish to mount and it’s working nicely in our environment. Within Jamf we have two copies uploaded into the Scripts repository. One is the unaltered version, which uses the first user-defined Parameter (aka $4) to define the path to mount. The other script is our adjusted version, which includes a couple of extra lines to calculate the path to the user’s DFS folder path.

The snippet is below, the hashed section should give you an idea where it sits within the script:

##### START #####

# Share_Path=$4

Replace Share_Path with your share path, placing the $3 wherever your username-specific section lies.

Configure defaults

The standard behaviour in macOS 10.12 and above is to warn the user before connecting to unknown network shares with a dialog box saying “enter your name and password for the server”.

This warning (and requirement to manually click the Connect button) prevents mapped drives from loading up silently. To fix it we configured the AllowUnknownServers option using defaults write as outlined in the Apple KB below:

Kerberos SSO

Another item to tick off the list was getting SSO to work correctly across all browsers. Safari worked out the box, however Firefox and Chrome need a few tweaks:


You need to configure the following items using Profile Creator (or your tool of choice)

  • network.negotiate-auth.trusted-uris
  • network.automatic-ntlm-auth.trusted-uris

The syntax for the above is just the domain, with multiple entries separated by commas e.g.


You may also need to set this to true depending on how your server names are configured on the sites you want to SSO with

  • network.negotiate-auth.allow-non-fqdn


It’s a similar process for Chrome but slightly different settings

  • AuthNegotiateDelegateWhitelist
  • AuthServerWhitelist

Syntax for these is *domain.fqdn, same again with commas to separate multiple entries


Oddly our Centrify \ Office 365 SSO isn’t fully working at present in Chrome and presents an authentication screen whereas Firefox and Safari SSO correctly; however other Kerberos SSO sites work fine. Need to investigate further to see if it’s just a missing domain that needs adding for Chrome or if there’s a technical limitation somewhere.

I have seen a support page from another hosted SSO provider saying their service doesn’t work on Chrome for Mac, which does make me wonder if there’s a bug that needs fixing for this to work smoothly. Will test further when time permits.

Printing with PaperCut

Previously we used a bash script to map printers based on the name of the machine but wanted to try and stick to the native Jamf functionality going forward for ease of management. Initially that seemed simple enough; add the printer manually on a Mac in the office then use Jamf Admin to upload the printer to the server and create a Policy to map it.

However this didn’t seem to work and kept popping up authentication dialogs. Not good.

A swift bit of research brought up the fact that although you can set Kerberos Negotiate auth on the printer it doesn’t get carried over in the upload to Jamf. Therefore each printer needs the authentication configured e.g.

sudo lpadmin -p <printername> -o auth-info-required=negotiate

Fortunately there’s a handy script that can be used to set this for all printers mapped on a machine:

Although no-one on the forums seems to have implemented it this way it seems to work fine running the Script as an “After” action within the print policy. Effectively meaning the printers map, then get the correct authentication settings configured immediately afterwards. Checking PaperCut on a couple of test jobs shows the user authenticated correctly and jobs are running fine.

Initially we had some issues with older HP printers e.g. Color LaserJet 4600n and it seemed they may be too old to be supported in Mojave. That seems to have settled down a bit now after updating firmware to the latest (2014!) version so we’ll see how they fare and replace if need be.

Know your Limitations

Fortunately I’m not talking about the start of a self-help speech (!) Many of our environment customisations need to be applied to particular groups; most commonly staff and students. Initially we wondered how to achieve this when Scoping Policies as it seemed to require populating JSS Groups rather than using the ones in LDAP that we use for everything else.

The trick (learnt after reading the Jamf Nation forums) is to Scope using All Computers \ All Users as required then use the Limitations tab to restrict the policy to the LDAP group(s) you desire. That method works a treat 🙂

The finished product

With some assistance from our Marketing team we have a nice fresh background to go with the new hardware and software. The subtle grill effect is a nod to the days of the Mac G5 tower and the notice board area on the left contains useful information to remind students about backing up their work and general housekeeping.

We also rebrand Jamf Self Service to match the same “My Apps” terminology we use on our Windows 10 machines with ZENworks. Same concept, different platforms but keeps the user experience consistent.

Help and Advice

After reading Sachin’s blog post I joined up on the Mac Admins Slack channel, an excellent source of community advice with lots of members online at any time and all willing to help out. Was also good to get some experience with using Slack as a collaboration platform

Jamf Nation has lots of reference material in their forums and add-ons

Lastly a shout-out for our Jamf Success Managers who have been proactive at tracking us down on LinkedIn and offering assistance to make sure we get up and running, a nice touch and good customer service.

Much unboxing later (thanks go out to our work experience students for their work here) we now have all our shiny new Mac suites up and running in time for students returning from half-term 😎

gshaw0 Origins – the PC where it all began…

If I looked at my career like a movie franchise the prequel would surely involve the subject of today’s post.

Seeing as it turned 21 this year I thought it apt to go back in time to feature the machine which got me started on the path to working in IT.

So step back in time and relive some (now) retro memories of days gone by when computers were beige and floppy disks were more than just a Save icon…

Looking back I can trace my interests in computing further into childhood, starting in primary school with memories of pressing black and orange keys on what I now know was the BBC Micro. During my journey through early years education that then progressed to the Acorn Risc PC, of which I have very fond memories. However by 1997 when setting out to buy a new computer Windows was the main game in town.

Setting the scene

We start the journey in late 1997, having convinced my parents that a computer was going to help at secondary school and how it would make the perfect Christmas present we ventured down to our local Comet (remember them?) to take a look at what was available and see if we could get something under £1000. That was far from a given in those days! After initially asking the salesman whether they sold Acorn Risc PCs (spoiler, they obviously didn’t!) eventually a PC was found for £999 with a printer thrown in to sweeten the deal.

This is what I came home with, printer and speakers out of shot…

Siemens Nixdorf Xpert

The machine itself was a Siemens Nixdorf Xpert, the consumer brand of the well-known German tech company. Those Germanic roots formed quite a large part of my early IT experiences, more on that later…

  • Pentium 166MHz with MMX Technology
  • 32MB SDRAM
  • 1.5GB Fujitsu HDD
  • 3.5″ floppy drive
  • CD-ROM drive
  • 14″ CRT monitor
  • Microsoft PS/2 Mouse
  • Windows 95 OSR2
  • Microsoft Works 4.0
  • Microsoft Money

Early experiences

Starting off with Windows 95 OSR2 and Microsoft Works 4.0 was a pretty gentle introduction to computing but after only a few weeks things turned a lot more technical. The PC failed to boot and started throwing Scandisk (remember that?) errors. Initially a factory restore was tried, which wasn’t quite as straightforward as one might expect given half the process was in German, including the config.sys and autoexec.bat files (!) but I got there in the end.

That first Windows install has now turned into thousands, I dread to wonder what the actual number is!

screenshots taken from Windows 98 SE, which was what the machine has spent most of its life running, much nicer than 95. Random trivia – you could obtain much of the UI functionality of Windows 98 by installing IE4 onto Windows 95, but it still wasn’t quite as pretty

The original hard drive soldiered on for a while longer but eventually succumbed to Bad Sectors and was replaced by a 6.4GB Samsung unit. I never trusted Fujitsu HDDs after that…

It also turned out that the Lexmark (shudder) printer that was bundled in didn’t work either so troubleshooting skills were also quickly picked up before that went back and was replaced with an Epson Stylus Color 300.

The Microsoft Mouse that came with the PC probably shaped the almost claw grip I have that now only seems to comfortably fit Microsoft mice – even for gaming I can’t find anything that fits my hand better (the “new” example in the photo is now 13 years old itself!)



The machine gained a fair few upgrades along the way as I used it throughout school years, at the time fitted by the local computer repair tech we found in the local paper. I remember being inspired by his proficiency working on the (then expensive) tech and thinking “that’s what I want to do as a job”. That’s where it all began I think.

First upgrade was memory as I soon exhausted the stock 32MB. Initially it was getting a boost all the way to 128MB but a huge earthquake struck Taiwan and the price of RAM doubled overnight so ended up with 1 stick instead of two, for a total of 96MB.

After the 6.4GB Samsung drive filled up another Samsung 10.2GB HDD was added alongside. I’ve still got the receipt for that one and dug it out of my spares pile recently to re-fit at some point 🙂

The OS was upgraded to Windows 98 SE and a USB card was also added to take advantage of that fancy new connector which didn’t need a reboot to detect a new device… magic!

The other significant upgrade of note was adding a CD-RW drive in place of the existing CD-ROM. That 4x LG unit cost a cool £99.99 when it came out but proved well worth the investment as USB sticks hadn’t yet gone mass-market and transferring multiple files via floppy disk was a rather painful experience to say the least.

One upgrade this PC never had was a modem, indeed it’s never been on the Internet – ever! By the time I got my first 56k connection I’d obtained \ rescued a well-worn 486 machine which was used for web duties, mainly because it was easier to run a phone line cable to (!)

Wi-Fi was also just a pipe dream back then so if it didn’t have a cable, it wasn’t getting connected. These days although I have an Intel NIC that can go in the phrase “just because you can, doesn’t mean you should” comes to mind…

Classic software

It’s not all about the hardware and this machine also introduced me to Microsoft Office 97, many tips and tricks still relevant 21 years later even in these Office 365 times. Of course documents were filled with ClipArt and WordArt as was almost obligatory for school homework projects 🙂


I remember this being much nicer than first-gen Windows Media Player, until I discovered WinAmp

Image editing was done in a cut-down Photoshop-lite product called Adobe PhotoDeluxe, which also taught me never to trust v1.0 software as it crashed regularly, usually right in the middle of a large edit. My CTRL+S reflex was well honed by that program, plus Windows 9x’s  general tendency to do this..

that’s a genuine Windows 98 crash from yesterday, no VM necessary here folks…

What struck me at the time was the amount of system and inventory tools that Siemens Nixdorf bundled with a home machine, it felt like something you’d get for a corporate environment than your average home user. Similarly the machine came with a large suite of manuals including technical reference for the motherboard and BIOS, how often do you see that?

The software itself was called DeskView and DeskInfo. Recently we bought a Fujitsu Primergy server at work, which is a descendent of the Siemens Nixdorf computer business that was eventually bought out by Fujitsu. Sure enough the kit was built in Germany and comes with management tools called… ServerView. It felt like seeing an old friend again all those years later.

This machine also introduced me to a PC game that changed things forever, you could almost say it had Unforseen Consequences…

I only got this disc because I was looking for backup and recovery tools at the time after the HDD incident…

I remember installing the demo of this new “Half-Life” game and being initially impressed by the design and atmosphere, even if I did have to run it at the very lowest setting for it to run anything like acceptably on the 2MB (!) internal Matrox graphics chip.

I nearly gave up on it when I couldn’t get past the army grunts armed with only a crowbar and low-ammo handgun… then I beat that particular bit of level and 21 years later I’m still just as hooked.

If, like me you’re a massive Half-Life fan you need to see the next two links…

  • Unforeseen Consequences: A Half-Life Documentary –
    A brilliantly researched and produced feature documentary on the background story of Half-Life
  • Half-Life: Echoes –
    This is simply stunning, 20+ years after the original Half-Life was released comes a mod that’s right up there with Valve’s releases. Download Half-Life on Steam, install this, set difficulty to Hard and put yourself right back where the story started!


Fast forward some years to around 2002 and it had got to a point where I needed to move onto a faster AMD Athlon CPU and graphics card so with regret I had to put the Siemens Nixdorf into retirement. Unfortunately around the same time the motherboard failed and it seemed that may be it as the floppy drive rang out one last time before the boot screen went dark. I couldn’t face throwing out the machine so kept it tucked away hoping I could fix it somehow.

It took until 2007 but one day I was browsing eBay and I had a thought to randomly search for “Siemens Nixdorf Xpert” and by pure chance there it was, an identical machine being listed for spares and repairs and even better starting at 99p. One catch, it was in Germany, of course it had to be! My knowledge of German only goes about as far as how to enable the mouse driver in DOS mode so I put my faith and Google Translate and waited…

The motherboard, manuals and RAM cost cost the grand total of €3 plus shipping, what a lucky find.

One replacement motherboard fitted and the machine sprung back into life, 10 years after it first started up 🙂 I’d always wanted to put the fastest available CPU in so it was a nice bonus when the replacement board came with a Pentium 233Mhz installed. I also found a brand new keyboard in a surplus store on eBay so added that in as well as part of the rebuild.

Now repaired it was kept nearby for any retro pangs and recently moved with me into its new home, where it was time to emerge from hibernation once again. This time the hard drive needed a bit of… persuasion (yes, a well-placed thump) to free the disk heads and one Scandisk later it was back on the familiar desktop from all those years ago.

One item left on the list is the original CRT monitor, which stopped powering on some time back. I suspect a power board issue but need to find someone who’s happy working around CRTs as that’s one area I’m happy to leave well alone due to the high voltages present.

All that is old is new again

It seems that Windows 98 and associated machines have now hit the “retro” phase so I’m glad I kept it all for that nostalgia blast, never thought the sound of a floppy drive seek at boot would be so comforting!

Now it sits on the other side of the desk to my day-to-day gaming machine – the past and present side by side, just a monitor input change away…

Learn something new every day – troubleshooting DMX lighting

One thing I really enjoy about working in education is the wide range of tech that we get to work with day-to-day. Yesterday was no exception and something a bit more exotic than the usual networking queries.

Due to our work with setting up streaming media for our Creative Arts department I was asked by one of the technicians if I could try and help resolve an issue they were having with remote managed lighting in our TV studio. With trusty ThinkPad in hand I wandered up to take a look.

The lighting itself was controlled by a small USB LimeLIGHT DX2 controller
This then hooked into lighting dimmer units, which feed off to the individual lights themselves.

The software launched correctly but the unit itself was unresponsive, as were the lights when attempting to change brightness. Initially tried power cycling the dimmers, reconnecting the interface cables, reinstalling drivers and so on but with no joy. We even tried a different brand of digital DMX controller with no luck.

I then read that the DMX cables need to form a continuous daisy chain and one faulty fitting could cause the kind of issues we were experiencing. With that in mind we tried disconnecting sections of the chain until eventually realising it was some LED light fittings at the end that were causing the issue.

Upon closer inspection it was found that they’d been switched from DMX mode to manual; a quick reconfiguration by the media tech later we were back in business!


There’s something about AV cables that makes them rather satisfying to work with. Heavy-duty cables with big mechanical connectors definitely feel the part when connecting everything up!


Deploying an Azure AD Joined machine to existing hardware with MDT and Windows Autopilot

We’ve recently started a refresh of our staff laptops to provide a better remote working experience, as well as baselining a common standard to their configuration (Windows 10, Office 365, BitLocker etc.) At this point we were also faced with a decision on how best to deploy this:

  • domained with DirectAccess
  • domained with VPN
  • Azure AD Joined

Given that we’re heavy users of Office 365 I decided to give Azure AD Join a try and go for a cloud-native solution, rather than extending the reach of internal services outwards. One flaw in the plan is that I’m still trying to make a case for adding InTune to our EES agreement so have had to get a bit creative in terms of deployment rather than using MDM to do the heavy lifting.

Windows Autopilot

Whilst at Future Decoded last year I attended a demo of Windows Autopilot, which sounded a very easy way to assign Windows 10 devices and get them up and running quickly.


However looking a bit more closely it wouldn’t do here as without InTune we can’t install any additional software and on top of that the devices we’re using need upgrading to Windows 10, rather than being nice fresh kit with the latest OS already installed. That said it still has a part to play in this deployment, more on that later…

MDT time

So with that initial thought discounted we turn back to trusty MDT. Having already gained its Windows 10 deployment stripes over summer I wondered if there’s a way to make a Task Sequence that will give a similar Autopilot experience but with a bit more flexibility around apps and re-using existing kit.

A fresh Task Sequence was created to handle the usual driver packs, generic Applications and machine naming. Now time for the fun part of integrating Office 365 ProPlus and Azure AD Join!

Deploying Office 365 ProPlus

Before we get to the Azure AD Join we need to deploy some basic software to the machine such as Chrome, VLC and, of course Office Apps. Internally we use Office 2016 ProPlus but for these Azure AD Joined devices Office 365 ProPlus is a better bet in order to ensure smooth SSO from the Azure AD account.

Deployment of Office 365 ProPlus looks a lot simpler now than it was previously thanks to Microsoft creating a handy web app for generating the Configuration XML file you need for deployment.

First download the Office Deployment Tool

Then create the XML file using the Office Customization Tool, configuring your desired options

Place all the files in a folder on your MDT server, along with a new file called install.cmd using the code from Rens Hollanders’ instructions (you don’t need to use his config.xml though as the one from the Customization Tool does the same job and is a bit more granular in terms of installation options)

Finally create an MDT Application (with source files pointing to the folder above) that runs install.cmd

In your Task Sequence add this and any further Applications you wish to install in the usual place under State Restore.

If using a Volume Licensing versions of Windows I also create an Application to install the MAK product key at this point.

Preparing for Azure AD Join

Now at this point you have a pretty bog standard Task Sequence and might be wondering how we get back to the first-run wizard. The reason for this is because it’s where we get our only chance to properly join to Azure AD if we want to log into the machine using an Office 365 account, otherwise if you join later on you end up with a mix of a local account connected to Office 365, which we don’t want.

The process of getting back to that OOBE wizard is simpler than you might think and just requires one command at the end of your Task Sequence

cmd /c c:\windows\system32\sysprep\sysprep.exe /oobe /quiet /quit

This does assume a couple of things though:

  • your Deployment Share is configured with SkipDomainMembership=YES and JoinWorkgroup=WORKGROUP
    these are already set on my Deployment Share as I prefer to Join Domain manually via a TS step; that way I can control exactly when domain policies come down to the machine during deployment, or in this case not join a Domain at all
  • your FinishAction in MDT is set to SHUTDOWN
    you can either set this at Deployment Share level or override it (as I do) for a single TS by adding this step in early on…

With this configured the machine will automatically run sysprep and return to OOBE state, ready for the user (or admin) to join the machine to Azure AD via the first-run wizard.

Provisioning Packages and customisation

Now what we have so far is good, but we can go a bit further and add some InTune-esque customisation of the deployed system via the MDM method of Provisioning Packages. This will allow you to prepare an identical base set of machines that you can quickly customise by plugging a USB stick into them at the OOBE screen for any additional changes. It’s also a good insight into how the policies, or should I say CSPs (Configuration Service Providers) work.

To create a Provisioning Package you need to open the Windows ICD, which in turn is part of the ADK (don’t you just love acronyms!)

Windows Configuration Designer provisioning settings (reference)

I initially started with this side of things via the Set up School PCs app but ended up opening up the package it built manually to take a look exactly what it did. Not all the settings were required so I decided to build a new one from scratch. Again it gives a good idea what’s going on “under the hood” though 🙂


Applying the package file from a USB when OOBE appears works fine but I couldn’t resist the automated approach outlined in the post below to do it via MDT. If you’re using one of the latest builds of Windows 10 note the comment at the bottom that you don’t appear to need to sign the packages for 1803 onwards.


However I found that in MDT running the AddProvisioningPackage Powershell command with a UNC path didn’t work, giving me “path not supported errors”.

There’s not much documentation online about using this particular cmdlet (most people seem to be using DISM instead) but I found that if you map a drive letter to the Application path it works fine. My code for this is below (also includes the nifty transcript logging wrapper from so you get full visibility of the process in your MDT Logs folder)

# Determine where to do the logging 

$tsenv = New-Object -COMObject Microsoft.SMS.TSEnvironment 
$logPath = $tsenv.Value("LogPath") 
$logFile = "$logPath\$($myInvocation.MyCommand).log"
$DeployRoot = $tsenv.Value("DeployRoot")

# Start the logging 
Start-Transcript $logFile
Write-Output "Logging to $logFile"

net use P: "$DeployRoot\Applications\Provisioning Package - Your Name"

Write-Output "Adding TrustedProvisioners Registry Key"
Start-Process -filepath "C:\windows\regedit.exe" -argumentlist "/s P:\TrustedProvisioners.reg"

Write-Output "Adding Provisioning Package from folder: $DeployRoot\Applications\Provisioning Package - Your Name mapped to P:"
Add-ProvisioningPackage -Path "P:\Provisioning Package - Your Name.ppkg" -ForceInstall

# Stop logging

Note: you need to copy the entire contents of the folder where the signed exported Provisioning Package is created for the silent install to work correctly. Thanks to Dhanraj B for pointing this out in the Technet forums otherwise I may well have given up on it…

If you followed the Chris Reinking instructions to the letter you should see a successful import in your logs, which looks something like this…

IsInstalled : False
PackageID : d69c654b-3546-4b77-abcd-93f09285c123
PackageName : Provisioning Package - Your Name
PackagePath : P:\Provisioning Package - Your Name.ppkg
Description :
Rank : 1
Altitude : 5001
Version : 1.17
OwnerType : ITAdmin
Notes :
LastInstallTime : 22/11/2018 15:51:52
Result : 0__Personalization_DeployDesktopImage.provxml
Message:Provisioning succeeded
NumberOfFailures:0 (0x0)

Message:Provisioning succeeded
NumberOfFailures:0 (0x0)

Message:Provisioning succeeded
NumberOfFailures:0 (0x0)

Message:Provisioning succeeded
NumberOfFailures:0 (0x0)

Start Menu layout

One part of ICD that I found didn’t seem to work at all was Start Menu layout deployment. In Active Directory land it’s a simple GPO but despite following the MS documentation to the letter it never applied despite the Provisioning Package working otherwise.

Instead of using the ICD method I took an easy way out and created another Application in MDT, which copies the desired LayoutModification.xml to the Default User AppData folder:

cscript.exe CopyFiles.vbs C:\Users\Default\AppData\Local\Microsoft\Windows\Shell

Get CopyFiles.vbs from here

The final piece of the puzzle – automatic Azure AD Join

At this point we’re very close to having a hands-off deployment but we need a method to join the machine to Azure AD itself. Of course you can do this manually if you wish; the machine at this point can be handed to a user to complete the OOBE wizard and they’ll be able to log in with their Office 365 credentials.

Be mindful that the user who performs the Azure AD Join becomes local admin on that device. If you don’t want this you’ll need to use the method below to get a bit more control.

Initially I thought the Azure AD Join would be simple as there’s an option sitting right there in the ICD wizard

However the URL gives away what seems to be the crux of the issue in that you need an InTune license to get the Bulk Token 😦 When I try it fails miserably with the error “Bulk token retrieval failed”

Again documentation on this isn’t exactly forthcoming but there’s one particular Technet post by a Microsoft staffer that suggests it is limited by licensing.

“By the way, the error message “Bulk token retrieval failed” might be caused if there are no more licenses available for Azure AD Premium and Microsoft Intune”

Interestingly the Set up School PCs app is able to join to Azure AD in bulk, which suggests Microsoft can enable this functionality to non-InTune users when they feel like it but seemingly not for ICD use.

Windows Autopilot returns

Remember I said I’d return to Autopilot… well, “later” in the post has now arrived and it’s time to make use of it.

Without InTune we can still configure Autopilot using the Microsoft Store for Business (or Microsoft Store for Education if you’re an edu user). You’ll need to set it up if you haven’t already…

and then access it via one of the links below

Now follow the steps below:

At this point once your freshly imaged device hits the OOBE screen it’ll connect to Autopilot, apply the profile settings and skip all screens apart from bare minimum input required from the user for keyboard layout and login info. Once they log in Office 365 will already be installed, along with any other apps you provisioned via the Task Sequence and the device will be branded up as per your organisation’s design (provided you’ve configured this in the Azure Portal)

Note: during my testing Autopilot didn’t seem to work so well with a Hyper-V VM. The gather script managed to obtain 3 different sets of hardware hashes for the same VM on 3 separate image attempts, whereas on physical laptops the data was gathered consistently. One to keep an eye on but it was a case of third time lucky in this case, which allowed for some nice screenshots of the end product…

Multiple user access

An interesting observation during this process was the mysterious appearance of the “Other user” button, which I’d been chasing on Azure AD Joined machines for a while before this but without much joy. As you can imagine I was quite pleased when it popped up after running the first couple of test Task Sequences.

I’m not sure if it’s a recent Windows Update, enabling some of the “Shared PC” settings in ICD or (more likely) the additional of Azure AD Premium P1 licenses on our Office 365 tenant but it makes an Azure AD Joined machine much more usable for our use case, where machines need to be loaned out to multiple members of staff.

If anyone can clear up this quandary it’d be great to hear from you!

Note the use of the branded login screen to add some additional instructions for the user to help them log in for the first time 🙂

At this point once your freshly imaged device hits the OOBE screen it’ll connect to Autopilot, apply the profile settings and skip all screens apart from bare minimum input required from the user for keyboard layout and login info.

Once they log in Office 365 will already be installed and visible on the (customised) Start Menu, along with any other apps you provisioned via the Task Sequence and the device will be branded up as per your organisation’s design (provided you’ve configured this in the Azure Portal)

And here’s one we made earlier…

Not bad eh 😉


If Autopilot doesn’t work check network connectivity first, particularly if using a proxy server internally


image credit: blickpixel –

MDT – Windows 10 deployment watch

With our MDT environment up and running we’ve been refining our Windows 10 build over the past couple of months, sending out pilot builds to specific areas so we’re confident in the process when it comes to large-scale deployment over summer.#

This post focuses on a few Windows 10-specific tweaks that we’ve made to the Task Sequence that may be of interest…

Thin image approach

In the past I was a fan of what could be called a Hybrid image model in as much that I’d create a “Base” Reference image in a VM, usually comprised of Windows + Office + Updates. That would get captured and become the WIM file that goes into the Task Sequence.

However with Windows 10 I’ve decided to go down the completely thin approach that’s best represented as either a sandwich or hamburger depending on your culinary preference (!) Effectively the deployment gets built from its component parts, starting from an unaltered source Windows 10 WIM file extracted from its parent ISO image.

In our case we’ve settled on Education 1709 x64 as the build to deploy, due to some useful features such as OneDrive Files on Demand and Windows Defender Exploit Prevention. Along the way we’ve also used the 1607 and 1703 builds. The advantage of using the Thin image method is that we can swap the OS out at will with two clicks, rather than having to go through a Capture process that seems to have the potential for error.

Secure Boot validation

Windows 10 1709 brought in some new security features which benefit from machines being converted to UEFI rather than BIOS mode and in some cases (Windows Defender Credential Guard) needs Secure Boot too. Seeing as we need to update the BIOS > UEFI on older machines anyway it made sense to enable Secure Boot at the same time.


The question was how to ensure that a machine is correctly configured before starting the imaging process (as converting later on is far from ideal).

The answer is to run cmd.exe to send a non-zero return code if specific requirements are met:

  1. Task Sequence variable isUEFI is false and \ or
  2. UEFISecureBootEnabled registry key is 0

If the machine is configured incorrectly the Task Sequence will fail before it even starts to pull down the image. To ensure you catch it early enough add the step here:

Putting the two together looks like this:


Removing the cruft

Sadly despite Microsoft giving Education our very own specific build of Windows they didn’t extend the effort into cleaning up the junk that gets pushed down with a standard Windows 10 installation. Seriously who wants Candy Crush on their business machines?!

Fortunately scripts exist to assist with cleaning up the junk shipped with the OS so it’s suitable for deployment. Now we can do this with DISM at image level but again my aim is to avoid tinkering with the Microsoft media if possible so I prefer the following PowerShell method…

PowerShell: Removing UWP apps from Windows 10 1607/1703/1709

Disable Refresh \ Reset

Another Windows 10-specific tweak is to disable the Refresh \ Reset menu that users can access either by using the Settings app or by holding shift while a machine reboots. In our case we don’t want users to wipe their machine clean of provisioned applications and it appears that this functionality will work even without local admin rights (!)

The solution to this one came via the EduGeek forums courtesy of ErVaDy using bcdedit commands:


Place the commands below into a batch file and run as an Application or Task Sequence step:

reagentc /disable
bcdedit /deletevalue {current} recoverysequence
bcdedit /set {bootmgr} bootems off
bcdedit /set {bootmgr} advancedoptions off
bcdedit /set {bootmgr} optionsedit off
bcdedit /set {bootmgr} recoveryenabled off
bcdedit /set {current} bootems off
bcdedit /set {current} advancedoptions off
bcdedit /set {current} optionsedit off
bcdedit /set {current} bootstatuspolicy IgnoreAllFailures
bcdedit /set {current} recoveryenabled off

Updating OneDrive Files on Demand Client

In a way that only Microsoft can Windows 1709 shipped with an old version of the OneDrive client that doesn’t work with the much-anticipated Files on Demand feature straight out the box 😦

Although the client does auto-update we didn’t want any automatic sync starting without the placeholder functionality being in place so I’ve scripted an Application in the MDT Task Sequence to take ownership of the file on the newly deployed image, copy the latest version of the client over and then set everything back as it was.

For more details and the script itself please see my previous post OneDrive Files on Demand – update!

Pre-staging printer drivers

During our Windows 10 deployment we’re also migrating to a new set of Windows Print Servers, along with new GPOs to map them. However in initial testing I noted the first user to log in had a long wait whilst drivers were copied down from the server and installed.

Although subsequent logins won’t get this issue it doesn’t give a good first impression to the initial user so I wanted to find a way around it.

Step forward the very useful printui.dll 🙂


Because we’ve rationalised our print fleet over the past few years in a move towards MFDs I only have 3 drivers to cover the entire range of hardware. By using a script method I can then pre-stage the drivers onto the machine at image time and speed up that first logon significantly!

Again paste this into a batch file and call as an Application (use an Application step instead of  Run Command Line as you want the driver files copied into the Deployment Share)

cscript "prndrvr.vbs" -a -m "HP Universal Printing PCL 6" -i "%CD%\HP Universal Print Driver\pcl6-x64-6.4.x.xxxxx\hpcu196u.inf"

Note the use of %CD% to ensure the path to the driver file is resolved correctly!

WSUS resources

Although there’s nothing special about running Windows Updates in MDT (use the built-in Task Sequence steps) we noticed that our WSUS server was struggling and sometimes hung the “Install Updates” step of the Sequence. The WSUS console then become unresponsive on the server end too.

After further research it turns out our increasing number of machines needs more resource than the default WSUS limit of 2GB  in the IIS Application Pool to handle the connections. Upon making the change below it’s back to being stable again.




Run WinSAT

An oldie-but-goodie; running the WinSAT assessment tool at the end of setup will make sure your machine is properly benchmarked and appropriate performance tuning is performed by Windows. It doesn’t take long so I thought it worth continuing with:


Just add a Run Command Line step with the following in the box:

winsat.exe formal