OneDrive Files on Demand – update!

OneDrive logo

After our initial post getting the new Windows 10 1709 OneDrive client up and running with Files on Demand we had one or two little snags left to fix. Both of which are now resolved so thought I’d make a quick ICYMI post to cover the final pieces of the puzzle to getting everything up and running perfectly 🙂

Outdated client on the image

In true MS fashion the 1709 ISO ships with the old OneDrive client (epic fail) which means users have an annoying wait while it updates. There’s also the possibility to start off with the wrong client and therefore syncing files down by mistake.

I was trying out an updater script that would copy over the new client but didn’t have much success in MDT. After looking more closely at the logs with CMTrace I could see it failing on the copy operation so I added a Suspend action and tried each step manually. That flagged up an access denied error.

I then realised that MDT runs its scripts as the local Administrator user rather than SYSTEM as SCCM would, therefore the script’s permissions need tweaking for MDT use:

%SYSTEMROOT%\system32\takeown /f %SYSTEMROOT%\SysWOW64\OneDriveSetup.exe >> %SYSTEMROOT%\logs\Onedrive.log
%SYSTEMROOT%\system32\icacls %SYSTEMROOT%\SysWOW64\OneDriveSetup.exe /Grant Administrator:(F) >> %SYSTEMROOT%\logs\Onedrive.log
Copy OneDriveSetup.exe %SYSTEMROOT%\SysWOW64\OneDriveSetup.exe >> %SYSTEMROOT%\logs\Onedrive.log /Y
%SYSTEMROOT%\system32\icacls %SYSTEMROOT%\SysWOW64\OneDriveSetup.exe /Remove Administrator:(F) >> %SYSTEMROOT%\logs\Onedrive.log

This works like a charm! The updated client is installed during the Task Sequence and the first run as a user now begins with the 2017 client.

I’m also thinking of setting up a scheduled task on the MDT server to pull down the latest OneDrive client at regular intervals so the Task Sequence always deploys the latest version. That should do the trick until Microsoft see sense and push it out properly via WSUS.

Silently configure OneDrive using the primary Windows account

The final piece of the puzzle is to make the client log in via SSO so users have a fully configured OneDrive without any additional login prompts. I was puzzled by this not working initially as the GPO looks straightforward but it didn’t seem to do anything.

I’d read that the SSO relies on ADAL (aka modern authentication) so I initially wondered if our SSO provider hadn’t implemented that yet. That didn’t seem to make much sense as ADAL has been out for a while now so I hit Google a bit more deeply to try and find some further detail.

Soon came to this page, which I’m sure I’ve seen before:

Ref: https://support.office.com/en-gb/article/Use-Group-Policy-to-control-OneDrive-sync-client-settings-0ecb2cf5-8882-42b3-a6e9-be6bda30899c#silentconfig

The key (pun not intended, honest!) is the EnableADAL.reg file that’s squirrelled away at the bottom of the page. Deploy that via GPP et voila, one perfect blue OneDrive icon without any user interaction 🙂

What next?

Having got Files on Demand working how we want with minimal cache, SSO and the latest client we can now move onto piloting it with our users. I’ve been tweaking Windows 10 GPOs today for some of the newer features such as Windows Defender Security Center, Exploit Protection etc. so the configuration is looking good enough for some early adoption!

Advertisements

OneDrive Files on Demand – first steps

OneDrive logo

After much anticipation and playing with Windows Insider previews OneDrive Files on Demand finally hit general release alongside Windows 10 1709 (Fall Creators Update) the other week. I’ve been giving it a test drive over the past week or two along with fellow Network tech Matt Stevens – here’s a few of our observations so far along with workarounds for a couple of teething issues.

Windows 10 build

There is one pretty important requirement to bear in mind with the new Files on Demand feature; it’s only available in build 1709 and above. That means you need to be on the semi-annual (aka CB) branch rather than the LTSB route that some people have taken.

Ref: https://blog.juriba.com/windows-10-branching-timeline

It’s new features like Files on Demand that make the additional work of staying up-to-date worthwhile; so far we have a couple of hundred laptops running 1703 without too much fuss so 1709 should slot in fairly smoothly as we build our images layer-by-layer now using only the pure Microsoft WIM as a starting point.

We tamed (nuked) the built-in apps via a very handy Powershell script we found online (also see alternative version here) that runs during MDT deployment and the Start Menu default tiles are cleaned up via a GPO layout file. Configure your Windows Store for Business (or Education as case would have it), tweak a few more policies for Cortana, Telemetry etc. and Windows 10 becomes much more manageable even on the latest build.

Why Files on Demand?

If you don’t know what all the fuss is about check out the initial Insider announcement:

Ref: https://blogs.windows.com/windowsexperience/2017/06/13/onedrive-files-demand-now-available-windows-insiders/#kwLbqguOTefId6pv.97

Ref: https://blogs.office.com/en-us/2017/05/11/introducing-onedrive-files-on-demand-and-additional-features-making-it-easier-to-access-and-share-files/?eu=true

What it basically means is that we can finally integrate (huge amounts of) cloud storage with our on-premise desktops in a much tighter fashion and dispense with (unsupported) scripts or (expensive) third party tools to access OneDrive on a Windows desktop using File Explorer. It also means not having to deal with WebDAV, which always felt a horribly dated and clunky protocol to use for accessing cloud storage.

As soon as the 1709 ISO hit VLSC I grabbed it from Microsoft, slotted the new WIM into one of my MDT Task Sequences and deployed a VM to give the production version a try. It shows much promise but as always there’s some gotchas that mean nothing is ever quite straightforward.

Client version

Microsoft being Microsoft always have one shoot-self-in-foot moment whenever a new product comes out and this release was no exception. Despite having the freshly downloaded 1709 ISO I noticed that on first launch the client was showing up as 2016 and not the latest 2017 (17.3.7076.1026) that brings in Files on Demand

https://support.office.com/en-gb/article/New-OneDrive-sync-client-release-notes-845dcf18-f921-435e-bf28-4e24b95e5fc0


that’s the one that you want…

There’s a useful summary of the client install \ update process below. It does strike me as odd that the client self-updates and installs from appdata rather than being managed by WSUS.

Ref: http://deploynovellas.com/2016/05/25/install-onedrive-ngsc-update-windows-10-osd

Similarly it also takes a while to update when deployed on a clean 1709 build due to the initial client being out-of-date. This also means if a user is a bit too quick off the mark they can end up with an old-school full sync rather than Files on Demand.

I’ve been trying to replace the client during the deployment Task Sequence but more testing is required as my initial attempt failed with “Application Microsoft OneDrive 17.3.7073.1013 returned an unexpected return code: 1”.

Ref: http://model-technology.com/next-gen-onedrive-deployment-during-sccm-osd

I’ve added a Suspend action to the Task Sequence and will examine the logs to see what’s going on as the script tries to run…

Group Policy

To get more control over how the client is used grab the updated Group Policy templates from the local installation folder %localappdata%\Microsoft\OneDrive\BuildNumber\adm\

Ref: https://support.office.com/en-gb/article/Use-Group-Policy-to-control-OneDrive-sync-client-settings-0ecb2cf5-8882-42b3-a6e9-be6bda30899c

We force Files on Demand to be enabled as we don’t want sync cache eating up drive space on machines. We also configure our tenant ID (found via the Azure AD portal) so only Office 365 accounts can be used.

Configure these under Computer Settings > Administrative Templates > OneDrive

  • Allow syncing OneDrive accounts for only specific organizations > Enabled (using Tenant ID)
  • Enable OneDrive Files On-Demand > Enabled
  • Silently configure OneDrive using the primary Windows account > Enabled

I need to check if our third-party identity provider supports ADAL to make sure that last GPO setting works correctly. In the future we may well move to Azure AD Connect Passthrough authentication instead.

Clearing local cache (Free up space)

One important thing to remember about using Files on Demand is that when a file is either downloaded from the cloud, or freshly uploaded to it a cached copy will be kept on the local machine.

Over time (or with a large upload) this cache could grow and cause similar issues to what we were trying to avoid, especially with a shared machine and large volumes of users (pretty much the case for all our classroom machines)

At present it seems that no policies exist to force the “Free up space” option that removes the cached copies of files. However the article below suggests that using the new file attributes that have been brought in with 1709 can automate the process.

“Attrib.exe enables 2 core scenarios.  “attrib -U +P /s”, makes a set of files or folders always available and “attrib +U -P /s”, makes a set of files or folders online only.”

https://techcommunity.microsoft.com/t5/OneDrive-Blog/OneDrive-Files-On-Demand-For-The-Enterprise/ba-p/117234

We tried a script that runs on the root OneDrive folder and sure enough it resets all files back to Online only and reduces the space used down to a megabyte or so 🙂

cd "%userprofile%\Onedrive - Name of your Organisation"
attrib +U -P /s

Running this script on Logoff should in theory keep the cache files down to the bare minimum.

Disclaimer: we only just figured this one out today so again caveat emptor if you go and run this in production without testing it first!!!

Future Decoded 2017 highlights

Today I took a trip down to ExCeL London for Microsoft’s annual Future Decoded conference. As always it proved an interesting showcase of their future vision and gain technical insights into current and future projects. Here’s a few of my take-aways from the day…

Deploying Windows 10 with Autopilot

Although I’d read a bit about this a while back it was useful to see the Windows 10 Autopilot deployment process in action and the rationale behind using it. Given that we have been deploying some pilot Windows 10 devices to staff it does in theory help speed up that initial out-of-box process for devices that we predominantly see as cloud-managed and want to hand out without too much fuss.

Future Decoded slides: https://www.futuredecoded.com/session/fd76e051-a6a9-e711-80c2-000d3a2269dd

Ref: https://docs.microsoft.com/en-us/windows/deployment/windows-10-auto-pilot

For me this method will be applied to devices that will spend more time off the main AD network than on it and likely have a fairly simple requirements for pre-installed software. My colleagues in the office will also be pleased to hear Autopilot helps to skip the initial talking Cortana screen that’s been heard many a time so far during testing (!)

However the next part and real power of the “Modern” deployment method being showcased requires InTune in order to set up full profiles with customisable apps, settings etc. Although an MDM solution is on my wish list to get more control over roaming mobile devices it’s another software subscription bolt-on so making it an almost-necessary part of the Modern deployment experience sits a bit uneasy with me.

Another useful piece of advice was to check out Windows Analytics to help prepare for our Win10 migration project, which I need to have a proper look at tomorrow.

Ref: https://www.microsoft.com/en-us/WindowsForBusiness/windows-analytics

Microsoft Hands On labs

During the break out sessions there were plenty of Surfaces put out on the 3rd floor running “Hands On” lab training materials. These looked like they’d be perfect for students in IT courses to use for trying out Azure etc. rather than needing access to a physical lab or trial accounts in a live environment.

The content covers Windows 10, Office 365 and Azure so it’s perfect for either keeping your own skills up to date or providing students with a good few hours’ worth of e-learning material, which is interactive because you actually configure VMs rather than just watching videos.

Check them out at https://www.microsoft.com/handsonlabs

All you need is some form of Microsoft account to log in with and away you go 🙂


here’s one I made earlier…

Security & ATP

One thing 2017 will certainly be remembered for in the tech world is the high profile ransomware attacks that have brought home the realities of modern malware threats to a much broader audience than perhaps ever before. As such the session on Advanced Threat Protection was particularly interesting.

Future Decoded slides: https://www.futuredecoded.com/session/f6204a3e-e5a8-e711-80c2-000d3a2269dd

We were also recommended to check out the NCSC presentation from yesterday, another one for tomorrow’s reading list:

NCSC slides: https://www.futuredecoded.com/session/e1382eb1-01a9-e711-80c2-000d3a2269dd

The ATP offering now covers email, endpoint and Azure-based analytics. Moving to Windows 10 (1709) brings additional security and exploit protection such as:

  • Windows Defender Application Guard
  • Windows Defender Exploit Guard (aka EMET for those who remember it from Windows 7 days)

Ref: https://www.microsoft.com/en-us/windowsforbusiness/windows-atp

All of this sounds great until the dreaded “l” word comes around… yup, it’s licensing. Although none of these services grow on trees there’s only so far budgets can stretch, particularly for us Education users. One thing that’s a real problem for Education in particular is that all the new cloud-first offerings are being sold solely on a per-user basis rather than the fairer per-FTE staff method for our on-prem EES-licensed products. Costs can soon spiral upwards and make some of these offerings (Azure AD Premium I’m looking at you!) almost unobtanium

A small plea to the powers that be…

If someone from Microsoft happens to end up reading this just think of it this way… in Edu we want to make use of these new solutions and embrace the tech that’s on offer to help provide the best environment we can for users.

I’m not saying we expect Microsoft to give it all away for free (although we’d be more than happy if you’re feeling generous!) but realise that we need to protect student accounts and machines as much as we do staff and paying for a 5000-seat EMS or ATP setup is just impossible. The end result, everyone loses (well perhaps not if you’re Google, who are working hard to take that Edu market if Microsoft don’t want it for some reason) so please rethink these pricing models and help make them work for non-profits as well.

Windows Mixed Reality

Towards the end of the day I went to the Mixed Reality stand to try out the new headsets, which sit in a much more affordable price range than the incredibly-cool-but-very-pricey HoloLens. We’re currently building a new campus for construction and engineering so I was interested to see if Mixed Reality could fit in there.

https://www.microsoft.com/en-us/store/collections/vrandmixedrealityheadsets

Having tried a Lenovo headset with its associated controllers I’m impressed! Whilst VR headsets \ Google Cardboard made that first step there still felt a disconnect in terms of interacting with the world you were immersed in but the hand-held controllers help take this a step further and bring you more into the 3D virtual environment.

The out-the-box demo of walking around a house picking up and manipulating objects showed potential for me as I can imagine students being able to design in 3D using something like Maya then showcase those objects in a virtual environment using Mixed Reality.

The idea of pinning multiple virtual screens, opening Windows apps and working through the headset is also intriguing, although I suspect it needs 4K lenses for longer periods of use than the 2K ones being fitted into the kit at present.

The demo finished off with a rather addictive space invaders-style game using the VR controllers. Anyone with a Playstation VR or similar has no doubt already experienced something similar and more but it’s good to see an attempt to bring the technology into productivity tools as well. One of the opening keynotes focused heavily on HoloLens and Mixed Reality so it does seem Microsoft are really going for this area of the market.

It’s also another reason to go down the Windows 10 (1709) route as these features are only available on the new Fall Creators Update.

Fail of the day

However Microsoft wouldn’t be Microsoft if they didn’t shoot themselves in the foot from time to time. At the first Future Decoded it was the irony of queuing at a tech event to collect a piece of aper but today’s award moves the bar up a notch… step forward the Future Decoded app!

Paris Tuileries Garden Facepalm statue

At an event where you spend the whole day watching cutting-edge Azure cloud technology Microsoft hired an external company to make possibly the worst conference app I’ve ever used…

  • slow to load and required registration to view even basic content, why MS would need that data is beyond me as they spend all day scanning your badge as you move between rooms
  • website scraping to populate the app content, if I wanted a web page I’d open it directly
  • seminar sessions list that had to be manually filtered per day (looks like a GETDATE function was too difficult to implement?)
  • but the worst & most irritating was the “My Agenda” planner that didn’t generate a personal agenda at all and just scraped the keynote details from the website… hopeless

Maybe next year get some of your in-house people to showcase some of those cutting-edge Azure technologies via the app,but whatever you do don’t bring this one back!

MDT imaging megapost – part 1 (our first server)

The great thing about working in the tech field is that it keeps moving on, ever changing, always evolving. That means sometimes you have to let go of systems that were once the bright shining light of progress once it becomes apparent something better has taken its place. Now is that time for my trusty ZCM 11 custom imaging system; built back in 2013 and star of a 6-part thread series I look back on now and think “wow, I actually did that”.

Until I moved imaging onto a Satellite the stats say the original Primary server pushed out over 5000 images. Given the length time the Satellite has been in place, plus the stats from our other sites that figure can easily be doubled and over the course of 4 years around 10,000 image cycles have been completed.

Compared to the previous process that was in place a huge amount of time was saved and allowed us to complete a large-scale Windows 7 migration with relative ease. Add to that a 4-year saving on ENGL license costs and my motley crew of Bash and PowerShell scripts can retire with a satisfied feeling of a job well done 🙂

The future calls, and it’s shaped like the number 10…

However we need to move on, funny enough it’s another OS migration knocking on the door that prompted the change along with a shift in hardware and environment that meant the Linux-based PXE environment was starting to hold us back.

Windows 10 support from ZCM seemed patchy at best, as was timely support for new hardware such as Surfaces and their ilk. Reading the forums and email groups didn’t inspire much confidence either so we decided to start looking elsewhere.

SCCM was the natural direction of travel but having made a substantial investment of time creating ZCM Bundles we weren’t necessarily ready to move all that just yet. Similarly ZCM Patch Management works pretty well these days for covering our 3rd-party apps. With that in mind the Microsoft Deployment Toolkit was the obvious choice.

A nice GUI-based managed scripting environment with Windows PE as the underlying OS ticked all the boxes. Oh and did I mention it’s free!

It’s time for my own MDT… Massive Deployment Thread!

What originally started as a small side-project to push Windows 10 out to a couple of trial tablets has now expanded into a core system that’s been at the heart of our summer works. With that in mind it’s time to write up the journey and the numerous tips, tricks and tools used along the way.

Many of those ideas come from some of the best deployment pros in the business such as Johan Arwidmark, Michael Niehaus and Mikael Nystrom so a big shout out for all the knowledge they share. Hopefully this post will give an idea of how we put those pieces together in a live environment.

The beginning, our first server

Initially we started out deploying MDT for the sole purpose of imaging up a batch of demo Surface 3 devices so the first thing was to spool up a new VM with all the required software and roles installed. Links can be found below to save you some time:

Early fixes and customisations

After getting the basic Deployment Share running we hit a few minor issues that need resolving, which are worth bearing in mind:

Multiple DNS namespaces

We have two domains that are in use internally, one of which usually gets appended as part of the domain join process and the other via DHCP.

In the PE environment the machine isn’t domain joined and as such the default setting in Bootstrap.ini wouldn’t connect to the deployment share as it didn’t know the correct DNS suffix to append.

Ref: https://scottisageek.wordpress.com/2011/12/22/mdt-2010-and-multiple-dns-namespaces/

…we found it quicker in our case to change the DeployRoot setting to the MDT server’s FQDN rather than short name… problem solved 🙂

Share permissions

The default permissions applied to the Deployment Share by the installation wizard weren’t set up as we liked. Can’t remember the exact reason now but looking back documentation on other sites I think the share needed locking down to prevent users viewing the Deployment Share content or (even worse) making unauthorised changes to it (!)

We now have specific AD groups and a service account set up so nominated MDT Administrators can read \ write to the share to upload Application install files etc. but the imaging account (more on that later) can only read and all other users are denied access by virtue of having no rights.

Set UK Locale

A quick an easy tweak sets up the keyboard settings for UK users in Bootstrap.ini

Ref: http://kabri.uk/2010/01/20/sample-bootstrap-ini-for-uk-deployments/

Similarly set them also in CustomSettings.ini

Ref: https://scriptimus.wordpress.com/2011/06/23/mdt-2010-sample-customsettings-ini-for-fully-automated-deployments/

There are quite a few other settings you’ll want to add in CustomSettings.ini but more detail on those will follow in relevant posts so keep your eyes peeled!

Update the Deployment Share

This is one action you’ll soon need to get into the habit of! If you make changes to the settings in any of the .ini files or add drivers that you’ll need in the PE environment (basically network and storage) then you need to update the Deployment Share.

This recompiles the Boot Images to include your changes, otherwise you’ll find all those nice new additions above make no difference whatsoever!

Think of this as step 1 / 2 to completely updating the Boot Images though. If the MDT wizard says that the Boot Images have changed you also need to copy the new WIMs over to WDS so PXE boot is using the latest images.

In WDS browse your server select Boot Images then right click as per screenshot above and click Replace Image. Browse to your Deployment Share’s Boot folder and select the correct image for each architecture.

Windows Deployment Services service won’t start

At an early point in our testing WDS decided it didn’t want to start after a server reboot and was spewing error code 0x906. We weren’t sure why and were on the verge of reinstalling from scratch when I spotted this:

Ref: https://social.technet.microsoft.com/Forums/windows/en-US/265b4b53-63ac-491f-817c-6030daa39b81/cant-start-windows-deployment-services-service?forum=itprovistadeployment

As per Aaron Tyler’s advice in the link above run the wdsutil commands to uninitialize then reinitialize the server manually pointing to the RemoteInstall folder WDS creates.

wdsutil /uninitialize-server
wdsutil /initialize-server /reminst:[PATH_TO_REMOTEINSTALL_DIRECTORY]

Next time…

That should be enough to get your first server up and running. For the second post in the series we’ll look at the MDT Database and how it turns MDT from a good imaging solution into a great one 🙂

Build your own Thin-ish client with Windows 10 LTSB

After some positive user feedback from the launch of our new Server 2016-powered RDS setup I started wondering if it could have a wider use that just the remote access concept we initially wanted to address. One thought in mind was making use of old \ low-spec devices that would be a bit too clunky for running a modern OS but where the physical hardware itself was in good condition.

Chrome-OS esque distributions such as CloudReady sound nice but come at cost so I set up a little side-project to see if there’s anything that could be done with what we have on our licensing agreement or anything in the open-source space.

Looking around there do seem to be various thin-client “converter” products but again they all seem to be commercial e.g. https://www.igel.com/desktop-converter-udc/

The only other option I found was ThinStation which may also be worth a look when I have more time as it seems a bit more involved to get set up and I wanted to stick to the Microsoft RDP client for now for maximum compatibility.

Windows options

Going back some time I remember Microsoft released cut-down versions of Windows for RDS-type scenarios; going back to the XP days it was called Windows Fundamentals for Legacy PCs and morphed into Windows 7 Thin PC in its next incarnation. Effectively all I want the OS to do is boot up, log in quickly then pass the credentials to a pre-configured RDP file using the standard mstsc.exe application.

However building any solutions on a Windows 7 base going forward seems to be a false economy so I decided to have a look around to see what was available on the Windows 10 codebase – the results were interesting…

IoT is name of the day

Going forward it seems Microsoft have changed the branding for this kind of cut-down devices to Windows IoT. In fact there’s a free edition which sounds ideal but it only runs on certain devices and isn’t really geared for UI use:

Ref: https://www.theregister.co.uk/2015/05/21/first_look_windows_10_iot_core_on_raspberry_pi_2/
Ref: http://blogs.perficient.com/microsoft/2016/01/windows-10-iot-editions-explained/

Reading a bit further it appears Microsoft license an edition called Windows 10 IoT Enterprise for new thin client devices. Now it gets interesting… it seems that the OS itself is Windows 10 Enterprise LTSB but with some special OEM licensing. It just so happens the edu customers get Enterprise LTSB on EES licensing so it’s time to take a closer look!

What this does mean is that Windows 10 Enterprise LTSB gets features from the old Windows Embedded products such as Unified Write Filter, perfect for a locked down device that shouldn’t need to experience configuration changes to the base OS.

Ref: https://msdn.microsoft.com/en-us/windows/hardware/commercialize/customize/enterprise/unified-write-filter

All these features are available in Enterprise LTSB simply by going into Add \ Remove Windows Features window, look for the Device Lockdown section and add whichever ones meet your needs (more on this later).

Image & GPOs

After downloading the latest ISO the LTSB 2016 WIM was imported into MDT. I made a quick task sequence to get it up and running and deployed the OS to a Hyper-V VM.

Boot and logon speeds are very quick given the lack of any Modern Apps which usually need to be provisioned at each new login. The performance gain explains why quite a few people within education have used LTSB for their desktop builds against MS’ wishes; however they’ll miss out on new features such as the much-needed OneDrive Files on Demand that will only be provided to the Current Branch release.

In theory setting up a Mandatory Profile could speed up login even further but haven’t got round to trying that yet.

RDS domain SSO

Upon logging in with domain credentials the next aim is to seamlessly drop users into the RDS farm without any further prompts. After doing a bit of research this can be achieved by setting a couple of GPOs:

  • allow credential delegation
  • trust SHA1 signature of signed RDP file

The need to allow delegation of credentials is fairly commonly mentioned but a lot of the articles are old and don’t mention where this needs to be set in a 2016 farm. In fact you only need to allow the delegation on the FQDN of the Connection Broker based on the results of my testing so far.

Computer Configuration > Administrative Templates > System > Credentials Delegation

To avoid any unwanted prompts about trusting the signature of a signed RDP file populate the GPO mentioned above and copy \ paste the signature from the RDP file that is provided by RDWeb for whatever RDS Collection you want to connect to.

User Configuration > Administrative Templates > Windows Components > Remote Desktop Services > Remote Desktop Connection Client > Specify SHA1 thumbprints of certificates representing trusted .rdp Publishers

Custom shell

Now with the credentials side sorted out the final piece of the puzzle was to cleanly launch the session and (here’s the tricky bit) made a seamless logout once the RDS connection is closed. Now there’s a few ways to achieve the first part:

  • use the IoT Embedded Shell Launcher feature \ enable Kiosk Mode via System Image Manager
  • use the Custom User Interface User GPO

Ref: https://social.technet.microsoft.com/Forums/en-US/b4552957-45c2-4cc4-a13d-6397f06ee62e/windows-10-kiosk-build-embedded-shell-launcher-vs-custom-user-interface?forum=win10itprosetup

Ref: https://docs.microsoft.com/en-us/windows/configuration/set-up-a-kiosk-for-windows-10-for-desktop-editions

One thing to bear in mind with Shell Launcher is what happens when the shell i.e. mstsc.exe closes, you only have the choice of

  • Restart the shell.
  • Restart the device.
  • Shut down the device.
  • Do nothing

For the sake of speed logging off would be better so I decided to go with the Custom User Interface GPO – seeing as the Windows 10 device would be domain-joined anyway it also seemed a quicker more efficient way to configure multiple clients too.

Seeing as the Custom User Interface is a User GPO it goes without saying that Loopback Policy Processing needs to be enabled for the OU where the client resides. That also comes in handy for a few additional personalisation settings later on too.

The User GPO settings are summarised in the screenshot below, you can add more lock-down policies as you see fit:

Auto log-out on disconnect

Seeing as I wanted to automate the process as much as possible and all the devices would be domain managed anyway the GPO method seems to be the quickest way to achieve what I want. Also avoids needing to do an Add \ Remove Features step for each endpoint device.

Another important point is that the Shell Launcher method only provides options to relaunch the program, shut down or restart the machine. For speed I was aiming to log off the “client” when the RDS session is done so definitely going down the GPO route as a result.

In the GPO settings I initially tried the standard string you’d expect to launch a Remote Desktop session i.e. mstsc.exe C:\Default.rdp but noticed some strange behaviour:

  • Windows logs in
  • RDP file launched
  • connection starts
  • before the green bar completes i.e. handshake still in progress
  • host session logs out

This seemed like a behaviour I’ve seen with some other programs in the past where they appear to terminate mid-way through actions actually occurring. To check I tried manually with the “start” command with the same result. It appears mstsc.exe doesn’t play nicely so we need another way…

Plan b) was to monitor the mstsc.exe process then log out from the client once RDS disconnected and therefore the process was no longer running. After looking around and trying a few scripts out I settled on one I found here:

Ref: https://www.experts-exchange.com/questions/24218998/Check-if-a-process-is-running-in-vbs.html

Just add the logout command as the action to run when the desired process terminates and we have the desired behaviour. It takes a second or two to react to the process closing but there doesn’t seem to be a way to speed that up as far as I can see.

Final steps

Now just some finishing touches required to give the solution a bit of polish 🙂

  • set logon and desktop wallpaper
  • disable Task Manager and related lockdown setings

When the machine boots users see this login screen, easily customised via GPO…

After login connection to RDS is pretty much immediate and no further credential \ security prompts appear…

UWF

The final piece of the puzzle is tidying up after the client has been in use for a while. That’s where the Unified Write Filter from earlier comes in handy:

Enable-WindowsOptionalFeature -Online -FeatureName Client-UnifiedWriteFilter

Then enable the filter;

uwfmgr.exe filter enable

Ref: https://docs.microsoft.com/en-us/windows-hardware/customize/enterprise/unified-write-filter
Ref: https://developer.microsoft.com/en-us/windows/iot/docs/uwf
Ref: https://deploymentresearch.com/Research/Post/632/Using-the-Unified-Write-Filter-UWF-feature-in-Windows-10

And there you have it, a locked down RDS client that will run on older hardware (Windows 10 works on pretty much anything from the last 10 years) which can be managed through your standard AD infrastructure, all using stuff you already have access to via your Campus agreement… enjoy!

Server 2016 RDS via Azure AD Application Proxy end-to-end guide

remote_desktop_blueOne of our priorities for this year was to improve our remote access offering to staff to enable more flexible working whilst outside of college. Office 365 helps greatly and has already improved functionality in many ways but there’s still some legacy applications and classic file shares that need to be provided remotely too. If at all possible we prefer the files not to leave the network so some form of virtual desktop looked the way to go.

After discounting VMware and Citrix offerings on cost grounds the improvements to Microsoft’s RDS offering in Server 2016 seemed to come at a perfect time.

Even more so now we’ve implemented Azure AD Application Proxy (more on that shortly!) We’ve also recently decommissioned some services that freed up a bit of physical hardware resource to “play” with so away we went!

Server installation

The physical hardware for now is running on some reclaimed Dell PowerEdge R610 servers; 64GB RAM, dual CPU and 6 x 15k disks in RAID10. Should be plenty to get us up and running with the RDS roles eventually split across two hosts. For now we’re running on just the one but even that’s plenty to get up and running with.

We installed Server 2016 Core running the Hyper-V role, which was simple enough. The Core role looks to be a tad more polished in Server 2016, although not new the sconfig tool got the main settings entered with fairly minimal fuss.

r610
yes it will go back in the rack once we’re done with it!

Getting the OS to update correctly wasn’t so simple due to Microsoft doing something silly to the update mechanism in the initial release of Windows 10 1607 and its equivalent Server 2016 release. Update status was stuck on “Downloading” showing no signs of progressing. In the end manually installing the latest Cumulative update release from the Microsoft Update Catalog did the trick e.g.

wusa.exe windows10.0-kb3213986-x64_a1f5adacc28b56d7728c92e318d6596d9072aec4.msu /quiet /norestart

Server roles

With Hyper-V up and running the next stage was to install our guests. We went with 3 VMs set up as follows:

  • Connection Broker \ RD Licensing
  • RD Web Access \ RD Gateway
  • RD Session Host

The original plan was to try and embrace the Server Core concept and only install the GUI where absolutely necessary. With that in mind we made the first two servers with Core and only the Session Host with a GUI. More on that soon… (!)

add-roles-wizard
RDS deployment wizard Role Services

Running the deployment through Server Manager on my desktop was easy going, Microsoft have done good work with this and the deployment doesn’t seem too far removed from the 2012 R2 guides I’ve been looking at online. We added each server to the roles as per above, got to the final screen and hit the magic Deploy button then…

"Unable to install RD Web Access role service on server"

Role service... Failed
Deployment... Cancelled

Well that didn’t go to plan! We had a look online, trying to find reasons for the failures and went through some initial troubleshooting to make sure all recent updates were installed and each server’s patches matched exactly, also enabled Powershell remoting…

Enable-PSRemoting -force

…still no joy until we found this little nugget of information…

Ref: https://social.technet.microsoft.com/Forums/Sharepoint/en-US/b5c2bae3-0e3b-4d22-b64d-a51d27f0b0e4/deploying-rds-2012-r2-unable-to-install-rd-web-access-role-service-on-server?forum=winserverTS

So it appears the RD Gateway \ RD Web Access role isn’t supported on Server Core. Of course we wouldn’t want the web-facing part of the deployment running on a server with reduced attack surface would we Microsoft… not impressed!

Ref: https://technet.microsoft.com/en-us/library/jj574158(v=ws.11).aspx

To confirm the hypothesis running Get-WindowsFeature on Server 2016 Core gives this…

server-core-available-rds-roles
Server Core

and on Server 2016 with GUI gives this…

server-gui-available-rds-roles
Server with GUI

Published names & certificate fun and games

After begrudgingly re-installing one of the VMs with a GUI (seemed quicker than trying to convert the Core install) we managed to get past the final Deploy page with 3 success bars 🙂

The first key setting we were asked for was the external FQDN for the RD Gateway, which was added to our ISP-hosted DNS records. We use a wildcard certificate to cover our external facing SSL needs, nothing out the ordinary there and went on to apply it to each of the four roles specified by the RDS Deployment wizard. A Session Collection was created for a test group and pointed at the new Session Host. All looking promising.

The RD Gateway FQDN naming in itself wasn’t a problem but led us to an interesting part of the setup relating to SSL certificates and domains. Once we had the RDS services accessible from outside the network (see below) I fired up my 4G tethering to give it a test.

The connection worked but threw up a certificate warning and it was obvious to see why. Our wildcard certificate is for *.domain.ac.uk but the Connection Broker’s published FQDN is servername.subdomain.domain.ac.uk and therefore isn’t covered.

Fortunately a Powershell script called Set-RDPublishedName exists to change this published name and works a treat! Grab it from https://gallery.technet.microsoft.com/Change-published-FQDN-for-2a029b80

You’ll also need to ensure that you can access the new published name internally, depending on what form your internal domain is vs. your external you may need to do a bit of DNS trickery with zones to get the records you need. More on that can be found at:

Ref: https://msfreaks.wordpress.com/2013/12/09/windows-2012-r2-remote-desktop-services-part-1
Ref: https://msfreaks.wordpress.com/2013/12/23/windows-2012-r2-remote-desktop-services-part-2

set-rdpublishedname
Set-RDPublishedName script in action

External access via Azure AD Application Proxy

We published the RD Gateway and RD Web Access via our new shiny Azure AD Application Proxy for a few reasons…

  • simplicity, no firewall rules or DMZ required
  • security, leverages Azure to provide the secure tunnel
  • SSO, use Kerberos Delegation to sign into RD Web Access as part of the user’s Office 365 login

I followed the excellent guides from Arjan Vroege’s blog for this, in particular the section regarding how to edit the RD Web Access webpage files… nice work Arjan!

Publish your RDS Environment with Azure and AD Proxy – Part 1 – http://www.vroege.biz/?p=2462
Publish your RDS Environment with Azure and AD Proxy – Part 2 – http://www.vroege.biz/?p=2563
Publish your RDS Environment with Azure and AD Proxy – Part 3 – http://www.vroege.biz/?p=2647

As per my previous post on Azure AD Application Proxy & Kerberos delegation use the command below to add the SPN record (replace the FQDN and server name as appropriate)

setspn -s HTTP/servername.subdomain.domain.ac.uk servername

When done the end result is a seamless login to RD Web Access via the Azure AD login page. In our case the link will eventually end up as a button on our Office 365-based Staff Intranet, therefore not requiring any further logins to get to the RDWeb app selection screen.

I particularly wanted to avoid the RDWeb login screen, which I’m amazed in 2017 still requires DIY hacks to avoid the requirement to login with the DOMAIN\username format. Thought Microsoft would’ve improved that in the Server 2016 release but evidently not.

One more gotcha

So having done all the hard work above preparing the login all that was left was to click the Remote Desktop icon and enjoy, right? Wrong.

After running the Set-RDPublishedName script the certificate warning went away and I could see the change to the new wildcard-friendly name, however the connection attempt now failed with the error “Remote Desktop can’t connect to the remote computer *connectionbrokername* for one of these reasons”

remote-desktop-cant-connect
connection failure after changing Published Name

Neither explanation made any sense as the connection was working perfectly fine until changing the Published Name. Indeed changing it back to the original FQDN of the Connection Broker restored service so it had to be something to do with that. After being stumped initially I came back after food (always helps!) then after a bit more research found this very helpful post:

Ref: https://social.technet.microsoft.com/Forums/windowsserver/en-US/4fa952bc-6842-437f-8394-281823b0e7ad/change-published-fqdn-for-2012-r2-rds?forum=winserverTS

It turns out the new FQDN we added when changing the Published Name needs to be added to RDG_RDAllConnectionBrokers Local Computer Group.

This group is used to approve connections in the Resource Authorization Policies (RD-RAP) section of RD Gateway Manager. By default only the server’s domain FQDN is present in the list (as you’d expect) so it appears unless you add the new Published Name in there the connection attempt gets denied.

To add your external published name follow these steps:

  • Server Manager > Tools > Remote Desktop Services > Remote Desktop Gateway Manager
  • expand your RD Gateway server > Policies > Resource Authorization Policies
  • Click Manage Local Computer Groups on the right hand pane
  • Select RDG_RDConnectionBrokers > Properties
  • Click the Network Resources tab
  • type the FQDN of the Published Name you supplied to the Powershell script earlier then click Add
  • OK all the way out then try your connection again

manage-locally-stored-computer-groups
RD Gateway Manager

The example below replaces the real server names with dummy entries but should illustrate the concept. The same scenario applies if your servers exist in a .local Active Directory domain (which will be the top entry) and your external domain is something different (again remember to sort out internal DNS zone entries to suit)

add-external-name-to-rdcbcomputers-group
Manage RDG_RDCBComputers group

Finishing touches

Once all the above is done you should then get a connection, there is one seemingly unavoidable credential prompt due to Microsoft persisting with using an ActiveX control to start the RDP session but perhaps one day they’ll update it (we live in hope). It seems you can use the UPN style format here which is handy as it keeps things consistent. In a way it’s a bit of a security measure so not the end of the world.

Now the connection itself is sorted out all that’s left is to tweak the Session Host to our requirements. This guide gives some nice pointers on locking down the server via GPO:

Ref: http://www.it.ltsoy.com/windows/lock-down-remote-desktop-services-server-2012

We also push out a custom Start Menu using the newer Windows 10 1607 GPO settings along with the Export-StartLayout command. Finally install any programs required, remember to change the mode of the server first:

Ref: https://technet.microsoft.com/en-us/library/ff432698.aspx

change user /install

Then once done

change user /execute

Now enjoy 🙂

rds-screenshot
Connection to Server 2016 RDS Session Based desktop via RD Web Access \ RD Gateway

Azure Active Directory Application Proxy installation and troubleshooting

11225654646_7fc9621cc9_bRecently we decided to migrate away from our legacy reverse-proxy product to something that would integrate better with our AD \ Office 365 systems. I’ve wanted to try out Azure AD Application Proxy for a while since seeing it in beta last year so this seemed a good time to get to grips with it. This post outlines a few gotchas to watch out for and some useful background reading.

Let’s start off with the initial Microsoft documentation available here

https://docs.microsoft.com/en-us/azure/active-directory/active-directory-application-proxy-get-started

Education freebies

Although Microsoft’s recent price hikes haven’t come at a good time for us in education we do get a lot of extras thrown into our Microsoft licensing agreement. One of the lesser-known ones is Azure AD Basic, which is the minimum requirement to use Azure AD Application Proxy – see comparison chart at https://www.microsoft.com/en-cy/cloud-platform/azure-active-directory-features for more info

To get your free licenses you’ll need to get in contact with your EES reseller and they’ll get them added to your tenant in a similar way to Office 365.

Applying the Azure AD Basic license is nice and simple, go to your Azure Management portal at https://manage.windowsazure.com, select your Azure AD directory then assign suitable groups to the license. What’s handy is that if you’re using Azure AD Connect to sync from your on-prem directory any new users will get automatically licensed as they come on board.

Installation

Next step in the documentation list is here:

https://docs.microsoft.com/en-us/azure/active-directory/active-directory-application-proxy-enable

I used two dedicated Server 2012 R2 VMs for our install, the connector will be installed on each so we have failover should it be required at some point. Enabling the Application Proxy in Azure is nothing more than one click in the portal

Now in theory the installation should be straightforward, nothing more than downloading the installer from the link, sign in with admin credentials and job done. However if everything went that smoothly this blog wouldn’t exist (!)

Troubleshooting 403 Forbidden errors

At the end of the installation the wizard helpfully offers to run a troubleshooter to check all is well but in fact all was far from well…

Checking Event Viewer threw up the following errors:

  • Event ID 32012
    The Connector update using the update service failed: ‘The remote server returned an error: (403) Forbidden.’. Check your firewall settings.
  • Event ID 12020
    The Connector was unable to connect to the service due to networking issues. The Connector tried to access the following URL: ‘https://***GUID***.bootstrap.msappproxy.net:8080/’

Outbound firewall settings were already configured to allow all the ports that were asked for in the documentation, proxy was disabled in Connection Settings and the firewall didn’t register any outbound traffic being blocked so what’s going on here? The mystery deepens…

Although the wizard only offers to run the troubleshooter once you can run it again manually by launching it from:

C:\Program Files\Microsoft AAD App Proxy Connector\ConnectorTroubleshooterLauncher.exe

Troubleshooting the troubleshooter

Although there’s a fair bit of documentation in the troubleshooting section on Microsoft’s pages none of it referred to this particular error. Google didn’t have much to go on either but did throw up some useful and detailed slides from the Ignite conference that are well worth a read:

Ref: https://channel9.msdn.com/Events/Ignite/2015/BRK3864
Ref: https://techcommunity.microsoft.com/t5/Microsoft-Ignite-Content/BRK3139-Throw-away-your-DMZ-Azure-Active-Directory-Application/td-p/10675

The second link references another useful document aimed purely at troubleshooting:

Ref: http://aka.ms/proxytshootpaper

Whilst searching I stumbled across an email contact for the Microsoft Azure AD Application Proxy team

aadapfeedback@microsoft.com 

so I dropped them a message with the errors I was encountering. The team replied almost instantly and initially suggested ensuring that the following updates were applied on the server:

https://support.microsoft.com/en-us/kb/2973337
https://support.microsoft.com/en-us/kb/2975719

Proxy proxy proxy!

However still no joy even with everything present as it should be. The next recommendation was to check if I was using a proxy server for outbound connections. We do have one but it’s not used for server VLANs and is the first thing I disable on a new VM build.

However I did get intrigued to check the traffic going out via TCPView… lo and behold there was the proxy server trying to take the outbound connections and failing miserably. It seems that despite everything in the operating system suggesting traffic should be going out directly the Connector was still trying to use the proxy route instead.

Ref: https://blogs.technet.microsoft.com/applicationproxyblog/2016/03/07/working-with-existing-on-prem-proxy-servers-configuration-considerations-for-your-connectors/

The solution is in this document under the section “Bypassing outbound proxies”, which basically involves adding these lines to the .config files for both Connector and Updater services

<system.net>

<defaultProxy enabled="false"></defaultProxy>

</system.net>

Checking Event Viewer and the Azure Portal afterwards showed success, my Connectors were now up and running with nice green icons, much better 🙂

Note: even though this fix resolves the issue the current version of the Troubleshooter doesn’t seem to follow the settings in the .config files and will still report connection failures. The Azure AD Application Proxy team are aware of this and are aiming to have a new version out soon.

Additional considerations

There’s a few other points to bear in mind when you’re completing the configuration of the application proxy. None of them are major issues but good to have everything ready before you start…

Certificates

Once the Connectors are up and running the rest of the process went smoothly, although note you will need a wildcard certificate if you want to publish your applications via a “vanity” URL i.e. your own domain rather than “msappproxy.net”

Using the vanity domain and some DNS CNAME records means that if you use Office 365 SharePoint for your Intranet your internal applications can work from the same URL both inside and outside.

Setting SPNs for Kerberos SSO

Even better, those internal apps can SSO based on the Office 365 initial sign-on for a suitably slick user experience! This does require a bit more configuration with Kerberos delegation but it’s not too bad.

When setting the SPN records I remembered the gotcha from when I worked on Dynamics CRM to type the command in manually… bizarre as it is the same still applies!

Using the -S switch worked well for me:

setspn -s HTTP/yourserver yourserver

Ref: https://blogs.msdn.microsoft.com/saurabh_singh/2009/01/08/new-features-in-setspn-exe-on-windows-server-2008/

Nested groups

Finally, bear in mind if you’re using groups created natively in Azure AD you can’t nest memberships when creating application assignments, which is a shame. As a workaround create any nested ones in your local AD instead and sync them up via Azure AD Connect or just create flat groups in Azure AD if you prefer to work solely up there.

Ref: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-accessmanagement-manage-groups

Application links

You can either publish your application links via your Intranet or users can browse them via the portal (I’ve linked to the new makeover version as it looks much better than the previous one in my opinion)

https://account.activedirectory.windowsazure.com/r#/applications

image credit Rainer Stropek