OneDrive Files on Demand – first steps

OneDrive logo

After much anticipation and playing with Windows Insider previews OneDrive Files on Demand finally hit general release alongside Windows 10 1709 (Fall Creators Update) the other week. I’ve been giving it a test drive over the past week or two along with fellow Network tech Matt Stevens – here’s a few of our observations so far along with workarounds for a couple of teething issues.

Windows 10 build

There is one pretty important requirement to bear in mind with the new Files on Demand feature; it’s only available in build 1709 and above. That means you need to be on the semi-annual (aka CB) branch rather than the LTSB route that some people have taken.

Ref: https://blog.juriba.com/windows-10-branching-timeline

It’s new features like Files on Demand that make the additional work of staying up-to-date worthwhile; so far we have a couple of hundred laptops running 1703 without too much fuss so 1709 should slot in fairly smoothly as we build our images layer-by-layer now using only the pure Microsoft WIM as a starting point.

We tamed (nuked) the built-in apps via a very handy Powershell script we found online (also see alternative version here) that runs during MDT deployment and the Start Menu default tiles are cleaned up via a GPO layout file. Configure your Windows Store for Business (or Education as case would have it), tweak a few more policies for Cortana, Telemetry etc. and Windows 10 becomes much more manageable even on the latest build.

Why Files on Demand?

If you don’t know what all the fuss is about check out the initial Insider announcement:

Ref: https://blogs.windows.com/windowsexperience/2017/06/13/onedrive-files-demand-now-available-windows-insiders/#kwLbqguOTefId6pv.97

Ref: https://blogs.office.com/en-us/2017/05/11/introducing-onedrive-files-on-demand-and-additional-features-making-it-easier-to-access-and-share-files/?eu=true

What it basically means is that we can finally integrate (huge amounts of) cloud storage with our on-premise desktops in a much tighter fashion and dispense with (unsupported) scripts or (expensive) third party tools to access OneDrive on a Windows desktop using File Explorer. It also means not having to deal with WebDAV, which always felt a horribly dated and clunky protocol to use for accessing cloud storage.

As soon as the 1709 ISO hit VLSC I grabbed it from Microsoft, slotted the new WIM into one of my MDT Task Sequences and deployed a VM to give the production version a try. It shows much promise but as always there’s some gotchas that mean nothing is ever quite straightforward.

Client version

Microsoft being Microsoft always have one shoot-self-in-foot moment whenever a new product comes out and this release was no exception. Despite having the freshly downloaded 1709 ISO I noticed that on first launch the client was showing up as 2016 and not the latest 2017 (17.3.7076.1026) that brings in Files on Demand

https://support.office.com/en-gb/article/New-OneDrive-sync-client-release-notes-845dcf18-f921-435e-bf28-4e24b95e5fc0


that’s the one that you want…

There’s a useful summary of the client install \ update process below. It does strike me as odd that the client self-updates and installs from appdata rather than being managed by WSUS.

Ref: http://deploynovellas.com/2016/05/25/install-onedrive-ngsc-update-windows-10-osd

Similarly it also takes a while to update when deployed on a clean 1709 build due to the initial client being out-of-date. This also means if a user is a bit too quick off the mark they can end up with an old-school full sync rather than Files on Demand.

I’ve been trying to replace the client during the deployment Task Sequence but more testing is required as my initial attempt failed with “Application Microsoft OneDrive 17.3.7073.1013 returned an unexpected return code: 1”.

Ref: http://model-technology.com/next-gen-onedrive-deployment-during-sccm-osd

I’ve added a Suspend action to the Task Sequence and will examine the logs to see what’s going on as the script tries to run…

Group Policy

To get more control over how the client is used grab the updated Group Policy templates from the local installation folder %localappdata%\Microsoft\OneDrive\BuildNumber\adm\

Ref: https://support.office.com/en-gb/article/Use-Group-Policy-to-control-OneDrive-sync-client-settings-0ecb2cf5-8882-42b3-a6e9-be6bda30899c

We force Files on Demand to be enabled as we don’t want sync cache eating up drive space on machines. We also configure our tenant ID (found via the Azure AD portal) so only Office 365 accounts can be used.

Configure these under Computer Settings > Administrative Templates > OneDrive

  • Allow syncing OneDrive accounts for only specific organizations > Enabled (using Tenant ID)
  • Enable OneDrive Files On-Demand > Enabled
  • Silently configure OneDrive using the primary Windows account > Enabled

I need to check if our third-party identity provider supports ADAL to make sure that last GPO setting works correctly. In the future we may well move to Azure AD Connect Passthrough authentication instead.

Clearing local cache (Free up space)

One important thing to remember about using Files on Demand is that when a file is either downloaded from the cloud, or freshly uploaded to it a cached copy will be kept on the local machine.

Over time (or with a large upload) this cache could grow and cause similar issues to what we were trying to avoid, especially with a shared machine and large volumes of users (pretty much the case for all our classroom machines)

At present it seems that no policies exist to force the “Free up space” option that removes the cached copies of files. However the article below suggests that using the new file attributes that have been brought in with 1709 can automate the process.

“Attrib.exe enables 2 core scenarios.  “attrib -U +P /s”, makes a set of files or folders always available and “attrib +U -P /s”, makes a set of files or folders online only.”

https://techcommunity.microsoft.com/t5/OneDrive-Blog/OneDrive-Files-On-Demand-For-The-Enterprise/ba-p/117234

We tried a script that runs on the root OneDrive folder and sure enough it resets all files back to Online only and reduces the space used down to a megabyte or so 🙂

cd "%userprofile%\Onedrive - Name of your Organisation"
attrib +U -P /s

Running this script on Logoff should in theory keep the cache files down to the bare minimum.

Disclaimer: we only just figured this one out today so again caveat emptor if you go and run this in production without testing it first!!!

Advertisements

Future Decoded 2018 highlights

Today I took a trip down to ExCeL London for Microsoft’s annual Future Decoded conference. As always it proved an interesting showcase of their future vision and gain technical insights into current and future projects. Here’s a few of my take-aways from the day…

Deploying Windows 10 with Autopilot

Although I’d read a bit about this a while back it was useful to see the Windows 10 Autopilot deployment process in action and the rationale behind using it. Given that we have been deploying some pilot Windows 10 devices to staff it does in theory help speed up that initial out-of-box process for devices that we predominantly see as cloud-managed and want to hand out without too much fuss.

Future Decoded slides: https://www.futuredecoded.com/session/fd76e051-a6a9-e711-80c2-000d3a2269dd

Ref: https://docs.microsoft.com/en-us/windows/deployment/windows-10-auto-pilot

For me this method will be applied to devices that will spend more time off the main AD network than on it and likely have a fairly simple requirements for pre-installed software. My colleagues in the office will also be pleased to hear Autopilot helps to skip the initial talking Cortana screen that’s been heard many a time so far during testing (!)

However the next part and real power of the “Modern” deployment method being showcased requires InTune in order to set up full profiles with customisable apps, settings etc. Although an MDM solution is on my wish list to get more control over roaming mobile devices it’s another software subscription bolt-on so making it an almost-necessary part of the Modern deployment experience sits a bit uneasy with me.

Another useful piece of advice was to check out Windows Analytics to help prepare for our Win10 migration project, which I need to have a proper look at tomorrow.

Ref: https://www.microsoft.com/en-us/WindowsForBusiness/windows-analytics

Microsoft Hands On labs

During the break out sessions there were plenty of Surfaces put out on the 3rd floor running “Hands On” lab training materials. These looked like they’d be perfect for students in IT courses to use for trying out Azure etc. rather than needing access to a physical lab or trial accounts in a live environment.

The content covers Windows 10, Office 365 and Azure so it’s perfect for either keeping your own skills up to date or providing students with a good few hours’ worth of e-learning material, which is interactive because you actually configure VMs rather than just watching videos.

Check them out at https://www.microsoft.com/handsonlabs

All you need is some form of Microsoft account to log in with and away you go 🙂


here’s one I made earlier…

Security & ATP

One thing 2017 will certainly be remembered for in the tech world is the high profile ransomware attacks that have brought home the realities of modern malware threats to a much broader audience than perhaps ever before. As such the session on Advanced Threat Protection was particularly interesting.

Future Decoded slides: https://www.futuredecoded.com/session/f6204a3e-e5a8-e711-80c2-000d3a2269dd

We were also recommended to check out the NCSC presentation from yesterday, another one for tomorrow’s reading list:

NCSC slides: https://www.futuredecoded.com/session/e1382eb1-01a9-e711-80c2-000d3a2269dd

The ATP offering now covers email, endpoint and Azure-based analytics. Moving to Windows 10 (1709) brings additional security and exploit protection such as:

  • Windows Defender Application Guard
  • Windows Defender Exploit Guard (aka EMET for those who remember it from Windows 7 days)

Ref: https://www.microsoft.com/en-us/windowsforbusiness/windows-atp

All of this sounds great until the dreaded “l” word comes around… yup, it’s licensing. Although none of these services grow on trees there’s only so far budgets can stretch, particularly for us Education users. One thing that’s a real problem for Education in particular is that all the new cloud-first offerings are being sold solely on a per-user basis rather than the fairer per-FTE staff method for our on-prem EES-licensed products. Costs can soon spiral upwards and make some of these offerings (Azure AD Premium I’m looking at you!) almost unobtanium

A small plea to the powers that be…

If someone from Microsoft happens to end up reading this just think of it this way… in Edu we want to make use of these new solutions and embrace the tech that’s on offer to help provide the best environment we can for users.

I’m not saying we expect Microsoft to give it all away for free (although we’d be more than happy if you’re feeling generous!) but realise that we need to protect student accounts and machines as much as we do staff and paying for a 5000-seat EMS or ATP setup is just impossible. The end result, everyone loses (well perhaps not if you’re Google, who are working hard to take that Edu market if Microsoft don’t want it for some reason) so please rethink these pricing models and help make them work for non-profits as well.

Windows Mixed Reality

Towards the end of the day I went to the Mixed Reality stand to try out the new headsets, which sit in a much more affordable price range than the incredibly-cool-but-very-pricey HoloLens. We’re currently building a new campus for construction and engineering so I was interested to see if Mixed Reality could fit in there.

https://www.microsoft.com/en-us/store/collections/vrandmixedrealityheadsets

Having tried a Lenovo headset with its associated controllers I’m impressed! Whilst VR headsets \ Google Cardboard made that first step there still felt a disconnect in terms of interacting with the world you were immersed in but the hand-held controllers help take this a step further and bring you more into the 3D virtual environment.

The out-the-box demo of walking around a house picking up and manipulating objects showed potential for me as I can imagine students being able to design in 3D using something like Maya then showcase those objects in a virtual environment using Mixed Reality.

The idea of pinning multiple virtual screens, opening Windows apps and working through the headset is also intriguing, although I suspect it needs 4K lenses for longer periods of use than the 2K ones being fitted into the kit at present.

The demo finished off with a rather addictive space invaders-style game using the VR controllers. Anyone with a Playstation VR or similar has no doubt already experienced something similar and more but it’s good to see an attempt to bring the technology into productivity tools as well. One of the opening keynotes focused heavily on HoloLens and Mixed Reality so it does seem Microsoft are really going for this area of the market.

It’s also another reason to go down the Windows 10 (1709) route as these features are only available on the new Fall Creators Update.

Fail of the day

However Microsoft wouldn’t be Microsoft if they didn’t shoot themselves in the foot from time to time. At the first Future Decoded it was the irony of queuing at a tech event to collect a piece of aper but today’s award moves the bar up a notch… step forward the Future Decoded app!

Paris Tuileries Garden Facepalm statue

At an event where you spend the whole day watching cutting-edge Azure cloud technology Microsoft hired an external company to make possibly the worst conference app I’ve ever used…

  • slow to load and required registration to view even basic content, why MS would need that data is beyond me as they spend all day scanning your badge as you move between rooms
  • website scraping to populate the app content, if I wanted a web page I’d open it directly
  • seminar sessions list that had to be manually filtered per day (looks like a GETDATE function was too difficult to implement?)
  • but the worst & most irritating was the “My Agenda” planner that didn’t generate a personal agenda at all and just scraped the keynote details from the website… hopeless

Maybe next year get some of your in-house people to showcase some of those cutting-edge Azure technologies via the app,but whatever you do don’t bring this one back!

Save yourself from insanity: Black Magic software installer

After 3 years of longingly looking at Black Magic’s stand at various AV events our wishes have been granted and we’re now proud owners of an ATEM Production Studio 4k, HyperDeck Studio Minis and DeckLink card 😀

In preparation for the new kit I’ve rebuilt our main streaming machine which runs vMix HD as it needed a bit of freshening up. It now runs Windows 10 LTSB with some added local storage and Google Drive File Stream for longer-term video archives (may as well make use of that unlimited Google Drive!)

Software install

Installing the DeckLink card looked pretty straightforward; find the PCI-E slot with 8x support, pop some software on et voila. But (you know what’s coming next)… nothing is ever as easy as it seems.

Running the software installer bombed out shortly after “trying” to install with this:

“Blackmagic Design Desktop Video Setup Wizard ended prematurely because of an error”

The fix

Having had a look around there’s a few reports of the error on the BlackMagic forums but no solutions listed.

Having noticed the installer was an MSI I thought I’d give it a go via command line instead:

msiexec /i "Desktop Video Installer v10.9.3.msi" /qb

Quelle surprise, it installed perfectly! Not sure what the installer GUI is trying to do that makes the process fail but everything is there using the msiexec method, software and drivers all looking good.

MDT imaging megapost – part 1 (our first server)

The great thing about working in the tech field is that it keeps moving on, ever changing, always evolving. That means sometimes you have to let go of systems that were once the bright shining light of progress once it becomes apparent something better has taken its place. Now is that time for my trusty ZCM 11 custom imaging system; built back in 2013 and star of a 6-part thread series I look back on now and think “wow, I actually did that”.

Until I moved imaging onto a Satellite the stats say the original Primary server pushed out over 5000 images. Given the length time the Satellite has been in place, plus the stats from our other sites that figure can easily be doubled and over the course of 4 years around 10,000 image cycles have been completed.

Compared to the previous process that was in place a huge amount of time was saved and allowed us to complete a large-scale Windows 7 migration with relative ease. Add to that a 4-year saving on ENGL license costs and my motley crew of Bash and PowerShell scripts can retire with a satisfied feeling of a job well done 🙂

The future calls, and it’s shaped like the number 10…

However we need to move on, funny enough it’s another OS migration knocking on the door that prompted the change along with a shift in hardware and environment that meant the Linux-based PXE environment was starting to hold us back.

Windows 10 support from ZCM seemed patchy at best, as was timely support for new hardware such as Surfaces and their ilk. Reading the forums and email groups didn’t inspire much confidence either so we decided to start looking elsewhere.

SCCM was the natural direction of travel but having made a substantial investment of time creating ZCM Bundles we weren’t necessarily ready to move all that just yet. Similarly ZCM Patch Management works pretty well these days for covering our 3rd-party apps. With that in mind the Microsoft Deployment Toolkit was the obvious choice.

A nice GUI-based managed scripting environment with Windows PE as the underlying OS ticked all the boxes. Oh and did I mention it’s free!

It’s time for my own MDT… Massive Deployment Thread!

What originally started as a small side-project to push Windows 10 out to a couple of trial tablets has now expanded into a core system that’s been at the heart of our summer works. With that in mind it’s time to write up the journey and the numerous tips, tricks and tools used along the way.

Many of those ideas come from some of the best deployment pros in the business such as Johan Arwidmark, Michael Niehaus and Mikael Nystrom so a big shout out for all the knowledge they share. Hopefully this post will give an idea of how we put those pieces together in a live environment.

The beginning, our first server

Initially we started out deploying MDT for the sole purpose of imaging up a batch of demo Surface 3 devices so the first thing was to spool up a new VM with all the required software and roles installed. Links can be found below to save you some time:

Early fixes and customisations

After getting the basic Deployment Share running we hit a few minor issues that need resolving, which are worth bearing in mind:

Multiple DNS namespaces

We have two domains that are in use internally, one of which usually gets appended as part of the domain join process and the other via DHCP.

In the PE environment the machine isn’t domain joined and as such the default setting in Bootstrap.ini wouldn’t connect to the deployment share as it didn’t know the correct DNS suffix to append.

Ref: https://scottisageek.wordpress.com/2011/12/22/mdt-2010-and-multiple-dns-namespaces/

…we found it quicker in our case to change the DeployRoot setting to the MDT server’s FQDN rather than short name… problem solved 🙂

Share permissions

The default permissions applied to the Deployment Share by the installation wizard weren’t set up as we liked. Can’t remember the exact reason now but looking back documentation on other sites I think the share needed locking down to prevent users viewing the Deployment Share content or (even worse) making unauthorised changes to it (!)

We now have specific AD groups and a service account set up so nominated MDT Administrators can read \ write to the share to upload Application install files etc. but the imaging account (more on that later) can only read and all other users are denied access by virtue of having no rights.

Set UK Locale

A quick an easy tweak sets up the keyboard settings for UK users in Bootstrap.ini

Ref: http://kabri.uk/2010/01/20/sample-bootstrap-ini-for-uk-deployments/

Similarly set them also in CustomSettings.ini

Ref: https://scriptimus.wordpress.com/2011/06/23/mdt-2010-sample-customsettings-ini-for-fully-automated-deployments/

There are quite a few other settings you’ll want to add in CustomSettings.ini but more detail on those will follow in relevant posts so keep your eyes peeled!

Update the Deployment Share

This is one action you’ll soon need to get into the habit of! If you make changes to the settings in any of the .ini files or add drivers that you’ll need in the PE environment (basically network and storage) then you need to update the Deployment Share.

This recompiles the Boot Images to include your changes, otherwise you’ll find all those nice new additions above make no difference whatsoever!

Think of this as step 1 / 2 to completely updating the Boot Images though. If the MDT wizard says that the Boot Images have changed you also need to copy the new WIMs over to WDS so PXE boot is using the latest images.

In WDS browse your server select Boot Images then right click as per screenshot above and click Replace Image. Browse to your Deployment Share’s Boot folder and select the correct image for each architecture.

Windows Deployment Services service won’t start

At an early point in our testing WDS decided it didn’t want to start after a server reboot and was spewing error code 0x906. We weren’t sure why and were on the verge of reinstalling from scratch when I spotted this:

Ref: https://social.technet.microsoft.com/Forums/windows/en-US/265b4b53-63ac-491f-817c-6030daa39b81/cant-start-windows-deployment-services-service?forum=itprovistadeployment

As per Aaron Tyler’s advice in the link above run the wdsutil commands to uninitialize then reinitialize the server manually pointing to the RemoteInstall folder WDS creates.

wdsutil /uninitialize-server
wdsutil /initialize-server /reminst:[PATH_TO_REMOTEINSTALL_DIRECTORY]

Next time…

That should be enough to get your first server up and running. For the second post in the series we’ll look at the MDT Database and how it turns MDT from a good imaging solution into a great one 🙂

Joomla adventures – rebuilding a community (part 2)

The post in this web development mini-series focuses on the products and tools that were used in the process of building and refining the new Metropower site http://www.metropower.info along with some lessons learned along the way.

Managing the project and resources

Early on in the build process it became apparent that our Facebook chat group wasn’t going to be sufficient to keep track of all the works that needed to be done to get the new site live.

To solve this I took another tip from my professional life, using Trello to manage tasks amongst the admin team and tracking the progress of content to be migrated from old site to new. For small projects it’s free, perfect!

We only needed to use the one board for managing the website project, although I have started another one up post-launch to keep track of bugs and website improvements. I initially wanted to use a Bug Tracker such as http://www.flyspray.org/ but ran out of databases on our hosting plan so put that on hold for the time being.

We also decided to set up a centralised Google account to store archive data, purchased plugins and so on. Using a Google account made sense as it would also serve as the account to use for Analytics as well. Drive has been handy as a secondary backup location too.

Social media and analytics

A few years ago one of the admin team set up a Facebook group as a first foray into social media; now we have coverage across a wide range of platforms:

  • Facebook Page – allows us to post “officially” as Metropower
  • Facebook Group – social discussion board that sees a lot of traffic that the forum used to serve. Excellent for quick responses but not so good for reference topics.
  • Facebook For Sale \ Wanted Group – saves our members paying eBay fees when trading parts between members (!)
  • Instagram – this used to be really powerful but appeal is a lot more limited now the TOU have changed and we can’t embed a hashtag gallery on our website
  • YouTube – event videos etc. but needs a bit of work for suitable content and branding

I also use Google Analytics to track site usage having completed a very well-timed training session at work on how to track campaigns and analyse user interaction.

We purchased a couple of Joomla plugins to pull dynamic content from social media onto the website. For example all events are managed via Facebook then embedded into the website so we only need to update content in one place. Using social media on the front page helps to keep it fresh but does come at a cost, more on that below…

Monitoring

If the site goes down for some reason I need to know about it and being used to having tools like PRTG Network monitor at work I wanted something similar for the site. Again fortunately there’s lots of high quality, business-grade free tools out there for personal use – I use two of them to make sure we’re covered:

After some issues with registration emails not arriving for Outlook.com users due to another user on the shared server being IP blacklisted we also set up an account with HetrixTools to keep an eye out for any similar occurrences in future https://hetrixtools.com

Website tuning and troubleshooting

With the site up and running the next stage was to tune its performance as initial page load speeds were somewhat slower than I was hoping. After doing some research via the Joomla documentation and third party sites I found some tools to benchmark the site and see what could be improved.

The main ones I use (in no particular order) are:

Immediately I could see issues such as content not being cached, CSS and JS files not minified \ compressed etc. Some could be fixed manually by adjusting settings on the server but it seemed the easiest way to fix others was to purchase an optimisation plugin for Joomla.

After browsing the JED I chose JCH Optimize and have been suitably impressed by the performance improvements since. We jumped from an F grade all the way up to A by following the recommendations from the tests above in addition to enabling JCH Optimize.

To check that your server supports the necessary GZip compression settings test it with https://checkgzipcompression.com

The only way we could speed things up further would be to move from shared to dedicated hosting (cost being the only reason we haven’t done so already) and to use a CDN to deliver content (a bit overkill really in this case).

One decision I have had to wrestle with was the choice between raw speed and community content. Running the tests above on the home page where we integrate with social media content drags the score right down into the red, due to a combination of multiple redirects (Facebook API), uncompressed images (Instagram thumbnails) and Javascript parsing (YouTube embedded player).

The moral of this story seems to be that if you want a fast-loading home page keep social media integrations well away from it.

Whilst writing this post I just spotted a potential workaround for the YouTube embeds https://webdesign.tutsplus.com/tutorials/how-to-lazy-load-embedded-youtube-videos–cms-26743

After embedding the script into my template’s header there was a definite increase in page load speed and the YouTube scripts no longer appeared in the GTMetrix “Defer parsing of JavaScript” section of the report, a nice easy win there!

Next up

In the third post in this series I’ll go over some of the plugins used and the tweaks I made to get them integrated neatly in the site 🙂

 

Image credits:

Icons made by Webalys Freebies from www.flaticon.com is licensed by CC 3.0 BY

Joomla adventures – rebuilding a community (part 1)

Sometimes work skills cross over into personal life and being the resident IT geek can come in rather handy. Cars are one of my passions outside IT and I’ve been a member of an owners group called Metropower for well over 10 years now.

When we came close to losing our old website due to issues with the previous web host I decided to take on the challenge of building a new one – the first major revamp since the site started in 2003 (!)

Outside of my usual network infrastructure work I enjoy indulging a more creative side as well so it seemed a good opportunity to combine two skill sets In the end I’ve surprised myself with what’s come out from the project and picked up some very useful tips and tricks along the way so that can only mean one thing… time for a new post series 🙂

Getting up and running… crowdfunding

Metropower has always been a free-to-join community from the very beginning and as such we didn’t have any funds to draw on for the new website. With the vision I had in mind I knew we’d need some money to get the site up and running so we turned to crowdfunding… the £300 goal was reached easily and gave us the investment we needed to purchase hosting and some very useful extras (more on that soon!)

Ref: https://www.justgiving.com/crowdfunding/metropower

It was great to see how valuable our members find the site and the community that’s been built from it and our aim with the new site was to build on that and go even bigger and better!

CMS platform… decisions decisions

When our site first started back in the early 2000’s web technologies were very different to how they are today. Our users were pretty much all accessing from desktop PCs and the portal software itself was pretty basic… to all intents a series of static web pages. Most interaction was made via forum threads and at the time users would visit multiple times each day to check for new posts.

Via the magic of the Internet Wayback Machine some screenshots of the old site below:

  

Moving forward to the present day a constant message was that the site needed to be mobile-friendly and from my personal perspective needed to present content more effectively. As such getting the platform right early on was going to be important. Social media has also taken over much of the role of the forum, although it turns out this form of communication still has a place albeit in a less starring role.

A few years back I tried out WordPress as a potential replacement but although I love the product as a blog platform it didn’t seem quite as convincing managing more structured content.

On the other hand I’d seen a fair bit of Joomla via work and as a CMS it seemed to fit the bill. Add in a wide range of extensions and the decision was made… Joomla it was.

With that in mind we chose a UK-based web host so our data was local to us. Quality of support was high on the list and after shopping around we were recommend to https://www.webhosting.uk.com/

Their live chat facility looked great and prices were very competitive with what we’d been offered elsewhere.

Theme

Although I wanted to retain the core branding from the old site I felt it needed to be merged with a more modern style such as Google Material design. As such a commercial theme was high up the shopping list to give us a quality base to build from. After searching across many theme sites the TechNews template from GavickPro caught my eye.

Ref: https://demo.gavick.com/joomla3/technews/

It ticked all the boxes for being clean, responsive and also included a few neat additional extensions such as NewsShowPro, again fitting the bill for presenting fresh content in a simple way.

The reviews section with it’s animated score sections was perfect for our “How-To Guides” section which gave a modern twist to the well-known Haynes manual “spanner ratings”or better known by that famous phrase “refitting is the reverse of removal” (!)

Other standard features you’d expect from a modern website such as social media sharing, print friendly view etc. are provided by a rather neat radial menu in the corner of the article’s cover image.

Branding & customisation

With the Technews base in place it was time to customise it to merge some core branding into the new site.

Using my favourite colour swatch tool ColorPix a few core colours were extracted from the old website and I set about customising the TechNews base to suit the Metropower brand. The built-in Chrome Web Developer tools come in very handy for this, the element inspector and  Computed CSS sections in particular are worth their weight in gold!

   

Taking feedback from our admin team on board the colours and styling continued to be tweaked for some time to get to a place where it looked “right”. It’s one of those things that’s really hard to quantity upfront, especially when working without a strict design brief but when it comes together there’s that moment when you realise the project is coming together and it’s a great feeling 🙂

Images were an area we had to put a fair bit of effort into as our old photo stocks were far too low-res (2006 vintage) so all needed to be retaken afresh. Our members proved to be very helpful on this front and supplied images via social media and our new dedicated website Gmail address (one of many platforms this project ended up utilising, more on that later).

In advance of the site launch we sent out a teaser image on social media to give our members an idea of what was to come and the feedback was very positive:

Further customisations included:

  • additional navigation using Breadcrumbs Advanced module
  • textured header background to add contrast and depth to the theme
  • social media icons added to main nav menu

The screenshots below show stages of the build process:


standard TechNews theme with minimal customisation


adding sidebar modules, colour scheme CSS modifications and social media menu icons


breadcrumb nav module installed, textured header background and feature banners added

Documentation is king

As the design progressed I started to find I needed to dig deeper into the theme to achieve what I wanted. For example in our Store category I wanted a 2-column layout that wasn’t a standard feature of the theme so had to rewrite a new PHP layout file and CSS style to suit.

Given that I was doing this in spare time I knew I’d have to revisit some elements later on so decided quite early to write documentation for the customisations as I went along, in the same way I would if I was doing the project at work. Looking back 6 months later when adding new content I’m very grateful to my past self for making that decision!

  

A useful tip that I learnt from our web designer at work was to place a custom Administrator module into the Joomla back-end as the first thing that other admins would see when they logged in. We use this to remind people of the colour scheme hex codes, layout recommendations (image sizes etc.) and common locations for admin tasks. Again it’s something I’d recommend and have done on sites I’ve made since.

To do this go to Extensions > Modules, change the dropdown on the left from Site to Administrator then add a new Custom module in the cpanel position.

The next post will focus on the Extensions used to bring additional functionality to the site, as well as the various tools and platforms for optimisation, monitoring and management that have proved incredibly useful so far!