Category Archives: Opinion

Rise of the Full Stack Vendors

In a recent Datanauts podcast Chris Wahl was discussing Azure and Azure Stack with fellow Rubrikan Mike Nelson and Microsoft’s Jeffrey Snover (If you haven’t already, you can check out the podcast for yourself- Datanauts #148). Jeffrey made some interesting observations about the changes in alignment of some of the major IT vendors over time (this discussion runs from 25min to 29min into the podcast).

He detailed how the big players (DEC, IBM etc) had started with a “vertical” alignment by building their own chips, boards, operating systems, and applications. This was followed by a dis-integration where the industry shifted to a “horizontal” alignment- chips from Intel/Motorola , Operating Systems from Microsoft/Sun, and applications and services coming from a wide range of vendors. He goes on to posit how cloud vendors are turning the industry back towards a vertical alignment, and gives the example of how Microsoft are designing their own chips (FPGAs, NICs, servers , the new “Brainwave” chip to accelerate AI etc)  right through to software; all to create the Azure Cloud.

This idea got me thinking about how this is happening elsewhere in the industry, and what the future might hold.

This realignment can be seen across the major IT manufacturers- in recent years Dell- traditionally just a client and server PC vendor- has formed Dell Technologies, picking up tech such as Force10’s network, EMC’s storage, and VMware’s hypervisor. This now puts them in that vertical alignment of controlling their own enterprise stack from the client device through the network to the server hardware and the hypervisor sat on it. In an on-premises setup Dell can provide the infrastructure from the end of the user’s fingers to the start of the Operating System or Container.

Amazon have started from the other direction- AWS as a cloud provider owning their own chipsets. servers, storage, and networking. They own the datacentre end of their customers today, but how long is it before we see the successors to the Kindle Fire devices and Alexa-connected displays being pushed as the end-user device of choice. Everything between the user and the application would then be in their single vertical.

We see similar activity from Google. Their cloud platform stretches down to their Android and ChromeOS operating systems, the Chrome browser, and even into hardware. Although (similarly to Amazon) the endpoint devices are today largely aimed at the consumer market, as the commoditisation of IT continues there’s nothing stopping this leaking into the enterprise.

However, these vertical orientations are not to the exclusion of horizontal partnerships and we’ve seen a lot more of that over recent years. For example VMware partnering with AWS, IBM, and Microsoft and Google for Cloud provision, or Dell-EMC powering the on-premises Microsoft Azure Stack, or IBM providing their software on Azure.

So will this continue, and what does the distant future hold? Looking far into the tech future is always guesswork, but if I had to bet I’d suggest that this alignment model will eventually swing back as these sort of things always seem to go in cycles. The verticalisation (new word?) will carry on for the next few years but over time the customers demand more choice and (in enterprise at least) less of the perceived risk of “vendor lock-in”. Eventually this leads to a tipping point, fragmentation of the stack and a turn back towards that horizontal alignment we are moving away from today.

Thanks Datanauts for the inspiration behind this, and #Blogtober2018 for convincing me to do more long-form opinion posts.

Happy 18th

Lighted Candles on CupcakesOctober 2018 marks my 18 year anniversary working in Higher Education IT (so yes, about the same time since this year’s Freshers were born). It’s been a long ride and things have changed dramatically from technology, personal, and industry perspectives in that time. In this post I’ll be discussing a few of those differences, so gather round and imagine me sat in a rocking chair holding a pipe and talking about the olden times.

October 2000 was a time of change in technology- the perils of the millennium bug were nearly 10 months behind us, Napster had gone legal, the last major release on LaserDisk hit the shelves, Sony released the Playstation 2, and Amazon was best known for selling books online.

I arrived fresh faced to the University department and one of the first tasks in my new role was to order some parts for my new computer. There was little budget for IT and we scraped things together from what was around. If memory serves I ordered a motherboard, memory and AMD K6 processor and coupled this up with an existing beige case, power supply, 14″ CRT monitor, and old hard disk from the recycling pile.

These days we order laptops and desktops from (insert major manufacturer here) and my office desk has a 15″ 8th-Gen-i7 hooked up to a pair of 29″ widescreen displays. As well as the advances in technology this is one of the most apparent signs of the professionalisation (and some might say commercialisation) of IT within Higher Education. There’s less scrabbling to recycle outdated components and squeeze assets for decades and a lot more focus on allowing IT to spend it’s time fixing and improving things.

Behind the scenes the server infrastructure consisted of tower cases on a desk in the corner of my office- a sneaky way for a junior employee to get an office to themselves- there was a small UPS on the floor under the table, and the entire lot ran off a single wall outlet. Windows NT 4 was the platform of choice here, about a year later upgrading to Windows 2000 and Active Directory. Fast forward and we saw the proliferation of rackmount servers and disk arrays in purpose built datacentres. Then there was the arrival of virtualisation, VMware Server and then ESX providing the opportunity to run multiple servers on one piece of tin. These days we’re putting some of these servers “out in the cloud” on the other end of an internet connection, something we wouldn’t have considered 18 years ago.

The network joining all these things together has changed as well. Gone are the days of 10Base2, crimping BNC connectors on cables we’d threaded through the suspended ceilings, and troubleshooting T-pieces and terminators.

View this post on Instagram

CentreCOM 3012SL Hub

A post shared by Chris Bradshaw (@startmenu) on

Today Gigabit ethernet to the desktop is norm, the datacentres run on fibre and 10G copper, and you can sit outside by the campus lake and get a Wifi connection.

As with the network, storage capacity has increased dramatically. On my first day in the office I had a 15 MB quota on my network home drive. In addition to storing all my personal files and settings this also had to hold my POP mailbox which I accessed by Eudora. Jump to 2018 and I’m working at a University where staff get a 1TB OneDrive account and a separate 100GB for their email.

Personally, whilst staying in the HE sector I’ve developed from a “Generic IT Support bod #7” to a more senior role, whilst keeping myself technical. I still retain some of that generalist approach, but my day-to-day work has become much more focused- particularly around virtualisation, servers, and automation.

In conclusion, as with everywhere else technology has definitely moved on dramatically in the past 18 years. Network, Storage, and Compute have all grown incredibly and this has allowed us to do things we wouldn’t have considered back in 2000. As well as that though, I believe the UK Higher Education industry has also changed and it’s IT departments have worked hard to adapt to that. We now take on many more of the processes and technologies you’d expect from our colleagues in more commercial backgrounds in a bid to provide a modern, up-to-date IT environment for the teaching and research activities of Universities in the current era.

As I finish writing this post, someone has just brought in a laptop from 1992 which they’ve just decided is no longer required. Please ignore the text above about how things have changed.

IMG_20181016_124227623

IT in Higher Education

After over 15 years working in IT within the H.E. vertical I’ve spoken publically a few times about our corner of the tech industry, with talks at VMworld in 2016 and a recent TechUG meeting and chairing a roundtable at a UK VMUG UserCon. This post covers the highlights some of the content of these sessions, it contains themes that I’ve seen myself at various institutions and have struck a common chord in discussions with colleagues from other Universities.

The HE IT Environment

TechUG Talk November 2017

TechUG Talk November 2017

There are 17,000 IT Professionals* working in the UK Higher Education industry spread across 160 Universities the length and breadth of the nation- that’s a sizable number and doesn’t include those working in IT within Schools and Further Education Colleges. These staff support some amazing research and teaching and have the opportunity to work with some really awesome people and kit in a wide variety of disciplines.

How many IT departments in other environments can support racing teams, particle accelerators, gene sequencers, dance studios, silver-service restaurants, sports centres, and farms whilst looking after residential internet customers, Nobel prize-winners, Rocket Scientists and Brain Surgeons all in a normal day? Dealing with the cutting edge presents unique challenged – for example in most environments the team looking after the wireless LAN doesn’t have to worry about the people in the office next door experimenting with next-gen wireless tech in the same airspace. As well as the cutting edge, there’s also IT supporting the more generic activities, most of which are found in any large enterprise organisation. There is still the need for a projector in the boardroom, a website for marketing, the EPOS in the coffee shop, payroll systems and so on.

State of the Art vs State of the Ark

Probably the most obvious challenge to someone dropped into the HE environment is the age range in supported equipment. There’s plenty of the latest and greatest- if you look round the vendors at any tech conference I’d be surprised if any of them didn’t have product in at least one University. But alongside this there’s usually a plethora of kit that’s perhaps past it’s best-before date but has to be kept running- this is partly down to the traditional grant-based funding model where “services” are funded once but then expected to stay on for ever.

Thankfully server virtualisation came along and helped to keep some of the old operating systems running when the hardware they relied on dies, and the advances in software defined networking have provided the opportunity to secure some manufacturer-unsupported workloads and protect the rest of the infrastructure.

Headcount

In higher education (and education in general) the employee headcount is much smaller than student numbers – UK Higher-Ed has about 400,000 Staff and roughly 2.2 Million students. Compared to a normal corporate environment there is a high turnover of these users because in addition to the regular comings and goings of employees, roughly a quarter of the “headcount” leaves every year as students graduate. This leads to the obvious potential difficulties in handling services such as user accounts – one that most Universities addressed some time ago with automation and integration with payroll and student record services.

It also presents some problems with software licensing- if site licensed software is based on the number of actual users on a site rather than the number of staff this can get quite costly. Most establishments also operate student computer labs- essentially a large scale hotdesking environment. If a software license is per-seat (and not on a flexible concurrent basis) then licensing enough seats for students to use the software in any lab (rather than having to be timetabled to just one for that application) can run up similarly high fees.

Ownership

One of the more bizarre things that a newcomer to the world of Higher Ed will come across is the issue of ownership. Often a Researcher can leave to join another institution and take their in-progress grants with them. This can mean that hardware and data can sometimes leave the company when staff do, and on the flip-side unexpected computer equipment and large amounts of data can arrive with new starters. Imagine in a more traditional corporate setting a developer or salesperson leaving and not only taking their Macbook with them, but also all the code or customer data they had been working on.

It’s an unusual situation, and one that IT departments in Higher Education need to deal with on a regular basis. They need to ensure that they have sufficient storage capacity that if terabytes of data arrive unexpectedly tomorrow it can be safely stored- requiring a flexible infrastructure. They also need to ensure that software licenses and hardware assets that are owned by the company and not part of any mobile grant are retained.  VDI and Application Virtualisation technologies can help with the software ownership and a rigorous asset management system and process is required to keep track of physical devices.

BYOD and UDDC

Staff arriving with computers from their previous employer is only one part of the “Bring Your Own Device” experience. BYOD is, and always has been, the norm at Universities for both students and staff. Thousands of students arrive each year with their own devices, coupled with staff with personal budgets and requirements sometimes choosing what to buy themselves. I’ve joked before that in Higher Education IT we were “doing” BYOD before we knew it was a thing.

But BYOD is not just for personal devices, this extends to the server environment as well- Staff and research students running servers in cupboards or under their desks. “UDDC” (The Under Desk DataCentre) can be commonplace. Add to this the “Bring Your Own Storage” problem everyone in the tech industry sees following the proliferation of large, cheap, portable USB disks and IT has a real challenge on it’s hand to provide the security and resilience that the institution, the business, requires.

Again VDI and App Virtualisation can help to deliver and maintain the software on the plethora of endpoint devices. For the server side, P2V for Under Desk DataCentres is an option. IT can easily show the benefits of a proper server environment and the ability to provide scaling and resilience that’s just not possible with one of these foot-warming server deployments.

Software

I’ve touched on Application virtualisation (and written in more depth on the subject) and there’s a lot of software used in Higher Education, a noticeable proportion of which presents a challenge to deploy and manage in the enterprise environment. IT are dealing with thousands of devices but the individual researcher wants to just download an app and get on with their job.

In Higher Ed (and Research in general) there’s a lot of little applications out there that another researcher has popped on the web (possibly back in 1994). Todays academics just want to download and use them, often with the expectation that everything will just work. However accompanying the download file there’s often no installation instructions, or instructions that remind you of the cover of a Led Zepplin album– there are so many steps. If anyone reading this ever finds themselves writing a manual don’t presume that just because someone has a Nobel Prize in Quantum Chemistry they are adept at editing the Windows Registry.

There’s also a lot of scientific applications just not designed to work in an enterprise environment. IT try and live in a world where users’ don’t share a login, and don’t require full administrator rights on their local workstation just to use it. It’s not just the freeware downloads that fall foul of these expectations- similar issues can often be found in expensive commercial research applications.

To aid this IT can invest in deployment methods- packaging through platforms such as SCCM, or virtualising the package (using ThinApp, XenApp, AppV, or Cloudpaging etc..) , or presenting the app through a virtual desktop infrastructure minimises the number of times an awkward installation process needs to be repeated and potentially allows some flexibility of the end-user device. User Environment Management plugs in here too, letting users escalate permissions without blanket issuing of admin rights across the estate.

hat-1217913_1280-pixabaySummary

So, to summarise, the big difference in a University environment to a traditional corporate one is the great variety of disciplines and activities, almost all of which require some form of IT. IT has become more and more central to almost every workplace over the past few decades and Higher Education institutions- themselves large enterprises- have at the same time adopted more and more of the practises and processes of the commercial sector. The Information Technology departments at Universities today faces many challenges common to their corporate counterparts in addition to some some unique to the sector. Thankfully modern technology is helping IT Pro’s rise to these challenges.

 

*HESA (Higher Education Statistics Agency) report for 2014/15 shows 16,900 staff categorised as “Information Technology Technicians” or “Information Technology and Telecommunications Professionals”

2016, a year of industry friendliness

You may have seen various posts in blogs and social media over the past few days about VMware staff accounts being blocked from joining the Nutanix community website, and the VMware User Group- VMUG- blocking Nutanix staff from leadership committees. I’m not party to the detail or the reasons behind these moves, but I’m surprised at the developments with the backdrop of 2016’s collaborative direction. As an industry we managed so well being friendly in 2016 despite the divisive world landscape with things like the US Election and Brexit, what happened over the Christmas break to mess this up? Here’s a few things I picked up on in the past year which paint a picture of much more inter-vendor friendliness, hopefully the issues in this particular case will be ironed out quickly and we can revert to business as usual.

VMware (and Amazon Web Services)

VMware’s 2016 announcement that you will soon be able to run their hypervisor on AWS may have rubbed a few of the vCloud Air vendors the wrong way by picking a collaboration with their biggest competitor. However, look at the positives- VMware are creating a standard platform whereby customers can take the workloads they run on AWS and port them to one of the smaller vendors if it makes sense to do so. This could even be automated- if AWS is more expensive in a particular month than another provider, some or all of the customers workloads can be migrated across.

The Dell purchase of EMC (and therefore VMware) had a few people worried that the hardware side of the VMware ecosystem would be destroyed- DellEMC would push their own traditional , storage, compute tin and hyperconverged platforms at the expense of the competition. Both Michael Dell and Pat Gelsinger have been consistent in their message that this won’t happen.

There’s also other good signs from VMware with their VM encryption package in vSphere. Rather than providing a VMware Key Management System, or insisting on an application provided elsewhere under the Dell Technologies umbrella- the requirement is just for a KMIP compliant service.

Microsoft Loves Everything

Microsoft also surprised a few people with their friendly approach to former competition recently- even to the extent that Steve Jobs and Amazon’s Alexa featured prominently in a Keynote at a recent Microsoft event I attended.

We’ve seen for some time that Microsoft Loves Linux  and Open Source. And these days they get on pretty well with Apple and Google these days, focusing on their flagship applications on Android, iOS, and MacOS and sometimes adding features there ahead of their own OS.

#VMUGgate

So, I hope this current grumbling between Nutanix and VMware either turns out to be nothing or everyone turns around and agrees to just get on. The London VMUG team sound like they agree:

IT Pro, Developer, Or Both?- The Results

For the past month I’ve been running a survey using Straw Poll to try and discover how my fellow IT Professionals and Developers see themselves. There’s a not very hidden DevOps elephant standing in the room somewhere, but I was careful not to use that term, or even mention “Ops” anywhere in the strawpoll or related blog post.

Here are the results

Interpret as you see fit, but here are some conclusions I’m drawing

  • Many people working in Computing see themselves as both an IT Pro and a Developer (a third of the respondents)- I take this to mean that in essence they are practising DevOps (at least in it’s most literal form) or otherwise see “Developer” being a subset of a more general “IT Professional” occupation.
  • “IT Professional” was possibly not the best term to use, one friend mentioned on twitter that he saw himself as a “Professional Developer”- a “Developer” vs “IT Professional” implying the non professionalism of the Dev. I suggested in this context perhaps the terms “IT Professional” and “Professional Coder” might be more suitable. Perhaps this is something to follow up in the future.
  • This was my first experience using the Straw application for a poll. It’s very straightforward to use, but I found I needed to work on advertising the poll if I was to get any respondents.