VMworld 2012 – Cloud ops

After a VMworld I always have a look at some of the announcements to surface after a VMworld, mainly also after the marketing fairy dust has settled. One of the most notable was VMware’s new Cloud Ops announcement. The VMware headline for this was;

VMware Defines New Operating Model for the Cloud Era.

New Cloud Ops education, transformation and advisory services help unleash value of cloud through people, process and measurement.

VMware will also be forming as part of this and to build and contribute, the Cloud Ops forum;

VMware is also introducing the Cloud Ops Forum, a group of consulting and integration partners that will collaborate on further definition of this new operating model.

So simplied this reads that several of the top tier outsourcers and consultancies will be collaborating and working in harmony as a forum to build an operating model that both you and me can use to great effect in order to relinquish business value and innovation (and I assume this means any user of VMware products).

I might be reading into this one too much but this is certainly an interesting strategy, and I’m asking myself what does this mean to the customer? And when I look into what this could/might mean i’m finding this gem an interesting one. I’m finding it interesting firstly as it sounds a bit too good to be true and also classic EMC strategy, and secondly I have asked myself why/how VMware have got the multiple top tier Outsourcer’s to actually collaborate and combine to build an operating model, when they are in effect in marketplace competition?

It’s got a few confusing messages for customers, for me I equate any past experiences  I have had with the Outsourcer in tendering and general consultancy engagements has been that Outsourcer A has always been better than Outsourcer B due to Outsourcer A adding additional “Value” to the customer and delivering the same focal point of that RFP at less cost than Consultancy B, although this is never entirely true that is how the game has mainly worked, so how will a level playing field in Cloud ops forum change this whole process?

Are VMware now saying that Cloud Ops will make Outsourcers equal in the value add they bring to any proposals for usage and implementation of VMware technologies (In which case why bother with the RFP?) or are they introducing this and rallying up key outsources as they sick of endless escalations and complaints from not so happy customers due to botched implementations and capability promises from Outsourcers that couldn’t be met, OR they want to build a new delivery model for delivering services indirectly more?

Time will tell on this but it certainly looks to me that they have a few intentions for this one, we have not seen this type of approach occur within outsourcers in the past, and one last pondering thought is it may be a tactic to take on the latest developments we are seeing in open source alternatives such as Openstack and Cloudstack.

In future less will be more

What backs this up?

  • Emergence of platform as a service offerings that are providing the flexibility to move, migrate and be portable between cloud providers,
  • Collective wave movements of thousands of knowledgeable people in highly connected and collaborative communities that can be tapped in for FREE, replacing timely consultancy and  engagement at the embryonic stage of projects
  • Community driven development and design tools and methods reducing time to market
  • More successful agile software delivery and change methods, reducing release management processes
  • Proprietary vendor marketing expenditure rising over the last 2-3 years, when was the last time you saw EC2 on a billboard :)

Maybe I’m wrong…?

New kids about

Reading this article definitely shows we have a new Architectural strategy kid on the block. Potentially migrating to an open source technology like Openflow would appear to the previous generation of IT architects as being risky business. Look to other big players in the new breed of computing and such as Instagram’s engineering blog and you can see yet more adoption of non proprietary technologies so its quite obviously not, and the financial results are showing it isn’t.

It’s no wonder why this could be perceived as being a risky business, in contrast to the strategy adopted in previous proprietary dominated worlds of Mainframe and Server/Client computing. However with the early adopters of open, extensible technologies now being the biggest fish in the pond and not the smallest first it is becoming quite clear that this architectural strategy is going to continue to dominate and be a defacto strategy to adopt for any new breed player in the world of consumer IT today, and then to likely merge into enterprise strategy.

Comparing the older strategies to the new generation and you couldn’t be far enough apart between what technologies have been used. In the past proprietary technologies have been used to ensure that they have;

  • Assurance with the external vendor support when things go well…tits up
  • A throat to choke on the end of a phone line when things again go wrong with the risk being applied to the vendor
  • Product integration and support within an ecosystem which is capable of partial compatibility with other toolsets and services

To add to this, the previous adoption of proprietary technology, has also made building a support capability to manage and grow that Infrastructure much easier with the educational programs allowing both old and new dogs to be taught tricks.. but again at a cost.

Granted, the difference here is that the likes of Google whom are adopting new strategies have a completely different business model and risk factor in certain areas of business operation but the majority adopting new breed strategy still serve enterprises or indirectly serve enterprise customers. So in my limited wisdom here some thoughts on this new generation of Architectural strategy and where it will go (or not);

How long will it last?

Personally I think this type of strategy is likely to never stall as the momentum grows and the new generations “growing up” in a cloud world are all oblivious to the past strategies, as lets face it the Client/Server world of computing is never going to the inheiritance that the Mainframe has been.

What dependencies are there?

A big one is the community that keeps it ticking and innovative, and I’m also sure this is never going to be an issue for as long as Proprietary technologies are around and people want freedom of openness. To add the likes of Facebook and Google are even building a community themselves that will be respected and continual, for both the organisations own business and the benefit of the open community.

Where does this leave proprietary tech vendors?

In a vulnerable position I think, for years organisations have shelled massive volumes of investment into technology solutions with very little revolutionary development in product sets. We’ve of course seen new technologies arise (at a cost) but they are still lumbered with the original technology that is depreciating meaning less competitive edge and less ability to manoeuvre.

Will it fail?

As i’ve said, we will see yet more emergence of the types of strategy google are adopting and the failure is going to be in the hands of the adopter not the hands of a vendors R&D program. Openness will provide more choice on what is and what isn’t adopted and more importantly at a cheaper cost. If this cost saving against risk is accepted by the key stakeholders then I can’t see it failing.

Do I need to change?

Well I think you and I will need to change the way we approach architecture.  I am and I expect you are still caught in an enterprise world of Client/Server legacy but I see this legacy dissolving in the 3-5 year time frame with more emphasis on knowing how to utilise the inner workings of an open platform and not just know how to read a vendors Readme or PDF in order to architect the solution.

New horizons in 2012

Well what a year it has been, looking forward to 2012, Last year I posted some 2011 predictions, after rereading them back, it certainly was a slow year with the economy stalling any massive growth but some of them were near enough correct.

So onto the next year in 2012, for me this is a year of change from the very off. In January 2012 I start a at a new company in there Architecture team. This was a tough decision to make, but it was time to move on to pastures new I felt, I have the crave for new challenges and hurdles to jump, and lastly if I want to reach my eventual goals and aspirations I need to broaden my experience in a different industry and diversify.

As for the industry outlook in 2012, here are some small potential predictions and scenarios that may arise in a few areas of datacentre related infrastructure and technologies;

Infrastructure technologies

When it comes to Storage we’ve got a hard disk crisis, hard disk prices have increased by 10-15% due to natural disasters in the far east, and with the increase in cost most likely being fronted by the customer, this may mean the following scenarios arise;

  • Price premium of new disks may mean it is actually more cost effective utilising tiered storage archiving technologies. This quest means the the old school on-premise archiving and optimisation toolsets in the marketplace will get reprieve for another year in what is seen as a dying marketplace,
  • Price increases due to disk cost increase will have a knock on of a slowdown on new array procurement. (Or maybe pigs might fly and the markup on implementation and design of arrays may be reduced to factor in the increased pricing of disk),
  • Dependent on chosen vendor, the “Big data” strategy in organisations will slowdown and be potentially put on hold,

Lots to look out for in the area of Virtualisation, we will see more push from the alternative Hypervisor vendors arise, this may only be exploratory for larger committed Vmware shops but 2012 will mark the beginning of new horizons at least,

  • Microsoft finally get a bigger foot in the door over VMware, it is only a matter of time before cost becomes the biggest driver for change to alternative “Just enough” hypervisors,
  • Adoption of alternative hypervisors such as KVM may well become more realistic due to 2011 price increases from VMware,
  • VMware finally release a mobile hypervisor platform, whether this will be any good only time can tell, but i’d say in 2012 this will arise,

Onto the desktop that has apparently been dead for the last period of 2012, well will it be in 2012? I doubt it very much but expect to see some of the following;

  • A Marketing/PR ambush over the course of the year when Microsoft release early versions ofWindows 8. It will certainly stir the pot that was filled in 2011 thats for sure with the likes of Apple and google pushing for market share.
  • As with any Microsoft implementation or upgrade, Windows shops wake up from the hangover of Windows 7 implementations wondering “why did I do that” when Windows 8 is about. W8 has a bit more too it, certainly with the next gen Metro apps, was all that investment in Windows 7 upgrades really worth it?

In the area of IaaS we will certainly see more and more uptake of cloud based service delivery in organisations, this will be driven more by board level mandates to save cost (as per usual), and will also be driven by the desire to implement the same levels of service that public cloud offerings are already providing. This will mean;

  • More feasibility studies on the use of open source IaaS alternatives such as Joyent, Opennebula and Openstack,
  • Self service increases in adoption with more and more organisations wanting IT to be run as a business.
And on a lighter note
  • Cloud will fail (again),
  • A Storagebeers might actually happen in full force,
  • There will be a surplus of Thin clients, monitors, printers and dodgy low end servers available for purchase from the London 2012 olympic committee,

That’s all folks

Embrace + acceptance = Vision

Whilst driving home I listened to a small amount of an interview on Radio 4 with Film producer Martin Scorsese.

Discussing his new film which is in full 3D, he discussed the use of 3D and what fascinated him about the technology, examples were made to childhood in an era of Black and White screenplay he saw a moving image that appeared as 3D and create this in a screen play. It gripped him so much he wanted to change the way films were made and so at nearly 70 he has created in a new breed generation of 3D film, a 3D film likely to be a box office hit. Lastly when discussing emergence of 3D he said that history has shown that the trends of adding Colour to films was what viewers demanded and that it would be the norm in years to come, so based on this the industry changed and adapted.

This got me thinking about me and my future (selfishly), I like most and people like yourselves who read my blog should realise in the IT Industry we work in an ever evolving world of technology that in order to benefit and provide value to your organisation requires vision, persistence and embrace to change. If you fail to embrace new change and trends and you are not prepared, it is safe to say you may well be limiting yourself a successful future and more importantly you may limit potential benefits gained by your business through lack of embrace to change..

Remember, there is a new generation of computing arriving in the shape and form of cloud computing. Forget the marketing campaigns on cloud, forget the fancy toolsets used to provide or run a cloud, this is a generation of computing that as with film did when moving from Black and White production to Colour will be used by a generation who really don’t care about whats running underneath or in fact where it is located. What they will however care if it’s a service which doesn’t meet the same criteria and de-facto standard as competition.

So to quickly summarise embrace the change and move forward, it will be the best thing you can do to enable you to have the vision on the trend that may well ever replace cloud computing in many years to come.