Public Sector Clouds

Storage Intelligence - about time!

I was reading recently an article about Backblaze releasing storage designs. This is a 180TB NAS device in 4U! Absolutely huge! A 42U rack would be able to have around 1.8Petabyte in a single rack.

Blog-pod30-header

When thinking about Petabytes, one thinks about the big players in storage, EMC/NetApp/HDS, selling tens of storage racks covering substantial parts of the datacenter floor space and offering a fraction of this capability.

Vmax_images Fas_index

Clearly, the storage profile of what the large monolithic enterprise arrays offer is different. However, Backblaze clearly highlights the ability to get conventional "dumb" storage easily and at low cost! Packing some flash  cache or SSD in front would already bring these boxes to the same I/O capacity;-)

This makes the case that storage per se is not really a challenge anymore. However, making storage aid in the overall performance equation; making sure that storage helps in specific workload acceleration is going to be critical going forward. Basically Intelligent Storage!

Questioning "Accepted Wisdom"

Many IT shops still think of storage as a separate part of their estate. It should simply store data and provide it back rapidly when asked - politely. The continuing stifling of innovation in datacenters due to having a single answer for all questions - namely VMware/hypervisors and server virtualization - tends to stop any innovative thinking that may actually aid an organisation to accelerate those parts of the application landscape leveraging revenue.

Some questions that came to mind and also echoed by clients are:

  • Disk is cheap now. SSD answers my performance needs for storage access. Is there something that together with software actually increases the efficiency of how I do things in the business?

  • For whole classes of typical applications, structured data persistence pools, web servers etc what would "intelligent" storage do for the physical estate and the business consumers of this resource?

  • How can enterprise architecture concepts be overlaid to intelligent storage? What will this mean to how future change programmes or business initiatives are structured and architected?

  • How can current concepts of intelligent storage be used in the current datacenter landscape?

We are seeing the first impact of this type of thinking in the structured data / database world. By combining the database workload with storage and through software enablement we get  intelligent acceleration of store/retrieval operations. This is very akin to having map-reduce concepts within the relational database world.

Further combining storage processing, with CPU/RAM/Networking offload of workload specific storage requests, facilitatest unprecedented scale-out, performance and data compression capabilities.

Oracle's Engineered Systems, the Exadata Database Machine in particular, represents this intelligent storage concept, amongst other innovations, for accelerating the Oracle database workload.

These workload specific constructs foster security of precious data assets, physically and logically. This is increasingly important when one considers that organisations are using shared "dumb" storage for virtual machines, general data assets and application working data sets.

In the general marketplace other vendors (IBM PureSystems + DB2, Teradata, SAP HANA etc) starting to use variations of the technologies for intelligent storage. The level of maturity varies dramatically, with Oracle having a substantial time advantage as first mover.

2013-2015 will see more workload focused solutions materializing, replacing substantial swathes of datacenter assets built using the traditional storage view.

Why is this important for the CIO, CTO & CFO?

Intelligent workload-focused storage solutions are allowing CIO/CTOs to do things that were not easily implemented within solutions based on server virtualization technology using shared monolithic storage arrays - dumb storage - such as in the VMware enabled VCE Vblock and HP CloudSystem Matrix - which are effectively only IaaS solutions.

Workload specific storage solutions are allowing much greater consolidation ratios. Forget the 20-30-40 Virtual Machines per physical server. Think 100s of workloads per intelligent construct! An improvement of 100s of percent over the current situation!

It is important to verify how intelligent storage solutions can be a part of the CIO/CTO's product mix to support the business aspirations as well as simplify the IT landscape. Financing options are also vastly simplified with a direct link between business performance and physical asset procurement/leasing:

  • Intelligent storage removes architectural storage bottlenecks and really shares the compute/IO/networking load more fully.

  • Intelligent storage ensures those workloads supporting the business revenue generating activities are accelerated. Acceleration is linked to the cost of underlying storage assets. As cost of NAND flash, SSDs and rotating disks drop, more is automatically brought into the storage mix to reduce overall costs without disrupting the IT landscape.

  • Greater volumes of historic data are accessible thanks to the huge level of context sensitive, workload-specific data compression technologies. Big data analytics can be powered from here, as well as enterprise datawarehouse needs. This goes beyond simple static storage tiering and deduplication technologies that are unaware of WHAT they are storing!
  • Workload-specific stacking supports much higher levels of consolidation than simple server virtualization. The positive side effects of technologies such as Exadata include the rationalization of datacenter workload estates in terms of variety, operating systems can be rationalized and generally have net-net healthier estate. This means big savings for the CFO!

Intelligent storage within vertically engineered workload specific constructs, what Gartner calls Fabric Based Infrastructure present a more cogent vision of optimizing the organizational's IT capability. It provides a higher level of understanding how precious funding from CFOs is invested to those programmes necessary for the welfare of the concern.

CIO/CTOs still talking about x86 and server virtualization as the means to tackle every Business IT challenge would be well advised to keep an eye on this development.

Intelligent storage will be a fundamental part of the IT landscape allowing effective competition with hyperscale Cloud Providers such as Google/Amazon and curtailing the funding leakage from the business to external providers.

Disclaimer

The opinions expressed here are my personal opinions. Content published here is not read or approved in advance by my current employer and does not necessarily reflect the views and opinions of my employer.

The Shape of Things to Come!

A lot of what I do involves talking with thought leaders from organizations keen to transform how they do business. In some cases, they espouse thoughts moving along general industry lines or marketing. However, in some cases, there is real innovative thought taking place. I believe firmly innovation starts with questioning the current status quo.

We are bombarded by Intel x86 as the ultimate in commodity processor offering everything one could possibly imagine on the one side, and public cloud on the other hand as the doom of in-house IT centers. It is incumbent on all in this industry to think beyond even the cloud as we know it today.

Questioning "Datacenter Wisdom"

This blog entry in entitled the Shape of Things to come with a clear series of ideas in mind:

  • System-on-a-Chip (SOC) are getting very powerful indeed. At what point are these so powerful that they represent the same order of magnitude as an entire hyperscale datacenter from Google or Amazon with a million machines inside?

  • Why does in-house IT have to move out to the cloud? Why could hyperscale clouds not be built up from capacity that organizations are already putting in place? This would be akin to the electricity grid as the transport for capacity created from multiple providers. Borrowing capacity could be done in-industry or across-industries.

  • Why is there disaggregation of all components at a physical datacenter level (CPU, RAM, storage, networking etc) rather than having assembly lines with appliances/constructs hyper-efficient at a particular task within the enterprise portfolios of services and applications?

  • Why are servers still in the same form factor of compute, memory, networking and power supply? Indeed why are racks still square and datacenter space management almost a 2-dimensional activity? When we have too many people living in a limited space we tend to build upwards, with lifts and stairs to transport people. Why not the same for the datacenter?

I 'm not the only one asking these questions. Indeed, in the industry the next wave of physical manifestation of new concepts is taking place albeit slowly. I wanted to share some industry insight as examples to whet the appetite.

  • At Cornell University a great whitepaper on cyclindrical racks using 60GHz wireless transceivers for interconnects within the rack show a massively efficient model for ultrascale computing.
  • RoundWirelessServerack

  • Potentially the server container would be based on a wheel with servers as cake slice wedges plugged into the central tube core. Wheels would be stacked vertically. Although they suggest wireless connectivity, there is no reason why the central core of the tube could not carry power, networking and indeed coolant. Indeed the entire tube could be made to move upwards and downwards - think tubes kept in fridge like housings (like in the film Minority Report!)
  • MinorityReport

  • One client suggested that CPUs should be placed into ultracooled trays that can use the material of the racks as conductors and transport to other trays full of RAM. We do this with hard disks using enclosures. Indeed Intel does 3D chip stacking already!
    • Taking the Intel 22nm Xeons with 10 cores or indeed Oracle's own SPARC T5 at 28nm and 16 cores as building blocks
    • A 2U CPU tray would allow say 200 such processor packages. This is an enormous capability! For the SPARC T5 this would be 3200 cores, 25600 threads and 11Thz of aggregate power!
    • Effectively, you could provide capacity on the side to Google!
    • A RAM tray would basically allow you to provide 20TB+ depending on how it is implemented (based on current PCIe based SSD cards).
  • Fit-for-purpose components for particular workloads as general assembly lines within an organization would fit in well with the mass-scale concepts that the industrial and indeed digital revolutions promoted.
    • If we know that we will be persisting structured data within some form of relational database, then why not use the best construct for that. Oracle's Engineered Systems paved the way forward for this construct.
    • Others are following with their own engineered stacks.
    • The tuning of all components and the software to do a specific task that will be used for years to come is the key point!

So the technical components in this radical shake up of the datacenter are materializing. We haven't even started to talk about some of te work happening in material science providing unparalleled changes in CPUs (up to 300GHz at room temperature) or even non-volatile RAM totally replacing spinning disk and possibly SSD and DRAM.


Why is this important for the CIO, CTO & CFO?

Customers typically ask whether they should move everything out to cloud providers such as Google/Amazon or private cloud hosters such as CSC/ATOS/T-Systems. Well looking at the nexus of technological change that is almost upon us, I would say that at some level it might make sense to evaluate the mix of on-premise and off-premise resource.

The Cloud is effectively a delivery model - some applications such as email clearly can be in the public cloud - bearing in mind privacy issues. However the capabilities needed for an organization to thrive as expressed in Enterprise Architecture in order to exploit market forces can be expressed in other ways.

  • Server virtualization relies on workloads not taking all the resources of a physical server. You should be questioning why the software, the most expensive components, is not being used to its maximum? Solving server acquisition costs does not reduce costs for you in a meaningful way.

  • Entertain the idea that with acceleration at every level of the stack, information requests may be serviced in near-realtime! The business should be asking what it would do with that capability? What would you do differently?

  • Datacenter infrastructure may change radically. It may well be that the entire datacenter is replaced by a tube stacked vertically that can do the job of the current football field sized datacenter. How can you exploit assembly line strategies that will already start to radically reduce the physical datacenter estate? Oracle's Engineered Systems are one approach for this for certain workloads, replacing huge swathes of racks, storage arrays and network switches of equipment.

  • Verify if notions of desktops are still valid. If everything is accessible with web based technologies, including interactive applications such as Microsoft Office, then why not ensure that virtual desktops are proactively made obsolete, and simply provide viewing/input devices to those interactive web pages.

  • Middleware may well represent a vastly unexplored ecosystem for reducing physical datacenter footprints and drastically reducing costs.
    • Networking at 100+Gbps already enables bringing your applications/web powered effective desktops with interaction to the users' viewing devices wherever they are.
    • Use intra-application constructs to insulate from the technical capability below. Java applications have this feature built-in, being cross platform by nature. This is a more relevant level of virtualization than the entire physical server.

  • Security should be enabled at all layers, and not rely on some magic from switch vendors in the form of firewalls. It should be in the middleware platforms to support application encapsulation techniques, as well as within pools of data persistence (databases, filesytems etc).
Enterprise architecture is fueling a new examination of how business defines the IT capabilities it needs to thrive and power growth. This architecture is showing the greater reliance on data integration technologies, speed to market and indeed the need to persist greater volumes of data for longer periods of time.

It may well be incumbent on the CIO/CTO/CFO to pave the way for this brave new world! They need to be already ensuring that people understand what is impossible now, technically or financially, will sort itself out. The business needs to be challenged on what it would do in a world without frontiers or computational/storage limitations?

If millions of users can be serviced per square round meter of datacenter space using a cylindrical server tube wedge/slice - why not do it? This is not the time for fanatics within the datacenter that are railroading discussions to what they are currently using - or to provide the universal answer "server virtualization from VMware is the answer, and what is the question?".

Brave thinking is required. Be prepared to know what to do when the power is in you hands. The competitive challenges of our time require drastic changes. Witness what is happening in the financial services world with traders being replaced by automated programs. This requires serious resources. Changes in technology will allow this to be performed effortlessly with the entire stock market data kept in memory, and a billion risk simulations run per second!

Disclaimer

The opinions expressed here are my personal opinions. Content published here is not read or approved in advance by my current employer and does not necessarily reflect the views and opinions of my employer.

Journey to the Cloud – the Instrumental Role of the Public Sector

Imported from http://consultingblogs.emc.com/ published December 3 2010

All too often in publications we hear of the desire of the US administration to move towards the use of Cloud technologies, with obvious references to Google. However, in Europe there is also ‘some’ activity in this direction. Actually, from the feedback we have been getting from the Public Sector here in Germany, and also from the European Institutions, such as the European Commission, Cloud is firmly on the agenda.

From the European Commission, there are many working groups supported by academia looking into the Cloud phenomenon. There is an excellent document they have produced that can be downloaded outlining the state of play. One of the areas that the Commission has picked up on really well I feel is the idea of preservation of individual rights and privacy where Neelie Kroes, the European Commission Vice-President for the Digital Agenda, clearly states that:

 

Cloud computing is more than simply a technical challenge. By putting our personal data on remote servers, we risk losing control over that data. Because the right to the protection of personal data is a fundamental right in the EU, this demands several actions. Fundamentally, the Commission believes that we need further research to enhance the security features of these technologies. And indeed we are funding such research at European level – which looks at "privacy-by-design" and "privacy-enhancing technologies". “

 

With such a public privacy agenda in mind, the Public Sector has an instrumental role to play in evolving the state of play of the Cloud paradigm on a global scale.

Indeed groups like ENISA, the European Network and Information Security Agency are tackling such themes head-on by collaborating with the private sector to determine best practices.

Across Europe, the Commission tries to create concensus, and in doing so, takes considerable time to reach a conclusion that can be acted upon. However, with their vast resources, things can be made to move quicker in Cloud adoption - when they are finally ready. At the local country level, there are also some clear statements being made. For example the German state of Schleswig-Holstein notes the following:

  • Applicable law issues
  • The legal basis for cloud computing and related processor and controller issues
  • Problems associated with the possibility of third-party access
  • The minimum requirements for data processor relationships and service provider contracts under the new German data protection law
  • Technical and organizational security measures
  • The legal landscape for clouds located outside the European Union

 

Clouds located outside the European Union are per se unlawful, even if the EU Commission has issued an adequacy decision in favor of the foreign country in question (for example, Switzerland, Canada or Argentina).

Strong words indeed!

Well it is not all doom and gloom. The mere fact that Cloud is recognized, and being seriously examined at this level of the ether, is encouraging in that future implementations of Clouds for the Public Sector – essentially you and I and the other citizens of the land/zone concerned – will take this level of personal privacy very seriously.

The Public Sector I believe has a serious role to play in enabling Cloud technologies to safely pervade the everyday fabric of society. I am talking about the citizen’s interactions with the State. The idea that we still need to troop off to some bricks-and-mortar building to show our papers, and in return we get..you guessed..another piece of paper - sounds crazy right? The ability of the individual to be able to instantly browse their data records held by all the areas of Government would surely be interesting to you and I.

However, it goes further. The ability to securely interact with your data when visiting say a doctor, and ensuring that whatever doctor you go to, the latest medical details in your file are available when you, and you alone, provide the ‘key’ to that data. The Doctor also needs to provide a 'key' indicating that he/she is a registered Doctor that is allowed to look at patient records with the patients consent.

By linking information at least within the various administrations and instituting the various regimes necessary for highly efficient IT operations, the taxpayer is already being better servedSmile. When you get down and think over the processes and linkage points that a citizen has with various Public Sector departments, and the actual need for physical transfer of bits of paper being eliminated, that is a service that should be strived for.

I agree that there are security concerns, but there are also security solutions for protecting data. Everyday, I use a biometric device to login to my laptop, or with an additional RSA SecurID two-factor token key create a secure encrypted connection to the corporate network.

This is used throughout the IT industry (including Public Sector). So why can the everyday consumer not have access to such tokens? Why not be able to combine this with biometric device and provide an ultra-secure link for the citizen to their data with the Public Sector? The technology is there today!

Even the DHL delivery service uses a tablet device to take a signature electronically as receipt of delivery! We, the citizen, don't think twice about thatSurprise!

User Experience teams, designers of web systems such as at EMC Consulting, and indeed the Private Sector as a whole, can be instrumental to rapidly getting the Cloud infrastructure built and running. The Public Sector always strives for the perfect solution, hence the long deliberations. However, “to err is human; perfection is divinity”, and so the current state of the art can be directly used. We learn along the way, and we evolve!

The Public Sector building out such infrastructures for their citizens, the people they are supposed to serve, provide a valuable jumpstart to citizens to get to use the Cloud, have access to services easily, get near-instant results and generally get in touch with their Cloud rootsWink.

Such a contribution would be akin to the giant strides that road, rail and indeed telephone networks have made in closing that last-mile loop to the consumers and providing universal mobility. The eCommission/eGovernment initiatives and the huge cost reduction programmes underway everywhere would be well served by the Cloud. Public Sector Cloud-enabled services supporting eGovernment, Academia and the Public Health sectors (to name a few) would be radically transformed.

Luckily, there is ongoing dialogue. However the reluctance to move forward and the fear of data security being breached which is just as real in the physical world, can be dealt with.

Small service focused initiatives that can be comfortably moved into the Public Sector Cloud would allow moving up the learning curve.

Indeed, here in Berlin, Germany, we have the old Tempelhof airport, the site of the Berlin Airlift, with its atypical period architecture sitting abandoned. The Cloud could even drive inner city rejuvenation schemes and green initiatives.

Why not transform a small part of this mainly unused airport into a Cloud infrastructure that would buttress the City of Berlin (what I like to call CIT – City IT) and indeed reinforce the heart of government itself. This could be funded jointly by Public-Private sectors. That would be an effective use of tax payers money - instead of building a new datacenter someplace else.

Every major city has some abandoned part that is simply looking for a lease of life. Containerized, pre-assembled Cloud infrastructure units could literally be rolled in and up and running in days. There are so many people looking for secure places to put their data assets, that there would be instant demand.

Backups could be kept there, and only you with your biometric enabled two-factor authentication ‘key’ would be able to access this over the excellent last-mile infrastructure to your home or on the move from anywhere in the world. If this is not allowed, then policies can be set that prevent data going to a specific IP address.

SMB could also, as part of a state run scheme promoting and helping business get on their feet, use this infrastructure, and when stable, also have the choice of moving the data to a public cloud provider or indeed even locally if desired. Setup a business' IT needs in seconds and massively reduce the cost of entrepreneurship!

All this points to the instrumental role that the Public Sector, and their ‘CIOs’ have for transforming the information-centric relationship they have with their citizens – their customers! Set the agenda, drive the information centric future that lies at the heart of a healthy social and economic Europe!

Disclaimer

The opinions expressed here are my personal opinions. Content published here is not read or approved in advance by EMC and does not necessarily reflect the views and opinions of EMC.