Our website makes use of cookies like most of the websites. In order to deliver a personalised, responsive and improved experience, we remember and store information about how you use it. This is done using simple text files called cookies which sit on your computer. These cookies are completely safe and secure and will never contain any sensitive information. By clicking continue here, you give your consent to the use of cookies by our website.

Tuesday, 12 January 2016 15:46

2016 the year of hyper-converged

VCE’s Nigel Moulton puts the case for 2016 being a the year converged and hyper-converged infrastructure becomes mainstream

Enterprise cloud migration is creating double-digit growth in the infrastructure-as-a-service market, finds Frost & Sullivan 

 Tom Homer argues that cloud needs to be designed as customer-centric from the get go otherwise it’s destined to fail.

City Cloud allows businesses to construct bare metal server infrastructure as a service solutions across multiple data centres throughout Europe

Wednesday, 21 January 2015 00:12

Datapipe acquires big data cloud pioneer GoGrid

GoGrid’s orchestration and automation technologies will be integrated into Datapipe’s managed hosting solutions  

More than half of IT decision makers confirmed that they expect to have moved to a fully cloud-based IT infrastructure within the next five years.

Tom Homer explains why not all service providers are created equal, and why businesses should look for a service provider that meets their infrastructure needs now and into the future.

Powered by Oracle's latest generation SPARC hardware platform, CloudSigma brings its infrastructure-as-a-service approach to Solaris

Businesses are realising the potential of big data but issues around technology skills and resourcing are holding them back from adopting big data in the timescale they would like.

Jeffrey Lyon looks at some of the ways to combat DDos attacks on your cloud real-estate

Businesses should dump their software licenses and their support contracts and move to the cloud, says Kevin Linsell

Figures from a new Citrix worldwide survey of Desktop-as-a-Service (DaaS) and other hosted mobile workspace providers show a move away from public cloud delivery to a more low level delivery.


New benchmarking studies reveal how businesses running NoSQL databases can get up to 60% more performance from any existing IT infrastructure with a few minor tweaks

This month sees the opening of the first UK SoftLayer data centre to supply infrastructure as a service solutions to the UK and European market

Bare metal servers are the hot topics at the moment when you need raw performance power and complete flexibility. Adam Weissmuller looks at why businesses are looking at Infrastructure as a service when a bare metal approach will give five times the performance and twice the efficiency. 

There are staggering failure rates across 'tech giant' IaaS implementations and the support and functionality required for an effective solution just isn’t there yet

Figures show that IaaS customers are regularly over specifying servers that are less than 50% utilised and consequently are over paying by around a billion pounds a year

A new API-driven server solution from Rackspace will deliver single tenant bare-metal servers that can be spun up as quickly as VMs with all the speed and flexibility of a dedicated hosted server.

Why businesses need a clear and thorough strategy if they want rewards and a return on investment rather than cloud chaos.

Businesses and service providers can now rapidly deliver cloud infrastructure with a new orchestration and virtualisation solution from Flexiant and Parallels

One challenge that IT decision makers face is choosing a cloud solution that can meet workload and performance requirements in the most cost-efficient manner. Even though industry hype suggests that enterprises should move everything into the cloud, the best solution is often a hybrid one.

Brinkster selects Flexiant Cloud Orchestrator to expand existing cloud business and open new revenue opportunities

Government is making good progress on the cloud-first policy with a strong take up on Infrastructure as a services however the levels of service supplied are significantly under-performing with the majority of users being disappointed with the results.

Tuesday, 18 March 2014 14:57

Putting business analytics in the Cloud

Nuno Godinho, Director of Cloud Services, Europe for Aditi Technologies discusses how cloud technologies can enable business analytics and big data.

Ever heard of data scientists?  Well the Harvard Business Review named it the sexiest job of the 21st Century and it is has quickly risen to prominence in a number of industries including retail, oil and gas, telecommunications and financial services.  So what has this got to do with cloud?  Should we all be packing up our RESTful APIs and retraining? Not at all.  Before looking at what the cloud has to offer the data scientist and IT departments that work with them, lets have a quick dash through history and give an explanation of the role.

Analysing data with computers has gone through a number of significant changes over the years, from looking at raw numbers by hand, to spreadsheets, Business Intelligence and, more recently, visualisations and complex real-time predictive analytics.  One thing that has largely been consistent throughout that period has been the data and analysis taking place on-premise.  For a long time this was down to technical reasons, but now that has been overcome with many vendors offering PaaS and SaaS versions of their tools and platforms, which means most of the barrier can be put down to the commercial sensitivity of data and it going outside the firewall.  Will the service be reliable, safe from hackers, and adequately protected with encryption etc?

Extracting deep meaning from your cloud data

In terms of the data scientists themselves, in truth this role has been around for a long time.  Essentially it boils down to extracting deep meaning from data.  However, the key to doing it is understanding the data itself – not just the results – the raw data, its sources, formats and relationships and then combining this with strategies to analyse the data, from descriptive and prescriptive to predictive perspectives.  The data scientist has to have a mind that combines analytic, business and IT skills.  Previously representatives from these departments would (read might) have worked to generate the required analysis normally leveraging some kind of data warehouse tool.

Traditional business intelligence, reviews and presents historic data, whether that is two-seconds or two-years old.  For years analysts have been taking those same datasets and using them to built prescriptive models that describe the relationships between the data elements and how the numbers interact together.  But the latest frontier is real-time predictive analytics: using those models to predict (for example) an action, value or preference.  Usually an event, such as an attempted credit card transaction, triggers a model to be run against the transaction details to determine the chances of whether it could be fraudulent, or to make recommendations for other products an online shopper might like.  In financial services trading floor systems make thousands of these predictions a second to assess stock movements.

Using raw cloud compute for big data analytics

So where can the cloud fit into all this, and should it? The answer is, the cloud can be of benefit throughout the process of collecting and getting data to the point that it can be used in predictive analytics.  Firstly, it can be an aggregation point.  If your data sources are sensors distributed across an oil field, or even mobile such as truck geo-location data, the cloud can be the point where all those resources are brought together into a single data source for further processing.

Raw compute power from the cloud can also be used to process the big data associated with predictive analytics.  Creating models can be an intensive task depending on the size of the data sets, if you don’t actually need to do this often, why make the capital investment when you can just buy the machine time?

The cloud can also be used to enrich your data sets, by providing additional data sources for your models.  There are hundreds if not thousands of sources that can enhance your data, whether you need traffic data, government information, or simply temperature data.  These sources are validated, reliable and can substantially improve the quality of your models, whilst reducing the costs.

Does your cloud infrastructure match your needs?

The cloud can of course be responsible for the predictive analysis itself and it is at this point more than any other that you have to consider how quickly you need the results and whether speed and reliability demand you have the infrastructure on-site.  For example if you are making thousands of transactions a second that rely on predictive analytics, and the internet connection to your cloud provider is lost – what happens?  You may be able to switch to a back-up line, but how long does it take and what is the impact?

Data Scientists and IT departments alike should not ignore the role that the cloud can play in any analytics scenario.  That is not to say that it right for all of them, but as we have explained above, there are a number of ways that cloud computing can play a role, it doesn’t have to be all or nothing.  It can enhance models, lower costs and give smaller companies access to intelligence that they would otherwise not be able to afford.  Basically in order to do these kinds of activities and analysis we require huge amounts of compute power and storage space. This is why the cloud is the perfect partner for big data.

So, to say ‘No’ outright is to deny yourself the possibility of improving or simplifying the way in which analytics is executed in your company.

About Nuno

Nuno Godinho @NunoGodinho is a Windows Azure MVP and Director of Cloud Services, Europe for Aditi Technologies.

He has been an MVP for the last six years, first as an MVP in ASP.NET and the last three years as a Windows Azure MVP.  His is also a speaker at some of Microsoft's key events such as TechEd North America, TechEd Europe, Tech Days Worldwide Online, TechDays Netherlands and at other community events such as GASP - Grupo de Arquitectura de Software Português, Windows Azure UK User Group, Azure BE UG and so on. He is a prolific blogger and community creator.

Tuesday, 18 February 2014 11:50

Why Cloud 2.0 will be everything-as-a-service

David Grimes, Chief Technology Officer at NaviSite explains what the next generation of cloud services will be and how they will differ from the current infrastructure-as-a-service solutions The cloud computing market has matured significantly in the past two to three years, becoming almost synonymous with infrastructure-as-a-service (IaaS). But if the rapid adoption of IaaS was Cloud 1.0, what does the future hold for cloud beyond IaaS and what services will we see being enabled by cloud in the future? Primarily cloud adoption was driven by cost. Not always in the sense that switching to cloud would reduce spend immediately, but the benefits of moving from a CAPEX to OPEX model was, and continues to be, a compelling proposition to businesses. Other factors have also helped to shape today’s cloud industry. These include the maturation of virtualisation technologies, the ever increasing capabilities of the underlying hardware platforms, and the expertise which service providers have developed in delivering robust, secure solutions using a shared infrastructure model. It is this attraction to ‘as-a-service’ delivery that will continue to fuel the next evolution in cloud services, or Cloud 2.0. Over the coming years we will see many existing offerings adapting to an as-a-service model enabling greater levels of flexibility and operational efficiency.

Creating the virtualised desktop as a service

One emerging opportunity in the as-a-service suit is Desktop-as-a-Service (DaaS). Previously a model that had limited success in the market, it is now experiencing increasing demand. This trend can be explained by businesses’ move towards ‘bring your own device’ (BYOD) policies, which in itself is a trend that has resulted from a combination of different elements. Firstly, the current generation of employees has expectations of personal choice when it comes to devices in the office. A company supplied mobile phone simply doesn’t cut it for many workers. They want freedom to choose their own — often more advanced — smartphones or tablets. Also, the trend toward a more global, distributed workforce means people need to work where they want, how they want. The flexible nature of desktop-as-a-service helps address many of these issues. For many companies their intellectual property (IP) is the lifeblood of their business, so protecting access to it is critical. DaaS brings with it a host of information security benefits and many companies now use it to help mitigate IP control concerns. Some of the businesses that most benefit from DaaS use large numbers of overseas contractors to work on specific projects for limited periods of time. The business wants to ensure that contractors have access to the source code but not allow them to copy it for use outside of the organisation, onto a USB stick or printer for example. By using a hosted DaaS model for their contracted developers they are able to provide the tools they need and at the same time retain more control, restricting and remotely cutting off access as soon as needed. Other drivers of DaaS are the advances in protocols and networks that have made the hosted desktop experience more acceptable, and in some cases superior, to the traditional desktop. For enterprises, DaaS represents an opportunity to satisfy their employees BYOD desires, replace CAPEX oriented desktop refresh cycles with a more predictable OPEX model, reduce overall IT support costs, and more readily address security and compliance needs.

Storage is defining the cloud future

Another area that will form a significant part of the Cloud 2.0 as-a-service future is storage. Exponential growth in storage needs will be driven by regulatory and compliance requirements, an increase in mobile devices, and the unprecedented growth of unstructured data. It would be difficult to have a discussion about storage today without using the phrase ‘Big Data’. Almost everything we do creates data, and as a result storage requirements are growing rapidly. This rate of growth cannot be met using traditional procurement methods. Storage-as-a-service (STaaS), where servers can be provisioned via the cloud, enables companies to add storage at the click of a mouse rather than the week and months it previously took to get new servers up and running. The clear business advantages in terms of speed of delivery and flexibility that STaaS provides, means that this area has the potential to grow significantly over the coming years.

The (r)evolution that is Cloud 2.0

Aside from DaaS and STaaS there is huge opportunity to transform other traditional methods of service delivery into an as-a-service model. All of these services will embody the essential characteristics of cloud and will likely be delivered from the common platform which defined the original Cloud 1.0. For example, many applications will benefit from both a cost model and operational efficiency perspective through database-as-a service offerings. The new generation of NoSQL databases are much better suited to cloud applications as they are designed to deal with multiple small requests and work in a distributed, horizontally scalable model – several databases spread across different geographies in the cloud. We could also soon see increased adoption of platform-as-a-service (Paas), but that will be dependent on an evolution in applications, which is clearly underway but likely to be a bit slower growing in the immediate future. Disaster recovery is yet another area where the move away from traditional hardware-based solutions will enable the infrastructure to provide additional capabilities. While any complete disaster recovery plan will include considerations at the application layer, the infrastructure itself can do more today than ever before. In this context of Cloud 2.0 it is best to think of cloud as a set of guiding principles as opposed to a term that defines a specific offering or capability. Our vision for Cloud 2.0 will see the evolutionary move from a pure IaaS to a comprehensive suite of as-a-service offerings. To that end, the specific offerings identified here should be considered representative, not exhaustive. The market is continuing to accelerate and we will likely see additional opportunities to apply the as-a-service mindset to legacy and new offerings.

About the author

David Grimes is the CTO of NaviSite and has responsibility for the overall technology vision and direction of the company and leads the Research and Development. Grimes moved to NaviSite in 2002 through the acquisition of Clear Blue where he was responsible for overseeing all internal and operational support systems. Prior to that, he was the lead software engineer at AppliedTheory.


Global research business Frost & Sullivan has named Interoute as European Infrastructure-as-a-Service (IaaS) Telecoms Provider of the Year


A new service from Hitachi Data Systems and Verizon Terremark will deliver a software-defined storage solution to provide secure, manageable file synchronisation and sharing on the new Verizon Cloud

Wednesday, 15 January 2014 09:57

Cloud security issues maturing


Security is still a concern when procuring cloud services, but as the technology matures, so are the attitudes of businesses towards it

Abiquo are making it easier to create fast secure connections between private cloud, hosted virtual data centres, and public cloud infrastructures with the addition of CohesiveFT’s VNS3 Software Defined Networking in to their infrastructure as a service product.



255x635 banner2-compressed