Tag: Cloud

Big Blue suits fight over redundancy

Backstreet_Boys_-_Black_&_Blue_album_coverSuits in Biggish Blue’s Systems Middleware division are fighting over the right to flee the company and collect a nice redundancy.

Some 110 people want to be paid to leave the company which is way more than the ten per cent of the division’s 736-strong workforce that IBM wanted.

IBM has said that if too many people applied for redundancy then it would choose from the list of volunteers.

The voluntary redundancy process is “coming to an end” and some will be offered redundancy. But the sheer size of the numbers of people who want out will be bad for IBM. It shows staff no longer have much confidence in the company and would rather take the money and run.

IBM has also bought in spending and travel restrictions to manage costs and it is investigating property portfolio projects which are aimed at reducing overall occupancy costs across IBM UK.
IBM staffers asking for redundancy will leave on 5 April, and compulsory lay-offs are not expected – at least, by employees.

Big Blue has restructured internal divisions, placed a big bet on cloud systems. It is also cutting costs by reducing its worldwide headcount. This follows eleven straight quarters of revenue decline.
IBM said it would take a $600m restructuring charge to expunge several thousand people this year, although the number of leavers depends on their seniority and pay scale.

 

Google’s Nearline could melt Glacier

bear_glacerGoogle is offering a new kind of data storage service which should go a long way to melting Amazon’s Glacier.

Nearline is for non-essential data, similar to Glacier, but it is offering it a cent a month per gigabyte. This is more than half the cheapest in the market place, which is Microsoft’s 2.4 cents a gigabyte.

Glacier storage has a retrieval time of several hours, and Nearline data will be available in about three seconds.

While three seconds is years for something like serving a web page, it is ideal for data analysis as well as long-term storage.

This could be Google’s cunning plan – positioning itself as the cloud computing company for all kinds of data analysis.

Tom Kershaw, director of product management for the Google Cloud Platform said that it is not about storage stupid. Its about what you do with analytics. Set ups like Nearline will mean you never have to delete anything and you can always use data.

Google announced plans with several storage providers, including Veritas/Symantec and NetApp, to encrypt and transport data from their systems onto Nearline.

On the consumer front, Dropbox charges about $10 a month to store a terabyte of data, which is the same price as Nearline and Glacier. However those businesses count on most of their customers storing well below their limit.

Either way it is looking like things are hotting up on the cloud with costs being driven down. Scattered showers much be expected.

 

 

HP creates cloud server line

Every silver has a cloudy liningThe maker of expensive printer ink, HP said that it is creating a server family for cloud providers.

The project is being done as a joint venture with Foxconn, a partnership announced last year to create cloud-optimised servers. HP has been building servers from Foxconn for a year, but is now giving a name to its server line: Cloudline.

According to HP, its systems are based on standards-based principles and use rack scale computing.

With rack scale systems, functions that were previous located in the server, such as cooling and power, may be part of the rack. The systems will likely be deployed in multi-vendor environments, although users want uniformity in controls.

HP will use the Intelligent Platform Management Interface (IPMI), an open management platform, and other systems that help provide a uniform way of managing hardware.

The hyperscale x86 server market has been growing fast, and this has led to increasing numbers of original design manufacturers (ODM), such as Taiwan’s Quanta entering the game.

HP is announcing these OpenStack systems at the Open Compute Summit and will begin taking orders on some of the systems at the end of this month. The systems use Intel Xeon E5 v3 processors and come in five configurations, including a two-socket (2P) server sled configuration and 1U configurations. No word on pricing yet.

Intel and Huawei snuggle up

cuddling-dog-catIntel and Huawei Technologies are getting closer even as their rival governments fall out over trade blocks.

According to Huawei, the pair are getting closer and will share technology and adopt Huawei branding behind the bamboo curtain to make Intel products more palatable to local buyers and the Chinese government.

The technology involved focuses on the cloud, with the pair working on a project to create new servers, a data centre, software and cyber security for a global cloud-computing network.

China’s government has been openly pushing for the use of more Chinese and less foreign-made technology, both to grow its own tech sector and as a response to Edward Snowden’s leaks about widespread US cyber surveillance.

Intel and Huawei have collaborated previously, including a server and cloud product team-up in 2012 and an agreement to cooperate on data storage last April.

Although the announcement is mostly Chinese focused it is likely that the Intel side of the deal will result in other products seen worldwide. Intel would take the lead in nations where Huawei is not trusted, and Huawei stepping forward in countries which are worried about US surveillance.

Open saucy Microsoft puts Azure on Ubuntu

Every silver has a cloudy liningMicrosoft has released its Azure hosted service so that it can run Linux.

Microsoft showed off a preview of Azure HDInsight running on Ubuntu and the makers of the open saucy gear Canonical claims that it is a recognition that Ubuntu is great for running Big Data solutions.

For those who came in late, Azure HDInsight, is Microsoft’s Apache Hadoop-based service in the Azure cloud. It is designed to make it easy for customers to evaluate petabytes of all types of data with fast, cost-effective scale on demand, as well as programming extensions so developers can use their favourite languages.

The big idea is that people that already use Hadoop on Linux on-premises like on Hortonworks Data Platform, because they can use common Linux tools, documentation, and templates and and now they can extend their deployment to Azure with hybrid cloud connections.

It is not all one way traffic.  Canonical has Juju which  is a Cloud Orchestration tool. This is the result of years of effort to optimize Big Data workloads on Ubuntu. This will mean that Azure will effectively gain access to this.

Hitachi Data Systems buys oXya

Clouds in Oxford: pic Mike MageeThe IT division of Hitachi said it is to buy oxYa.

The company is a provider of services for cloud and SAP products. The acquisition will be complete by the end of March and the 500 employees and the company will become a wholly owned subsidiary of Hitachi Data Systems (HDS).

HDS did not say how much it paid for the company, which has over 200,000 people using its SAP services.

The company said it will now be able to offer extended portfolio of cloud and managed services for its customers and the acquisition will help HDS to collaborate and deliver and management of large environments.

The reason for the acquisition, according to VP Hicham Abdessamad, was because customers demand “as a service” options that let them keep up with the fast pace of cloud based systems.

He said oxYa offered an expanded set of application-as-a-service offerings for both the hybrid and for public clouds.

 

Insurance industry drags feet on big data


next-years-mainframe-model-comes-in-nearly-half-the-spaceThe insurance industry
is in danger of falling behind other companies because it is not interested in the latest digital technology.

Reuters reported that while some insurers are using developments such as telematics, or social media sources, to increase the amount of information they have about customers to reduce claims and make insurance cheaper for all most are luddite laggers.

Famously we will probably need  “black boxes” in our cars so that we can be rewarded with lower insurance premiums if we drive carefully.

But apparently when it comes to Big Data, insurance companies are saying a big “no.”

This is because the insurance industry is still locked in the early 20th century, where pen and paper were mightier even that the typewriter.

Staff at Lloyd’s, home to more than 90 trading syndicates in London’s financial district, still trundle suitcases of claim forms for complex insurance transactions.

Lloyd’s Chief Executive Inga Beale has said the industry needs to take technology on board to maintain its role in global business. The firm recently appointed a Chief Data Officer and Beale said the sector needs to attract new, tech-savvy talent.

Part if the difficulty is that there are a mass of different systems out there and firms are often  swallowed up by bigger insurers, makes it hard to streamline technology.

Firms might like the idea of technology, but cannot be bothered spending because they are having trouble balancing their books with bond yields at record lows.

This is despite the fact that a report from Morgan Stanley and Boston Consulting Group says the first movers will clean up.

They say a full transformation to becoming a digital company could cut an insurer’s combined ratio by 21 percentage points, in other words making the firm more profitable. Expenses could fall by 10 percent of premiums and claims by 8 percent.

 

Personal storage market was flat last year

storage75.7 million personal and entry level storage products shipped in 2014 and that means the market was essentially flat.

IDC estimated that annual shipment values fell 1.5 percent compared to 2013, with a value of $6.6 billion.

Personal stort age suffered from competition from public cloud providers and people started using online streaming more, said IDC.

The entry level market is largely dominated by vendors that don’t make hard drives but their share fell as much as 17.6 percent compared to the year before.

USB continues to be the choice for the personal and entry level storage market, while Ethernet is preferred for entry level market. Thunderbolt based devices fell by 5.7 percent in the fourth quarter of 2014, the first time it had showed a decline.

Devices with over four terabytes of storage now account for a third of all shipments in the quarter.

Microsoft offers start ups Azure credits

Pic Mike MageeMicrosoft has launched a package to lure start-ups and SME’s to its Azure profile by offering them $500,000 in Azure credits. 

The deal, announced by partner Y Combinator, is only available to Y Combinator-backed companies and will be offered to the 2015 Winter and future batches.

It seems that Microsoft is following Google, AWS and IBM which already offer incentives for start-ups to join them.

Microsoft is giving Y Combinator start-ups a three years Office 365 subscription, access to Microsoft developer staff and one year of free CloudFlare and DataStax enterprise services.

It is starting to look like Microsoft is getting more aggressive in its competition with Amazon Web Services and Google, both of whom already offer credits and freebies.

Amazon offers $25,000 in AWS credits and other freebies, while Google offers $100,000 in Google platform credits and IBM offers $120,000 in credit for SoftLayer infrastructure of BlueMix PaaS.

Writing in his company’s bog Sam Altman said that this brings the total value of special offers extended to each YC company to well over $1,000,000. “The relentless nagging from partners to grow faster we throw in for free,” he said.

It is likely that the YC deal is the first of many which will be rolled out worldwide to Microsoft’s partners.

 

AT&T gets out off of its cloud

cloudWhile everyone seems to be rushing to get on the cloud, AT&T is downsizing its data centre operations.

The telco is apparently selling some of its data centres worth about $2 billion as it continues its streak of asset sales.

Apparently, AT&T is keen to get its debt loads down and pay off its credit card bill from last Christmas.  It is all a rumour of course, and the story is based on leaks to The Wall Street Journal 

Part of AT&T’s debt problems came because it had to bid high prices for spectrum.  The company said it had spent close to half of the total bids in the record-setting $44.9 billion spectrum sale that concluded last week.

AT&T bagged 251 licences in the  AWS-3 spectrum auction worth $18.2 billion. The company has also been investing to expand its footprint in Mexico to grow its business, as the US wireless market reaches saturation. It said last month it would buy bankrupt NII Holdings wireless business in Mexico for $1.875 billion.

Big Data analytics are not up to snuff

3921968993_9bccb97118_zCompanies relying on Big Data analytics might be disappointed to discover that they are not so good at finding a needle in a haystack after all.

Currently the best way to sort large databases of unstructured text is to use a technique called Latent Dirichlet allocation (LDA) which is a modelling technique that identifies text within documents as belonging to a limited number of still-unknown topics.

According to analysis published in the American Physical Society’s journal Physical Review X, LDA had become one of the most common ways to accomplish the computationally difficult problem of classifying specific parts of human language automatically into a context-appropriate category.

According to Luis Amaral, a physicist whose specialty is the mathematical analysis of complex systems and who wrote the paper, LDA is inaccurate.

The team tested LDA-based analysis with repeated analyses of the same set of unstructured data – 23,000 scientific papers and 1.2 million Wikipedia articles written in several different languages.

Not only was LDA inaccurate, its analyses were inconsistent, returning the same results only 80 percent of the time even when using the same data and the same analytic configuration.

Amaral said that accuracy of 90 percent with 80 percent consistency sounds good, but the scores are “actually poor, since they are for an easy case.”

The base of data for which big data is often praised for its ability to manage – the results would be far less accurate and far less reproducible, according to the paper.

The team created an alternative method called TopicMapping, which first breaks words down into bases (treating “stars” and “star” as the same word), then eliminates conjunctions, pronouns and other “stop words” that modify the meaning but not the topic, using a standardized list.

This approach delivered results that were 92 percent accurate and 98 percent reproducible, though, according to the paper, it only moderately improved the likelihood that any given result would be accurate.

The paper’s point was that it was not important to replace LDA with TopicMapping, but to demonstrate that the topic-analysis method that has become one of the most commonly used in big data analysis is far less accurate and far less consistent than previously believed.

 

SAP wants its software on the cloud

cloud1SAP, the maker of expensive esoteric business software which no one really understand wants to deliver its product onto the cloud.

This means you can be completely baffled by the product, without having to store it on your local servers.

Luka Mucic told the Euro am Sonntag business weekly that contract cloud work becomes profitable over time and in the long term; they can definitely become more profitable than classic licence sales.

SAP said last week its push to deliver cloud-based products via the internet would “dampen profitability” until at least 2018, even if it attempts to blow dry its profitability with a hair-dryer or makes it stand in the sun for a few hours.

This is because unlike the packaged software SAP has been selling for decades, for which clients pay a immediate licence fee, cloud-based software is generally paid for by subscription over time, but most of the costs for the software provider are upfront.

Mucic said contracts were loss making for the first year of operation.

SAP agreed in September to buy cloud-based travel and expenses software maker Concur for $7.3 billion in cash, its biggest takeover ever, but about what you can expect to pay for a single SAP business consultant.

Mucic said SAP might add another, smaller tranche, perhaps as soon as the first half of this year, but added that otherwise the company had no need for further capital. He did not say why SAP needed the money.

Worldwide IT spending still to grow

Pic Mike MageeIT spending worldwide will reach $3.8 trillion 2015 – that’s up 2.4 percent from last year.
But market intelligence company Gartner has warned that its earlier prediction of 3.9 percent will be affected by the rise in the price of the US dollar as well as conservative sentiment about services and devices.
But Gartner research VP John-David Lovelock sought to play down the reduction.  He said it “is less dramatic than it might at first seem.  The rising US dollar is chiefly responsible for the change.  Stripping out the impact of exchange rate movements, the corresponding growth figure is 3.7 percent.”
Gartner breaks down the spending by categories as follows:

Screen Shot 2015-01-12 at 12.09.23

Datacentre systems will be worth $143 billion in 2015, while enterprise software will total $335 billion.
There will be a price war in cloud per seat during 2015 with price drops of as much as 25 percent right through until 2018.  Vendors are discounting cloud offerings heavily, said Gartner.

 

Only 10 percent of cloud apps are secure

Every silver has a cloudy liningNew research has found that only one in ten cloud apps are secure enough for enterprise use.

According to a report from cloud experts Netskope, organisations are employing an average of over 600 business cloud apps, despite the majority of software posing a high risk of data leak.

More than 15 percent of logins for business cloud apps used by organisations had been breached by hackers.

One in five businesses in the Netskope cloud actively used more than 1,000 cloud apps, and over eight per cent of files in corporate-sanctioned cloud storage apps were in violation of DLP policies, source code, and other policies surrounding confidential and sensitive data.

A quarter of all files are shared with one or more people outside of the organisation, and of external users with links to shared content, almost 12 percent have access to 100 or more files.

Netskope CEO Sanjay Beri said that 2014 left an indelible mark on security – between ongoing high-profile breaches and the onslaught of vulnerabilities like Shellshock and Heartbleed, CSOs and CISOs had more on their plate than ever.

“These events underscore the sobering reality that many in the workforce have been impacted by data breaches and will subsequently use compromised accounts in their work lives, putting sensitive information at risk,” he added.

The research also found that the most insecure apps were primarily linked with marketing, finance and human resource software, while cloud storage, social and IT/app management programmes had the lowest proportion of insecure apps.

“Employees today have shifted from thinking of apps as a nice-to-have to a must-have, and CISOs must continue to adapt to that trend to secure their sensitive corporate and customer data across all cloud apps, including those unsanctioned by IT,” Beri continued.

Google Drive, Facebook, Youtube, Twitter and Gmail were among the apps investigated.

How Snowden put the brakes on Amazon’s cloud

snowdenWhile the industry is telling the world+dog know that 2015 was the year of the cloud, one has to wonder what it would have been like if Edward Snowden had not revealed high level snooping of off-site data centres.

This year Taser discovered first hand some of the problems. It won a high-profile contract to supply body cameras to the London police. But the deal nearly collapsed because video footage on Amazon’s cloud.

The deal survived only after Taser dropped Amazon.com because it did not have a data centre in Britain.  The UK coppers did not want their data going overseas where it could be snooped upon by the US.

Larger companies are getting worried about relying too heavily on Amazon’s public cloud servers, preferring to store data on their own premises or work with cloud providers that can offer them the option of dedicated servers.

It has opened the door for Microsoft which has flogged the private cloud over the public and offered companies more direct oversight of their data in the cloud.

Steve Herrod, the former chief technology officer of VMware now a venture capitalist at General Catalyst Partners said Edward Snowden did more to create a future with many clouds in many locations than any tech company has managed.

A web of new laws restricting how data can move across national borders creates another hurdle for Amazon and led for calls for it to build more localised clouds.

SAP has ruled out working with Amazon on many upcoming projects due partly to data-location issues.

Amazon insists that demand for AWS, including in Europe and Asia, has never been stronger, and that any contracts lost to rivals are the extreme exception. It said that it will build data centres in every large country over time, but that will cost a bomb.

However it is having to face that fact that the model it pioneered in 2006 is slowing down because it is UScentric – at least for now.

AWS is five times the computing capacity of its next 14 rivals, including Microsoft, Google and IBM, according to Gartner and analysts are predicting that AWS revenue will more than double from 2014 levels to $10.5 billion in 2017, faster than the market overall.

But Synergy Research Group said that it could have been a lot different. At the moment  AWS holds a 27 percent market share in the third quarter of 2014, compared to 10 percent for Microsoft’s Azure cloud business. Azure, however, grew 136 percent on a rolling annualized basis in the quarter, while AWS grew 56 percent.

Part of the reason that Azure did so well is because that Microsoft is willing to work with third-party data centre managers, such as Fujitsu, when clients are required to keep data within a country’s borders.

 

Vole is helping companies add cloud capabilities to their existing data centres and create a “hybrid” model that Amazon has only just started to offer.

Aix months ago, Barclays chose  Azure over AWS to power some development and testing work because of its private-cloud option, along with Barclays’ existing familiarity with Microsoft’s data-centre software.

Vole has the advantage that it knows a few people in corporate and government and is using them to  peddle Azure. AWS has only just started to build such ties.

It would have been different if it had not been for Snowden making those corporates and governments very nervous about allowing their data out of their sight.