Tag: big data

Actian completes Pervasive Software deal

Hands across the waterActian has signed on the dotted line and completed its purchase of Pervasive Software.

The big data management company has said that having the cloud-based and on-premises data business on board means it will be able to deliver a portfolio of highly scalable, elastic and performant products that drive positive business outcomes in the Age of Data.

Steve Shine, chief executive officer of Actian said that big data could and would impact every industry as organisations struggled to take action on their data due to legacy technology too rigid or expensive to scale.

Robin Bloor, chief analyst and co-founder, The Bloor Group, described the merger as “powerful”. He said the combination of the technologies would provide Actian with a performance capability for BI and Data Analytics which no other company could “currently equal”.

HP chucks Moonshine at non-x86 SECCs P.I.E

hpmoonshineHP has announced the latest in Project Moonshine, which CEO Meg Whitman said in a web conference should be a shift in the way servers handle data. It may also be a shift away from X86.

If nothing is done to address core infrastructure problems, Whitman said, infrastructure could be something that actually holds back the development of the web instead of enabling it. “It’s not just about cellphones and tablets connected to the internet but millions of sensors collecting data,” she said, machines talking to machines, and generating not petabytes but brontobytes of data.

Project Moonshine, Whitman promised, would not be jailhouse toilet booze but a “multiyear” and “multi phased” program to shape the future of data centres – as the current path we’re on is “not sustainable from a space, energy and cost perspective”. Using years of HP Labs research, Whitman and HP Moonshine will hel create “the foundation for the next 20 billion devices”.

In a webcast, HP’s Dave Donetelli mentioned the proof of concept for Moonshine which was unveiled in 2011, and since HP roped in 50 beta customers to thoroughly develop and test its various iterations. Now, HP has given the world the second gen Moonshine servers, which it claims are based on the concept of the ‘software defined server’ – that is, specifically with internet scaled workloads in mind, and designed for the software that needs to run on it.

Donetelli said the servers address Space, Energy, Cost, and Complexity (SECC). By which he means there’s less of all of the above.

The Moonshot 1500 enclosure, Donetelli points out, can hold 45 Moonshot servers, and compared to the traditional ProLiant server, it uses up to 80 percent less energy, 80 percent less space, and is 77 percent cheaper. Customers, then, will be able to build better revenues from a smaller footprint for less cash. These servers run on the Intel Atom s1200, though partners like AMD, Applied Micro, Texas Instruments and Calxeda are all bringing in new chipsets – which HP hopes will provoke market competition and more innovation.

Targeting big data, high performance computing, gaming, financial services, facial recognition, video analysis and other stuff, Donetelli promised that the portfolio of the servers will grow – and at a quicker rate thanks to the competition between its partners as it’s not tied to an 18 to 24 month chip cycle.

Partners will be able to, and encouraged to join the Pathfinder Innovation Ecosystem, or P.I.E., including operating system developers and software vendors.

Donetelli said this announcement is not an “incremental change” but a “new class of servers designed for the data centre”.

When asked if these will replace X86 servers, an HP spokesperson said PCs were the high volume product at that time, today things that people buy in high volume are smartphones and tablets. A transition from Unix to X86 took time, and HP believes a transition from X86 to Moonshot will take time. “X86 will be here for a very long time, but Moonshot will be here for a long time,” the spokesperson said.

Analyst Patrick Moorhead said that the developments are positive because the servers of today aren’t ready for the explosion in data driven by future trends such as the all-singing all dancing totally connected internet of things.

The first Moonshot server is shipping today in US & Canada and will be available to channel partners around the world next month.

Big Blue launches Customer Experience Lab

ibm-officeIBM has launched an initiative aimed at helping its staff improve their interaction with customers and other staff. The IBM Customer Experience Lab is supposed to help both IBM and other businesses by allowing them to gather more feedback from social networks, the target audience and the workforce. 

Mahmoud Naghshineh, IBM vice president of services research, believes emerging technologies, including social media and mobile tech, are changing the way organisations get feedback from their customers. With that in mind, there is a need to tap them as soon as possible.

“Today, businesses have a completely different way of engaging customers,” he said. “There are all these new ways of reaching out to people [but] you need to know when the right time is to engage.”

The new lab will allow clients to access IBM researchers and consultants, who will then deliver systems that learn and personalize the experiences of each individual customer, identify patterns, preferences and create context from Big Data, and drive scale economics. The whole idea is similar to IBM’s Services Lab, launched a couple of years ago.

IBM says the new customer oriented lab will focus its efforts on helping customers obtain more insight into their user base. IBM will use machine learning and visual analytics to predict differences in individual customers, thereby customizing services to a much greater extent.

The lab will be staffed by more than a hundred IBM researchers from across the world and it will also offer clients a number of workshops for generating new ideas. Although the lab will be headquartered in New York, it will feature researchers from twelve IBM labs around the globe, from Africa, Brazil, Israel, India and Japan, to the United States.

VMware needs luck as it sticks its head in the clouds

cloud (264 x 264)VMware has given up trying to wait for its partners to help it become an important name in the cloud space and has decided to do it itself.

Yesterday the outfit unveiled vCloud Hybrid Service to investors. Well we say unveiled we really mean that it told the world that was intending to set up a public cloud service. But it caught everyone on the hop because it was only a couple of months ago that VMware’s Pat Gelsinger sounded so dead set against the public cloud.

Speaking at the VMware’s Partner Exchange Conference in Las Vegas, Gelsinger said that VMware needed to own the corporate workload. He said that the company would lose if they end up in commodity public clouds.

With comments like that to suddenly come out and launch your own public cloud seems a little silly. However what Gelsinger appeared to be saying was that he did not want corporate data on other people’s public clouds.

“We want to extend our franchise from the private cloud into the public cloud and uniquely enable our customers with the benefits of both. Own the corporate workload now and forever.”

But Gelsinger’s plans might be a little tricky to pull off.

When it comes to public cloud there is a lot of top notch competition including Amazon, IBM, and HP who don’t take too kindly to strangers in the market. To make matters worse VMware’s offering will not be around until at least the second quarter.

VMware has chucked a bit of money trying to get the idea of the ground. Former Savvis Cloud president, Bill Fathers, will run the vCloud and has said that the idea will get a level of investment appropriate to that priority and to capitalize on a $14 billion market opportunity.

One of the crucial differences about what VMware is offering is that it is the service “hybrid” so that enterprises should see it as part of the VMware’s packages. The software which the vCloud is based on is called Director. It uses an IaaS environment and lets workloads become managed either in the cloud or in the office in the same way.

But all this is being set up because VMware could not interest its partners in building something similar. VMware had a crack at offering similar products through its ISP partners. But these were a little spooked that vCloud implementation would commodise their products. There were mutterings from ISPs who did not want to pay VMware licensing costs when they had cheaper open source alternatives.

VMware has a job on its hands to prove to VMware Certified Professionals that the public cloud is an extension of the data centre while at the same time convincing them that there are some advantages over the “non-cloud” environments they use now.

The public cloud will be aimed at its existing customer base and sold through its existing VAR and SI channel.

However most of VMware’s channel partners don’t have the skills to help their I&O clients transition from static virtualisation to cloud. So somehow VMware is going to have to give its channel the consulting skills and hope they can bluster their way through conversations where real cloud is needed.
Either way the company has a long way to go before it can sit comfortably among other cloud players. It might just pull it off, but it will take a bit of time and a lot of luck.

Differentia expands into big data with Actian deal

next-years-mainframe-model-comes-in-nearly-half-the-spaceQlikTech partner, Differentia Consulting, has signed a reseller agreement with Actian which will give it access to Big Data markets.

Actian wants the software under the bonnet of its enterprise big data analytics which it dubs Actian Vectorwise.

Actian was particularly interested in using Qlikview Direct Discovery functionality with Vectorwise.

Differentia provides consulting, solutions, resourcing, support and training services to its clients. It is also a key Qlikview Provider for Europe.

There are some mutual benefits to the partnership. Differentia has Qlikview clients who need to analyse and report on bigger, more complex data sets.

Adrian Parker, vice-president strategy and marketing at Differentia said that Qlikview could not scale up to handle big data pressure. When they wanted to use high data volume scenarios, customers had to build and manage numerous linked QlikView documents supported by QVD files. This was inflexible and limited the data analysis.

Parker said that QlikView with Direct Discovery uses the high performance of the Vectorwise database for calculating aggregates on-the-fly over large datasets. This simplifies the process.
Parker sees Vectorwise as the enabler of Big Data access as he gives an analytic database product where large data volumes were making deploying Qlikvew impossible.