Tag: big data

Hadoop makes the enterprise grade

cloud 2An IDC survey commissioned by Red Hat indicates Hadoop is reaching critical mass in the business world.

According to IDC, 32 percent of those surveyed already deploy Hadoop; 31 percent will deploy it in the next 12 months and 36 percent indicated they would deploy it in the future.

And the use of Hadoop is not just for analysing big data.

IDC said that 39 percent of the respondents use NoSQL databases such as Hbase, Cassandra and MongoDB, and 36 percent said they use MPP databases such as Greenplum and Vertica.

While businesses use Hadoop for analysis of raw data, 39 percent of the respondents use it to put “if-then” modelling for products and services.

The IDC survey also showed that many businesses use alternatives to HDFS such as Big Blue’s Global File System, EMC’s Isilon OneFS and Red Hat Storage – that is GlusterFS.

Big Data is a waste of space

Mammoth big dataBeancounters here and on the other side of the pond say that companies are struggling to cope with the enormous amount of data they are gathering. And that could cost them dear.

The Chartered Institute of Management Accounts (CIMA) and the  American Institute of CPAs (AICPA) surveyed 2,000 finance professionals – including CEOs and chief financial officers – and found to their dismay that 86 percent of organisations struggle to make sense of the data they’re storing.

And nearly half (44 percent) surveyed said their organisations don’t have the right tools in place to understand the trend. Only 53 percent are investing in tech to harvest and get insight into the data.

The main reasons organisations are struggling is that they can’t bring the data together; aren’t sure if the data is good quality; are unable to get information from non-financial dta and identifying trends and insights.

Nevertheless, according to Peter Simons, an analyst at CIMA, companies do understand that big data is a real business asset. He said as many as 93 percent of the respondents think finance has an important part to play to help their organisations  benefit from data.

Big Blue’s Big Data Lab reveals the Big Unknowns

ibm-officeEver had the feeling there were things afoot that were unknown to you? You’re not alone. But fear not, for the good folk of IBM have pulled a Big Blue Rabbit out of the Big Data Hat for you.

The world’s favourite international business machines have opened a new lab called the Accelerated Discovery Lab. One of the most remarkable things about it might even be its name, which – unlike so much of what goes on in the technology sector – seems related to what it does.

The lab will offer “diverse data sources, unique research capabilities for analytics such as domain models, text analytics and natural language processing capabilities derived from Watson, a powerful hardware and software infrastructure, and broad domain expertise including biology, medicine, finance, weather modeling, mathematics, computer science and information technology,” said IBM, presumably just before it passed out. Don’t forget to breath, dear.

By making it possible for organisations to take their data and mix it with these vast and disparate data sources, the Accelerated Discovery Lab will make it possible to start to identify hitherto unknown relationships among the data.

That could be to find seasonal patterns in purchasing behaviour that go beyond the obvious, such as people buy ice cream and shorts in summer. Or it could be combining social media insights with psychology data in an attempt to create meaningful customer profiling. Or it might be finding statistically robust segmentation that takes you further than ‘our target market is men in the 35-50 age bracket.’

At the moment, analysing Big Data can mean relying on a fairly manual approach to the massive amounts of data, gathered from a broad variety of channels. Whether you’re a business or a researcher, this is a testing and expensive process with precious little in the way of meaning or value waiting for you at the end.

There is obvious appeal in being able to accelerate that process.

“If we think about Big Data today, we mostly use it to find answers and correlations to ideas that are already known. Increasingly what we need to do is figure out ways to find things that aren’t known within that data,” said Jeff Welser, Director, Strategy and Program Development, IBM Research Accelerated Discovery Lab.

And to think how people laughed at Donald Rumsfeld when he said something not too dissimilar.

Big Data is underused

Big dataCompanies aren’t using Big Data for their own competitive advantage.

That’s according to a survey by Stibo Systems, which surveyed 200 “senior business decision makers” looking after IT.

And despite companies having zillions of bytes of data, 34 percent said they didn’t know what their firm does with the information.  Another 15 percent said their organisations keep too much data.

But there’s another problem. Stibo’s survey revealed that pan-European businesses have trouble managing their data. Some get third parties to look after it while some don’t make any use of it at all.

Companies managing their own data centrally can use it better than firms which have “siloes” all over the place.

Simon Walker, a director at Stibo, said: “With so much of a company’s data being used for marketing purposes to inform financial decisions, it begs the question of why it’s largely owned by IT departments and not those departments that are using it. There is a large number of enterprises being left behind in big data adoption simply due  to the lack of effective data management processes.”

Bull flies red rag with fast data analytics

scotbullBull Information Systems has put together a new big data analytics tool called “bullion fast data analytics”, designed to look at data from the digital economy in real time.

It has been built using Pivotal based technologies in combination with Bull’s bullion servers.

Bull points out that this year there are roughly 3 Zettabytes of data floating around, or 400 Gigabytes for everyone on the planet, with this figure only set to increase to up to 40 Zettabytes by 2020. So for it’s very useful for organisations to be able to sift through this data and extract relevant information, whether that is managing crises, or building customer loyalty. Of course, we have all heard about “big data” this year.

Fast data analytics is, Bull asserts, the “first platform to integrate new data fabrics, modern programming frameworks, cloud portability and support for legacy systems”. The architecture has been designed on top of Pivotal Greenplum Database and Gemfire, and the company promises the end product makes analytics less complex, shifting the focus from software tinkering to applying the actual information.

The company says its technology is highly flexible and can “significantly” reduce Total Cost of Ownership, as well as having been validated with Pivotal and VMware at Bull’s R&D labs. It runs in a virtualised environement, promises lower latency, and cost savings.

VP of Bull’s enterprise service business, Jacqueline Moussa, said the company offers a “unified and robust platform”.

“Organisations can take advantage of lower implementation and operations costs and quick real-time analysis of the huge amounts of data being produced each hour,” Moussa said.

IBM buys into Irish Big Data

IBM logoAmerican behemoth IBM said it has bought Dublin based company the Now Factory but didn’t say how much it paid for it.

The privately held firm makes analytics software targeted at communication service providers (CSPs).

IBM said that the software complements its own range of products – IBM Mobile First. These products are intended to help organisations analyse mobile device usage.

The Now Factory’s stuff analyses large quantities of business data and networks data, and that provides better management of outages.

IBM claims that people create 2.5 quintillion bytes of data a day, whether its healthcare, social media, climactic information and the rest.

Big Blue wants to position itself as the leader in the Big Data field. The buy will be completed in the fourth quarter, and the company borged into  the IBM mother ship.

FusionExperience gets SAP silver status

sapbeerSAP has awarded business and tech company FusionExperience with silver partner status, granting it access to the usual resources, services and benefits to help it build a customer base through SAP.

The company will gain access to a wider range of software for clients, and FusionExperience believes the partnership will build on its current portfolio it can offer to clients.

FusionExperience has brought SAP Sybase IQ and SAP Hana into its own Big Data and Visualisation offerings, which the company promises extends the range of services it can offer to clients.

FusionExperience’s CEO, Steve Edkins, said that now the company is both a VAR and service partner with SAP, his company can act as a single supply source for all of SAP’s database and technology kit, and that includes advice, training, implementation, support, and providing licences.

“The benefits for our clients are the access to a single source of experts that are constantly kept abreast of the latest SAP developments and reduction in total cost of ownership for SAP technology,” Edkins said.

FusionExperience said it plans to take an active role in supporting, customising, and deploying SAP systems.

SAP wants VARs to cash in on big data

sapbeerSAP is telling its partners that it is time to cash in on big data. The company estimates that its global partner base will earn up to $220 billion by selling its big data and analytics products.

So it sees a huge opportunity for partners and resellers, who could provide more services and products in addition to SAP software.

A recent IDC report revealed that SAP partners could be in for a lot of growth over the next five years. IDC’s Worldwide Ecosystem Analytics and Big Data: Growth Opportunities for SAP Partners found that EMEA partners could earn $70 billion by 2018, dabbling in big data and analytics. Asia Pacific and Japan should climb to $40 billion, while North America will lead the way with $102 billion.

One of the more curious factoids from the report claims that the digital landscape will grow more than 30 thousand percent between 2005 and 2020, from 130 exabytes to 40,000 exabytes. It’s not called big data for nothing.

“SAP and its partners make a significant impact on the global economy,” said Darren Bibby, vice president for IDC Channels and Alliances Research. “SAP does an excellent job delivering great products for partners to work with, as well as effective sales, marketing and training resources. The result is that the SAP ecosystem is well-positioned for the future and customers will benefit from these additional skills and resources.”

Interestingly, the IDC report concluded that 68 percent of the companies don’t have a business intelligence or analytics strategy, while a whopping 63 percent don’t even know what big data is. However, 69 percent said they are looking for staff who can handle analytics.

As it grows, the industry will change. IDC believes 90 percent of industry growth will come through third-platform technology, cloud, mobile and social.

EU firms complacent on data risk

ironmountainCABusinesses, overwhelmed by an ever increasing surge of data to deal with, are in danger of becoming complacent about data loss.

A survey from Iron Mountain and PwC has determined there is an increasing awareness in information risk, but many SMEs just don’t have the tools in place to deal with the reams of data and in multiple formats. There is also a danger of more sophisticated security threats as well as needing to treat information management as essential to business.

Under half of businesses surveyed in the 2013 Risk Maturity Index said they had a strategy in place for measuring and combating information risk – even as the average number of data breaches increase by 50 percent each year.

Of those asked, over half were so overwhelmed by the threat of data breaches that they acknowledge they’ll never be able to keep up, while 41 percent said data loss is an “inevitable part of daily business”.

Evaluating 600 European SMEs with between 250 and 2,500 employees, across the legal, financial, pharma, insurance, manufacturing and engineering sectors, there was some improvement compared to last year in understanding information risk. Using a set of metrics based on the amount of data protection in place, it rates companies at a target score of 100. This year European companies scored an average of 56.8 compared to 40.6 last year, but clearly there is a long way to go.

PwC Risk partner Claire Reid said that businesses will have to embrance a “new way of thinking” – where data security will be a top priority and also a way to create value.

Internet of Everything becomes something

map-of-internetAccording to a report from Cisco the latest buzzword on the world wide wibble, the Internet of Everything, will become a major market earner by the end of the year.

The Internet of Everything is the networked connection of people, process, data and things so that “everything” joins the network.

Cloud computing is one of the early examples of the Internet of Things along with the boom in the mobility market.

According to the Internet of Everything Value Index study released by Cisco the global private-sector businesses to generate at least $613 billion this year.

Companies who optimise the connections among people, process, data and things will generate the largest profits, the report said.

Rob Lloyd, Cisco President of Development and Sales said that the study of 7,500 global business and IT leaders in 12 countries reports that the United States, China and Germany will earn the most.

They will be chasing the promise of nearly doubling their profits by adopting business practices, customer approaches and technologies that use Internet of Everything ideas.

He said that the Internet of Everything is already driving private-sector corporate profits, it is estimated that an additional $544 billion could be realised if companies adjusted their strategies.

“The Internet of Everything has the potential to significantly reshape our economy and transform key industries. The question is who will come out on top and win in this new economy. This study shows us that success won’t be based on geography or company size but on who can adapt fastest,” Lloyd said.

SmartThings CTO Jeff Hagins said that the study confirms the potential for the Internet of Everything.
“With the SmartThings platform and open community, we believe that more developers and inventors will be able to participate in the value chain and ultimately bring the physical graph to life,” he said.

Global businesses can pursue as much as $14.4 trillion over the next decade by using the Internet of Everything to improve operations and customer service.


HP expands its big data products

old schoolHP has expanded its big data products portfolio so that partners can tailor products so that clients can squeeze more out of their business information.

There is a lot of money in these sorts of products. According to HP research, nearly 60 percent of companies surveyed will spend at least 10 percent of their innovation budget on big data this year.

However the study also found that one in three organisations have failed with a big data initiative and are wary of getting their fingers burnt again.

HP thinks its new enhanced portfolio delivers big data out of the box so that it can enable enterprises to handle the growing volume, variety, velocity and vulnerability of data that can cause these initiatives to fail.
The new product range is based around HAVEn which is a big data analytics platform, which uses HP’s analytics software, hardware and services.

George Kadifa, executive vice president, Software said that big data enables organisations to take advantage of the totality of their information—both internal and external—in real time.

It produces extremely fast decision making, resulting in unique and innovative ways to serve customers and society.

HAVEn combines proven technologies from HP Autonomy, HP Vertica, HP ArcSight and HP Operations Management, as well as key industry initiatives such as Hadoop.

It avoids vendor lock-in with an open architecture that supports a broad range of analytics tools and protect investments with support for multiple virtualisation technologies.

HAVEn uses all information collected including structured, semistructured and unstructured data, via HP’s portfolio of more than 700 connectors into HAVEn.

It means that organisations can consume, manage and analyse massive streams of IT operational data from a variety of HP products, including HP ArcSight Logger and the HP Business Service Management portfolio, as well as third-party sources.

In addition to this HP announced its Vertica Community Edition. This is a free, downloadable software that delivers the same functionality of the HP Vertica Analytics Platform Enterprise Edition with no commitments or time limits. Clients can analyse up to a terabyte of data before spending more cash on an enterprisewide solution.

There is also the HP Autonomy Legacy Data Cleanup—information governance package. According to HP this helps clients analyse legacy data, lower costs and reduce risks while squeezing value from big data.

Enterprise software driven by Cloud, Big Data

cloud 2A report from IDC said the market for enterprise software worldwide showed conservative growth during 2012.

It estimated that the worldwide software market grew 3.6 percent year on year – half the growth rate of 2010 and 2011.

However, some market segments grew by between six and seven percent, including data access, analysis, CRM applications, security software and collaborative software.

IDC said that the management of information for competiive purposes is pushing along applications associated with Big Data and analytics.

From the vendor standpoint, Microsoft was the leader of the applications primary market in 2012 with 13.7 market share, followed by SAP, Oracle, IBM and Adobe. Of these vendor, IBM showed the highest growth rate.

System infrastructure software made up 27 percent of total software revenues but that only grew 3.3 percent during 2012, compared to the previous year.

Actian completes Pervasive Software deal

Hands across the waterActian has signed on the dotted line and completed its purchase of Pervasive Software.

The big data management company has said that having the cloud-based and on-premises data business on board means it will be able to deliver a portfolio of highly scalable, elastic and performant products that drive positive business outcomes in the Age of Data.

Steve Shine, chief executive officer of Actian said that big data could and would impact every industry as organisations struggled to take action on their data due to legacy technology too rigid or expensive to scale.

Robin Bloor, chief analyst and co-founder, The Bloor Group, described the merger as “powerful”. He said the combination of the technologies would provide Actian with a performance capability for BI and Data Analytics which no other company could “currently equal”.

HP chucks Moonshine at non-x86 SECCs P.I.E

hpmoonshineHP has announced the latest in Project Moonshine, which CEO Meg Whitman said in a web conference should be a shift in the way servers handle data. It may also be a shift away from X86.

If nothing is done to address core infrastructure problems, Whitman said, infrastructure could be something that actually holds back the development of the web instead of enabling it. “It’s not just about cellphones and tablets connected to the internet but millions of sensors collecting data,” she said, machines talking to machines, and generating not petabytes but brontobytes of data.

Project Moonshine, Whitman promised, would not be jailhouse toilet booze but a “multiyear” and “multi phased” program to shape the future of data centres – as the current path we’re on is “not sustainable from a space, energy and cost perspective”. Using years of HP Labs research, Whitman and HP Moonshine will hel create “the foundation for the next 20 billion devices”.

In a webcast, HP’s Dave Donetelli mentioned the proof of concept for Moonshine which was unveiled in 2011, and since HP roped in 50 beta customers to thoroughly develop and test its various iterations. Now, HP has given the world the second gen Moonshine servers, which it claims are based on the concept of the ‘software defined server’ – that is, specifically with internet scaled workloads in mind, and designed for the software that needs to run on it.

Donetelli said the servers address Space, Energy, Cost, and Complexity (SECC). By which he means there’s less of all of the above.

The Moonshot 1500 enclosure, Donetelli points out, can hold 45 Moonshot servers, and compared to the traditional ProLiant server, it uses up to 80 percent less energy, 80 percent less space, and is 77 percent cheaper. Customers, then, will be able to build better revenues from a smaller footprint for less cash. These servers run on the Intel Atom s1200, though partners like AMD, Applied Micro, Texas Instruments and Calxeda are all bringing in new chipsets – which HP hopes will provoke market competition and more innovation.

Targeting big data, high performance computing, gaming, financial services, facial recognition, video analysis and other stuff, Donetelli promised that the portfolio of the servers will grow – and at a quicker rate thanks to the competition between its partners as it’s not tied to an 18 to 24 month chip cycle.

Partners will be able to, and encouraged to join the Pathfinder Innovation Ecosystem, or P.I.E., including operating system developers and software vendors.

Donetelli said this announcement is not an “incremental change” but a “new class of servers designed for the data centre”.

When asked if these will replace X86 servers, an HP spokesperson said PCs were the high volume product at that time, today things that people buy in high volume are smartphones and tablets. A transition from Unix to X86 took time, and HP believes a transition from X86 to Moonshot will take time. “X86 will be here for a very long time, but Moonshot will be here for a long time,” the spokesperson said.

Analyst Patrick Moorhead said that the developments are positive because the servers of today aren’t ready for the explosion in data driven by future trends such as the all-singing all dancing totally connected internet of things.

The first Moonshot server is shipping today in US & Canada and will be available to channel partners around the world next month.

Big Blue launches Customer Experience Lab

ibm-officeIBM has launched an initiative aimed at helping its staff improve their interaction with customers and other staff. The IBM Customer Experience Lab is supposed to help both IBM and other businesses by allowing them to gather more feedback from social networks, the target audience and the workforce. 

Mahmoud Naghshineh, IBM vice president of services research, believes emerging technologies, including social media and mobile tech, are changing the way organisations get feedback from their customers. With that in mind, there is a need to tap them as soon as possible.

“Today, businesses have a completely different way of engaging customers,” he said. “There are all these new ways of reaching out to people [but] you need to know when the right time is to engage.”

The new lab will allow clients to access IBM researchers and consultants, who will then deliver systems that learn and personalize the experiences of each individual customer, identify patterns, preferences and create context from Big Data, and drive scale economics. The whole idea is similar to IBM’s Services Lab, launched a couple of years ago.

IBM says the new customer oriented lab will focus its efforts on helping customers obtain more insight into their user base. IBM will use machine learning and visual analytics to predict differences in individual customers, thereby customizing services to a much greater extent.

The lab will be staffed by more than a hundred IBM researchers from across the world and it will also offer clients a number of workshops for generating new ideas. Although the lab will be headquartered in New York, it will feature researchers from twelve IBM labs around the globe, from Africa, Brazil, Israel, India and Japan, to the United States.