Tag: big data

Governments need to go cloud busting

cloudbustA report from Gartner said that by 2017 public cloud offerings will account for over 25 percent of government business services, not counting defence and security.

But CIOs need to get themselves into the debate on public cloud sourcing, and kick of sourcing strategies with politicos.

By 2017, predicted Gartner, 35 percent of government shared service organisations will be managed by private sector companies. Public private partnerships are already embracing infrastructure as a service but governments will move to integration and software as a service.

And, Gartner predicts, by 2017 as many as 60 percent plus of government open data programmes that do not use open data internally will be discontinued.

And if you’ve a job in government software development, mind your back, because at least 25 percent of such jobs will be axed while governments hire data analysts from outside.  Data analysis is now a high priority.

Where big data is headed

John Dee and Queen Elizabeth the firstAs predicted here, now is the time for predictions for 2014. And the latest seer to gaze into his dark John Dee style speculum is Mario Faria, from ServiceSource.

Fario says many companies have experimented with big data stuff during 2013 and tried pilots. The crunch time will be when these pilots start to show a return on investment.

And next year, he says, the phrase big data will yield to analytics because it’s not about collecting data but to capture, analyse and act in the here and now to stay cmpetitive.

Fario’s job title is Chief Data Officer and he explains that this function is the janitor of data but also an evangelist and will engage in 2014 with the board of directors. No doubt the board of directors will engage with chief data officers too.

Unlike the Harris poll we published earlier, which shows a degree of insouciance about wearable stuff, Fario thinks that Fistbit, Nike devices and Google Glasses will be part of our everyday life.

But ultimately, everything is down to the quality of data because if its not reliable any amount of analysis will not deliver results. He says that data quality is a  money maker.

IBM takes plastic quantum route

ibm-officeBig Blue said it has made a breakthrough that will provide the potential to create ultra fast optical switches, suitable for future big data computer systems.

The company said its scientists have demonstrated a quantum mechanical process known as Bose-Einstein Condensation (BEC).

It uses a luminescent polymer similar to the light emitting displays in smartphones of today.

The phenomenon demonstrated by the scientists is named after Satyendranath Bose and Albert Einstein – they predicted it in the mid 1920s, said IBM.  A BEC is a state of matter when a dilute gas of particles (bosons) are cooled to close to absolute zero (-273C).

However, IBM has achieved the same state at room temperature in a thin plastic film of 35 nanometres with bosonic particles created through interaction of the polymer material and light.   The phenomenon last for a few picoseconds, but the IBM scientists think that is long enough to create a source of laser-like light or an optical switch.

Hadoop makes the enterprise grade

cloud 2An IDC survey commissioned by Red Hat indicates Hadoop is reaching critical mass in the business world.

According to IDC, 32 percent of those surveyed already deploy Hadoop; 31 percent will deploy it in the next 12 months and 36 percent indicated they would deploy it in the future.

And the use of Hadoop is not just for analysing big data.

IDC said that 39 percent of the respondents use NoSQL databases such as Hbase, Cassandra and MongoDB, and 36 percent said they use MPP databases such as Greenplum and Vertica.

While businesses use Hadoop for analysis of raw data, 39 percent of the respondents use it to put “if-then” modelling for products and services.

The IDC survey also showed that many businesses use alternatives to HDFS such as Big Blue’s Global File System, EMC’s Isilon OneFS and Red Hat Storage – that is GlusterFS.

Big Data is a waste of space

Mammoth big dataBeancounters here and on the other side of the pond say that companies are struggling to cope with the enormous amount of data they are gathering. And that could cost them dear.

The Chartered Institute of Management Accounts (CIMA) and the  American Institute of CPAs (AICPA) surveyed 2,000 finance professionals – including CEOs and chief financial officers – and found to their dismay that 86 percent of organisations struggle to make sense of the data they’re storing.

And nearly half (44 percent) surveyed said their organisations don’t have the right tools in place to understand the trend. Only 53 percent are investing in tech to harvest and get insight into the data.

The main reasons organisations are struggling is that they can’t bring the data together; aren’t sure if the data is good quality; are unable to get information from non-financial dta and identifying trends and insights.

Nevertheless, according to Peter Simons, an analyst at CIMA, companies do understand that big data is a real business asset. He said as many as 93 percent of the respondents think finance has an important part to play to help their organisations  benefit from data.

Big Blue’s Big Data Lab reveals the Big Unknowns

ibm-officeEver had the feeling there were things afoot that were unknown to you? You’re not alone. But fear not, for the good folk of IBM have pulled a Big Blue Rabbit out of the Big Data Hat for you.

The world’s favourite international business machines have opened a new lab called the Accelerated Discovery Lab. One of the most remarkable things about it might even be its name, which – unlike so much of what goes on in the technology sector – seems related to what it does.

The lab will offer “diverse data sources, unique research capabilities for analytics such as domain models, text analytics and natural language processing capabilities derived from Watson, a powerful hardware and software infrastructure, and broad domain expertise including biology, medicine, finance, weather modeling, mathematics, computer science and information technology,” said IBM, presumably just before it passed out. Don’t forget to breath, dear.

By making it possible for organisations to take their data and mix it with these vast and disparate data sources, the Accelerated Discovery Lab will make it possible to start to identify hitherto unknown relationships among the data.

That could be to find seasonal patterns in purchasing behaviour that go beyond the obvious, such as people buy ice cream and shorts in summer. Or it could be combining social media insights with psychology data in an attempt to create meaningful customer profiling. Or it might be finding statistically robust segmentation that takes you further than ‘our target market is men in the 35-50 age bracket.’

At the moment, analysing Big Data can mean relying on a fairly manual approach to the massive amounts of data, gathered from a broad variety of channels. Whether you’re a business or a researcher, this is a testing and expensive process with precious little in the way of meaning or value waiting for you at the end.

There is obvious appeal in being able to accelerate that process.

“If we think about Big Data today, we mostly use it to find answers and correlations to ideas that are already known. Increasingly what we need to do is figure out ways to find things that aren’t known within that data,” said Jeff Welser, Director, Strategy and Program Development, IBM Research Accelerated Discovery Lab.

And to think how people laughed at Donald Rumsfeld when he said something not too dissimilar.

Big Data is underused

Big dataCompanies aren’t using Big Data for their own competitive advantage.

That’s according to a survey by Stibo Systems, which surveyed 200 “senior business decision makers” looking after IT.

And despite companies having zillions of bytes of data, 34 percent said they didn’t know what their firm does with the information.  Another 15 percent said their organisations keep too much data.

But there’s another problem. Stibo’s survey revealed that pan-European businesses have trouble managing their data. Some get third parties to look after it while some don’t make any use of it at all.

Companies managing their own data centrally can use it better than firms which have “siloes” all over the place.

Simon Walker, a director at Stibo, said: “With so much of a company’s data being used for marketing purposes to inform financial decisions, it begs the question of why it’s largely owned by IT departments and not those departments that are using it. There is a large number of enterprises being left behind in big data adoption simply due  to the lack of effective data management processes.”

Bull flies red rag with fast data analytics

scotbullBull Information Systems has put together a new big data analytics tool called “bullion fast data analytics”, designed to look at data from the digital economy in real time.

It has been built using Pivotal based technologies in combination with Bull’s bullion servers.

Bull points out that this year there are roughly 3 Zettabytes of data floating around, or 400 Gigabytes for everyone on the planet, with this figure only set to increase to up to 40 Zettabytes by 2020. So for it’s very useful for organisations to be able to sift through this data and extract relevant information, whether that is managing crises, or building customer loyalty. Of course, we have all heard about “big data” this year.

Fast data analytics is, Bull asserts, the “first platform to integrate new data fabrics, modern programming frameworks, cloud portability and support for legacy systems”. The architecture has been designed on top of Pivotal Greenplum Database and Gemfire, and the company promises the end product makes analytics less complex, shifting the focus from software tinkering to applying the actual information.

The company says its technology is highly flexible and can “significantly” reduce Total Cost of Ownership, as well as having been validated with Pivotal and VMware at Bull’s R&D labs. It runs in a virtualised environement, promises lower latency, and cost savings.

VP of Bull’s enterprise service business, Jacqueline Moussa, said the company offers a “unified and robust platform”.

“Organisations can take advantage of lower implementation and operations costs and quick real-time analysis of the huge amounts of data being produced each hour,” Moussa said.

IBM buys into Irish Big Data

IBM logoAmerican behemoth IBM said it has bought Dublin based company the Now Factory but didn’t say how much it paid for it.

The privately held firm makes analytics software targeted at communication service providers (CSPs).

IBM said that the software complements its own range of products – IBM Mobile First. These products are intended to help organisations analyse mobile device usage.

The Now Factory’s stuff analyses large quantities of business data and networks data, and that provides better management of outages.

IBM claims that people create 2.5 quintillion bytes of data a day, whether its healthcare, social media, climactic information and the rest.

Big Blue wants to position itself as the leader in the Big Data field. The buy will be completed in the fourth quarter, and the company borged into  the IBM mother ship.

FusionExperience gets SAP silver status

sapbeerSAP has awarded business and tech company FusionExperience with silver partner status, granting it access to the usual resources, services and benefits to help it build a customer base through SAP.

The company will gain access to a wider range of software for clients, and FusionExperience believes the partnership will build on its current portfolio it can offer to clients.

FusionExperience has brought SAP Sybase IQ and SAP Hana into its own Big Data and Visualisation offerings, which the company promises extends the range of services it can offer to clients.

FusionExperience’s CEO, Steve Edkins, said that now the company is both a VAR and service partner with SAP, his company can act as a single supply source for all of SAP’s database and technology kit, and that includes advice, training, implementation, support, and providing licences.

“The benefits for our clients are the access to a single source of experts that are constantly kept abreast of the latest SAP developments and reduction in total cost of ownership for SAP technology,” Edkins said.

FusionExperience said it plans to take an active role in supporting, customising, and deploying SAP systems.

SAP wants VARs to cash in on big data

sapbeerSAP is telling its partners that it is time to cash in on big data. The company estimates that its global partner base will earn up to $220 billion by selling its big data and analytics products.

So it sees a huge opportunity for partners and resellers, who could provide more services and products in addition to SAP software.

A recent IDC report revealed that SAP partners could be in for a lot of growth over the next five years. IDC’s Worldwide Ecosystem Analytics and Big Data: Growth Opportunities for SAP Partners found that EMEA partners could earn $70 billion by 2018, dabbling in big data and analytics. Asia Pacific and Japan should climb to $40 billion, while North America will lead the way with $102 billion.

One of the more curious factoids from the report claims that the digital landscape will grow more than 30 thousand percent between 2005 and 2020, from 130 exabytes to 40,000 exabytes. It’s not called big data for nothing.

“SAP and its partners make a significant impact on the global economy,” said Darren Bibby, vice president for IDC Channels and Alliances Research. “SAP does an excellent job delivering great products for partners to work with, as well as effective sales, marketing and training resources. The result is that the SAP ecosystem is well-positioned for the future and customers will benefit from these additional skills and resources.”

Interestingly, the IDC report concluded that 68 percent of the companies don’t have a business intelligence or analytics strategy, while a whopping 63 percent don’t even know what big data is. However, 69 percent said they are looking for staff who can handle analytics.

As it grows, the industry will change. IDC believes 90 percent of industry growth will come through third-platform technology, cloud, mobile and social.

EU firms complacent on data risk

ironmountainCABusinesses, overwhelmed by an ever increasing surge of data to deal with, are in danger of becoming complacent about data loss.

A survey from Iron Mountain and PwC has determined there is an increasing awareness in information risk, but many SMEs just don’t have the tools in place to deal with the reams of data and in multiple formats. There is also a danger of more sophisticated security threats as well as needing to treat information management as essential to business.

Under half of businesses surveyed in the 2013 Risk Maturity Index said they had a strategy in place for measuring and combating information risk – even as the average number of data breaches increase by 50 percent each year.

Of those asked, over half were so overwhelmed by the threat of data breaches that they acknowledge they’ll never be able to keep up, while 41 percent said data loss is an “inevitable part of daily business”.

Evaluating 600 European SMEs with between 250 and 2,500 employees, across the legal, financial, pharma, insurance, manufacturing and engineering sectors, there was some improvement compared to last year in understanding information risk. Using a set of metrics based on the amount of data protection in place, it rates companies at a target score of 100. This year European companies scored an average of 56.8 compared to 40.6 last year, but clearly there is a long way to go.

PwC Risk partner Claire Reid said that businesses will have to embrance a “new way of thinking” – where data security will be a top priority and also a way to create value.

Internet of Everything becomes something

map-of-internetAccording to a report from Cisco the latest buzzword on the world wide wibble, the Internet of Everything, will become a major market earner by the end of the year.

The Internet of Everything is the networked connection of people, process, data and things so that “everything” joins the network.

Cloud computing is one of the early examples of the Internet of Things along with the boom in the mobility market.

According to the Internet of Everything Value Index study released by Cisco the global private-sector businesses to generate at least $613 billion this year.

Companies who optimise the connections among people, process, data and things will generate the largest profits, the report said.

Rob Lloyd, Cisco President of Development and Sales said that the study of 7,500 global business and IT leaders in 12 countries reports that the United States, China and Germany will earn the most.

They will be chasing the promise of nearly doubling their profits by adopting business practices, customer approaches and technologies that use Internet of Everything ideas.

He said that the Internet of Everything is already driving private-sector corporate profits, it is estimated that an additional $544 billion could be realised if companies adjusted their strategies.

“The Internet of Everything has the potential to significantly reshape our economy and transform key industries. The question is who will come out on top and win in this new economy. This study shows us that success won’t be based on geography or company size but on who can adapt fastest,” Lloyd said.

SmartThings CTO Jeff Hagins said that the study confirms the potential for the Internet of Everything.
“With the SmartThings platform and open community, we believe that more developers and inventors will be able to participate in the value chain and ultimately bring the physical graph to life,” he said.

Global businesses can pursue as much as $14.4 trillion over the next decade by using the Internet of Everything to improve operations and customer service.

 

HP expands its big data products

old schoolHP has expanded its big data products portfolio so that partners can tailor products so that clients can squeeze more out of their business information.

There is a lot of money in these sorts of products. According to HP research, nearly 60 percent of companies surveyed will spend at least 10 percent of their innovation budget on big data this year.

However the study also found that one in three organisations have failed with a big data initiative and are wary of getting their fingers burnt again.

HP thinks its new enhanced portfolio delivers big data out of the box so that it can enable enterprises to handle the growing volume, variety, velocity and vulnerability of data that can cause these initiatives to fail.
The new product range is based around HAVEn which is a big data analytics platform, which uses HP’s analytics software, hardware and services.

George Kadifa, executive vice president, Software said that big data enables organisations to take advantage of the totality of their information—both internal and external—in real time.

It produces extremely fast decision making, resulting in unique and innovative ways to serve customers and society.

HAVEn combines proven technologies from HP Autonomy, HP Vertica, HP ArcSight and HP Operations Management, as well as key industry initiatives such as Hadoop.

It avoids vendor lock-in with an open architecture that supports a broad range of analytics tools and protect investments with support for multiple virtualisation technologies.

HAVEn uses all information collected including structured, semistructured and unstructured data, via HP’s portfolio of more than 700 connectors into HAVEn.

It means that organisations can consume, manage and analyse massive streams of IT operational data from a variety of HP products, including HP ArcSight Logger and the HP Business Service Management portfolio, as well as third-party sources.

In addition to this HP announced its Vertica Community Edition. This is a free, downloadable software that delivers the same functionality of the HP Vertica Analytics Platform Enterprise Edition with no commitments or time limits. Clients can analyse up to a terabyte of data before spending more cash on an enterprisewide solution.

There is also the HP Autonomy Legacy Data Cleanup—information governance package. According to HP this helps clients analyse legacy data, lower costs and reduce risks while squeezing value from big data.

Enterprise software driven by Cloud, Big Data

cloud 2A report from IDC said the market for enterprise software worldwide showed conservative growth during 2012.

It estimated that the worldwide software market grew 3.6 percent year on year – half the growth rate of 2010 and 2011.

However, some market segments grew by between six and seven percent, including data access, analysis, CRM applications, security software and collaborative software.

IDC said that the management of information for competiive purposes is pushing along applications associated with Big Data and analytics.

From the vendor standpoint, Microsoft was the leader of the applications primary market in 2012 with 13.7 market share, followed by SAP, Oracle, IBM and Adobe. Of these vendor, IBM showed the highest growth rate.

System infrastructure software made up 27 percent of total software revenues but that only grew 3.3 percent during 2012, compared to the previous year.
IDC_software