Tag: big data

Big Data demand increasing

IDC expects spending on big data and business analytics (BDA) to increase 10.1 percent over 2020.

The beancounters have a theory that as the global economy picks up spending on BDA will reach $215.7 billion this year, an increase of 10.1 percent over 2020.

In an update to its Worldwide Big Data and Analytics Spending Guide, the research firm has tipped BDA spending to gain strength over the next five years as the global economy recovers from the COVID-19 pandemic. The compound annual growth rate (CAGR) between the 2021-2025 period will be 12.8 percent, IDC said.

IDC Vice president Jessica Goepfert said: “As executives seek solutions to enable better, faster decisions, we’re seeing relatively healthy BDA spending across all industries.”

IDC cites businesses in the professional services industry that are utilising big data and analytics to support their 360-degree customer and client management efforts, as well as project management initiatives.

Big Data will see big investments

Frost & Sullivan predicts that the global Big Data Analytics (BDA) market is estimated to witness double-digit growth in the post-pandemic COVID-19 era.

In its report, Post-pandemic Growth Opportunity Analysis of the Big Data Analytics Market, Frost & Sullivan said that that if COVID-19  is contained by August 2020 and that global markets will be able to recover by the end of the year, the market is expected to expand at a compound annual growth rate (CAGR) of 28.9 percent, reaching $68.09 billion by 2025 from $14.85 billion in 2019, the report said.

Under the conservative forecast, the market is likely to reach $41.84 billion by 2025, at a CAGR of 18.8 percent. Depending on the development and availability of a vaccine, the conservative forecast includes a market slowdown and recovery period of 18 to 24 months.

Deviki Gupta, Information & Communication Technologies Senior Industry Analyst at Frost & Sullivan, said: “Considering the benefits of BDA solutions in both the government and intelligence (G&I) and non-governmental organization (NGO) sectors, there will be an increase in demand for analytics as it has promising features, such as mitigating risk in business planning, improving operations, and better serving customer needs.”  

Big Data vendors are mostly American

Beancounters at Globaldata have been looking at the biggest number crunchers in the business and have found all but one are American.

The company’s report, ‘Big data – Thematic Research’ , details how companies that fail to derive actionable insights from the wealth of data that they possess, struggle to compete with rivals using diverse data sets to gain a deeper understanding of their customers and marketplace.

OpenPOWER reveals hardware plans

Screen Shot 2015-03-19 at 09.25.15The OpenPOWER Foundation – a group backed by Google, IBM, Nvidia, Mellanox, Tyan and others, revealed its hardware plans to capture data centre business.

OpenPOWER has over 100 members worldwide and IBM claims, for example, that Power 8 microprocessors offer something close to 60 percent better price performance than the competition. The competition, by the way, is mainly Intel.

IBM claims the Power 8 microprocessor is the first CPU designed specifically for Big Data and analytics workloads.

OpenPOWER members showed a number of hardware elements in their plan to grab data centre business.

IBM and Wistron showed off a prototype of a high performance server using tech from Nvidia and Mellanox. IBM will deliver two systems to Lawrence Livermore and Oak Ridge National Laboratories, with a throughput five to 10 times faster than existing supercomputers.

In the second quarter of this year, Tyan will release its TN71-BP012, using an OpenPOWER customer reference system and aid at large scale cloud projects.

Nviia, Tyan and Cirrascale have developed the Cirrascale RM4950, which is a GPU accelerated developer platform which will be available in volume in the second quarter of this year. That’s aimed at big data analytics and scientific computing applications.

Smart city connections rise over a billion

Internet of ThingsEven though there’s little in the way of standards for the internet of things (IoT), the revolution is already here, according to research published by Gartner.

In a report released today, Gartner said that 1.1 billion connected things will be used by smart cities this year but that figure will soar to 9.7 billion by 2020.

But a significant number of connected things this year will be down to so called smart homes and smart commercial buildings – right now the share is 45 percent but the percentage will reach 81 percent by 2020.

Gartner said most of the money will be spent from the private sector. It released figures which showed that public services and in particular healthcare are lagging behind other sectors including transport and utilities.

For the home, connected devices include smart LED lighting, such as Philips Hue lights, healthcare monitoring, smart locks, and sensors that detect things as diverse as motion and carbon monoxide. The highest growth will be in smart lighting – in 2015 there will be only six million units shipped but that will grow to 570 million units by 2020.

Major applications in cities include IoT deployments for parking, traffic and traffic flow. And the UK is leading the way in the field.

Commercial IoT applications will span multiple industries and firms specialising in analytics will see a rise in revenue as big data generated by the billions of devices will represent challenges for the industry.

IBM teams up with Twitter

ibm-officeBig Blue is very busy with its cloud data services and data analytics and today has penned an agreement with Twitter aimed at enterprises and developers.

The deal means IBM will deliver cloud data services with Twitter built in – meaning that companies can use analytics to mine meaningful data from the flood of tweets that hit cyber space every day.

IBM described Twitter as unlike any other data source in the world because it happens in real time, and is public and conversational.

IBM claims it can separate the signal from the noise by analysing tweets with millions of data points from other data that is public.

The deal means that developers can search explore and examine data using its Insights for Twitter service on Bluemix.

The company said it can also analyse Twitter data by configuration Biginsights on Cloud and combine the tweets with IBM’s Enterprise Hadoop-as-a-service.

It has already given 4,000 of its own staff access to Twitter data.

 

Hitachi Data Systems targets telco big data

server-racksThe IT division of Hitachi said it has started to sell an analytics package aimed at telecom service providers.

Hitachi Data Systems (HDS) said that its Live Insight for Telecom is aimed at giving providers real time information into networks, services and application level performance.

This, HDS claims, will let them predict network activity using both real time and historical data in parallel

Analytics is big business now – for example IBM is betting the farm on big data and the cloud.

So companies like HDS are claiming their products will reduce subscriber “churn”, lower the operational costs and give them new sources of revenue.

HDS claims that are close to seven billion mobile subscribers worldwide, with 78 percent of households in the developed world connected to the web.

But, it continues, even though telco providers can access tens o

Dell sells web scale converged devices

Dell logoDell said it has introduced the second series of of its XC Series web scale converged devices.

The units are aimed at data centre customers and Dell claimed they have now over 50 percent more storage capacity and twice the rack densities.

They’re intended to support many different kinds of workloads including private cloud, big data and virtual desktop infrastructures.

The appliances are based on Dell PowerEdge server technology with Nutanix software and bundled with Dell global services and support.

The appliances now offer additional drive options including for both flash and conventional hard drives. Each rack unit can support up to 16 terabytes per rack unit, and a number of options for multiple drives, memory and microprocessors.

Dell is now offering a compact 1U form factor with the XC630 model, while the XC730xd will support up to 32 terabytes of memory.

The units will be up for sale in early March.

 

IBM strikes further deals with Juniper

Juniper and IBM have decided to work together in a bid to provide customers with improved mobile facilities, look at Internet of Things (IoT) applications and plumb the world of big data.

IBM said that the two companies will work together to deliver high performance network analytics to speed up enterprises, reduce costs, and provide better end user applications.

IBM logoIBM and Juniper have worked together for a while, but are now devising the integration of Juniper’s MX Router Service Control Gateway with IBM Now Factory analytics.

Other future developments will include providing visibility of subscribers and the ability of CSPs offer automated services based on data. Juniper will use IBM Analytics to understand data flows and self configure and optimise network operations.

Juniper will also integrate IBM Analytics features into its own Cloud Analytic Engine.

Bob Picciano, a senior VP at IBM, said: “Integrating predictive analytics directly into the stream of data processing – and embedding into the network of CSPs – will help to ensure the reliability of the network.”

 

ITC intros high speed analytics

Pic Mike MageeITC Infotech said that it has introduced an enterprise analytics system that lets users more easily access high speed data analytics.

The product, called ZEAS (Z Enterprise Analytics Solution) uses a graphical user interface to analyse big data with the minimum of coding.

The product supports Hadoop open source technology and ITC claims that it will let enterprises analyse big data five times faster than its’ competitors’ offerings.

It also claimed that data analysis projects that would have taken months for experienced Hadoop developers to implement can now be done in weeks.

ZEAS also includes a data operation centre that gives enterprise grade access controls, monitoring and alerting mechanisms for data management.

The company introduced the offering at the Strata+Hadoop World conference held in San Jose this week.

ITC Infotech is a subsidiary of $7 billion company ITC that provides services to global customers. It targets the banking, financial services and insurance sectors.

IBM makes big data push

ibm-officeBig Blue said it has introduced data analytics with the introduction of IBM BigInsights for Apache Hadoop.

The offering provides machine learning, R, and other features that can tackle big data.

IBM claimed that while many think Apache Hadoop is powerful for collecting and storing large sets of variable data, companies are failing to realise its potential.

It’s offering has a broad data science toolset for querying data, visualising, and provide scaleable distributed machine learning.

The offering includes Analyst, which includes IBM’s SQL engine, Data Scientist that provides a machine learning engine that ranges over big data to find patterns.

Enterprise Management includes tools to optimise workflows, and management software to give faster results.

IBM also said it has joined the Open Data Platform (ODP) association which is aiming to provide standardisation over Hadoop and big data technologies.

Insurance industry drags feet on big data


next-years-mainframe-model-comes-in-nearly-half-the-spaceThe insurance industry
is in danger of falling behind other companies because it is not interested in the latest digital technology.

Reuters reported that while some insurers are using developments such as telematics, or social media sources, to increase the amount of information they have about customers to reduce claims and make insurance cheaper for all most are luddite laggers.

Famously we will probably need  “black boxes” in our cars so that we can be rewarded with lower insurance premiums if we drive carefully.

But apparently when it comes to Big Data, insurance companies are saying a big “no.”

This is because the insurance industry is still locked in the early 20th century, where pen and paper were mightier even that the typewriter.

Staff at Lloyd’s, home to more than 90 trading syndicates in London’s financial district, still trundle suitcases of claim forms for complex insurance transactions.

Lloyd’s Chief Executive Inga Beale has said the industry needs to take technology on board to maintain its role in global business. The firm recently appointed a Chief Data Officer and Beale said the sector needs to attract new, tech-savvy talent.

Part if the difficulty is that there are a mass of different systems out there and firms are often  swallowed up by bigger insurers, makes it hard to streamline technology.

Firms might like the idea of technology, but cannot be bothered spending because they are having trouble balancing their books with bond yields at record lows.

This is despite the fact that a report from Morgan Stanley and Boston Consulting Group says the first movers will clean up.

They say a full transformation to becoming a digital company could cut an insurer’s combined ratio by 21 percentage points, in other words making the firm more profitable. Expenses could fall by 10 percent of premiums and claims by 8 percent.

 

IBM steps up educational push

ibm-officeMassive services giant International Business Machines (IBM) said it has now enrolled over 300 colleges and universities around the world in its Power Systems Academic Initiative (PSAI).
IBM said that the push is to help students learn skills related to big data, cloud computing, mobile and social networking.
That, said IBM, is important in today’s job market.
The initiative, which started in October 2012. has grown by 152 percent over the last two years, IBM claimed.
Schools and universities hooked up to IBM include New York University’s polytechnic school of engineering, Virginia Tech, the UK University of Greenwich, the University of Ulster, and Glasgow Caledonian University.
Of course, IBM’s move is not all altruism – it is pinning its future on cloud computing, big data, analytics and security.
Several of the academic bodies offer courses related to IBM specific operations, and the company said it recruit from universities and business schools.

Obama has support for “big data” bill

Obama BarackIt is looking like President Barack Obama’s “Big Data” privacy plans might get through the Republican controlled Congress.

He has proposed action on a series of laws to address “Big Data” concerns, but most have not gone anywhere when many corporations want to collect data to sell products, and are telling their paid politicians to vote them down.

This was the reason that a proposal to update the outdated Electronic Privacy Communications Act to protect email and other data stored in the cloud died.

However that is starting to change after public concerns over privacy and cybersecurity that have been amplified by high-profile hacking of credit card data at companies such as Target and Home Depot.

First up is a law being put through by Indiana Congressman Luke Messer, the chairman of the House of Representatives Republican Policy Committee, and Democrat Jared Polis of Colorado, an Internet entrepreneur who founded a network of charter schools.

He is pushing a student privacy bill which will stop big corporates collecting data on kids. The lawmakers have worked on the issue with privacy advocates and more than 100 companies including Microsoft, Google and News Corp subsidiary Amplify to develop a privacy pledge to prevent misuse of data collected in classrooms.

The law will make sure that data collected from students is used only for educational and legitimate research purposes.

Obama wants to go further and has proposed a new national standard to require companies to tell consumers within 30 days from the discovery of a data breach that their personal information had been compromised.

However, there are a patchwork of differing state regulations, which might put a spanner in the works.

Obama is also worried about how Big Data could be used to discriminate against people based on race or where they live for housing or jobs.

On Thursday, the White House will release a report on how companies use Big Data to offer different prices to different consumers saying that Big Data techniques have “turbocharged” price discrimination.  Those sorts of laws will hack off the US corporate sponsors of the US political system, and might also die.  But US reports are optimistic that Obama might win that one.

 

Big Data analytics are not up to snuff

3921968993_9bccb97118_zCompanies relying on Big Data analytics might be disappointed to discover that they are not so good at finding a needle in a haystack after all.

Currently the best way to sort large databases of unstructured text is to use a technique called Latent Dirichlet allocation (LDA) which is a modelling technique that identifies text within documents as belonging to a limited number of still-unknown topics.

According to analysis published in the American Physical Society’s journal Physical Review X, LDA had become one of the most common ways to accomplish the computationally difficult problem of classifying specific parts of human language automatically into a context-appropriate category.

According to Luis Amaral, a physicist whose specialty is the mathematical analysis of complex systems and who wrote the paper, LDA is inaccurate.

The team tested LDA-based analysis with repeated analyses of the same set of unstructured data – 23,000 scientific papers and 1.2 million Wikipedia articles written in several different languages.

Not only was LDA inaccurate, its analyses were inconsistent, returning the same results only 80 percent of the time even when using the same data and the same analytic configuration.

Amaral said that accuracy of 90 percent with 80 percent consistency sounds good, but the scores are “actually poor, since they are for an easy case.”

The base of data for which big data is often praised for its ability to manage – the results would be far less accurate and far less reproducible, according to the paper.

The team created an alternative method called TopicMapping, which first breaks words down into bases (treating “stars” and “star” as the same word), then eliminates conjunctions, pronouns and other “stop words” that modify the meaning but not the topic, using a standardized list.

This approach delivered results that were 92 percent accurate and 98 percent reproducible, though, according to the paper, it only moderately improved the likelihood that any given result would be accurate.

The paper’s point was that it was not important to replace LDA with TopicMapping, but to demonstrate that the topic-analysis method that has become one of the most commonly used in big data analysis is far less accurate and far less consistent than previously believed.