Tag: big data

Internet of Things “is already here”

Internet of ThingsA report commissioned by Verizon suggests that rather than being science fiction, the Internet of Things (IoT) is already here and producing business results.

Verizon commissioned the Harvard Business Review to conduct a report and that suggested the IoT is already here in the shape of connectivity, cloud computing, and miniaturisation of sensors “making it possible” for over 10 billion devices to be networked.

Nevertheless, HBR’s Analytic Services surveyed 269 business leaders and says the number of deployments is still relatively small.

While estimates say that IoT could add 10s of trillions of dollars to GDP in the next 10 years, HBR says defining it goes way beyond wearable devices, smart meters and connected cars.

The survey, conducted last September on early IoT adopters, concludes those using it were doing so to improve customer services, increase revenues from services and products, better using assets in the field, and picking up additional data for analytics.

Applications include asset tracking, security, fleet management, field force management, energy data management and “condition based monitoring”.

There are challenges to adopting IoT that include pivacy and regulatory compliance.  HBR said most legislation and industry regulations predates the use of IoT.  Managing the sheer amount of data will also be a problem, and finding people with skill sets capable of using IoT data.

The report said in healthcare, Varian, a manufacturer of medical devices, says the IoT meant a 50 percent reduction to repair connected devices.  Pirelli is using the IoT to manage data from sensors embedded in its Cyber Tyre range.  And Ford’s Connected Car Dashboards programme collects and analyses data from cars to better understand driving patterns and behicle performance.

Intel announces 3D NAND-Flash

IMFT Sign - Lehi

Rob Crooke, VP & GM of Intel’s Non-Volatile Memory (NVM) Solutions Group was last up in the company’s day long Investor Meeting today in Santa Clara.

Though last, he had the most newsworthy announcement about the company’s future memory intentions.

Intel announced it is back in the memory business – 3D NAND-Flash that is (mass production in-house is conditional though).

Crookes’ revelation ends any rumination on Intel-Micron Flash Technologies 3D Flash development – it also includes SK Hynix when the device goes into production 2Q 2015. Evidently those who have been nice have early sample devices according to sources.

The specifics:

  • 4G hole array 32 layers deep | (216 x 216)(Array) x 25(Layers) x 2(MLC) = 256 Gbits
  • 1TB in 2 mm package
  • SSDs: 10TB and up planned
  • Production 2H 2015 – IMFT (Lehi, Utah facility mentioned) & SK Hynix
  • Intel can also produce internally
  • Replacement of HDD with SSD in all PC and Mobile devices

Crooke allowed that the devices will not use Intel’s cutting edge 14nm technology but a slightly relaxed geometry  – Micron is on record at 16nm geometries for 3D NAND. The openly known fact that prevaricating about Flash Geometries may hold sway – a hefty dose of caveat emptor is recommended.

The announcement coincides with reports that Intel and Micron are involved in a project with EMC2-DSSD – an effort to produce the first NAND-Flash In-Memory Database appliance.  The proffered memory type may be a custom type expressly tailored for the application and may be produced in-house by Intel – more on this as roll-out time nears.

IBM brings in the clouds

Pic Mike MageeBig Blue said it has released or is just about to release a slew of cloud and Big Data analytics to the IT party.

It said that Cognos Business Intelligence, SPSS predictive analytics and Watson Analytics will soon be available on its Cloud marketplace. Currently the Cognos offering is in beta, and won’t be ready for action until the first quarter of next year.  And SPSS Modeller won’t be available for another 30 days.

What’s the Cloud marketplace?  It’s one place you can go to, or in IBM speak it’s “the digital front door to cloud innovation”.

Big Blue said that 25 percent of new business analytic installations will be as subscriptions to cloud analytic or application services by next year.

IBM wants a slice of that lucrative cake.

The giant said that it has five answers to five common problems for businesses including understanding customers, understanding operations, security, compliance and data warehouse modernisation.

Berners-Lee speaks up for people

Sir Tim Berners-LeeThe inventor of the world wide web said today that data should belong to each of us.

Sir Tim Berners-Lee was speaking at a keynote speech at IPExpo Europe in London.

He hit out at the notion that data belongs to corporations like Facebook and Google who collect it and then use it to make money out of everyone.

He said that using big data for advertising purposes gave him a queasy sensation and rather than big data we should be interested in rich data.

He told the conference that big companies are, essentially, spying on us all and this is a real threat.

Collecting your own data from different gizmos you use and different transactions you make gives you a perspective on yourself that is much more valuable than feeding that data to large corporations.

Berners-Lee described what would become known as the world wide web 25 years ago.

IBM wants to dam big data deluge

IBM logoBig Blue says it has created a new model for enterprise data storage intended to work across a large number of IT solutions.

Jamie Thomas, general manager of storage at IBM, said that it’s time the “traditional” storage model must change. That’s because data is churned out to 2.5 billion gigabytes every day.

Enterprises need to make real time decisions based on this data.  Storage and data centres are the foundation for the model using analytical tools.

She said IBM has introduced something called the Elastic Storage Server, a software storage appliance that works in conjunction with IBM Power 8 servers.

She said that software defined storage is changing the entire industry and IBM can now sell products to customers that want to manage, organise, and use data as a competitive tool.

IBM will offers its Software Defined Storage products through Elastic Storage, SAN Volume Controller and the Virtual Storage Centre.

Big data is riddled with myths

server-racksMarket research company Gartner enumerated what it described as five big data myths.

The company said that companies tended to believe that their competitors were ahead of them in the adoption of big data.  But its survey showed that while 73 percent of organisation intending to invest or planning to invest in big data, most organisations are still in the “very early” staged of adoption.

biggie
Only 13 percent of the companies it surveyed had actually deployed anything. And companies face a challenge how to obtain value from big data.

The second myth is that many IT folk believe the large volume of data held means faults with individual flaws don’t matter.  But, said Ted Friedman, a VP at Gartner: “Although each indiviidual flaw has a much smaller impact on the whole dataset than it did when there was less data, there are more flaws than before because there is more data.”  The impact of poor quality data remains the same, he said.

Myth three  is that big data technology removes the need for data integration..  But most information users rely heavily on scheme on write – meaning data is described, content is prescribed and there’s agreement about the integrity of data.

The fourth myth, according to Gartner, is that you don’t need a data warehouse for advanced analytics.  But that’s not necessarily true – many advanced analytics projects use a data warehouse during an analysis.

And, finally, so-called “data lakes” aren’t going to replace data warehouses.  A data lake is defined as enterprise wide data management that analyses different sources of data in their native file formats.  Data lakes lack the maturity and breadth of features in established data warehouse technologies.

Big Data market growing exponentially

server-racksAnother survey on the growth of big data technology and services underlines the growth in this sector of the IT market.

Market research company IDC predicts that the western Eurpean big data market will grow between now and 2018 at a compound annual growth rate of 24.6 percent.

IDC said that western European organisations are catching up with the USA rapidly because of a combination of smaller datasets, challenging economies and privacy concerns.

The market sector is segmented into infrastructure, such as servers, storage and networking;  software; and services.  Storage was worth $536 million in 2013, while the server market is worth $314 million.  But the largest segment is software, worth an estimated $698 million last year, followed by services which was worth $593 million.

IDC said the UK, Benelux and the Nordic countries are showing higher initial adoption, but Germany and France are fast catching up.

But Alys Woodward, research director at IDC, warned that getting value from investments in big data is far from guaranteed. Vendors need to clearly demonstrate to their customers how their organisations can benefit from adoption.

Big data ready for the big time

Godzilla: Big DataEnterprises have got off the fence about adoption in big data technologies with 73 percent of those surveyed saying they either have invest or will invest in big data in the next 24 years.

That’s according to some data from Gartner, which says the pack is being led by North America, with 47 percent of organisations saying they’d invested in 2014.

But while these organisations might be ready for the big data big time, IDC says that most work is in strategy and starting pilots and experimental projects.

Lisa Kart, a research director at IDC, said: “The most dramatic changes are in enhancing customer experience, especially in transportation, healthcare, insurance, media and communications, retail, and banking. Another area where we see an increase is using big data to develop information products, where organisations are looking to monetise their data. This is especially true among IT vendors, government and manufacturing.”

What is big data, though? It appears that some are still trying to understand what big data is.  Gartner says increasing data volume is understandable because it’s just a massive amount of data, and volume is easy because you just add storage and computing capacity.

Getting value is more difficult because of the variety of data and sources including social media feeds, machine and sensor data and free form text which all require analysing.

Big Data gets very big indeed

server-racksRevenues for Big Data technology and services will be worth $41.5 billion by 2018 and is growing now at a 26.4 percent compound annual growth rate (CAGR).

That’s an estimate by market research company IDC. Ashish Nadkarni, research director at the company, said the hype was simmering down.  “This is a sign that the technologies are maturing and making their way into the fabric of how organisations operate and firms conduct their business,” he said.

This year, infrastructure has a large share of the entire market, with a 48.2 percent slice of the Big Data pie.

While America leads the way in Big Data investment, it isn’t going to stay that way. EMEA and Asia Pacific have nearly 45 percent market share in infrastructure, software, and services.

IDC predicts there will be a mergers and acquisitions boom in the sphere.

IT ready for Big Data

clouds3A survey of 100 IT decision makers from top dollar firms has revealed that enterprises are more than dabbling their toes in the ocean of Big Data.

Syncsort, which is in the Big Data business itself, said that 62 percent of its respondents will optimise their enterprise data warehouses by sending data and batch workloads to Hadoop.

And 69 percent of the people it polled said they expect to make their enterprise wide data available in Hadoop.

Meanwhile just over half of the respondents are likely to spend between five to 10 percent of their budgets on Big Data projects.

Over seventy percent of the respondents work for companies with turnovers of over $50 million plus.

It seems that the IT guys don’t have problems proving the benefits of Big Data to the senior suits that authorise the buys.  It appears from the survey that less than 25 percent of those polled have problems allocating budgets to their Big Data plans.

Teradata snaps up Think Big Analytics

doshAnalytic data company Teradata has bought Think Big Analytics.

The reason it’s bought the company is for its Hadoop and big data consulting capabilities, it said in a s statement.

Teradata didn’t say how much it paid for the firm, but said Think Big’s team will stay in place.  It will continue to use the Think Big brand.

CEO Mike Koehler said it is Teradata’s third buy in six weeks. All, he said, will help to achieve its goal of being the market leader.

“Think Big’s consulting expertise enhances Teradata’s capability to advise customers on the best way to leverage diverse, open source big data technologies to grow their businesses,” he said.

Think Big, said Teradata have heaps of experience with a number of Hadoop distributions including Hortonworks, Cloudera, and MapR.

Big Data will bring bonanza

clouds3The market for Big Data tech and services is expected to be worth $32.4 billion by 2017, growing between then and now at a CAGR of 27 percent.

That’s what market research company IDC is projecting, in a report that says that growth is about six times the growth rate of the ICT market.

Dan Vesset, a VP at IDC, said that the Big Data market is growing fast as startups and large IT company attempt to take market share and fold in customers.

Cloud infrastructure has the highest CAGR in individual segments, at 49 percent until 2017.  And automation based on Big Data tech is set to affect knowledge worker roles.

And datacentres are likely to suffer too, because it will either be discarded or archived to the cloud, meaning the storage market will be affected.

Governments need to go cloud busting

cloudbustA report from Gartner said that by 2017 public cloud offerings will account for over 25 percent of government business services, not counting defence and security.

But CIOs need to get themselves into the debate on public cloud sourcing, and kick of sourcing strategies with politicos.

By 2017, predicted Gartner, 35 percent of government shared service organisations will be managed by private sector companies. Public private partnerships are already embracing infrastructure as a service but governments will move to integration and software as a service.

And, Gartner predicts, by 2017 as many as 60 percent plus of government open data programmes that do not use open data internally will be discontinued.

And if you’ve a job in government software development, mind your back, because at least 25 percent of such jobs will be axed while governments hire data analysts from outside.  Data analysis is now a high priority.

Where big data is headed

John Dee and Queen Elizabeth the firstAs predicted here, now is the time for predictions for 2014. And the latest seer to gaze into his dark John Dee style speculum is Mario Faria, from ServiceSource.

Fario says many companies have experimented with big data stuff during 2013 and tried pilots. The crunch time will be when these pilots start to show a return on investment.

And next year, he says, the phrase big data will yield to analytics because it’s not about collecting data but to capture, analyse and act in the here and now to stay cmpetitive.

Fario’s job title is Chief Data Officer and he explains that this function is the janitor of data but also an evangelist and will engage in 2014 with the board of directors. No doubt the board of directors will engage with chief data officers too.

Unlike the Harris poll we published earlier, which shows a degree of insouciance about wearable stuff, Fario thinks that Fistbit, Nike devices and Google Glasses will be part of our everyday life.

But ultimately, everything is down to the quality of data because if its not reliable any amount of analysis will not deliver results. He says that data quality is a  money maker.

IBM takes plastic quantum route

ibm-officeBig Blue said it has made a breakthrough that will provide the potential to create ultra fast optical switches, suitable for future big data computer systems.

The company said its scientists have demonstrated a quantum mechanical process known as Bose-Einstein Condensation (BEC).

It uses a luminescent polymer similar to the light emitting displays in smartphones of today.

The phenomenon demonstrated by the scientists is named after Satyendranath Bose and Albert Einstein – they predicted it in the mid 1920s, said IBM.  A BEC is a state of matter when a dilute gas of particles (bosons) are cooled to close to absolute zero (-273C).

However, IBM has achieved the same state at room temperature in a thin plastic film of 35 nanometres with bosonic particles created through interaction of the polymer material and light.   The phenomenon last for a few picoseconds, but the IBM scientists think that is long enough to create a source of laser-like light or an optical switch.