Tag: big data
MEMS market galvanised by the internet of things
Growth in the internet of things (IoT) means demand for microelectromechanical systems (MEMS) has risen steeply this year.
The MEMS category of semiconductors includes accelerometers, pressure sensors, timing components and microphones.
MEMS are used in areas like asset tracking, smart grids, building and other sectors. Market research company IHS said that revenues last year were $16 million but will be worth $120 million yearly by 2019.
But MEMS will also be widely used in datacentres and this means that this sector of the market will be worth $214 million in 2018.
By 2025, shipments of MEMS for industrial IoT equipment will amount to 7.3 billion units. Last year 1.8 billion units shipped.
Datacentres will want optical MEMS, used for wavelength selective switches and optical cross connects.
Fashion designers get fingered by big data
The use of analytics and big data has demonstrated how style trends surge through the industry, according to researchers at Penn State University.
Heng Xu, a professor of info science and tech at Penn, said a team of researchers analysed a large number of words and phrases from fashion reviews.
Xu said: “Data analytics are becoming more available for finding patterns, establishing correlations and identifying trends”. He said: “It is being applied to many industries and fields, from health care to politics, but what we wanted to see is if data analytics could be used in the fashion industry.”
Her team extracted keywords and phrases describing silhouettes, colours, fabrics and other data from designers’ collection and then created algorithms to rank the designers and find out their influence.
Fashion designers can be sceptical about big data and analytics because they consider themselves to be artists. But the team said it had found fingerprints that could be related to individual designers.
“Buying from leading designers is expensive, but if you had information on what design elements were beginning to trend, it might help you buy the latest fashion more inexpensively,” Said Xu.
The team believes the analytics will help them to discover who the next big fashion designer could be.
Internet of Things “is already here”
A report commissioned by Verizon suggests that rather than being science fiction, the Internet of Things (IoT) is already here and producing business results.
Verizon commissioned the Harvard Business Review to conduct a report and that suggested the IoT is already here in the shape of connectivity, cloud computing, and miniaturisation of sensors “making it possible” for over 10 billion devices to be networked.
Nevertheless, HBR’s Analytic Services surveyed 269 business leaders and says the number of deployments is still relatively small.
While estimates say that IoT could add 10s of trillions of dollars to GDP in the next 10 years, HBR says defining it goes way beyond wearable devices, smart meters and connected cars.
The survey, conducted last September on early IoT adopters, concludes those using it were doing so to improve customer services, increase revenues from services and products, better using assets in the field, and picking up additional data for analytics.
Applications include asset tracking, security, fleet management, field force management, energy data management and “condition based monitoring”.
There are challenges to adopting IoT that include pivacy and regulatory compliance. HBR said most legislation and industry regulations predates the use of IoT. Managing the sheer amount of data will also be a problem, and finding people with skill sets capable of using IoT data.
The report said in healthcare, Varian, a manufacturer of medical devices, says the IoT meant a 50 percent reduction to repair connected devices. Pirelli is using the IoT to manage data from sensors embedded in its Cyber Tyre range. And Ford’s Connected Car Dashboards programme collects and analyses data from cars to better understand driving patterns and behicle performance.
Intel announces 3D NAND-Flash
Rob Crooke, VP & GM of Intel’s Non-Volatile Memory (NVM) Solutions Group was last up in the company’s day long Investor Meeting today in Santa Clara.
Though last, he had the most newsworthy announcement about the company’s future memory intentions.
Intel announced it is back in the memory business – 3D NAND-Flash that is (mass production in-house is conditional though).
Crookes’ revelation ends any rumination on Intel-Micron Flash Technologies 3D Flash development – it also includes SK Hynix when the device goes into production 2Q 2015. Evidently those who have been nice have early sample devices according to sources.
The specifics:
- 4G hole array 32 layers deep | (216 x 216)(Array) x 25(Layers) x 2(MLC) = 256 Gbits
- 1TB in 2 mm package
- SSDs: 10TB and up planned
- Production 2H 2015 – IMFT (Lehi, Utah facility mentioned) & SK Hynix
- Intel can also produce internally
- Replacement of HDD with SSD in all PC and Mobile devices
Crooke allowed that the devices will not use Intel’s cutting edge 14nm technology but a slightly relaxed geometry – Micron is on record at 16nm geometries for 3D NAND. The openly known fact that prevaricating about Flash Geometries may hold sway – a hefty dose of caveat emptor is recommended.
The announcement coincides with reports that Intel and Micron are involved in a project with EMC2-DSSD – an effort to produce the first NAND-Flash In-Memory Database appliance. The proffered memory type may be a custom type expressly tailored for the application and may be produced in-house by Intel – more on this as roll-out time nears.
IBM brings in the clouds
Big Blue said it has released or is just about to release a slew of cloud and Big Data analytics to the IT party.
It said that Cognos Business Intelligence, SPSS predictive analytics and Watson Analytics will soon be available on its Cloud marketplace. Currently the Cognos offering is in beta, and won’t be ready for action until the first quarter of next year. And SPSS Modeller won’t be available for another 30 days.
What’s the Cloud marketplace? It’s one place you can go to, or in IBM speak it’s “the digital front door to cloud innovation”.
Big Blue said that 25 percent of new business analytic installations will be as subscriptions to cloud analytic or application services by next year.
IBM wants a slice of that lucrative cake.
The giant said that it has five answers to five common problems for businesses including understanding customers, understanding operations, security, compliance and data warehouse modernisation.
Berners-Lee speaks up for people
The inventor of the world wide web said today that data should belong to each of us.
Sir Tim Berners-Lee was speaking at a keynote speech at IPExpo Europe in London.
He hit out at the notion that data belongs to corporations like Facebook and Google who collect it and then use it to make money out of everyone.
He said that using big data for advertising purposes gave him a queasy sensation and rather than big data we should be interested in rich data.
He told the conference that big companies are, essentially, spying on us all and this is a real threat.
Collecting your own data from different gizmos you use and different transactions you make gives you a perspective on yourself that is much more valuable than feeding that data to large corporations.
Berners-Lee described what would become known as the world wide web 25 years ago.
IBM wants to dam big data deluge
Big Blue says it has created a new model for enterprise data storage intended to work across a large number of IT solutions.
Jamie Thomas, general manager of storage at IBM, said that it’s time the “traditional” storage model must change. That’s because data is churned out to 2.5 billion gigabytes every day.
Enterprises need to make real time decisions based on this data. Storage and data centres are the foundation for the model using analytical tools.
She said IBM has introduced something called the Elastic Storage Server, a software storage appliance that works in conjunction with IBM Power 8 servers.
She said that software defined storage is changing the entire industry and IBM can now sell products to customers that want to manage, organise, and use data as a competitive tool.
IBM will offers its Software Defined Storage products through Elastic Storage, SAN Volume Controller and the Virtual Storage Centre.
Big data is riddled with myths
Market research company Gartner enumerated what it described as five big data myths.
The company said that companies tended to believe that their competitors were ahead of them in the adoption of big data. But its survey showed that while 73 percent of organisation intending to invest or planning to invest in big data, most organisations are still in the “very early” staged of adoption.
Only 13 percent of the companies it surveyed had actually deployed anything. And companies face a challenge how to obtain value from big data.
The second myth is that many IT folk believe the large volume of data held means faults with individual flaws don’t matter. But, said Ted Friedman, a VP at Gartner: “Although each indiviidual flaw has a much smaller impact on the whole dataset than it did when there was less data, there are more flaws than before because there is more data.” The impact of poor quality data remains the same, he said.
Myth three is that big data technology removes the need for data integration.. But most information users rely heavily on scheme on write – meaning data is described, content is prescribed and there’s agreement about the integrity of data.
The fourth myth, according to Gartner, is that you don’t need a data warehouse for advanced analytics. But that’s not necessarily true – many advanced analytics projects use a data warehouse during an analysis.
And, finally, so-called “data lakes” aren’t going to replace data warehouses. A data lake is defined as enterprise wide data management that analyses different sources of data in their native file formats. Data lakes lack the maturity and breadth of features in established data warehouse technologies.
Big Data market growing exponentially
Another survey on the growth of big data technology and services underlines the growth in this sector of the IT market.
Market research company IDC predicts that the western Eurpean big data market will grow between now and 2018 at a compound annual growth rate of 24.6 percent.
IDC said that western European organisations are catching up with the USA rapidly because of a combination of smaller datasets, challenging economies and privacy concerns.
The market sector is segmented into infrastructure, such as servers, storage and networking; software; and services. Storage was worth $536 million in 2013, while the server market is worth $314 million. But the largest segment is software, worth an estimated $698 million last year, followed by services which was worth $593 million.
IDC said the UK, Benelux and the Nordic countries are showing higher initial adoption, but Germany and France are fast catching up.
But Alys Woodward, research director at IDC, warned that getting value from investments in big data is far from guaranteed. Vendors need to clearly demonstrate to their customers how their organisations can benefit from adoption.
Big data ready for the big time
Enterprises have got off the fence about adoption in big data technologies with 73 percent of those surveyed saying they either have invest or will invest in big data in the next 24 years.
That’s according to some data from Gartner, which says the pack is being led by North America, with 47 percent of organisations saying they’d invested in 2014.
But while these organisations might be ready for the big data big time, IDC says that most work is in strategy and starting pilots and experimental projects.
Lisa Kart, a research director at IDC, said: “The most dramatic changes are in enhancing customer experience, especially in transportation, healthcare, insurance, media and communications, retail, and banking. Another area where we see an increase is using big data to develop information products, where organisations are looking to monetise their data. This is especially true among IT vendors, government and manufacturing.”
What is big data, though? It appears that some are still trying to understand what big data is. Gartner says increasing data volume is understandable because it’s just a massive amount of data, and volume is easy because you just add storage and computing capacity.
Getting value is more difficult because of the variety of data and sources including social media feeds, machine and sensor data and free form text which all require analysing.
Big Data gets very big indeed
Revenues for Big Data technology and services will be worth $41.5 billion by 2018 and is growing now at a 26.4 percent compound annual growth rate (CAGR).
That’s an estimate by market research company IDC. Ashish Nadkarni, research director at the company, said the hype was simmering down. “This is a sign that the technologies are maturing and making their way into the fabric of how organisations operate and firms conduct their business,” he said.
This year, infrastructure has a large share of the entire market, with a 48.2 percent slice of the Big Data pie.
While America leads the way in Big Data investment, it isn’t going to stay that way. EMEA and Asia Pacific have nearly 45 percent market share in infrastructure, software, and services.
IDC predicts there will be a mergers and acquisitions boom in the sphere.
IT ready for Big Data
A survey of 100 IT decision makers from top dollar firms has revealed that enterprises are more than dabbling their toes in the ocean of Big Data.
Syncsort, which is in the Big Data business itself, said that 62 percent of its respondents will optimise their enterprise data warehouses by sending data and batch workloads to Hadoop.
And 69 percent of the people it polled said they expect to make their enterprise wide data available in Hadoop.
Meanwhile just over half of the respondents are likely to spend between five to 10 percent of their budgets on Big Data projects.
Over seventy percent of the respondents work for companies with turnovers of over $50 million plus.
It seems that the IT guys don’t have problems proving the benefits of Big Data to the senior suits that authorise the buys. It appears from the survey that less than 25 percent of those polled have problems allocating budgets to their Big Data plans.
Teradata snaps up Think Big Analytics
Analytic data company Teradata has bought Think Big Analytics.
The reason it’s bought the company is for its Hadoop and big data consulting capabilities, it said in a s statement.
Teradata didn’t say how much it paid for the firm, but said Think Big’s team will stay in place. It will continue to use the Think Big brand.
CEO Mike Koehler said it is Teradata’s third buy in six weeks. All, he said, will help to achieve its goal of being the market leader.
“Think Big’s consulting expertise enhances Teradata’s capability to advise customers on the best way to leverage diverse, open source big data technologies to grow their businesses,” he said.
Think Big, said Teradata have heaps of experience with a number of Hadoop distributions including Hortonworks, Cloudera, and MapR.
Big Data will bring bonanza
The market for Big Data tech and services is expected to be worth $32.4 billion by 2017, growing between then and now at a CAGR of 27 percent.
That’s what market research company IDC is projecting, in a report that says that growth is about six times the growth rate of the ICT market.
Dan Vesset, a VP at IDC, said that the Big Data market is growing fast as startups and large IT company attempt to take market share and fold in customers.
Cloud infrastructure has the highest CAGR in individual segments, at 49 percent until 2017. And automation based on Big Data tech is set to affect knowledge worker roles.
And datacentres are likely to suffer too, because it will either be discarded or archived to the cloud, meaning the storage market will be affected.