Oracle, SAP and big data standards

Monday 26th March 2012
Big Data courtesy: m.enterprisecioforum.com

Database and Cloud software supplier Oracle has scooped up two human resource software companies and is to run Fusion applications from its Data centres. SAP promises a unified data management portfolio for April with HANA for general business. Calpoint is a finalist in the CODiE Awards. But hope for a Big Data standard similar to LAMP is reputedly unlikely at present.

Database software creator Oracle reports a 3Q fiscal 18% profit jump thanks to new license sales as itsold more databases and business applications gaining a 3% revenue rise to $2.37bn. But hardware systems sales have declined 16% to $869m. Oracle bought Sun Microsystems two years ago, but its sales of large systems for data processing is at the expense of Sun's less expensive products.

Oracle is the second-largest maker of business applications used to manage financials, operations and human resources, after SAP AG. Recent acquisitions broadened its offerings in cloud computing software delivered over the Web. Last month it declared it was to buy Taleo Corp, that makes online human resources software, for $1.9bn, its second Web software acquisition in less than four months. Oracle has acquired RightNow Technologies Inc. for $1.5bn to gain online customer service software and has announced the Oracle Public Cloud is to run Fusion applications in Oracle's Data Centres.

Oracle's biggest competition SAP acquired Sybase, the world's No. 4 maker of database software back in July 2010 and is to unveil a unified data management portfolio in April. Last year SAP launched High Performance Analytic Appliance (HANA) a specialised in memory database that earned €160m in its first two quarters sales on the market, well ahead of SAP's €100m target.

The technology has been sold to date to handle a series of niche applications that help companies analyse large data. But SAP plans to make it available as a database for general business management applications by the end of the year.

In other database news, Calpont Corporation with scalable, high-performance column-oriented analytic databases, for ultra-fast, deep analysis of massive data sets has had its InfiniDB software has been selected as finalist for Best Database innovation in the Software and Information Industry Association (SIIA) CODiE Awards
 

HOW TO STANDARDISE  ON BIG DATA STACKS

But, reports PCAdvisor, despite big data interest in a standardisation system for big data software stacks, that LAMP says, achieved with Linux, the Apache Web server, the MySQL database and a set of programming languages--Perl, Python and PHP, there appears no easy answer.

Large Web service companies that use Hadoop, such as eBay and Twitter, are running in a "continuous beta," and … "hire a lot of technically competent staff to handle the pace of rapid change," says (right) Mark Staimer, president, Dragon Slayer Consulting. 

Arriving at such a stack may be difficult, given the variety of technologies available, and the degrees of difficulty inherent in connecting them together in various configurations.

"Now we have loads of different pieces out there …MongoDB, Cassandra, HSpace," says research director Jo Maitland (left), covering cloud technology for GigaOm Pro. All this choice "makes it more difficult for people. We're in a mashup situation with all these different components."

Variety emerged to address differing user needs, says Mark Baker, Canonical Ubuntu server product manager. "MySQL, for instance, is really fast at reading data, but the Cassandra data store, on the other hand, can write data more quickly.  The production company behind the U.K. television show 'Britain's Got Talent,' used a Cassandra database to log the votes of viewers choosing their favourite performer, because it could ingest a high number of writes simultaneously.

A number of companies have released commercial Hadoop distributions, such as Cloudera, MapR and Hortonworks, in which all the software components are integrated. But even Hadoop itself is unsuited for all jobs.  It processes data as batch jobs, meaning the full data set must be written to a file before it can be analyzed. Many jobs, however, involve the analysis of a continually updated data, such as click streams or Twitter messages.

"If there is going to be a stack, it needs to be [managed by] an open source organization and not necessarily managed by a specific company," Maitland said. "Not having a standardised stack ... drives up the cost of hiring experts to manage and use such systems. Right now the competition for experts is fierce.

"Trying to build [a big data system] takes knowledge and skill. To plug those into your infrastructure can take time and money," Baker said. "There is no standard roadmap -- it is a feeling along process. Putting it all together is not a simple task."

"You can't have the explosive growth in an industry with so much specialised knowledge that is required as of right now. The average business analyst can't write queries against Hadoop," added Staimer. 

Custom Search

Scotland, Computer News in Scotland, Technology News in Scotland, Computing in Scotland, Web news in Scotland computers, Internet, Communications, advances in communications, communications in Scotland, Energy, Scottish energy, Materials, Biomedicine, Biomedicine in Scotland, articles in Biomedicine, Scottish business, business news in Scotland.

Website : beachshore