Oracle's answer for large-scale data warehousing looks promising. But as this in-depth review reveals, production deployments and clearer insight on cost and administrative overhead are still wanting.
Introduced in September 2008, Oracle Exadata is Oracle's first major offering in fully parallel data warehousing. As such, it fixes the biggest gap in Oracle's data warehouse product line. Indeed, Exadata is something of an architectural leapfrog, establishing a "storage tier" separate from the "database tier" in a way no other commercial product previously has. Combined with the vast array of features already present in the Oracle database management system (DBMS) -- including strong capabilities to handle high-concurrency workloads -- Exadata has the potential to become the most capable data warehouse offering on the market. And of course it offers the ultimate in "Oracle compatibility," which alternative vendors generally lack.
Architecture looks effective (in theory)
Architectural tradeoffs and inherent complexity
Lack of proven reference implementations in the field
Data warehouse DBMS architecture is all about tradeoffs, however, and so Oracle still lacks features other vendors do offer. This deficiency is most obvious when compared to column-oriented DBMSs; if your needs are better met by columnar than row-based data warehousing, Oracle Exadata probably isn't for you. But other row-based vendors -- Teradata and upstarts alike -- also sport features that wouldn't make sense in the Exadata design.
What's more, any favorable view of Exadata assumes that it actually works more or less as advertised. This supposition is not yet known to be the case -- at least not widely. Although beta customers have been named, Oracle hasn't provided any true production references. Oracle is even reluctant to run Exadata proofs of concept at customer sites, preferring instead to do tests at its own facilities. Oracle's demos and other released information seem sufficient to establish that Exadata can provide a significant speed-up over Exadata-less Oracle DBMS deployments. Less clear is how Oracle Exadata compares with non-Oracle alternatives.
The last unknown is price -- and, more generally, total cost of ownership (TCO). Oracle's all-in list pricing is in the $50-$60 thousand/terabyte range using slower disks and in the $110-$130 thousand/terabyte range using faster ones, depending on system size. That is at the high end of the market. These ranges make sense, given Oracle's traditional software pricing posture and the fact that it isn't a leader on hardware/cost-saving metrics in areas such as raw analytic query performance or data compression. (For more detailed pricing information, visit "Oracle Database Machine and Exadata Pricing" and this list of posts on competitor pricing.) But realistically, these fees are all just first approximations, as each enterprise's negotiating situation is different.
Exadata is designed for data warehouses in the multi-terabyte range (user data). The smallest configuration you can buy is rated at 6 terabytes with two storage cells (the minimum allowed). The two most standard configurations have 14 storage cells each and are rated at 42 or 92 terabytes, depending on the type of disk used. Note that all such figures are approximate, because the user data to disk ratio is heavily affected by the nature of the raw data.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?