Data-integration and data-transformation steps taken before loading the database are sometimes the best antidote to high-volume storage and scaling challenges.
When it comes to making sense of big data, the glory is hogged by database platforms such as EMC Greenplum, IBM Netezza, Oracle Exadata, and Teradata.
But sometimes simple data processing work done outside of the database can help you scale and eliminate hours or even days of processing on expensive database platforms.
Marketing data provider comScore, for example, uses data-sorting techniques to improve compression and aggregate data before it even gets to the warehouse. As a result it's saving on storage, reducing processing times, and, most importantly, speeding information to its data-hungry customers.
As I detailed late last year, comScore has been a leading source of online marketing data for more than a decade. As such it was a pioneer of big-data computing. At 150 terabytes, the company's latest Sybase IQ warehousing platform sounds big, but it would have to be many times larger if not for the company's skill at compressing and aggregating data.
The big-data leagues span from the tens of terabytes into the petabytes. That's when it becomes essential to add the power of massively parallel processing (MPP), used by most of the leading platforms, or the compression advantages of column-oriented databases (such as Sybase IQ and HP Vertica). But organizations playing at this scale also have to manage big data before it gets into the database.
To give you some idea, comScore tracks the daily Internet surfing (and mobile-access) habits of about 2 million consumer panelists who have registered and supplied their demographic and psychographic profiles. The company also takes a daily census of activity across the Internet so it can report on and compare Internet-wide behavior to that of targeted segments tracked through the panel data. As a result, comScore collects about 2 billion new rows of panel data and more than 18 billion new rows of census data each day.
That means more than 20 million rows of new data is loaded into the data warehouse each day. Of course, most every organization will apply compression to reduce storage demands. But comScore also uses Syncsort DMExpress data integration software to sort and bring alphanumeric order to the data before it's loaded into the warehouse. This improves compression ratios.
Where 10 bytes of unsorted data can be compressed to three or four bytes, says Michael Brown, comScore's chief technology officer, 10 bytes of sorted data can typically be crunched down to one byte. "That makes a huge different in the volume of data we have to store, and it streamlines our processes and reduces our capital costs," Brown says.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.