Release 1.6 of the MongoDB open source project includes a major new feature: auto-sharding, or the ability to subdivide and spread a very large data set across many servers but still continue doing high-speed data updates and retrievals.
MongoDB is one of several big data handling systems sometimes referred to as key value store or NoSQL systems. They have emerged to meet very large scale data management requirements on the web and in e-commerce, such as eBay.
In addition to MongoDB, systems such as CouchDB or Cassandra are in use on social networking and game sites, including Farmville, and online retailing, such as Amazon.com. MongoDB is used by Foursquare, a location-based social networking system where users get points for "checking in" from certain locations. Users who check in at a place the most times become the "mayor" of that location and eligible for special deals. Foursquare passed the 100 million mark of check-ins in mid-July.
Sharding is a way for a database system to expand its data handling capabilities. It allows MongoDB to scale horizontally across more servers as the size of a job increases. Automated sharding allows the system to set up a routing map in the database based on a key to each shard. When a query or update arrives at the MongoDB system, the router component in the database detects what it is looking for and knows where to find the data.
Like other large cluster software, such as Hadoop, MongoDB generates two replicas of the original data set in the server cluster so that it can tolerate a hardware failure. If such a failure occurs, MongoDB retrieves a copy of the server's data from a backup source, generates a third copy, and continues working.
10gen, the company behind the MongoDB open source code project, added automated sharding to Release 1.6 on Aug. 5 and has been airing the feature in interviews and webinars since.
A MongoDB system could split up an extra large address book into four letter segments of the alphabet, spreading the segments across servers in a cluster. When a query sought a particular name, the system would respond quickly, using the first letter of the last name as a keyword to route the query to the correct shard, said Roger Bodamer, senior VP of products and engineering for 10gen, in an interview.
Bodamer also heads a new office for 10gen in Redwood City, Calif. The company and its largest development group are headquartered in New York. 10gen sells training and consulting around the MongoDB system as it contributes to its ongoing development.
The Foursquare social networking system makes use of MongoDB's ability to use geospatial indexing, or the ability to use geographic positioning as a data element.
A second feature added to MongoDB 1.6 was replica sets, Bodamer said. Replica sets establish a hierarchy among the three nodes handling a data set, with one node the primary handler of the data and two others receiving replicated copies. If the primary node is lost, the two remaining nodes "vote on which is to become the next primary node," said Bodomer. Criteria previously established determine the outcome of the vote and allow a new primary node to establish the third required copy of the data to replace the failed copy.
Data reads, or responses to queries, are sped up by the auto-sharding feature. Data writes, or updates to the database, are sped up by replica sets, Bodamer said.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?