When fall comes, leaves drop from trees, nights last longer than days, and SnapLogic updates its Elastic Integration Platform.
Following recent trends in big data integration, SnapLogic is adding new connectors, or "snaps," to enable tie-ins with its "Infrastructure-Platform-as-a-Service."
The SparkSnap will allow users to create their own Spark-based pipelines. The Sparkplex data processing platform can execute Spark pipelines at the cluster level. And a Snap for Apache Cassandra allows clients to choose a big data store best suited to their needs.
All this brings Spark code into the dataflow, basically allowing users to do their processing at the Spark layer, explained Darren Cunningham, SnapLogic's VP for marketing.
[ Infrastructure is no longer static, immovable, or inflexible -- and neither should be an IT pro's skill set. Check out 10 IT Infrastructure Skills You Should Master. ]
What it all boils down to is a reduction of processing overhead in big data systems only now making greater use of Hadoop. A recent study by AtScale showed ETL (extract, transform, load) as a major reason why enterprises are turning to Hadoop to handle big data.
It is the "transform" part of ETL that SnapLogic focuses on. "Transformation is done in the core ETL server itself," Cunningham said. There is no need for an army of technicians to build the integration between two different parts of the in-house IT system.
Use of a graphical user interface speeds the process, cutting data integration projects from days or weeks down to hours or days, he continued.
SnapLogic sticks to a pretty tight development schedule to keep pace with change. "We do four releases per year of the core platform, delivered as a cloud-based service," he said. New snaps are added monthly. Sometimes a new snap may be crafted for a single customer, only to be released later in the year for the rest of the client base.
"We can move quickly because we are a cloud service," Cunningham said. Development gets its aim from SnapLogic monitoring and knowing how its customers use its service. That situation allows the company to plan capacity and upgrades.
Lately, SnapLogic has been paying more attention to providing APIs and tools focusing on data engineering problems, he explained. It's about "respecting data gravity." Integration takes place where the data resides.
As users acquire greater experience handling Hadoop, they are looking for ways to move their clusters from on-premises to the cloud. "Pricing options are becoming attractive," he said. "As enterprise customers dig in, we are seeing just as much focus on governance and security." Those are "classic IT requirements."William Terdoslavich is an experienced writer with a working understanding of business, information technology, airlines, politics, government, and history, having worked at Mobile Computing & Communications, Computer Reseller News, Tour and Travel News, and Computer Systems ... View Full Bio