TRAC Intermodal's Journey to Data, Analytics, Cloud
The IT team at this provider of chassis for metal shipping containers has transformed its data infrastructure from spreadsheets and reports to a Hadoop data lake and dashboards.
When you were eating those chips and guacamole in front of your flat screen TV during the recent World Cup matches, you probably weren't thinking about how that television and the avocados for that dip arrived on US shores in shipping containers. While not many people spend a lot of time thinking about shipping containers and the equipment used to move them around, those containers and that equipment play a large role in enabling the US and global supply chain and delivering many of the goods we buy and consume on a daily basis.
The innovation of standard-sized or "intermodal" shipping containers dates back at least a half century, so it may be no surprise that a business that supplies the equipment needed for moving individual shipping containers operates with a traditional culture -- one that may not easily welcome digital transformation and its benefits.
Indeed, John Ackerman, senior director of enterprise architecture and innovation at TRAC Intermodal says that the company had been a "pen and paper" business for years. TRAC Intermodal supplies the chassis for intermodal shipping containers to marine shipping companies and trucking companies. The term intermodal refers to standardized sizes of shipping containers. The chassis is the metal frame that the industry puts the metal shipping containers on in order to transport them. The company is the largest in the marine sector of the intermodal equipment provider market and was also previously a leader in the domestic market, Ackerman told InformationWeek in an interview.
But while it may have been at the top in terms of providing equipment for shipping containers, the company was still relying on Microsoft Excel spreadsheets on shared drives to work with its data. When it had to share data with its customers -- marine shipping lines or independent truckers -- it often did so by emailing spreadsheets.
Soon after he joined the company two years ago, Ackerman set out to bring the company's technology up to date. The project began by trying to match IT capabilities to the goals of the organization, and that rapidly evolved into a plan to figure out how to deliver analytics to an organization that had never had them.
"They didn't understand what analytics meant," Ackerman said. The company had been working with operational-based reporting, not analytics. The goal became "how do we innovate and how do we push data as an enterprise asset -- a strategic asset throughout the organization?"
TRAC Intermodal's data is largely made up of "movement" data -- for instance, when did equipment move out of the terminal, where did it go, when did it come back? From that TRAC Intermodel constructs a segment or cycle that is the basis of its billing to its customers and also the basis for providing customer service.
After a shaky attempt at a traditional data warehouse several years back, TRAC Intermodal decided a data lake would be a better fit for the company, and potentially a first step to creating a successful data warehouse. The company evaluated the three major Hadoop suppliers and chose Hortonworks, which it has implemented in the Microsoft Azure cloud. The company currently has about 2 TB of data there. AtScale has provided the tools to enable more traditional users to leverage their current tools to access data in the data lake. At the same time in an unrelated effort, it recruited a couple of Hortonworks experts from Hewlett Packard. Ackerman explained that the group chose Hadoop as the foundation because the company planned to instantiate both its structured and unstructured data into a data lake.
The infrastructure is also designed to support TRAC Intermodal's plans to ingest data from its "smart" GPS-enabled chassis and IoT data, he said.
The setup is a big improvement, according to Gina Petito, manager of data analytics for the enterprise analytics team at TRAC Intermodal. She should know. She joined IT after spending the first 12 years of her career doing financial analysis on the business side.
"I knew what the business users were going through," she said. "The amount of heroics that had to happen on the business side to produce any sort of analytical report was a lot."
Reports would take four hours to run, Petito said, and many people used unsanctioned workarounds to get the job done. The data lake now provides a certified and sanctioned place for anyone in the company to get data.
But in an old industry such as shipping, not all innovations are welcomed. When Shauna Davis interviewed for the position of technical manager of big data architecture at TRAC Intermodal, Ackerman told her that one of the big challenges for the big data project would be the old-school culture. Users may be suspicious of change, even if it saves them time.
Davis is one of the Hortonworks experts who joined TRAC Intermodal from HP.
"Our biggest challenge has been the culture change, getting the business onboard with the capabilities of these big data technologies and platforms," Davis said. "People are scared. They need to understand, they need to be trained, they need knowledge transfer. That's what we've tried to do here."
To meet the cultural resistance, Davis said that the group targeted very specific use cases. Petito specifically spearheaded efforts to go out to business users to assess needs, and then the team put together streams of real time data with flashy dashboards, Davis said. Those dashboards meant that users wouldn't have to wait 4 hours or 2 days for the insights they needed.
Still, not all users are convinced.
"It's a curse of the legacy system that it is very tabular in nature," Davis said. "The business pulls reports that gives them 150 columns and thousands and thousands of rows down. The business loves to see this information." And they are reluctant to give up that view of everything in favor of the dashboard that gives them top level views of key metrics and allows them to drill down into the detail. It's the opposite of how they have traditionally viewed the data.
Petito said they still want to see the giant spreadsheet views first.
"But what are they really doing with that? They are filtering, they are putting it into a pivot table," she said. "…The business isn't used to seeing the information they can act upon instantaneously."
Davis said that the disconnect comes down to trust.
"In order for them to trust us they want to see the data at the most detailed level," Davis said. But nine months after introducing this, Petito said she thinks the group may be turning a corner in gaining some user acceptance.
But it is not waiting for widescale acceptance before embarking on other pieces of the big transformation project. The company is working to implement a new ERP system -- Oracle eBusiness Suite. Other items on the agenda for 2018 include loading the company's historical data into the data lake and taking some initial steps to applying machine learning to that data.
Jessica Davis has spent a career covering the intersection of business and technology at titles including IDG's Infoworld, Ziff Davis Enterprise's eWeek and Channel Insider, and Penton Technology's MSPmentor. She's passionate about the practical use of business intelligence, ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.