At the Amazon Summit in Chicago, the CTO of GE Oil & Gas said his unit is exploring ways to better understand and save with the AWS platform.

Charles Babcock, Editor at Large, Cloud

April 21, 2016

5 Min Read
<p align="left">(Image: ssuaphoto/iStockphoto)</p>

10 Cloud Jobs In Highest Demand Now

10 Cloud Jobs In Highest Demand Now


10 Cloud Jobs In Highest Demand Now (Click image for larger view and slideshow.)

Use of the Amazon cloud for running enterprise applications will reduce an app's total cost of ownership by an average of 52%, according to Ben Wilson, CTO of GE's Oil & Gas unit.

Wilson was the first Amazon Web Services (AWS) customer to take the stage April 19 as the Amazon Summit got underway this week in Chicago. He was introduced by Matt Wood, general manager for product strategy at AWS, who opened the session.

Wilson told the audience that GE Oil & Gas moved 350 applications onto Amazon over a 30-month period. The simplest, least critical applications, such as the blogging application WordPress, were moved first as GE IT staff learned the ins and outs of Amazon operations. As they grew more confident in their ability to manage the results, the complexity of workloads migrated was increased, Wilson told InformationWeek in a follow-up interview.

One application running there now is GE Oil & Gas's customer billing application, the one that produces a single consolidated invoice rather than many invoices per customer. The oil and gas industry "is in a bit of downturn. We want to get paid," he observed wryly.

Another mission-critical app it moved was his unit's analytics engine for data collected from "pig" operations, where an MRI-like inspecting device is hauled through miles of pipeline to check its structural integrity. There are multiple sensors on the device, and it generates hundreds of TBs of data through continual use.

GE's Epsilon custom application analyzes that data, searching for the anomalies and indicators of pipeline deficiency or weakness. Amazon is able to upload 50 TBs at a time through the use of Amazon's Snowball data transporter appliance, a physical device that's loaded with encrypted data that can be unlocked only upon reaching its destination at an AWS data center.

Learn to integrate the cloud into legacy systems and new initiatives. Attend the Cloud Connect Track at Interop Las Vegas, May 2-6. Register now!

AWS's flexible compute platform allows "Epsilon to pull the static out of the pig data," Wilson said in the interview. GE Oil & Gas has been steadily moving additional 50TB segments of the 750TB of pig data it holds onto Amazon. Not only does the analysis platform work better there, but the timeliness of the data available is improving. In the past, it took GE an average of 100 days to move 50TB of data over its 1-gigabit connection to Amazon. Now it's assured of getting access to the data within two weeks, he said.

Another point Wilson made at the Summit was that GE Oil & Gas is testing what database systems should remain in GE data centers and which ones are candidates to move into Aurora, Amazon's scalable version of MySQL as a service, generally available since last July. Asked whether Oracle, SQL Server, or existing MySQL databases inside GE get first consideration, Wilson said that wasn't the issue.

[Want to learn more about AWS's Aurora database? Read Amazon Urges Proprietary Database Customers to Migrate.]

"We've become more technical over the last few years" in terms of cloud computing, he said of the GE Oil & Gas IT staff managing the migration to Amazon. For one thing, it has launched a "bot army" to monitor and measure the use of the workloads already there.

The bot army taught Wilson and his staff that database systems and applications that are infrequently used or used for limited periods each week offered a special opportunity. They could become "metered applications" charged for use during the time they are needed, then shut down.

The Bot Army

Some database systems particularly show a savings. On AWS, their owners pay only for when they're being used rather than paying for a lifetime license and annual maintenance that assumes they'll run 24 hours a day, seven days a week. But it's the ability to measure when they're used -- and how much they're used -- that allows GE operations to set Amazon Elastic Beanstalk to launch them when they're needed and shut them down when they're done.

For all applications moved to the cloud, "it's their metered aspect that's appealing," he said. "That's what the world is changing into, a metered opportunity. That's a real change in the IT environment."

The metered approach was one of the reasons Wilson could conclude that, on average, applications moved into the cloud had a 52% reduced cost of ownership compared to on-premises applications.

Now GE Oil & Gas is asking itself what its next big Amazon initiative should be. "Do we want to put the CAD engine and project lifecycle management systems on AWS?" he asked. A strong argument to do so is that CAD and project files are huge and not easily moved around the world to the locations where GE houses its engineering teams. GE has "thousands" of engineers in such locations as Oklahoma City, Houston, and Shanghai. It also has teams in Florence, Italy; Aberdeen, Scotland; Perth, Australia; and dozens of other locations.

Moving CAD into the cloud would allow engineering teams to more readily view, share, and work jointly on projects in progress, but they would also be moving important, proprietary data out the door, and provide instant feedback from a knowledgeable engineering staff if anything doesn't perform the way they think it should.

The migration to the cloud doesn't encompass the only IT initiative afoot in the Oil & Gas unit. Wilson said the IT staff is also figuring out what existing outsourced services should be brought back in house. "Insourcing" will give IT further cost and operational control over those aspects of IT now outsourced.

The opportunity for change is being pursued in both directions. Both insourcing and metered applications in the cloud allow GE IT to better understand how its infrastructure works, or doesn't work, and that will lead to a more streamlined, cost-efficient IT in the future, he said.

About the Author(s)

Charles Babcock

Editor at Large, Cloud

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week. He is a graduate of Syracuse University where he obtained a bachelor's degree in journalism. He joined the publication in 2003.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights