4 Trends In 2017 That Every Developer Needs To Understand

Developers need to be ready for the opportunities that quantum computing, big data, and mixed reality might bring to all computing sectors, including commercial computing, in 2017.

Guest Commentary, Guest Commentary

February 3, 2017

5 Min Read
Martin Puryear, Coding Dojo

2017 takes us one step closer to a future articulated in classic and influential science fiction (such as the Jetsons). Artificial intelligence, virtual reality, and quantum computing teeter on the brink of mainstream. As a coding instructor and curriculum designer I spend a lot of time thinking about where the tech industry is headed, to prepare my students for that new world. Here are several trends I think will dominate software development in 2017.

Increase in client-server hybrid systems. In 2017 we will see more software systems that blend local and cloud computing in a variety of different proportions. In traditional web programming, a browser connects to a backend server, which in turn does all the actual processing. In traditional application programming, these programs run locally (e.g. on a phone or laptop) and do all their work on the device itself. Apps such as OneNote or Firefox run locally, whereas web services such as Amazon or Gmail run on servers in the cloud.

Some systems are hybrids: neither pure applications nor pure services. Some computing is done locally, some in the cloud. For example, games written for the Xbox One can tap into a huge amount of local processing power in the console, while still incorporating a huge multiplayer component with Xbox Live. This kind of system isn’t new – we see it in any app that has a "connected" mode and an "offline" mode, such as GMaps or Outlook. This cloud-versus-local divide is becoming less apparent. As computing power grows in devices and services (along with growing broadband availability), companies are increasingly creating integrated hybrid systems.

In 2017 we will continue to see software built in this way, and I predict that the distinction between cloud and local computing will blur further, to the point that it can be set at any point in the spectrum, based on that system’s needs. For features that must work on every device and browser, developers will continue to build web apps, leveraging cross-platform components. For extremely responsive or real-time features, work must be done locally to avoid network latency. As always, feature requirements (in concert with team expertise) will suggest the best architecture, and going forward most systems will be blended in this way.

Big data gets even bigger. The amount of information available for big data calculations is increasing, and powerful cloud computing tools and machine learning algorithms will allow developers to take more nuanced and valuable actions with that data.

It’s not news that we generate more and more data nowadays. Wearable devices emit biometric data; websites record each user click for A/B testing; companies track customer behaviors down to trivial detail. Big data tools (for Python or perhaps functional programming languages such as Scala) equip us to meet these challenges.

In particular, there is significant growth in the use of machine learning systems for data analysis. Far beyond yesterday’s techniques of merely proving or disproving specific cause-and-effect hypotheses, these systems return unexpected findings by picking out trends that look like random data to human eyes. We’ll see more useful, actionable results from big data analysis this year, and along with it, more need for skilled data engineers and analysts.

VR goes mainstream. 2017 might be the year that virtual reality makes the jump to mainstream use, as VR devices continue to grow in capability and shrink in size. With augmented reality (AR) also making rapid strides, this means more demand for VR/AR-specific skill sets in both design and development.

When VR goes mainstream, a critical aspect to watch will be that transitional interval during which consumers adopt this new technology while still relying on apps/services based on today’s “flat” UI.  The best products will be powered by talented designers who can create user interfaces to be experienced in either 3D or 2D. For VR to thrive in the short-term, most 3D apps will need to map back to 2D when necessary. Otherwise they run the risk of splitting their user base. Consider an app like Skype: Will it split into two products (one for full VR users and one for everyone else), or one product with a mix of both? Designing interfaces that work for both VR and non-VR users is a serious design challenge that I’m fascinated to see solved.

Growth in VR also means that we’ll see skills from the game development world move into other kinds of software. Aspects like lighting and camera movement that are not applicable to most apps become paramount in VR. One can actually use a game engine like Unreal Engine or Unity to write software for VR devices such as Oculus Rift, HTC Vive, Google Daydream and Cardboard. For most apps, the software development process will need to change, since the first step toward recasting an app in VR is to place it in 3D. Having said that, once an app is translated into a 3D world, the amount of device-specific code can be relatively small (despite the lack of VR hardware standards). If the software industry adapts appropriately, we’ll see a jump in VR that is integrated into everyday life.

Quantum computing makes devices even smaller. Some lament that Moore’s Law is slowing down, as transistors begin to be measured in nanometers and atoms. However, Google and Microsoft are hiring quantum computing experts to work on engineering projects, and they may soon produce actual quantum computers. A quantum computer could carry out calculations exponentially faster than transistor-based ones, leading to smaller, more powerful devices. Devices will likely continue to shrink, and we will likely see a few companies (e.g. Apple and Microsoft) try to create a single wearable super-device that can completely replace a phone, tablet and laptop.

As I said, it’s exciting to see so many technologies move closer to mainstream use. I encourage my fellow developers to prepare for the opportunities that quantum computing, big data, and mixed reality might bring in 2017. I believe our field will become less specialized this year, so branch out from what you know. Pick up a new language, brush up on your distributed systems skills, or start reading about 3D design and learn something new!

Martin Puryear is a Stanford computer science graduate with over two decades of engineering experience, including many years at Microsoft. He currently is Head of Curriculum and Technology at Coding Dojo, a 14-week coding bootcamp that teaches full stack development. This position constantly exposes him to new technologies, methods, and trends in programming. 

About the Author

Guest Commentary

Guest Commentary

The InformationWeek community brings together IT practitioners and industry experts with IT advice, education, and opinions. We strive to highlight technology executives and subject matter experts and use their knowledge and experiences to help our audience of IT professionals in a meaningful way. We publish Guest Commentaries from IT practitioners, industry analysts, technology evangelists, and researchers in the field. We are focusing on four main topics: cloud computing; DevOps; data and analytics; and IT leadership and career development. We aim to offer objective, practical advice to our audience on those topics from people who have deep experience in these topics and know the ropes. Guest Commentaries must be vendor neutral. We don't publish articles that promote the writer's company or product.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights