7 Emerging Technologies IT Should Study Now - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Data Management // Big Data Analytics
News
2/23/2015
06:56 PM
Connect Directly
Twitter
LinkedIn
Google+
RSS
E-Mail
100%
0%

7 Emerging Technologies IT Should Study Now

Staying on top of current technologies means anticipating future ones. Here, we look at seven technologies IT should be studying right now. One (or more) of these may well be the next big trend in the industry.
Previous
1 of 10
Next

(Image: Benjamin Nelan via Pixabay)

(Image: Benjamin Nelan via Pixabay)

Staying on top of technology trends is serious business for those of us in the field. Skill sets come and go at a breakneck pace, and it's important to stay ahead of the curve in an attempt to anticipate the next hot trend to keep oneself relevant in an ever-changing world.

Here's a look at seven trending technologies that IT professionals should be studying right now. Where these technologies lead us, IT jobs are certain to follow in order to help design, implement, and support each one.

Our goal for this list was to identify not only hardware and software technologies, but also ideologies and legislative movements that can dramatically influence how and when a particular technology will reach a critical mass in terms of impact on our lives. You'll find that our list contains not only technologies that are attempting to solve problems we see today, but also ways to move beyond what we have today and push us into uncharted territories.

To help understand the importance of studying emerging technologies, simply look back at the past decade and contemplate the disruptive technologies that have revolutionized the way IT infrastructure works today. Topics such as server virtualization, big data, and cloud computing were once merely high-level concepts and ideas. Yet, those of us who investigated and learned about these technologies early on had a dramatic advantage over our peers in the workplace, once these technologies came to fruition.

As our technologies increase in complexity, it takes more and more time for technologists to start to comprehend new technologies, let alone learn how to implement and support them. So it's in our best interest to start our education as early as possible by first identifying the technology trends likely to shake up the IT landscape in the years ahead.

The proliferation of mobile computing around the world clearly indicates that a focus on future wireless technologies would make our list. The same goes for IT's hottest topic of the past few years -- IT security. Other technologies, such as three-dimensional imagery and robotics, will advance many areas of our lives that have remained stagnant for years.

Click on the following pages to see our top seven picks for technologies IT needs to watch and study now. Then, let us know what you think about the list and tell us about the technologies you think we're missing out on. Share your thoughts in the comments section below.

Andrew has well over a decade of enterprise networking under his belt through his consulting practice, which specializes in enterprise network architectures and datacenter build-outs and prior experience at organizations such as State Farm Insurance, United Airlines and the ... View Full Bio

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Previous
1 of 10
Next
Comment  | 
Print  | 
More Insights
Comments
Oldest First  |  Newest First  |  Threaded View
Page 1 / 3   >   >>
Thomas Claburn
100%
0%
Thomas Claburn,
User Rank: Author
2/23/2015 | 7:13:12 PM
Fog Computing?
Why not move straight to smoke-and-mirrors computing?

We need more technical precision, not Cisco muddying the waters with annoying jargon.
Charlie Babcock
100%
0%
Charlie Babcock,
User Rank: Author
2/23/2015 | 7:46:24 PM
Don't we need an edge protocol to do that?
Fog Computing to me sounds a lot like the AllSeen Alliance's AllJoyn protocol, where devices discover each other on a BlueTooth network and discover each other's capabilities.
pcharles09
100%
0%
pcharles09,
User Rank: Ninja
2/23/2015 | 10:11:23 PM
Re: Fog Computing?
@Thomas Claburn,

As an old IT manager once told me, that's called 'job security'. Coming from a behemoth like Cisco, I guess it's industry security.
shamika
100%
0%
shamika,
User Rank: Ninja
2/24/2015 | 12:50:37 AM
Re: Fog Computing?
This is an interesting article. I prefer the concept on 3-D displays. This will have a huge value addition for the industries mechanical design, engineering, advertising. It will help people to understand the real feeling of it.
shamika
0%
100%
shamika,
User Rank: Ninja
2/24/2015 | 12:51:00 AM
Re: Fog Computing?
"Biometric authentication methods" This is a good idea. It will help to have more secure environment. 
progman2000
0%
100%
progman2000,
User Rank: Ninja
2/24/2015 | 9:39:14 AM
Biometrics seems to make the most sense to me...
Still seems like there are a lot of barriers, but man how I would love a world without a bloated password safe...
Aroper-VEC
0%
100%
Aroper-VEC,
User Rank: Strategist
2/24/2015 | 10:59:47 AM
Re: Fog Computing?
Really? Sounds like a method for adding complexity for complexity's sake. Or, just another marketing term like "cloud" computing.

Still waiting for the "transformational paradigm shifters" to come out.
Aroper-VEC
100%
0%
Aroper-VEC,
User Rank: Strategist
2/24/2015 | 11:18:26 AM
Re: Biometrics seems to make the most sense to me...
I don't understand why Biometrics gets such a bad rap. When I read about how a certain type of biometric technology gets defeated and then I read the process that it took to defeat the technology, it really makes me wonder. Does this really mean it is so insecure? The amount of effort that has to take place is onerous in a lot of situations and I can't see somebody having that kind of time or access to pull it off outside of a lab environment. For highly secure locations, I can understand having an extremely minimal attack footprint but for the vast majority of us who want to eliminate passwords "good enough" is, well, good enough.

Would I ever rely on biometrics alone? That all depends on what I am trying to secure. Again, it is all about the application. The fingerprint sensor on my iPhone works just fine for unlocking it but I haven't saved my CC information in my phone.
Andrew Froehlich
100%
0%
Andrew Froehlich,
User Rank: Moderator
2/24/2015 | 12:14:32 PM
Re: Fog Computing?
@Thomas - I think that the term Fog Computing is actually a good one. If you understand the concept of cloud computing, fog computing is nothing more than taking the cloud computing concept and pushing it down to the edge where it surrounds us.  Communications and computing will happen closer to the edge device as opposed to inside a data center way up in the clouds.
mak63
0%
100%
mak63,
User Rank: Ninja
2/24/2015 | 12:47:12 PM
Re: Fog Computing?
"...Sounds like a method for adding complexity for complexity's sake. Or, just another marketing term like "cloud" computing"

Sounds like P2P network to me
Page 1 / 3   >   >>
News
How COVID is Changing Technology Futures
Jessica Davis, Senior Editor, Enterprise Apps,  7/23/2020
Slideshows
10 Ways AI Is Transforming Enterprise Software
Cynthia Harvey, Freelance Journalist, InformationWeek,  7/13/2020
Commentary
IT Career Paths You May Not Have Considered
Lisa Morgan, Freelance Writer,  6/30/2020
White Papers
Register for InformationWeek Newsletters
Video
Current Issue
Special Report: Why Performance Testing is Crucial Today
This special report will help enterprises determine what they should expect from performance testing solutions and how to put them to work most efficiently. Get it today!
Slideshows
Flash Poll