10 tech terms everyone needs to know for 2014

10 tech terms everyone needs to know for 2014

Information Technology (IT) is getting more pervasive and complicated every day. Although most of us experience IT in terms of personal computing devices (smartphones, tablets, or laptops), office productivity tools (word processors or spreadsheets), or infrastructure functionality (routers and servers), IT is steadily becoming more sophisticated and critical to everything we do.

At the Institute for Software Integrated Systems (ISIS) at Vanderbilt University, we conduct basic and applied research on the science and engineering of complex software-reliant IT systems to help industry, academia and ultimately, the general public.

The terms below suggest some of the key areas of technology development in the near future. Some of these may seem a bit more esoteric than others, but our future is clearly headed in these directions. To make intelligent decisions about our lives, our safety, our computers and more, here are the Ten Technical Terms Everyone Should Know for 2014.

1. Cyber-physical Systems (CPS) are an integrated set of hardware and software that controls physical things, which can involve humans or not. Classic examples of CPS’s include anti-lock brakes and automated mass-transit systems, like the subway. More sophisticated emerging CPS’s (such as driverless cars) are adaptive and intelligent, often solving problems as they occur in real time without direct human input. Among the hardest problems facing engineers is how robust and secure CPS’s should be to do what it is intended to do. More ‘robust and secure’ usually means more complex, expensive and apt to fail (ever had to reboot your car?), thus potentially costing time, money, lives or other unintended consequences.

2. Cloud Storage has become ubiquitous when talking about managing one’s growing cache of information, media and other data. The idea here is that your data is hosted by a third party, presumably secure and accessible anywhere you have an internet connection. The concept of a ‘cloud’ means many different resources connected together acting as one, thus increasing redundancy (and conceivably reliability) by creating many copies of data and storing it in many places. More copies in more places generate a potential security issue. If I store my file cabinet in your office, anyone with access to your office can get to my file cabinet. How good is your office door lock? Are you telling me the truth? Which files am I now comfortable storing in that file cabinet? These are the issues facing popular cloud storage services like Dropbox, Google Drive and iCloud.

3. Industrial Internet is an emerging communication infrastructure that connects people, data, and machines to enable access and control of mechanical devices in unprecedented ways. The Industrial Internet leverages the power of Cloud Storage and Computing to connect machines embedded with sensors and sophisticated software to other machines (and end users) so we can extract data, make sense of it, and find meaning where it did not exist before. Machines—from jet engines to gas turbines to medical scanners—connected via the Industrial Internet have the analytical intelligence to self-diagnose and self-correct, so they can deliver the right information to the right people at the right time (and in real-time).

4. 3G / 4G / 5G – The G stands for Generation, thus typically the speed of data transmission over wireless networks increases with each generation. U.S. wireless providers are far into the process of converting their networks from 3G to 4G, as are the device makers (Apple's iPhone 5 was its first 4G smartphone). Recently two competing 4G platforms were in use by various wireless telecom companies. For many reasons, LTE (long-term evolution) won out over WiMAX for North American cellular phone markets in 2012, thus moving all of us closer to a common broadband platform for the world. You can expect to see 5G roll out within the next decade.

5. Advanced Manufacturing involves the integration of IT-based systems and processes in the creation of products (fit, form, and function) to high levels of quality and in compliance with industry-specific certification standards. Products are increasingly complex and users demand more performance and reliability from them. With complexity comes cost and time, thus in order to keep costs and manufacturing time economical, methods like rapid prototyping and computer modeling are essential. For example, GE Aviation is applying Advanced Manufacturing technologies to develop new types of ceramic that outperforms the most advanced metallic alloys within a gas turbine and jet engine environment. Paramount to advanced manufacturing is a highly skilled workforce operating in lean and continuous improvement cultures.

6. Big Data refers to the massive amounts of data collected over time that are hard to analyze and handle using conventional database management tools. Big Data analytics operate upon a wide range of datasets, from organized to seemingly random, including business transactions, e-mail messages, photos, surveillance videos, and cyber incident activity logs. Scientific data from sensors can reach mammoth proportions over time, and Big Data also includes text posted on the Web, such as blogs and social media. Big Data analytics has traditionally focused on offline processing (download the data and process it locally somewhere). However, advances in computing clouds, analytics, programs, and automation for cyber-physical systems are broadening the applicability of Big Data techniques for use using the conventional Internet and the emerging Industrial Internet.

7. Cybersecurity involves preventive methods to protect information and machines connected to networks from being compromised or attacked. As we migrate more of our personal and business data to cloud storage—and as cyber-physical systems connected via the Industrial Internet and next-generation wireless networks become more integrated and essential to our health, economy, society, and homeland defense—we need better methods and tools for identifying and neutralizing potential cyber threats, such as viruses and other malicious code, as well as human vulnerabilities, such as insider threats. A cybersecurity plan is critical when company information is highly sensitive, such as medical records, financial information and other personal information. Recent arguments have been made regarding the intentional access of private information in the name of cybersecurity and national security.

8. Augmented Reality is the interaction of superimposed graphics, audio and other enhancements over a real-world environment displayed in real-time. A key challenge in cyber-physical systems is that users often can’t see the cyber information they need in the real-world setting. For example, as construction workers walk around a site, they can’t see the 3D building plan for the project directly overlaid on the walls in front of them to determine if they are built as planned. Augmented reality technologies enable these workers to reduce costly mistakes by visualizing what they are building atop what actually exists in the physical world. Retailers are currently experimenting with augmented reality to get more customers into the stores by allowing shoppers to ‘see’ the clothes on them without having to actually try them on. Some of these apps are a bit gimmicky now, but have the potential to change how we shop, train for new skills, game, build, and make other important decisions.

9. Agile Development Methods are a principled means of anticipating the need for flexibility in creating IT solutions. Agile software development focuses on keeping code simple, testing often, and delivering functional bits of the application as soon as they're ready. The goal of agile methods is to build upon small client-approved parts as the project progresses, as opposed to delivering one large integrated solution only at the end of the project. Now that agile methods are well-established throughout the commercial IT industry the challenge is to scale them up so they are suited in larger-scale mission-critical and life-critical environments, such as the Industrial Internet, automotive and avionics, space exploration, etc., that require balancing agility and discipline with large teams and long lifecycles.

10. Massive Open Online Course (MOOC) is a web-based class environment aimed at large-scale global participation and open access via the Internet. MOOCs have been dubbed a potentially disruptive technology trend that poses many challenges for traditional higher education. They are particularly relevant to the discussion of the other Tech Terms presented above because it’s likely that future researchers and practitioners of these topics will received a significant portion of their education through MOOCs and associated digital learning methods and tools. I recently taught one of the first four MOOCs offered by Vanderbilt on “Pattern-Oriented Software Architecture for Concurrent and Networked Software,” to 30,000+ students from all over the U.S. and scores of other countries. My experiences—both pro and con—teaching a MOOC underscored the point that in the rapidly changing and globally competitive environment in which we live, learn, and work, we need to continue to clarify and refine the value of—and affordable access to—high quality education.

Douglas C. Schmidt is the Associate Chair of Computer Science and Engineering and Professor of Computer Science at Vanderbilt University.  He works at ISIS at Vanderbilt University in Nashville.