Skip to ContentGo to accessibility pageKeyboard shortcuts menu
OpenStax Logo
Foundations of Information Systems

10.1 Defining Emerging Technologies

Foundations of Information Systems10.1 Defining Emerging Technologies

Learning Objectives

By the end of this section, you will be able to:

  • Define emerging technology and provide examples
  • Identify real-world applications of emerging technologies
  • Identify the opportunities, challenges, and risks of emerging technologies

When you hear the phrase “emerging technology,” what comes to mind? Have you ever driven an electric car or ridden in a self-driving car? What do you think are the factors that determine if a technology is emerging? Would you consider the latest smartphone an emerging technology? Technically, the first commercial smartphone was released three decades ago. If technology created over thirty years ago was emerging then, is it still emerging technology today? What about Henry Ford’s historical introduction of the moving assembly line? Both of these technologies were built on existing technologies, and both have been revised and advanced over the years to their current forms—both are examples of the important influences of emerging technologies.

What Makes an Emerging Technology

Any software or hardware that enhances the user experience by obtaining or using information and data in new and compelling ways can be considered an emerging technology. The term can be used to describe new technologies or the continuing development of existing technologies. Emerging technology can be found in all areas of our society, including education, information technology, nanotechnology, biotechnology, robotics, and artificial intelligence. If we look back over the last century, there have been many technological advancements: the automated teller machine, the hard disk drive, the magnetic stripe card, mobile telephony, desktop computers, the computer mouse, and more. Each of these was considered an emerging technology at the time and is now a familiar part of our lives.

The progressive nature of emerging technologies allows for companies that embrace them to gain a competitive advantage and the potential for synergies with other technologies that have the same or similar goals. The convergence of technologies has the potential to create efficiencies that may not have previously existed. Consider the convergence of video, voice, and data, for example. All these technologies were new at one time, and as their capabilities became apparent, opportunities became available to combine the technologies for use in one product. Having video, voice, and data on the same network allows multiple forms of communication that are not possible with separate infrastructures. Sending email or text messages to others while having a video chat conversation on the same device used to be hard to imagine, but they are now widely used together.

There are many other forms of technology that are considered to be emerging because of their rapid rate of change. The branch of engineering and computer science called robotics involves the conception, design, building, and operation of robots, creating intelligent machines that can assist humans with a variety of tasks. Robotics is considered emerging because it is being used in new and exciting ways every day. For example, many surgeries today are being performed laparoscopically with the assistance of robotic technology. Self-driving and electric vehicles are increasing their presence on roads, making waves in the automotive industry. The Internet of Things (IoT) has introduced biometric scanners and wearable devices that are changing the way we communicate and interact with each other.

Blockchain is considered an emerging technology due to its ability to improve efficiencies and streamline processes across many different industries. A blockchain is a shared, immutable ledger that facilitates the process of recording transactions and tracking assets. Blockchains are used for secured, transaction-based actions and information sharing within business networks. Blockchain uses cryptography, the process of hiding or coding information so that only the intended recipient can read it. Cryptographic protocols provide secure connections, enabling two parties to communicate with privacy and data integrity and provide additional layers of security. We see examples of blockchain use through the emergence of digital currency, such as cryptocurrency or bitcoin, one type of cryptocurrency. Cryptocurrency uses blockchain technology and allows transactions over the internet with no Federal Reserve System or monetary backing. Blockchain ensures that the cryptocurrency is successfully transferred from the sender to the recipient and arrives at its intended location and that financial transactions occur properly.

In 1977, the U.S. Department of Energy (DOE) was created and charged with promoting broader energy policy, promoting energy conservation, and finding alternative sources of energy. With these goals in mind, the DOE has become one of the largest federal organizations to drive innovation in the areas of power plants, solar panels, and renewable energies, all considered emerging technologies because they influence the way we interact with sustainable resources.

Educational institutions have also provided the means for researchers to foster innovation and develop technology by providing research labs and direct and indirect support, such as funding, research assistants, and faculty with subject matter expertise in research area and statistics. Ivan Sutherland, a professor at Harvard University, along with his student Bob Sproull, created the first virtual reality (VR) device in 1968 (Figure 10.2). Sutherland is also credited with the development of augmented reality (AR) that same year.

Photo of first VR prototype on a fake head. A large screen is visible over the eyes with long tubes coming off right side of unit and leading to a long cord.
Figure 10.2 Known as the “Sword of Damocles,” Ivan Sutherland and his research team created this first head-mounted virtual reality device in 1968. (credit: modification of work “Virtual Reality Headset Prototype” by “Pargon”/Flickr, CC BY 2.0)

Emerging technologies usually introduce novel approaches, concepts, or applications that may not have been previously considered. They also show rapid advancement, developing and changing quickly, often due to an organization’s financial investments and research efforts. Other characteristics of emerging technologies include their prominent impact, volatility, complexity, and uncertainty. Additionally, emerging technologies may be characterized by the type of technology used, the industry in which they are used, or the uniqueness of their attributes.

Emerging technologies are also known for their disruptive or transformative potential. They can introduce significant change and challenge traditional norms. For example, the introduction of self-checkout kiosks in grocery stores has reduced the number of cashiers needed to assist customers with their purchases. Following are some other cutting-edge emerging technologies that merit special attention for their transformative potential:

  • Quantum computing harnesses quantum mechanics principles to perform complex calculations exponentially faster than traditional computers. It shows promising breakthroughs in cryptography, drug discovery, and financial modeling.
  • Edge computing brings data processing closer to where data are created, reducing latency and enabling real-time applications like autonomous vehicles and smart manufacturing.
  • Green computing practices focus on environmentally sustainable computing through energy-efficient hardware, smart power management, and eco-friendly data center design.

The integration of these technologies creates new possibilities. For instance, edge computing can reduce energy consumption by processing data locally, while quantum computing could optimize power grids for better energy distribution. Meanwhile, cross-platform integration allows these technologies to work together. A self-driving car might use edge computing for immediate decisions, quantum algorithms for complex route optimization, and green computing principles to maximize battery life.

If we look back over the last century, there have been many technological advancements: the automated teller machine, the hard disk drive, the magnetic stripe card, mobile telephony, desktop computers, the computer mouse, and more.

Real-World Applications of Emerging Technologies

Real-world applications of emerging technologies are boundless, and you are likely familiar with many usages. An augmented reality (AR) overlays digital information onto a user's environment in real time. Augmented reality often involves the use of hardware such as headsets or smartphones to overlay digital information onto physical environments. Slightly different, virtual reality (VR) is a computer-generated environment that simulates reality and allows users to interact with three-dimensional environments. AR and VR technologies are advancing (Figure 10.3), particularly in fields like education, health care, and entertainment. New immersive experiences are being created with better hardware, more realistic simulations, and applications in virtual collaboration.

(a) Photo of individual participating in a reality tour of an Air Force Space Command using a hand-held small screen. (b) Photo of individual using a VR headset.
Figure 10.3 (a) Augmented reality and (b) virtual reality create immersive experiences that are being used across organizations and fields to help teach and train employees. (credit a: modification of work “Command Center Alpha” by Dale Eckroth, U.S. Air Force/Air Education and Training Command, Public Domain; credit b: modification of work “Razer OSVR Open-Source Virtual Reality for Gaming (16241057474)” by Maurizio Pesce/Wikimedia Commons, CC BY 2.0)

The biggest technological change over the last three decades has been the introduction of cell phones and smartphones, which are almost as powerful as desktop computers. In response to consumer feedback, smartphone manufacturers continue to develop and create more powerful devices with new features, improving screen size, data storage, battery life, camara quality, and processing power.

Another emerging technology, AI, is the branch of computer science focused on creating intelligent machines capable of performing tasks that typically require human intelligence, such as visual perception, speech recognition, decision-making, and language translation. While AI was originally developed in the 1950s, the power of modern computers provides new uses for AI. Artificial intelligence applications are transforming everyday business operations across industries:

  • In retail, AI powers recommendation systems that suggest products based on past purchases and browsing history.
  • In health care, AI assists radiologists by flagging potential abnormalities in medical images for review.
  • Manufacturing plants use AI for predictive maintenance, analyzing sensor data to identify when machines might fail before they break down.
  • Financial institutions employ AI to detect fraudulent transactions by spotting unusual patterns in real time.

These practical applications are examples of how AI moves beyond theory to solve real business problems. AI is not just about complex algorithms; AI is using technology to make processes more efficient, decisions more informed, and services more personalized.

A type of AI called generative AI creates new content or ideas in the form of text, images, videos, music, audio, and other forms of data, and its supporting tools are being used to create mounds of content for different professions. Educators, students, lawyers, project managers, and even publishers are using generative AI to shorten the time on tasks. The information gathering that can be done from internet sources and web applications can be pulled together quickly through generative AI.

Ethics in IS

Trust Issues in Artificial Intelligence Cause Ethical Concerns

Generative AI, such as ChatGPT, will change the way we approach learning and all forms of technology. Generative AI has the ability to facilitate our learning, help with industry research, and diagnose many different problems. But, there is one major flaw—it cannot be trusted all of the time. When gathering information—such as images, audio, and text—from the internet, generative AI tools often use copyrighted material, often without obtaining permission from the intellectual property owners. While generative AI software can do a phenomenal job creating text and images, it lacks this ethical component.

Additionally, if AI is unable to find the information you request in your prompt, it may create that information itself, which can result in fake articles, photographs, events, and people. For example, following hurricanes on the East Coast during fall 2024, several fake AI-generated images were used to highlight how the hurricane affected the area.1 In 2023, The Guardian determined that ChatGPT listed fake, unpublished journal articles as responses to prompts asked for through the ChatGPT interface.2 ChatGPT may fabricate content that mimics real articles if prompted, highlighting generative AI’s inability to verify factual accuracy. This is a real problem in a virtual world. If generative AI software fabricates information, how can individuals identify real information from fake information? The material generated by AI looks and reads just like real articles. This only diminishes the trust of what is found on the internet. Without any type of regulation, fake information can be referenced and even cited on the internet, leading to more misinformation and potentially disinformation.

Quantum computing represents another real-world application of emerging technologies. Rensselaer Polytechnic Institute in Troy, New York, became the first university to house an IBM quantum computer, costing the school more than $150 million in investment capital.3 This acquisition allows students at the college to propose research projects, and if the project has merit, time will be allotted for the project to run on the quantum computer, providing students with an opportunity to gain hands-on experience in quantum computing. In quantum computing computers use quantum mechanics principles to perform complex calculations exponentially faster than traditional computers. With this ability, the computer models problems and simulations that are extremely hard for researchers to conceptualize, such as complex weather forecasting, large-scale financial modeling, and advanced pharmaceutical formulas for new drugs. These and other complex problems tend to involve multiple variables that have elaborate interactions, requiring sophisticated technology to analyze and understand the problems and potential solutions. Quantum computing can provide the sophistication needed. However, the most significant challenge in quantum computing is considered to be quantum error correction; effectively managing the noise and errors that occur within quantum systems, which is crucial for achieving reliable and large-scale quantum computations.

Opportunities, Challenges, and Risks of Emerging Technologies

Emerging technologies offer a multitude of opportunities to enhance our lives in every way. For example, businesses can use AI to generate valuable data about customer experiences and satisfaction, and these data can be the impetus to make changes needed to improve customer service. Artificial intelligence can help students learn by providing educational materials like customized flash cards and quizzes. At the same time, the use of emerging technologies can pose challenges and risks that must be managed in balance with the benefits that these technologies offer.

Opportunities

Emerging technologies provide boundless opportunities for businesses to evolve and increase their competitive advantage. For example, enterprise modeling and integration (EMI), a process that uses computer-based tools to model the business structure and facilitate the connection of its technology, work, and information flow across an organization, has increasingly been considered a value-add for businesses as it allows a quicker response to business challenges, and improves efficiencies. EMI connects functionality and communication between information systems to include applications, data, clouds, application programming interfaces, processes, and devices. It combines multiple integration approaches into one combined effort, with one governance model. Incorporating AI into this process would be a value-add for businesses as its capabilities can be integrated directly into products and systems to enhance performance in all system areas.

Augmented reality and VR also provide opportunities for business growth and improved performance. AR and VR technologies allow users to access animated three-dimensional experiences, videos, and targeted detection directly from their personal devices, leveraging components within the device such as the camera, magnetometer, orientation, and other functions. An example of this functionality is the use of AR-enabled applications to enhance user shopping experiences (Figure 10.4).

Photo of individuals taking pictures of a dressed mannequin using augmented reality on their devices to picture how a dress would look on different bodies.
Figure 10.4 Augmented reality is used with many online retailers to help shoppers visualize how that item would fit in their environment. We are now able to see how a couch fits in our living room or how a dress looks on our body prior to purchase. (credit: modification of work “Augmented reality fashion” by “sndrv”/Flickr, CC BY 2.0)

Additionally, emerging technologies continue to influence areas such as information technology, integrated manufacturing, medical informatics, digital libraries, and electronic commerce, supporting efficiencies in manufacturing, health care, e-commerce, and other facets of business. Another area impacted by emerging technologies is information economics, which is a branch of microeconomics that analyzes how economic decisions and consumer behaviors are influenced by knowledge and power. It focuses on how information is produced, distributed, and used in economic systems. It is an important field of study to provide businesses and other organizations with the data and knowledge they need to be competitive in the marketplace.

Another example is Bitcoin, which provides specific opportunities with its functionality, including the following:

  • Data sharing between businesses is enabled in a decentralized structure where no single entity is exclusively in charge.
  • Security and privacy are improved wherein transactions have end-to-end encryption protections from unauthorized activity.
  • Costs are reduced as a result of efficiencies in transaction and business processes.
  • Speed is increased as compared to manual processes and other technologies with similar functions.

Blockchain technology continues to evolve, finding new applications in areas like decentralized finance, supply chain management, and secure data sharing. Blockchain technologies have touted benefits and opportunities in several industries including financial institutions, health-care organizations, and nonprofit and government agencies. Customers of these industries have experienced faster and less costly clearing and settlement of financial transactions, increased security of patient privacy, and transparent supply chains to maximize social impact. Specific to health-care organizations, patient- and organizational-related benefits can be attributed to the use of blockchain technologies (Figure 10.5).

Benefits diagram: Patient-related benefits (Security and authorization, Personalized health care, Monitoring patient health status, Tracking patient health data) and Organizational-related benefits (Pharmaceutical supply chain, Clinical trials, Managing medical insurance, Health information exchange).
Figure 10.5 Health care’s use of blockchain technology has benefits for both the health-care organization and patients. (credit: modification of work “Fig. 3. Benefits of blockchain technology” by Israa Abu-elezz, Asma Hassan, Anjanarani Nazeemudeen, Mowafa Househ, Alaa Abd-alrazaq/International Journal of Medical Informatics, Volume 142, October 2020, 104246. https://doi.org/10.1016/j.ijmedinf.2020.104246, CC BY 4.0)

Challenges and Risks

The opportunities derived from emerging technologies are indeed exciting; however, with their expected growth and expansion come challenges and associated risks. Security and data privacy will continue to be ongoing concerns as cybercriminals are increasingly sophisticated at usurping the security protocols of networked and cloud-based systems. Broader access to AI tools has equipped adversaries with the means to exploit these technologies, generating misleading or incorrect information. For example, in fall 2023, fake videos generated by AI featured Taylor Swift promoting Le Creuset cookware and Tom Hanks promoting a dental plan. Both celebrities decried the videos as fake content produced without their input or permission.4

Artificial intelligence can be used to analyze patterns and detect vulnerabilities faster than security teams can respond as the impact may be increasingly widespread as more of our services become reliant on AI. Users can be tricked into responding to impostor prompts asking for identifying information. These challenges continue to risk the security and data privacy protections needed to protect the personal information of users and customers.

There are also security and data privacy concerns with the use of AR and VR. Unauthenticated data content is sometimes used by AR browsers that facilitate the augmentation process; therefore, people can be misled by false information provided on these sites. Aside from the cybersecurity challenges, the biggest VR danger is its ability to interfere with one’s visual and auditory connection to the outside world. When users are immersed in VR, they may experience a sensory conflict between what their bodies are experiencing in the real world and the visuals of the virtual world. This can lead to cybersickness, which may include disorientation, dizziness, and even nausea as users lose spatial awareness. It is crucial to maintain awareness of one’s surroundings when immersed in these environments.5

Early implementations of blockchain technology have exposed some of the technology’s challenges and risks, including ongoing threats to security and data privacy of its users. Additional challenges include the scalability and performance, interoperability, regulatory and legal concerns, and adoption and integration of blockchain technology. Energy consumption is also a significant challenge as blockchain technology requires high-powered computing equipment to create new blocks and verify transactions. The energy needs to power this equipment are so great that blockchain technology’s energy consumption is causing substantial greenhouse gas emissions and contributing to climate change.

Footnotes

  • 1“Fake Images Generated by AI Are Spreading on Social Media, Compounding Misinformation Surrounding Hurricane Recovery Efforts,” ABC News, October 15, 2024, https://abcnews.go.com/US/video/fake-images-generated-ai-spreading-social-media-compounding-114824660
  • 2Chris Moran, “ChatGPT Is Making Up Fake Guardian Articles: Here’s How We’re Responding,” The Guardian, April 6, 2023, https://www.theguardian.com/commentisfree/2023/apr/06/ai-chatgpt-guardian-technology-risks-fake-article
  • 3“Rensselaer Polytechnic Institute Plans to Deploy First IBM Quantum System One on a University Campus,” IBM, June 28, 2023, https://newsroom.ibm.com/2023-06-28-Rensselaer-Polytechnic-Institute-Plans-to-Deploy-First-IBM-Quantum-System-One-on-a-University-Campus
  • 4Megan Cerullo, “AI-Generated Ads Using Taylor Swift’s Likeness Dupe Fans with Fake Le Creuset Giveaway,” ed. Anne Marie Lee, CBS News, updated January 16, 2024, https://www.cbsnews.com/news/taylor-swift-le-creuset-ai-generated-ads/
  • 5Ann Pietrangelo, “All About Cybersickness,” Healthline, February 4, 2021, https://www.healthline.com/health/cybersickness
Citation/Attribution

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution-NonCommercial-ShareAlike License and you must attribute OpenStax.

Attribution information
  • If you are redistributing all or part of this book in a print format, then you must include on every physical page the following attribution:
    Access for free at https://openstax.org/books/foundations-information-systems/pages/1-introduction
  • If you are redistributing all or part of this book in a digital format, then you must include on every digital page view the following attribution:
    Access for free at https://openstax.org/books/foundations-information-systems/pages/1-introduction
Citation information

© Mar 11, 2025 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.