Learning Objectives
By the end of this section, you will be able to:
- Discuss the frontiers of information systems
- Explain the convergence of information systems frontiers and emerging technologies
- Explain the opportunities and challenges related to information systems frontiers
Information systems are constantly evolving as communities of researchers, developers, think tanks, and others have expanded their thinking beyond what many of us can imagine in terms of where information systems may take us next. We have seen the evolution of systems from concept through iterations of change as the technology supporting these systems has evolved. Just look back on the evolution of the Apple iPhone, first introduced in 2007. From its initial functionalities that included mobile phone calling, personal computing, music, and a camera to its more contemporary features such as extended battery life, assistive touch, AI features, and camera resolution, iPhone functionality continues to be developed and enhanced with each new release. What do you think the next step is for the iPhone? How will the future versions of these types of technologies continue to impact our lives?
The Frontiers of Information Systems
The concept of information systems frontiers refers to the latest developments in the field of information systems. Frontiers explore new research areas, innovative applications, and emerging technologies that have the potential to significantly impact the field. These frontiers encompass a wide range of topics including data analytics, AI, cybersecurity, cloud computing, mobile computing, and social media.
Data Analytics
Data analytics has been identified as the future of information systems and innovation. In response to the increasing number of systems available and the breadth of data collected, organizations are increasingly looking to utilize this information. Walmart, for example, collects substantial amounts of data from their website (such as purchase histories, and products sold and returned) and in store (like customer demographics, store details, and products sold and returned). These data inform many of their business practices.
Data analytics, as you learned in Chapter 8 Data Analytics and Modeling, is the process of examining datasets to draw conclusions and insights, typically using statistical and computational methods to inform decision-making or solve problems (Figure 10.6). The insights generated from data analytics help businesses with the foundational information needed to increase performance and operational efficiencies. Along with better decision-making and operational efficiencies, data analytics can lead to the simplification of data, increasing the organization’s ability to make sense of the raw data collected and share that data as needed. Refer to 8.3 Analytics to Improve Decision-Making for data analytics tools and techniques that are used to process and examine data, allowing insight into business challenges and future trends, and leading to more informed business decisions.
Careers in IS
Tech Market Researcher
Tech market research is becoming increasingly popular as organizations attempt to gauge how well technology-related products are being received. Tech market researchers study market trends, consumer behaviors, emerging technologies, competitor information, and other factors to determine the viability of a new technological product or service. It is a systematic process of gathering, analyzing, and interpreting data related to the technology sector. Specific roles in this area may include analyst, specialist, recruiter, and general researcher, with general focus in wide-ranging areas of technologies to include data cloud computing, big data management, and emerging technologies.
Artificial Intelligence
Artificial intelligence is an important innovation in information systems. Artificial intelligence is increasingly being used in many industries, including health care, logistics, manufacturing, automotive, and publishing, as well as in daily lifestyle applications. Artificial intelligence–enabled computer systems are able to process large amounts of data, identify patterns and trends, and make decisions, tasks that generally require human intelligence and a great deal of time and resources. Artificial intelligence uses reasoning, learning, problem-solving, and perception as it processes the data.
It is important to recognize that to function optimally, AI must be grounded in data that are valid and reliable. Without robust data, AI may produce inaccurate data analyses and biased algorithms. In addition, since AI doesn’t have the reasoning capabilities of humans, the technology is poorly suited for situations that require adaptation to change, such as using AI to operate machinery safely.
Artificial intelligence has several subfields that focus on its different aspects
- The field of machine learning involves the creation of algorithms and models that enable machines to learn from or make decisions about the data without specific programming.
- A neural network is a method of AI that uses algorithms to teach computers to process data much like the human brain, using image and speech recognition.
- Deep learning uses multiple layers of neural networks to address deeper, more complex decision-making and is often considered a subset of machine learning.
- Cognitive computing simulates human thought processes via reasoning and learning.
- Computer vision teaches machines how to see and interpret information from images or videos using facial recognition, object identification, and segmentation.
- The field of natural language processing teaches machines to understand and generate human language and involves tasks such as speech recognition, text analysis, and language translation.
Cybersecurity
Cybersecurity is the practice of protecting internet-connected systems from internal and external threats of unauthorized access, attack, or damage to its technologies and networks. It combines the people, processes, policies, systems, and technology needed to thwart cyber risks, safeguard assets, and protect assets. Cybersecurity has become increasingly vital to business as the breadth of today’s information is managed electronically. The exposure caused by a breach can compromise personal information, leading to a loss of trust and potential financial liabilities. Additionally, cybersecurity allows organizations to remain compliant with regulations, safeguard against identity theft, and protect intellectual property, finances, and people’s personal information.
Currently, cybersecurity is a critical aspect of information systems, and this will continue to be true as new technologies emerge. The work of the National Institute of Standards and Technology (NIST) will continue to be important for emerging technologies. The NIST-developed Cybersecurity Framework (CSF) leads as an essential approach for organizations to create and manage their cybersecurity strategy. Refer to Chapter 5 Information Systems Security Risk Management and Chapter 6 Enterprise Security, Data Privacy, and Risk Management for more on information on cybersecurity.
As new technologies emerge, there remain several pivotal layers of cybersecurity necessary to guard against ever-evolving cyber threats:
- Application security is a principal component of cybersecurity, adding security inside an application to shield it from attacks on vulnerabilities within its code. Various tools (such as firewalls, antivirus software, encryption techniques, and web application firewalls) combined with various types of application security (authentication, authorization, encryption, logging, and application security testing) assist in keeping applications secure.
- Companies must protect customer, client, employee, and user information from unapproved access, use, modification, loss or deletion. This component is dedicated to protecting the integrity of the data residing within a system.
- Network security protects the network from unapproved access and potential dangers. Firewalls and antivirus software are examples of strategies used to support network security efforts.
- Disaster recovery planning helps businesses identify the necessary and mission-critical applications vital to maintain the operations of the organization and how the implementation of these plans will occur in the event of a cyberattack.
- Operational security encourages management to step inside the role of a hacker to identify areas of vulnerability within the organization.
- End-user security targets the safeguards of individual devices connected to a network, such as computers, tablets, printers, and smartphones. Coupled with end-user education and training, these controls help to alleviate security threats that may be caused by human error.
Biometrics are increasingly used to authenticate a person’s identity, such as fingerprints to access smartphones, or the use of facial recognition technology at airport smart-gates. Some examples of biometrics that could be used in 5G network security include fingerprint scanning, iris recognition, and voice recognition.
Cloud Computing
Cloud computing is an emerging technology defined as the use of hosted services like data storage, servers, databases, networking, and software that run over the internet (or an intranet) rather than on private servers and hard drives. Cloud computing services are available via public, private, or hybrid means and are generally owned by a third party, allowing the customer to pay for the choice of how they want their infrastructure to be managed and supported. Review Chapter 7 Cloud Computing and Managing the Cloud Infrastructure for more information on cloud computing.
Careers in IS
Cloud Engineering
Cloud engineers are increasingly needed to support the design, development, maintenance, security, and management of cloud infrastructures. Cloud engineers ensure the security of the network. Additionally, they assist with the planning and design of cloud computing applications and services for the business, deployment of cloud-based infrastructure, and programming code in various languages such as Java, Python, and C++. They also work with organizations on disaster planning, preparedness, and recovery. Experience working with coding languages, as a systems administrator or network engineer and excellent written and communication skills are necessary to succeed in this position.
Mobile Computing
The emerging technology of mobile computing involves the strategies, technologies, products and services that enable users to access information without restricting the user to a single geographic location. Combined, mobile computing technologies support the use of mobile devices that are portable, and wireless devices that are enabled to transmit data, voice, and video communications. The convenience of mobile computing allows people to access network services anywhere and anytime. Most could not have imagined a few decades ago a future where you could call or text a relative in another country from an underground subway train or connect with a long-lost classmate through a social media application.
Mobile computing combines infrastructure (technical pieces that enable communication such as a wireless network), hardware (physical devices such as laptops), and software (applications and operating systems) technologies. Characteristics of mobile computing technologies include portability, connectivity, social interactivity, context sensitivity, and individualization. These are all applicable to the types of mobile devices consumers enjoy using, such as tablets, mobile phones, and laptop computers.
In addition to being able to privately connect, interact, and collaborate with people through different applications, there are several other advantages to mobile computing. For example, studies have shown that mobile computing increases productivity. With the move toward working remotely, organizations have realized that the cost of an office location may not make sense when employees can work from any location and be just as productive.6 Mobile computing has also enabled a plethora of entertainment options with applications that provide movies (like Netflix and YouTube), games (such as Wordle), lifestyle content (for example, HGTV and Amazon), and more. Additionally, mobile computing now supports and connects to the cloud and cloud computing services, allowing data such as photos, videos, and documents to be secured for future retrieval.
Mobile computing does have limitations. For example, the range and bandwidth (the capacity at which a network can transmit data) of some devices is limited, leading to transmission interference or unwanted disruptions while communicating. This can severely interrupt the quality of the sound or picture being displayed on the device. Security standards that govern mobile computing technologies also remain an issue as the industry regulations can lag behind the rate of innovation. Additionally, mobile computing technologies present power consumption and battery charging challenges. For example, batteries can be negatively impacted by temperature changes, making it difficult to recharge and maintain battery performance.
The Convergence of Information Systems Frontiers and Emerging Technologies
In the context of computing and technology, convergence is the joining of two or more different entities in a single device or system. The convergence of emerging technologies and IS frontiers can create new opportunities for innovation and growth as the research and development processes in new frontiers helps foster and promote emerging technologies to develop and evolve. For example, the convergence of AI and data analytics can help organizations make better decisions by analyzing vast amounts of data in real time. It can enable organizations to gain a competitive edge, optimize operations, and drive business value by providing insights into data that a data analyst may not be able to uncover. Data analysts will still be needed to interpret the data in a business sense as these technologies do not yet have the capacity to accomplish such tasks.
The IoT can connect devices and sensors to create smart systems that can optimize operations and enhance user experiences. These technologies can be leveraged to create smart homes, where internet-enabled appliances and devices can be managed remotely via a connected network. For example, IoT smart devices can support the needs of people who are hard of hearing or deaf by providing real-time alerts, such as a smoke detector that activates non-sound-based alarms. Overall, the intersection of emerging technologies and IS frontiers is an exciting area that has the potential to transform various industries and improve people’s lives.
Opportunities, Challenges, and Risks of Information Systems Frontiers
The rapidly evolving field of information systems presents both significant opportunities and complex challenges for organizations navigating the digital landscape. Frontiers of information systems, such as data analytics, AI, cloud computing, mobile computing, and social media, provide opportunities, challenges, and even risks to people and organizations as they continue to evolve.
Opportunities
Businesses can expect IS frontiers to expand and grow, becoming more advanced and mature in their functions. For example, natural language processing enhancements will further the abilities of machines to understand and generate human language, making it easier for users to interact with information systems. This expected growth may provide increased employment opportunities to develop and manage such systems, as well as a growth in educational and training opportunities in these areas.
Another opportunity afforded by these systems will be an overall improvement in networking infrastructure, allowing increased compatibility between networked systems. Problems with voice, data, and image transmission will be reduced, improving overall communication quality and delivery. This will also lead to a reduction in hardware and software costs, reducing the overall costs of processing data. Over time, the cost savings should make information systems more affordable, allowing businesses to become more competitive.
We have already seen exponential growth in mobile computing in the variations of devices, their functionality, and their processing power. Mobile computing will continue to exhibit improved functioning, making it easier to use and maintain, and possibly become more affordable in the future.
Challenges and Risks
Data analytics will experience challenges with big data (large amounts of complex data) in that it is unable to be stored, processed, or analyzed in traditional data storage formats—a significant challenge. However, analyzing this data in a timely manner can help decrease risks to society, nature, or the ecosystem. For example, hospitals, pharmaceutical companies, and other medical and health organizations store large amounts of medical data. When completed in a timely manner, data analytics can provide trend analysis and identify potential health-related threats to different communities, improving and even saving lives. Another risk occurs when, even if data are analyzed in a timely manner, the analysis uses bad data, such as those caused by outdated records, inaccurate data integration processes, and data entry errors.
Data breaches are an increasing concern as hackers are becoming more sophisticated in breaking through networks. In health care, which shows much promise in the convergence of IS frontiers and emerging technologies, 45.9 million U.S. health-care records were breached in 2021, 51.9 million breaches occurred in 2022, and this increased to 133 million records exposed, stolen, or otherwise impermissibly disclosed in 2023.7 Care needs to be taken to protect data as it is processed in new ways.
Businesses are also challenged to maintain regulatory compliance as the increased use of these technologies continues to push the boundaries of regulatory bodies. It is becoming more difficult and expensive to ensure adherence to these regulations, and violations may result in substantial penalties, data breaches, and reputational risk to the business.
Challenges and risks for cloud computing include misconfiguration of security settings, a common vulnerability that occurs when security settings such as default configurations, improper access controls, insufficient firewall protections, and other misconfigurations result in security issues. The data itself may pose quality issues where duplicate data, corrupt data due to human error, or mixed data types may exist, all creating challenges when gathering data for analysis.
Link to Learning
The publication Information Systems Frontiers: A Journal of Research and Innovation explores topics in areas of emerging technologies, including research developments in EMI, medical informatics, mobile computing, and e-commerce.
Footnotes
- 6Jane Thier, “Bosses, You’re Wrong: Remote Workers Are More Productive than Your In-Office Employees,” Fortune, October 20, 2022, https://fortune.com/2022/10/20/remote-hybrid-workers-are-more-productive-slack-future-forum/
- 7Steve Alder, “Healthcare Data Breach Statistics,” The HIPAA Journal, January 15, 2025, https://www.hipaajournal.com/healthcare-data-breach-statistics/