Skip to ContentGo to accessibility pageKeyboard shortcuts menu
OpenStax Logo
World History Volume 2, from 1400

15.3 Science and Technology for Today’s World

World History Volume 2, from 140015.3 Science and Technology for Today’s World

Learning Objectives

By the end of this section, you will be able to:

  • Discuss the development of complex digital computers and their effects on human society
  • Analyze the effects of the internet and social media on society
  • Describe important medical developments of the last fifty years and current medical challenges

World War II brought about a massive technological transformation as countries like Germany and the United States rapidly innovated to avoid destruction and defeat their enemies. In the decades after the war, there was major progress in medical technology, the creation of new vaccines, and the elimination of deadly diseases. All these achievements had profound effects on the way people lived, traveled, and worked. Underlying them were major advancements in the field of information technologies, such as computers. Once the war was over, the development of increasingly powerful computers ushered in a computer revolution as powerful as the nineteenth century’s Industrial Revolution, and with it a digital age.

The Digital Computer Revolution

Many of the technological advancements of the 1940s and 1950s came in the form of increasingly powerful analog computers, which analyze a continuous stream of information, much like that recorded on vinyl records. Analog computers worked well for solving big mathematical problems, such as the calculations related to electrical power delivery systems or the study of nuclear physics. However, one of their weaknesses was that they were inefficient at managing large amounts of data. Digital computers, or those that translate information into a complex series of ones and zeros, were far more capable of managing bulk data. Just a few years after the war, digital computing received a huge boost with the invention of the transistor, a device with far more computing potential than its predecessor the vacuum tube. Scientists could amplify this enlarged computing capacity even further by wiring multiple transistors together in increasingly complex ways.

The use of multiple transistors for computing purposes was an important step, but it had obvious drawbacks. Making machines capable of processing a great deal of information required connecting many transistors, which took up a great deal of space. Then, in the late 1950s, inventors in the United States developed an innovative solution. Using silicon, they could integrate transistors and capacitors in a way that clumsy wiring could not accomplish. The silicon-based integrated circuit freed computer technology from size constraints and opened the door to additional advancements in computing power.

Even so, digital computers remained large, expensive, and complicated to operate, and their use was largely confined to universities and the military. Only gradually over the 1970s did computing technology become more widely available, largely thanks to mass-produced general-purpose computers, sometimes called minicomputers, designed by IBM and the Digital Equipment Company. These served a variety of government and private purposes, such as calculating the Census, managing the flow of tax monies, and processing calculations related to creditworthiness (Figure 15.15). But despite being somewhat cheaper, minicomputers remained out of reach for average users.

A black and white photo is shown of a woman in a light-colored belted dress with short sleeves sitting at a table with a large machine on it. She has long hair and bangs and is holding a spool of film in her left hand, pulling the film down with her right hand. There are many buttons, knobs, and dials on the machine in front of her and vents underneath the machine by her knees. In the background the walls around her hold machinery with knobs, slats, and buttons all in rows on several panels. On the right wall along the top is a sign with “Film Optical Sensing Device for Input to Computers” printed in black across it. Long, thin, slatted lights hang from the ceiling.
Figure 15.15 Computers, 1950s. In the 1950s, computers required complex data management systems to operate. The U.S. Census Bureau used this device to transfer data from paper questionnaires to microfilm to allow for rapid processing by its computers. (credit: modification of work “Woman inspecting FOSDIC film” by U. S. Census Bureau/Wikimedia Commons, Public Domain)

The journey from minicomputers to personal computers began with the Intel Corporation, established in 1968 in Mountain View, California, in a region now commonly called Silicon Valley. During the 1970s, Intel developed a line of integrated circuits that were not only more powerful than their predecessors but also programmable. These became known as microprocessors, and they revolutionized computing by holding all of a computer’s processing power in a single integrated circuit. In 1975, a company in New Mexico released the first marketed personal computer, the Altair 8800. This used an Intel microprocessor and was promoted to computer hobbyists eager to wield a level of computing power once available to only a few. The Altair’s popularity inspired competing products like the Apple, the Commodore, and the Tandy Radio Shack computer (Figure 15.16). These personal computer systems were far easier to use and appealed to a much larger market than just hobbyists.

On a wooden desk sits an old computer with a deep monitor and a green screen with words in a basic font, attached to a thick beige keyboard with two rows of vents along the top. A beige box sits to the right of the computer with black rectangular slots along the left and a black square item sticking out of the back slot. There are vents along the top right and the word “Tandy” on the front on a black strip next to red, blue, and green slanted stripes. The next box on the right is rectangular with vents along the back and two slots with locks in the front. All three machines are connected with wires in the back. A phone is partially shown on the left with round number buttons.
Figure 15.16 Personal Computer, 1980s. The Tandy Color Computer 3 shown here, released in 1986 and nicknamed the CoCo 3, was one of many personal computers released in the 1980s that average consumers were able to buy for their homes. (credit: “CoCo3system” by Unknown/Wikimedia Commons, Public Domain)

By 1982, there were 5.5 million personal computers in the United States, and over the next decade, their number and computing power rose exponentially. Computers proliferated in government offices, private firms, and family homes. Then, in 1984, Apple introduced the world to the Macintosh computer, which not only used a mouse but also replaced the standard code-based user interface with one based on graphics and icons. Recognizing the user-friendly possibilities of this graphic interface, competitors followed suit. Before long, the design popularized by Apple had become the norm.

By the end of the 1980s, not only had personal computers become common, but the microprocessor itself could be found everywhere. Microprocessors were incorporated into automobiles, cash registers, televisions, and household appliances and made possible a variety of other electronic devices like videocassette recorders and video game systems (Figure 15.17). Computer systems were created to store and manage financial, educational, and health-care information. In one form or another and whether they realized it or not, by the 1990s, almost everyone in the developed world was interacting with computers.

A picture is shown of a black rectangular item with a brown wooden front. There are long grooves along the front on top and a raised area along the back with switches, two on each side and a slot in the middle. The switches on the left are labeled “Power” in red and “TV Type” in yellow. “Color” and “B-W” are indicated at the top and bottom of the “TV Type” switch. The switches on the right are labelled “Game Select” and “Game Reset” in yellow with yellow arrows pointing down on the right sides of the switches. The words “Video Computer SystemTM” are written in white above the middle slot. A black square with a tall black joystick handle sits on the top left corner. It has a red button in the left corner and is connected to the console with a wire. The word “Atari®” is written in white on the bottom front of the brown panel with a symbol of one line and a curved line on either side printed above it.
Figure 15.17 The Atari Video Computer System, 1977. Released in 1977, the Atari Video Computer System could be connected to almost any television, allowing users to play video games at home. The software was stored on small plastic cartridges that were plugged directly into the machine. (credit: “Atari 2600 mit Joystick” by “joho345”/Wikimedia Commons, Public Domain)

Modems were hardly new in the 1990s, but they became much faster and more common with the rise of the internet. The origins of the internet date back to the 1960s and the efforts by government researchers in the United States to use computers to share information. These developments were especially important for the U.S. Department of Defense during the Cold War and resulted in the emergence of the Advanced Research Projects Agency Network (ARPANET). In creating ARPANET, researchers developed many of the technologies that over the next few decades formed the basis for the internet we know today.

The Internet and Social Media

The process of globalization has been accelerated by the rise of the internet and the various social media platforms like Instagram, Facebook, and Twitter that exist there. Many people were introduced to the potential of computer networks for sharing information and creating small social networks in the 1980s, when individual users became able to connect their computers to others by using modems and telephone networks. This connectivity gave rise to regional bulletin board systems (BBSs), in which one person’s computer served as a host for those of other users (Figure 15.18). BBSs functioned much like websites today. Though they ran far more slowly and had limited capabilities, they allowed users to share computer files like games and images, post messages for others to read, participate in virtual discussions and debates, and play text-based online games. BBSs used phone networks to communicate, and long-distance calls were then expensive, so their users tended to be local.

An image of a black computer screen with words and numbers typed in white, blue, green and pink all over is shown. Across the top in a gray bar are three circles on the left, the words “ArchLinux32” in the middle and an oval rectangle on the right. Across the bottom in a gray bar are the words “To direct input to this virtual machine, click inside the window.” are printed. On the bottom right are 10 icons: a picture of a camera, a gray box with a line branching off in three parts, a rectangle with a dark rectangle on the bottom, a line with two lines in a “v” shape on the left and two triangles on the right, a black circle with many other circles inside and two white dots on either side, a rectangle with a white circle at the top, three squares with “<” on the left and “>” on the right of those squares, a speaker with round circles coming out on the right, and a box with two people walking toward each other in it. At the bottom right are three lines drawn across the corner.
Figure 15.18 A Bulletin Board System, 1980s. Bulletin board systems like this one relied on colorful text and simple graphics to make them appealing. They appear very limited compared to today’s websites, but in the 1980s, they were revolutionary and opened new possibilities for the future of communication. (credit: “Screenshot of OpenTG’s Group Permissions Editor” by Chris Tusa/Wikimedia Commons, Public Domain)

Throughout the 1980s, BBSs continued to be popular with computer hobbyists and those intrigued by the idea of unique virtual communities, while networking technology improved steadily behind the scenes. The United States, Europe, and other developed countries were busy adopting a uniform protocol system that would allow computers around the world to easily communicate with others. Once this protocol had been established, the commercial internet as we currently understand it was born.

As early as 1987, about thirty thousand hosts resided on the burgeoning internet. Soon telecommunications and software companies began to exploit this new network by creating online service providers like America Online (AOL) to act as gateways to the internet. Initially, they used standard phone lines and modems to connect, much as BBSs had. But as the volume of information on the internet increased exponentially, service providers turned to more expensive broadband connections that used cable television lines and even dedicated lines to connect. During the 1990s, the first websites, the first internet search engines, and the first commercial internet platforms were established.

By 2005, more than one billion people worldwide were using the internet regularly. They were able to shop online, make phone calls around the world, and even create their own websites with almost no technical training. Never before had the world been so connected. In 2004, Facebook was launched. Originally a networking tool for Harvard students, it quickly expanded globally to become a giant in the new world of social media. By 2010, nearly half a billion Facebook users around the world were sharing images and messages, creating communities, and linking to news stories. By 2022, the number of Facebook users had reached nearly three billion.

Before 2007, almost all internet users gained access to the network via a personal computer, either at home or at work. That year, however, Apple Inc. released the first iPhone, a powerful cell phone but also a portable computer capable of performing all the tasks it once required a desktop computer to do. Even more revolutionary, it connected to the internet wirelessly through cell-phone infrastructure. While the iPhone was not the first phone to connect to the internet, its revolutionary touch-screen interface was far superior to earlier systems. Within just a few years, other cell-phone manufacturers were imitating its design and putting smartphones, and thus internet access, in the pockets of users around the world.

Smartphones have transformed life in developing countries, where they have helped bypass some of the traditional stages of infrastructure creation. In Africa, for example, people living where no landlines existed can now communicate with others using cell phones. Small farmers and traders can use cell phones for banking and to connect with potential suppliers and customers. In communities without libraries, schoolchildren can access the internet’s resources to study.

Smartphones have also democratized the internet, serving as powerful tools for organizing and promoting political change. The large pro-democracy movement in Cairo’s Tahrir Square captured the world’s attention in 2011, for example. But it began with twenty-five-year-old activist Asmaa Mahfouz’s YouTube video of January 18, 2011, in which she spoke directly to the camera and urged young Egyptians to protest at the square as part of the larger Arab Spring, a call for government reform and democracy that echoed across in the Arab world.

The Arab Spring was touched off in December 2010 when Muhammad Bouazizi, a young college graduate, set himself on fire in Tunisia after government officials there tried to interfere with the fruit cart that was his only source of income. Other young Tunisians took to the streets in protest, and demonstrations began again in January 2011. As people died in confrontations with government forces, President Zine al-Abidine Ben Ali fled the country, and Tunisia’s prime minister resigned shortly thereafter.

The Tunisian protests led to similar demonstrations in Egypt. On January 17, 2011, an Egyptian set himself on fire near the nation’s Parliament to protest the lack of economic opportunities. Crowds of mostly young people responded with massive demonstrations that lasted weeks (Figure 15.19). These demonstrations were fueled by and broadcast to the world through text messages, photos, tweets, videos, and Facebook posts sent by thousands of mobile phones, including that of Mahfouz. The devices amplified the calls for democracy and showed the world the Egyptian government’s use of violence to try to silence the protestors. Egyptian president Hosni Mubarak resigned on February 11, 2011. He was later convicted for his role in ordering government forces to harm and kill protestors.

A picture and an image are shown. (a) A picture shows a large circular area in the middle of a large crowd of people filling every space. Inside the circular area are many white tents with people underneath. Banners surround the circular area with Egyptian writing. In front of the circular area there are six rows of people kneeling on the ground with their heads down. Another large tented circular area is shown behind the first area. In the background tall buildings and the sky can be seen. (b) A map of northern Africa and some of the Middle East is shown. Egypt and Yemen are highlighted purple to indicate “Government overthrown multiple times.” Tunisia is highlighted green to indicate “Government overthrown.” Libya and Syria are highlighted beige to indicate “Civil war.” Morocco, Western Sahara, Oman, Jordan, and Kuwait are highlighted pink to indicate “Protests and governmental changes.” Algeria, Sudan, and Iraq are highlighted yellow to indicate “Major protests.” Mauritania, Somalia, and Saudi Arabia are highlighted orange to indicate “Minor protests.” Mali, Nigeria, Chad, Central African Republic, South Sudan, Gabon, Congo, Democratic Republic of the Congo, Uganda, Iran, Afghanistan, and Pakistan are highlighted blue to indicate “Other protests and militant action outside the Arab world.” All other parts of the map are gray and the water is white. There is a thick, dark green circle drawn on the map on the eastern shore of Saudi Arabia.
Figure 15.19 The Egyptian Revolution. (a) Internet-connected cell phones using social media applications like Facebook were a common site at the large 2011 protests at Tahrir Square, Cairo. (b) The map shows the results of other uprisings in Africa and the Middle East that were part of the Arab Spring of 2010–2012. (credit a: modification of work “Tahrir Square during 8 February 2011” by “Mona”/Flickr, CC BY 2.0; credit b: modification of work “Arab Spring and Regional Conflict Map” by Ian Remsen/Wikimedia Commons, Public Domain)

In the wake of the Egyptian protests, activists in Libya, Yemen, Syria, Morocco, Lebanon, Jordan, and other countries coordinated their activities using computers and smartphones to access social media, video, and mobile phone messaging. These efforts resulted in protests, changes to the laws, and even the toppling of governments, such as in Egypt and Tunisia. They also led to civil war in Syria, Iraq, and Libya, leading to thousands of deaths and a refugee crisis in the Mediterranean. While Twitter and Facebook were useful for scaling up protests, the movements to which they gave birth often struggled to find a purpose in countries without a well-established resistance movement.

Since 2011, governments around the world have come to recognize the power of social media to bring about change, and many authoritarian and even ostensibly democratic leaders have moved to limit or block social media use in their countries. China has blocked Facebook and Twitter since 2009 and encourages its citizens to instead use the state-authorized app WeChat, which shares information with the government. In 2020, India banned the social media app TikTok, claiming it threatened state security and public order. In March 2022, following its February invasion of Ukraine, Russia banned Instagram and Facebook because, the government alleged, the platforms carried messages calling for violence against Russian troops and against Russian president Vladimir Putin. Turkmenistan has gone further than China, India, or Russia. It not only bans Facebook and Twitter, but it also requires citizens applying for internet access to swear they will not try to evade state censorship.

In the United States, lawmakers have recognized that social media platforms like Facebook and Twitter can both promote and endanger democracy. Social media provides extremist groups with the ability to attract followers from across the nation and incite violence. Groups can use the platforms to spread fake news, and a report by the U.S. Senate has concluded that Russian intelligence operatives used Facebook, Twitter, and Instagram to manipulate voters. Legislators have called on social media to more actively censor the content on their platforms and limit or block access by groups or persons spreading hate speech or disinformation. The potential for misuse of technology is heightened by advances that enable the creation of deepfakes, computer-generated images that closely resemble real people.

Medical Miracles and Ongoing Health Challenges

Advances in computer technology were not the only technological success stories of the post–World War II world. In 1947, scientists perfected an artificial kidney, and just five years later, the first successful kidney transplant was performed. In the 1950s, antipsychotic drugs were developed and used to treat neurological disorders that once consigned patients to a lifetime of difficult treatment in a psychiatric hospital. In the 1950s, geneticists discovered the double-helix structure of DNA, information that was crucial for later advancements such as the ability to use DNA to diagnose and treat genetic diseases. In 1962, a surgical team successfully reattached a severed limb for the first time, and in 1967, the first human heart transplant took place. Over the next decade and a half, medical advances made it possible to conduct telemedicine, view and monitor internal organs without performing surgery, and monitor the heartbeat of a fetus during pregnancy.

Medical science also made enormous gains in eradicating diseases that had been common for centuries. For example, polio had caused paralysis and even death since the late nineteenth century, but in 1950, the first successful polio vaccine, developed by the Polish-born virologist Hilary Koprowski, was demonstrated as effective in children. This was an orally ingested live vaccine, a weakened form of the virus designed to help the immune system develop antibodies. In the meantime, researcher Jonas Salk at the University of Pittsburgh was developing an injectable dead-virus vaccine (Figure 15.20). This vaccine rendered the virus inactive but still triggered the body to produce antibodies. In 1955, Salk’s vaccine was licensed for use in the United States, and mass distribution began there. Other vaccines were developed in the United States and other countries over the next several years. Their use has nearly eradicated polio cases, which once numbered in the hundreds of thousands. When polio was detected in an adult in New York in July 2022, it was the first case in the United States since 2013.

A black and white photo of a serious looking man is shown. He has dark hair, a large, balding forehead, and wears a suit, white collared shirt, tie and glasses. He is standing in front of the tail of a SAS airplane. Other small airplanes can be seen in the far background.
Figure 15.20 Jonas Salk. After the polio vaccine he developed proved successful, Dr. Jonas Salk chose not to patent it to ensure it would be used freely around the world. (credit: “Dr Jonas Edward Salk, creator of Salk polio vaccine, at Copenhagen Airport” by SAS Scandinavian Airlines/Wikimedia Commons, Public Domain)

The eradication of smallpox is another important success story. Centuries ago, smallpox devastated communities around the world, especially Native American groups, which had no immunity to the disease when Europeans brought it to their shores. Early vaccines based on the cowpox virus were deployed in the United States and Europe in the eighteenth century with great effect. In the twentieth century, advancements made the vaccine safer and easier to administer. However, by the 1950s, much of the world remained unvaccinated and susceptible. Beginning in 1959, the World Health Organization (WHO) began working to eradicate smallpox through mass vaccination, redoubling efforts in 1967 through its Intensified Eradication Program. During the 1970s, smallpox was eradicated in South America, Asia, and Africa. In 1980, the WHO declared it had been eliminated globally.

The WHO’s smallpox program is considered the most effective disease-eradication initiative in history, but it was an aggressive campaign not easily replicated. And without a vaccine, the problems of controlling transmissible diseases can be immense. A novel disease was first reported among Los Angeles’s gay community in 1981, and by 1982 it had become known as AIDS (acquired immunodeficiency syndrome). Researchers realized it was commonly transmitted through sexual intercourse but could also be passed by shared needles and blood transfusions. At that time, the U.S. Centers for Disease Control explained that AIDS was not transmitted through casual contact, but the information did little to calm rising concerns about this still largely mysterious and deadly disease. By 1987, more than 60,000 people in the world had died of AIDS. In the United States, the government was slow to fund research to develop treatments or to find a cure. That year, activists at the Lesbian and Gay Community Services Center in New York City, concerned with the toll that AIDS was taking on the gay community and the government’s seeming lack of concern regarding a disease that the media depicted as affecting primarily gay men, an already stigmatized group, formed the AIDS Coalition to Unleash Power (ACT UP). ACT UP engaged in nonviolent protest to bring attention to their cause and worked to correct misinformation regarding the disease and those who were infected with it.

By the year 2000, scientists in the developed world had acquired a sophisticated understanding of AIDS and the human immunodeficiency virus (HIV), and treatments have emerged that make it a manageable rather than a lethal disease, at least in the developed world. But in parts of the developing world, like Sub-Saharan Africa, infection rates were still rising. One difficulty was that HIV infection and AIDS had become associated with homosexuality, which carried stigma and, in some places, even legal penalties that made those infected reluctant to seek help. Addressing transmission with the general public also meant broaching sometimes culturally sensitive topics like sexual intercourse. Those attempting to control the spread of the disease often found themselves trying to influence social and cultural practices, a complicated task fraught with pitfalls.

This does not mean there were not successes. The proliferation of condom use, circumcision, and public information campaigns, along with the declining cost of treatment, have greatly reduced the extent of the epidemic in Africa. But AIDS is still an enormous and devastating reality for Africans today. Sub-Saharan Africa is home to nearly 70 percent of the world’s HIV-positive cases. Women and children are particularly affected; Africa accounts for 92 percent of all cases of infected pregnant women and 90 percent of all infected children.

Ebola virus has also threatened the health of Africans. The first known outbreak of Ebola, a hemorrhagic fever, took place in Central Africa in 1976. Since then, there have been several other outbreaks. In 2013–2016, an outbreak in West Africa quickly spread across national borders and threatened to become a global epidemic. Approximately ten thousand people fell ill in Liberia alone, and nearly half of those infected died.

The most recent challenge to world health, the COVID-19 pandemic, demonstrates the effects of both globalization and technological developments. The coronavirus SARS-CoV-2 appeared in Wuhan, China, an industrial and commercial hub, in December 2019. Airplane and cruise ship passengers soon unwittingly spread it throughout the world; the first confirmed case in the United States appeared in January 2020. As every continent reported infections, offices, stores, and schools closed and travel bans appeared. Despite these restrictions, middle-class and wealthy people in the developed world continued almost as normal. Many worked, studied, shopped, visited friends and family, and consulted doctors online from their homes.

Low-paid workers in service industries often lost their jobs, however, as restaurants and hotels closed, and children without access to computers or stable internet connections struggled to keep up with their classes. Even the more fortunate in the developed world confronted shortages of goods from toilet paper to medicines to infant formula when global supply chains stalled as farm laborers, factory workers, dock hands, and railroad employees fell ill or workplaces closed to prevent the spread of infection. Developing countries lacked funds to support their citizens through prolonged periods of unemployment. Although vaccines were developed in several countries, they were available primarily to people in wealthier nations. As of March 2022, only 1 percent of all vaccine doses administered worldwide had been given to people in low-income countries.

Beyond the Book

Public Art and Modern Pandemics

Dangerous diseases like HIV/AIDS can energize more people than doctors working in laboratories and global leaders publishing reports. During the early years of the HIV/AIDS crisis, grassroots organizers from around the world strove to focus attention on the problem. Their actions were necessary because governments often did little to prevent the spread of the disease or provide treatment for those infected. The AIDS Coalition to Unleash Power (ACT UP) became known for staging loud protests in public and sometimes private places to raise awareness about the disease. In the United States, the publicity generated through groups like ACT UP forced the government to pay greater attention and to budget more money to the search for a cure. Some artists responded to this movement with murals in well-known locations like the Berlin Wall (Figure 15.21).

A picture of three sections of a concrete wall is shown. A mural of an open mouth is drawn on the larger middle section. It has red lips, a red painted inside, and teeth, white on top and red at the bottom. The words “Act Up!” are painted inside the mouth, “act” in white and “up!” in tan. The side sections of the well have dark scribblings on them.
Figure 15.21 AIDS Mural. Artists often used the western face of the Berlin Wall to create provocative murals. This one, painted in the 1980s, featured the name of the HIV/AIDS awareness group ACT UP. (The Berlin Wall came down in 1989.) (credit: “Act Up! From the West side of the Berlin Wall” by Rory Finneren/Flickr, CC BY 2.0)

While some murals about diseases were a call to action, especially about HIV/AIDS, others have aimed to educate the public. A mural painted on a wall in Kenya for World Malaria Day 2014 showed viewers the proper use of bed nets to help lower the rate of infection (Figure 15.22).

A mural is on a wall is shown. A large oval is shown in the middle with a wooden double bed. In the bed there is a woman in a green nightgown and black hair and a small child with darker hair under a pink blanket and on white pillows. There is a blue, tall net draped around the bed and mosquitos are seen flying all around the net. On the floor to the right of the bed is a man with short dark hair in blue pants laying on a pink mat with shoes next to his mat. The background of the oval is green at the bottom and turns to blue at the top. At the top of the blue net there is a white oval with blue trim with a drawing of a long, dark coiled bug with a tail at the end labelled “larva.” There is a red arrow pointing to the right. The word “Cycle” is written to the right of the oval with a red arrow pointing to another white circle with blue trim on the right side of the oval on the mural. It shows a dark coiled bug with a large black and white head with an eye and is labelled “Pupa.” A red arrow is pointing to the bottom of the mural oval. The word “Malaria!” is written in black and red around the blue trimmed circle. At the bottom of the mural a blue trimmed white circle shows a mosquito on an orange and white background with the word “Adult” written inside. Between the bottom blue trimmed circle and the one on the right side of the oval are the words “kila usiku, kila msimu.” written along the oval in black. On the left side of the oval there is a blue trimmed white circle with 23 black dots and the word “Eggs” written in red above the dots and a red arrow pointing to the top of the oval. The words “Lala ndani ya neti iliyotibiwa,” are written along the oval from the circle with the eggs down to the mosquito circle. Above the “eggs” circle is the word “Zuia…” in black and red and then the word “-Malaria-“ running along the oval toward the top “larva” circle. Along the top of the oval there are four pictures of a hollow “x.”
Figure 15.22 Malaria Mural. This 2014 mural in Kenya illustrates how to avoid malaria infection with mosquito nets. The caption reads, “Sleep in a treated net, every night, every season.” (credit: “Final mural at Nangina Primary School, Busia County” by U.S. President’s Malaria Initiative/Flickr, Public Domain)

During the COVID-19 pandemic, artists also went to the streets. Some of the murals they painted demanded action or celebrated health workers. Others called for awareness about the rising number of elderly people dying of the disease (Figure 15.23).

A large wall mural is shown behind some short bushes with trees in the background on both sides. The top and bottom of the mural has wavy lines and the concrete is shown above and below it. At the bottom left, a small black tree is shown with the words “Rick Riojas Art and Soul Agency” in white above. Next to the tree the shadow of the back of an elderly man walking with a cane is shown with him holding hands with a small girl in a dress. Above them is a large gray wrinkled hand with a small smooth, gray hand grasping three of its fingers. The words “Familia ist alles” are written in cursive in black, orange, gray and white to the right of the gray hands. Below the words a row of large many-storied buildings are shown fading into the background. Above the words a white and gray dove flies with an olive branch in its mouth. At the left of the mural a girl is in front of a window with long dark hair, a white and gray shimmery bow in her hair wearing a short-sleeved shirt and looking sad. A black heart outline is shown in the bottom right corner on the wall and the words “#Daham Bleiwe!” are written along the bottom of the mural in orange.
Figure 15.23 COVID-19 Mural. In this German mural about the COVID-19 pandemic, the artist shows how the highly communicable disease separated generations while also highlights the vulnerability of the elderly. The caption reads, “Family is everything.” (credit: “Mural Familie ist alles by Rick Riojas 2020” by Hubert Berberich/Wikimedia Commons, Public Domain)
  • What makes art a powerful medium for conveying messages about awareness? What aspects of these murals seem especially powerful to you?
  • Do you recall seeing artwork from the COVID-19 pandemic or any other disease outbreak? What stood out in it?
  • What other art forms might an artist use to communicate political or social messages? How are these methods effective?
Citation/Attribution

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Attribution information
  • If you are redistributing all or part of this book in a print format, then you must include on every physical page the following attribution:
    Access for free at https://openstax.org/books/world-history-volume-2/pages/1-introduction
  • If you are redistributing all or part of this book in a digital format, then you must include on every digital page view the following attribution:
    Access for free at https://openstax.org/books/world-history-volume-2/pages/1-introduction
Citation information

© Jul 3, 2024 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.