World-renowned primatologist Dian Fossey is found murdered in Rwanda


Updated:
Original:
Year
1985
Month Day
December 26

On December 26, 1985, primatologist and conservationist Dr. Dian Fossey is found murdered in her cabin at Karisoke, a research site in the mountains of Rwanda. It is widely believed that she was killed in connection with her lifelong crusade against poaching.

An animal lover from a young age, Fossey began her career as an occupational therapist. She would later credit her work with children for helping her earn the trust of the mountain gorillas she studied. In 1963, she borrowed money in order to finance an extended trip to Africa. Her travels brought her into contact with the archaeologists Louis and Mary Leakey and wildlife photographers Alan and Joan Root and introduced her to the work of primatologist Jane Goodall. She published several articles about her travels and returned to the United States, but in 1966 the Leakeys helped her secure funding to study gorillas in the Congo.

Political unrest in the Congo led Fossey to flee the country and set up her camp, Karisoke, in the Rwandan foothills of the Virunga Mountains. There, she studied and interacted extensively with the native gorillas. Fossey eventually received a Ph.D. in zoology from Cambridge University and lectured for several years at Cornell. Her research on gorilla societies greatly enhanced mankind’s understanding of one of its closes evolutionary relatives. Fossey is best known, however, as a fierce opponent of poaching. Park rangers were known to accept bribes, allowing poachers to set up traps and routinely kill gorillas in the national park where Fossey worked. After poachers brutally killed her favorite gorilla, Digit, in 1977, Fossey launched a public and somewhat obsessive crusade to protect gorillas and punish poachers. Fossey destroyed traps and was even known to detain poachers, sometimes physically beating them. She cultivated a reputation among the locals as a practitioner of dark magic in an effort to keep people from harming her gorilla friends.

Her efforts garnered worldwide attention to the anti-poaching cause, but may have led to her death. Though an allegedly jealous fellow researcher was convicted in absentia for her murder in Rwanda, many believe that her killing was revenge for her treatment of poachers. She was buried in a cemetery at Karisoke, alongside Digit and other gorillas killed by poachers. Though she had become reclusive and bitter toward the end of her life, the final entry in her journal was a hopeful one: “When you realize the value of all life, you learn to dwell less on what is past and concentrate more on the preservation of the future.” The fund she founded, the Dian Fossey Gorilla Fund International, carries on her efforts to protect gorillas to this day.

Source

First American “test-tube baby” is born


Updated:
Original:
Year
1981
Month Day
December 28

On December 28, 1981, the first American “test-tube baby,” a child born as a result of in-vitro fertilization, is born in Norfolk, Virginia. Considered a miracle at the time, births like that of Elizabeth Jordan Carr are now common.

In-vitro fertilization is a process in which doctors fertilize an egg outside of a woman’s body and implant the developing embryo in the womb. In this way, women with damaged or missing Fallopian tubes, which carry fertilized eggs from ovaries to the uterus, are able to become pregnant. Doctors carried out the first successful in-vitro fertilization of a rabbit in 1959, and the first human test-tube baby was born in England in 1978. One of the doctors responsible, Dr. Robert Edwards, was awarded the Nobel Prize in 2010.

A number of successful IVF-induced pregnancies followed, leading the husband-and-wife team of Drs. Howard and Georgeanna Jones to open an IVF clinic at Eastern Virginia Medical School in 1980. “I think this is a day of hope,” Howard Jones said after Carr and her mother were declared to be in perfect health, citing the roughly 600,000 American women who could theoretically give birth thanks to the procedure.

IVF was not without its critics. Many in the medical community were cautious about “playing God.” IVF drew condemnation from figures like Rev. Jerry Falwell and others in the “Moral Majority,” a socially conservative movement that was in its ascendancy in the early 1980s. The Roman Catholic Church opposes IVF on the grounds that it separates marital sex from the act of conception, while others continue to criticize what they perceive as an industry built around selling IVF to couples with fertility issues. Nonetheless, the procedure has been refined over several decades and is now fairly common, leading to an estimated 5 million total births as of a 2012 study. It is estimated that IVF now accounts for over one percent of American births every year.

Source

“Gangnam Style” becomes the first YouTube video to reach one billion views


Publish date:
Year
2012
Month Day
December 21

On December 21, 2012, the music video for “Gangnam Style,” a song by the Korean rapper Psy, becomes the first YouTube video to garner one billion views. The video’s global popularity is a case study in the power and unpredictability of viral internet content.

Psy had been well-known in Korea for a decade, earning awards and acclaim as well as a reputation for controversy. Though Korean pop music, or K-pop, was increasingly popular outside of South Korea, Psy was not an international star until “Gangnam Style.” Released on July 15, 2012 as the lead single to his album Psy 6 (Six Rules), Part 1, the video would make him a global sensation.

“Gangnam Style” is a send-up of “posers and wannabes” Psy observed in Seoul’s fashionable Gangnam District. Though the lyrics are humorous, it was the video that made the song a sensation beyond Korea. Psy and others perform the “invisible horse” dance, in which the singer pretends to ride a horse and occasionally toss a lasso, in a variety of locations including a stable, a bus, a tennis court and other locales around Seoul. The iconic dance, the memorable chorus of “Hey sexy lady!” and the general over-the-top nature of the video caught the attention of a global audience.

The likes of T-Pain, Britney Spears and Katy Perry noticed the video and drew attention to it on social media. By the end of August, it was garnering over 3 million YouTube views a day, and in December it reached its unprecedented 1 billionth view.

Like other viral videos, “Gangnam Style” inspired countless parodies, reaction videos, and flash mobs. Athletes, television personalities and even politicians—U.S. Representative John Lewis recorded a video of himself doing the dance, and then-Prime Minister of the United Kingdom David Cameron reportedly performed it along with future PM Boris Johnson at a conference—joined in the viral craze. Though no longer the most-watched video on YouTube, “Gangnam Style” was an inescapable cultural phenomenon, serving as an introduction to K-pop for millions around the world and as a lasting example of internet virality.

Source

Blockbuster sci-fi film “Avatar” opens in U.S. theaters


Publish date:
Year
2009
Month Day
December 16

December 16, 2009 sees the U.S. release of the blockbuster science fiction film Avatar. One of the most expensive films ever made, it was also one of the most successful, holding the title of highest-grossing film of all time for nearly a decade.

Director James Cameron was no stranger to massive, ambitious projects, having achieved acclaim and enormous box office success with films like The Terminator and Titanic. A lifelong science fiction fan, he wrote a treatment for Avatar in 1994 but delayed the project because he felt the technology required did not yet exist. Finally, in 2006, the project began to take shape.

Avatar is the story of a human soldier who takes an alien form in order to explore and infiltrate the Na’vi race of the planet Pandora, which humans intend to exploit for its natural resources. Like Hayao Miyazaki’s Princess Mononoke—one of its major influences—Avatar is an action-adventure movie with heavy environmentalist and anti-imperialist overtones. Cameron stated that, in addition to warning about environmental degradation, the film was also a critique of the Iraq War, then in its sixth year.

The film made use—in many cases, the first use—of a number of advances in motion-capture technology and computer-generated imagery. Over 900 people worked on the digital effects, and the film officially cost $237 million, although there is speculation that the actual budget ran as high as $310 million. Avatar was not the first major 3D movie, but it contributed greatly to the mainstream release of films in 3D.

Avatar was an immediate hit, supplanting Titanic as the new highest-grossing film of all time. Reviews were largely positive, although some felt the film was heavy-handed or derivative of other stories, most obviously Pocahontas. It was nominated for nine Academy Awards, winning the Oscar for Best Art Direction, Best Cinematography and Best Special Effects. Though many of its innovations are commonplace or even outdated today, Avatar is remembered for ushering in a new age of CGI-heavy blockbusters. It was one such film, Avengers: Endgame, which finally surpassed Avatar as the highest-grossing film of all time in April of 2019.

READ MORE: The True Stories That Inspired ‘Titanic’ Movie Characters

Source

U.S. declares an end to the War in Iraq

In a ceremony held in Baghdad on December 15, 2011, the war that began in 2003 with the American-led invasion of Iraq officially comes to an end. Though today was the official end date of the Iraq War, violence continued and in fact worsened over the subsequent years. The withdrawal of American troops had been a priority of President Barack Obama, but by the time he left office the United States would again be conducting military operations in Iraq.

Five days after the 9/11 attacks, President George W. Bush announced the “War on Terror,” an umbrella term for a series of preemptive military strikes meant to reduce the threat terrorism posed to the American homeland. The first such strike was the invasion of Afghanistan in October 2001, which began a war that continues to this day.

Throughout 2002, the Bush Administration argued that Iraqi president Saddam Hussein was allied with terrorists and developing “weapons of mass destruction.” By all accounts, Hussein was responsible for many atrocities, but there was scant evidence that he was developing nuclear or chemical weapons. Behind closed doors, intelligence officials warned the case for war was based on conjecture—a British inquiry later revealed that one report’s description of Iraqi chemical weapons had actually come from the Michael Bay-directed action movie The Rock. The governments of the U.S. and the U.K., however, were resolute in their public assertions that Hussein posed a threat to their homelands, and went ahead with the invasion.

The invasion was an immediate success insofar as the coalition had toppled Hussein’s government and occupied most of Iraq by mid-April. What followed, however, was eight years of insurgency and sectarian violence. American expectations that Iraqis would “greet them as liberators” and quickly form a stable, pluralistic democracy proved wildly unrealistic. Though the coalition did install a new government, which took office in 2006, it never came close to pacifying the country. Guerilla attacks, suicide bombings and improvised explosive devices continued to take the lives of soldiers and civilians, and militias on both sides of the Sunni-Shia divide carried out ethnic cleansings.

The American public remained skeptical of the war, and many were horrified at reports of atrocities carried out by the military and CIA. Leaked photos proved that Americans had committed human rights abuses at the Abu Ghraib prison, and in 2007 American military contractors killed 17 civilians in Baghdad’s Nisour Square. Opposition to the war became an important talking point in Obama’s bid for the presidency.

On New Year’s Day 2009, shortly before Obama took office, the U.S. handed control of the Green Zone—the Baghdad district that served as coalition headquarters—to the Iraqi government. Congress formally ended its authorization for the war in November, and the last combat troops left the following month. Even by the lowest estimates, the Iraq War claimed over 100,000 lives; other estimates suggest that the number is several times greater, with over 205,000 civilian deaths alone.

Over the next three years, ongoing sectarian violence blossomed into a full-out civil war. Many of the militias formed during the Iraq War merged or partnered with extremist groups in neighboring Syria, itself experiencing a bloody civil war. By 2014 the Islamic State of Iraq and the Levant, which absorbed many of these groups, controlled much of Syria and Iraq. The shocking rise of ISIL led Obama to launch fresh military actions in the region beginning in June of 2014. Though ISIL has now been driven out of Iraq and appears to be very much diminished, American troops are still on active duty in Iraq, 16 years after the initial invasion and eight years after the official end of the Iraq War.

READ MORE: The War on Terror: A Timeline

Source

Smallpox is officially declared eradicated


Updated:
Original:
Year
1979
Month Day
December 09

On December 9, 1979, a commission of scientists declare that smallpox has been eradicated. The disease, which carries around a 30 percent chance of death for those who contract it, is the only infectious disease afflicting humans that has officially been eradicated.

Something similar to smallpox had ravaged humanity for thousands of years, with the earliest known description appearing in Indian accounts from the 2 Century BCE. It was believed that the Egyptian Pharaoh Ramses V died of smallpox in 1145 BCE; however, recent research indicates that the actual smallpox virus may have evolved as late as 1580 CE. A type of inoculation—introducing a small amount of the disease in order to bring on a mild case that results in immunity—was widespread in China by the 16th century.

There is no record of a smallpox-like illness in the Americas before European contact, and the fact that Europeans brought pox with them was a major factor in their conquest and near-eradication of many of the indigenous peoples of North, South and Central America. Smallpox was the leading cause of death in 18th century Europe, leading to many experiments with inoculation. In 1796 the English scientist Edward Jenner discovered a vaccine. Unlike other types of inoculation, Jenner’s vaccine, made from a closely-related disease that affects cows, carried zero risk of transmission.

Many European countries and American states made the vaccination of infants mandatory, and incidents of smallpox declined over the 19th and early 20th centuries. Compared to other epidemic diseases, such as polio or malaria, smallpox eradication was relatively simple because the disease lives only in humans, making human vaccination highly effective at stopping its spread, and its symptoms appear quickly, making it easy to identify and isolate outbreaks.

Starting in 1967, the World Health Organization undertook a worldwide effort to identify and stamp out the last remaining outbreaks of the disease. By the mid-70s, smallpox was only present in the Horn of Africa and parts of the Indian subcontinent. The last naturally occurring case was diagnosed in Somalia in 1977. Two years later, doctors proclaimed its eradication. The elimination of smallpox is one of the major successes in the history of science and medicine.

READ MORE: How an African Slave in Boston Helped Save Generations from Smallpox

Source

Kenya declares independence from Britain


Updated:
Original:
Year
1963
Month Day
December 12

On December 12, 1963, Kenya declares its independence from Britain. The East African nation is freed from its colonial oppressors, but its struggle for democracy is far from over.

A decade before, in 1952, a rebellion known the Mau Mau Uprising had shaken the British colony. Not only did the British spend an estimated £55 million suppressing the uprising, they also carried out massacres of civilians, forced several hundred thousand Kenyans into concentration camps, and suspended civil liberties in some cities. The war ended in the imprisonment and execution of many of the rebels, but the British also understood that things had permanently changed. The colonial government introduced reforms making it easier for Kenyans to own land and grow coffee, a major cash crop previously reserved for European settlers. Kenyans were allowed to be elected to the Legislative Council beginning in 1957. With nationalist movements sweeping across the continent and with Britain no longer financially or militarily capable of sustaining its empire, the British government and representatives from the Kenyan independence movement met in 1960 to negotiate independence.

The agreement called for a 66-seat Legislative Council, with 33 seats reserved for black Kenyans and 20 for other ethnic groups. Jomo Kenyatta, a former leader of the Kenya African National Union whom the British had imprisoned on false charges after the Mau Mau Uprising, was sworn in as Kenya’s Prime Minister on June 1, 1963, in preparation for the transition to independence. The new nation’s flag was modeled on that of the Union and featured a Masai shield at its center.

Kenya’s problems did not end with independence. Fighting with ethnic Somali rebels in the north continued from the time of independence until 1969, and Kenyatta instituted one-party rule, leading a corrupt and autocratic government until his death in 1978. Questions about the fairness of its elections continue to plague the country, which instituted a new constitution in 2010. Kenyatta’s son, Uhuru, has been president since 2013.

Source

Plantation Middle School and Western High School Selected by the U.S. Department of State to Host Teachers from China

Teacher of Mandarin Chinese to Lead Classes for the 2019/20 School Year

Congratulations to Plantation Middle School and Western High School for being selected to host teachers from China as part of the Teachers of Critical Language Program by the U.S. Department of State’s Bureau of Educational and Cultural Affairs. The schools are two of only 28 that have won a nationwide competition to host Chinese language teachers to strengthen their language and culture programs. This is the second year that Plantation Middle has been selected to host a teacher from China. 

In addition, students, teachers and community members will have the opportunity to learn about the visiting teachers’ home countries in China, and expand their understanding of the world and the global workforce. In return, exchange teachers gain first-hand knowledge of the United States to share with students and fellow teachers in their home countries. 

“We are incredibly proud that Plantation Middle School and Western High School were selected for the Teachers of Critical Languages Program,” said Broward County Public Schools (BCPS) Superintendent Robert W. Runcie. “It’s important for our students to become global learners and to build international competencies. Mandarin Chinese is one of the critical languages of the world and we’re thrilled to provide this cultural and academic experience for our students.”  

In addition to Plantation Middle and Western High, BCPS offers Mandarin Chinese at Crystal Lake Middle School, Falcon Cove Middle School and Parkway Middle School; and Cypress Bay High School, Millennium 6-12 Collegiate Academy, Pompano Beach High School and South Broward High School. 

The Teachers of Critical Languages Program is supported and fully funded by the U.S. Department of State’s Bureau of Educational and Cultural Affairs, and implemented by the American Council for International Education. The program is designed to increase the study and acquisition of important world languages and to help U.S. schools and districts expand and reinforce existing world languages programs. 

###

 

 

ABOUT BROWARD COUNTY PUBLIC SCHOOLS

“Committed to educating all students to reach their highest potential.”

Broward County Public Schools (BCPS) is the sixth-largest school district in the nation and the second-largest in the state of Florida. BCPS is Florida’s first fully accredited school system since 1962. BCPS has more than 271,500 students and approximately 175,000 adult students in 234 schools, centers and technical colleges, and 89 charter schools. BCPS serves a diverse student population, with students representing 204 different countries and 191 different languages. Connect with BCPS: visit the website at browardschools.com, follow BCPS on Twitter @browardschools and Facebook atfacebook.com/browardschools, and download the free BCPS mobile app.

Source

The United Arab Emirates is formed


Updated:
Original:
Year
1972
Month Day
December 21

On December 21, 1972, the United Arab Emirates is formed. The union of six small Gulf kingdoms—to which a seventh was soon added—created a small state with an outsized role in the global economy.

A number of kingdoms on the norther coast of the Arabian Peninsula came under British protection through a series of treaties beginning in 1820. Concerned with protecting trade routes and their prized colony of India, the British navy protected what became known as the Trucial States in exchange for their cooperation with British interests. During this period of British protection, the region’s vast oil reserves were discovered. As the Trucial States and nearby kingdoms like Bahrain and Qatar became major suppliers of oil, the British Empire’s influence receded due to a number of factors, the two World Wars chief among them. In 1968, the British government declared that it would end the protectorate, withdrawing its military and leaving the people of the region to their own devices.

Dwarfed by their neighbors in terms of size, population and military capabilities, the small kingdoms of the region attempted to organize themselves into a single political unit. The negotiations proved difficult, and Bahrain and Qatar elected to declare independence unilaterally. With the British treaty due to expire and both Iran and Saudi Arabia eyeing their territory and resources, the kingdoms of Abu Dhabi, Ajman, Fujairah, Sharjah, Dubai and Umm al-Quwain became the independent United Arab Emirates on this day in 1972. Ras al-Khaimah joined two months later.

Since then, the UAE has been a sovereign nation, enjoying the profits of its natural resources—its reserves of oil and natural gas are the seventh-largest in the world, and it has the seventh-highest GDP per capita. This wealth has turned the Emirates into a major hub of trade, travel, tourism and finance. Dubai’s Burj Khalifa, the tallest structure in the world, is emblematic of the Emirates’ dramatic construction boom and rise to global prominence. Though its cities are some of the most modern in the world, the nation remains a monarchy governed by religious law—its president and prime minister are the absolute monarchs of Abu Dhabi and Dubai, respectively, and apostasy, homosexuality and even kissing in public are punishable by law.

Source

Grand Central Terminal opens in New York City


Year
1913
Month Day
February 02

On February 2, 1913, New York City’s Grand Central Terminal opens for the first time. The transportation hub as we know it today began construction in 1903, but before that 89 E 42nd was home to an older steam train station built in 1879. Even though the station had been updated to deal with an increased volume of commuters coming from suburbs outside the city, a collision between outdated steam trains in 1902 killed 15 people, and made it clear that a more substantial renovation was needed.

That same year, engineer William Wilgus and railroad tycoon Cornelius Vanderbilt began planning the landmark that Grand Central is today. They proposed a station with new electric trains that would not emit exhaust fumes and could, for the first time, operate underground. Planning officials also changed the station’s name. Technically a station, because trains no longer went south of Grand Central Station, the hub was renamed Grand Central Terminal. While these renovations and improvements had practical value, the more significant impact that both Wilgus and Vanderbilt hoped to create was cultural.

Grand Central was designed to usher New York into the dynamic 20th century. As the world around it grew increasingly interconnected, Vanderbilt wanted Grand Central to overtake its rival Penn Station as the palatial gateway to the heart of a rapidly growing country. That ambition was manifested in the form of a towering white marble facade and a ceiling mural depicting God’s view of the sky. After almost 10 years of construction and more than $4 billion in today’s money, New York’s architectural marvel opened to the world.

Despite initial success, Grand Central eventually fell into severe disrepair due to an increase in highway use and gradual neglect. Even the ceiling blackened due to cigarette smoke. As early as 1945, there were calls to tear down the building. However, the destruction of the original Penn Station between 1963 and 1966 sparked a movement to preserve architecturally significant buildings in New York, including Grand Central. Several high-profile New Yorkers, including former first lady Jackie Kennedy Onassis and architect Philip Johnson, formed The Committee to Save Grand Central. The committee fought to preserve Grand Central’s status as a landmark building, ensuring it could never be torn down. A $100 million restoration beginning in 1980 reestablished Grand Central as a bustling monument to the power and grandeur of New York City.

READ MORE: 10 Things You May Not Know About Grand Central

Source