The Chorus Effect: Part 1 — Soundtrack To A Famine
On the 40th anniversary of Live Aid, a look back at how three interlinked moments in popular music helped reframe global conscience.
Prologue
Between November 1984 and July 1985, three events reshaped the landscape of global humanitarian action — not through policy or diplomacy, but through the unlikeliest of instruments: popular music. In the span of eight months, Do They Know It’s Christmas?, We Are the World, and Live Aid emerged as a trilogy of pop-driven interventions that came to define a new standard for celebrity humanitarianism. Each was born of moral urgency, media savvy, and cultural capital, and each sought to address the same unfolding catastrophe: widespread famine in Ethiopia, and its cascading public health consequences including, most notably, epidemic levels of severe acute malnutrition. Together, they offered the world a new playbook — one in which popular music became both the message and the medium for global engagement.
But despite the monumental visibility and fundraising success of these campaigns, they faced intense criticism — not for what they failed to do, but for how they chose to do it. The challenges they encountered can be broadly understood on two levels: the narratives used to represent the crisis, and the mechanics of the aid efforts they helped set in motion. On the first level, the lyrics and imagery used across the three campaigns, while emotionally powerful, were often accused of reinforcing reductive stereotypes, privileging Western voices, and flattening the complexity of African realities into a singular narrative of helplessness. On the second level, the operational frameworks set up by the Band Aid Trust and United Support of Artists for Africa — the two entities collectively responsible for the three campaigns — often clashed with established NGO protocols, leading to logistical inefficiencies, accusations of amateurism, and institutional friction with the entrenched aid establishment. In both realms — the symbolic and the structural — the very qualities that made these campaigns groundbreaking also exposed them to deeper scrutiny, revealing deep tensions around narrative authority and operational legitimacy.
In this inaugural installment of The Chorus Effect — a six-part series — Sound Alive explores this landmark trilogy of musical humanitarianism: how each campaign came into being, who was involved, how the projects were executed, how they were received, how the funds were disbursed, and what legacies remain. But in setting a tradition for the series, we begin not with the pop stars or the pop songs, but with the public health issue at hand.
Severe Acute Malnutrition
Severe acute malnutrition (SAM) is a life-threatening condition marked by a rapid deterioration in nutritional status, characterized by extreme wasting — dangerously low body weight for one’s height — and reflecting a significant imbalance between the body’s nutritional needs and intake [1]. As acute malnutrition worsens in an individual, normal physiological responses to reduced food intake become increasingly pronounced [2, 3]. These responses — known as “reductive adaptations” — affect every physiological function in the body by drawing on internal energy and nutrient reserves, while simultaneously suppressing energy and nutrient demands, thus helping to preserve homeostasis under extreme nutritional stress [4]. However, once nutritional deprivation passes a critical threshold, these same adaptations begin to undermine the body’s ability to respond to additional stresses such as infection, further compromising overall health [5, 6].
In children between the ages of 6 and 59 months, SAM is diagnosed using three primary criteria: a very low weight-for-height ratio, the presence of bilateral pitting oedema (visible swelling in both legs caused by fluid retention), and a very low mid-upper arm circumference, typically less than 11.5 centimetres, or about the circumference of a medium-sized banana [7]. Globally, SAM affects an estimated 19 million children under the age of 5 and is thought to be responsible for approximately 400,000 child deaths each year [7]. The threat is even broader when considering children who are at risk: severe malnutrition, especially in the form of wasting, is estimated to endanger the survival of 47 million children under 5 years of age across low- and middle-income countries (LMICs) [8]. Taken together, all forms of malnutrition are implicated in roughly 45% of mortality among children under 5 worldwide. [9].
For those children fortunate enough to survive, the long-term consequences can remain significant well into adulthood, with a growing body of evidence suggesting that early exposure to SAM leaves a persistent physiological imprint. In a cohort study of 122 adult SAM survivors, researchers found reduced beta-oxidation — the process by which the body breaks down fatty acids for energy — alongside a heightened risk of type 2 diabetes, suggesting that early malnutrition can lead to lasting metabolic disruption [10]. Cardiovascular outcomes have also been a focus: one study involving 116 adult SAM survivors reported diminished cardiac output and increased circulatory strain, a combination linked to greater vulnerability to high blood pressure, particularly when later compounded by obesity [11]. These outcomes mirror broader findings on early-life famine exposure, which has been associated with elevated risks of heart disease, insulin resistance, and hypertension in adulthood [8]. A systematic review of famine studies further confirmed consistent links between childhood deprivation and adult-onset non-communicable diseases (NCDs), with 2 out of 15 studies identifying increased risk of hyperglycemia, and 7 indicating a higher incidence of diabetes [8]. Four additional studies found higher rates of obesity among individuals exposed to famine between the ages of 0 and 9 [12–15]. Taken together, the evidence underscores the long-term cardiometabolic consequences of early SAM — especially its role in predisposing survivors to chronic conditions later in life [8].
Treating SAM has always required balancing clinical care with public health realities — a reflection of its multifactorial causes, which include poverty, social exclusion, inadequate public health infrastructure, and the loss of entitlements such as food security or access to essential services [4]. Historically, the response to SAM was rooted in a hospital-based model: centralized, clinically intensive, and resource-heavy. But this approach soon revealed several critical shortcomings. Hospitals in many low-resource settings lacked the inpatient capacity and skilled personnel required to treat the large numbers of children affected [16, 17]. Moreover, the structure of hospital-based care promoted delayed admissions and imposed high opportunity costs on families — especially mothers and caregivers who had to remain at the facilities for weeks at a time. Immunosuppressed children placed in shared wards also faced elevated risk of center-acquired infections, and high rates of mortality both before and after discharge remained persistent concerns [18–21].
In response, efforts began in the 1970s to demedicalize SAM treatment and relocate care away from hospitals and into the community — via simple nutrition rehabilitation centers, existing primary health care clinics, or even the homes of those affected [18, 22]. A subsequent breakthrough came with the development of ready-to-use therapeutic foods (RUTF). These shelf-stable, bacteria-resistant, nutrient-dense pastes allowed children to complete most of their recovery at home, reducing the average hospital stay from 30 days to just 5–10. By enabling home-based treatment after brief inpatient stabilization, RUTF dramatically improved the cost-effectiveness and scalability of SAM interventions [4]. Yet despite these innovations, many health institutions remained slow to adopt proven clinical management protocols. Throughout the 1970s, 1980s, and 1990s, case-fatality rates for hospital-based SAM treatment in many settings continued to hover between 20–30% — figures that had barely shifted since the 1950s, despite the availability of strategies capable of reducing them to just 1–5% [23]. This chronic failure to translate scientific knowledge into effective practice was publicly denounced in 1992 as “nutrition malpractice” [24]. Well into the 21st century, however, this implementation gap persisted — and in some cases, even widened [4].
Although SAM can arise from a range of contributing factors — including poverty, infection, and systemic neglect — its most acute and widespread expression is often found in the context of famine. And few modern famines have shaped global awareness of SAM more viscerally than the crisis that gripped one Soviet-aligned republic in the Horn of Africa.
The Crisis Unfolds: Ethiopia, 1984–85
Ethiopia experienced more devastating famines in the 1970s and 1980s than any other country in the world [25]. Historical records of drought and famine in the region date as far back as Egyptian accounts of low Nile floods between 253–242 B.C. and again from 1066–1072 A.D. [25]. Although multiple famines occurred in the country during the 20th century — notably in 1958–59, 1966–67, and the early 1970s — the famine of 1984–85 was the most severe in terms of mortality, with approximately one million people reported to have died [25].
The crisis began to unfold in late 1983, following a period when the country’s Relief and Rehabilitation Commission (RRC) had issued food aid appeals between 1981 and 1983 that were largely ignored, both internally and externally. Neither the Ethiopian politburo nor international donors responded — in part due to allegations that the RRC had previously exaggerated the country’s food aid needs, and the unpopularity of Mengistu Haile Mariam’s regime in the Western world [26]. Mengistu, a military officer who rose to power following the 1974 overthrow of Emperor Haile Selassie, had emerged as the dominant figure within the Derg — a Marxist-Leninist junta that ruled Ethiopia throughout the 1970s and 1980s — and went on to lead a government that launched the brutal Red Terror campaign to suppress dissent, executing tens of thousands and instilling widespread fear. Against this political backdrop, the famine was not solely the result of natural phenomena; the economic, agricultural, and development policies of the Mengistu regime, coupled with civil war, caused major shortfalls in food production and disrupted distribution systems [25]. These state-led policies — including forced collectivization, villagization, and scorched-earth military campaigns — undermined traditional coping mechanisms and exacerbated rural poverty. In this context, socialist development priorities, coercive governance, and armed conflict proved more damaging than drought, pests, or locusts [27].
Ethiopia’s inability to build a national food reserve was further compounded by its low levels of development aid — the lowest in Sub-Saharan Africa — and its allocation of scarce resources to the ongoing civil war [26, 28]. Meanwhile, political considerations influenced donor behaviour: countries like Sudan, considered friendly by the West, received significantly more food aid than Ethiopia in 1984 [25]. Donors were also reluctant to channel aid through the Ethiopian state, fearing it would be diverted to loyal militias in the country’s Tigray and Welo regions [29].
The practice of using humanitarian assistance for political ends — a longstanding tradition in Ethiopia — became especially pronounced under Mengistu’s leadership. This was due in part to the failure of Ethiopia’s principal ally, the Soviet Union, to provide sufficient food aid [25]. Ironically, the famine ultimately consolidated state control, as the regime expanded its authority over both resources and people [30].
On the ground, affected populations employed various survival strategies. These included reducing food intake, changing diets, consuming wild or “famine foods” such as roots, leaves, grass seeds, and wild berries — which, in some regions, occasionally led to accidental poisoning — as well as selling off personal property and livestock, and altering migration patterns — especially among pastoralist communities [25]. Many elderly or infirm individuals were left behind during these migrations [31, 32]. At one shelter alone, between October 1984 and September 1985, 7,844 deaths and only 765 births were recorded — a grim indicator of the famine’s toll [25].
In response to the escalating crisis, humanitarian organizations mobilized. Save the Children launched a supplementary feeding program in March 1984, supported by the European Economic Community, while Médecins Sans Frontières provided critical medical care [33]. Western media outlets also began highlighting the Ethiopian government’s apparent hypocrisy — notably its decision to purchase half a million bottles of Scottish whisky to mark ten years of Marxist rule, even as it allowed mass starvation to unfold [33].
The Broadcast That Changed Everything
In July 1984, almost by accident, BBC foreign correspondent Michael Buerk grasped the scale of the Ethiopian famine during a visit to a refugee center in the country’s north [34]. He returned three months later to the town of Korem with Kenyan photojournalist Mohamed Amin to produce what would become a landmark exposé: a pair of BBC news reports that shocked the world and triggered what many now recognize as the first consumer-driven global aid movement [34]. While in Korem, the two men pushed the limits of their battery-powered equipment to shoot extended news segments — each over seven minutes long — capturing haunting footage of skeletal children, emaciated families, and mass displacement [34]. Their reporting was made possible with the assistance of Oxfam and World Vision [33].
The first broadcast stunned viewers with scenes that included the death of a three-year-old girl — captured on camera — and images of starving families waiting helplessly for food shipments. The emotional power of the footage led British news producers to air the entire seven-minute segment without cuts, a rare move for television news [34]. Buerk opened the segment with a line that would become etched into the memory of British viewers: “Dawn, and as the sun breaks through the piercing chill of night on the plains outside Korem, it lights up a biblical famine, now, in the twentieth century. This place, say workers here, is the closest thing to hell on earth” [35].
Although NBC shortened the piece to two minutes for American audiences, it still left an impact. American news anchor Tom Brokaw titled the U.S. version “Faces of Death” [34]. The report was shown by 425 broadcasting stations around the world — a staggering level of syndication compared to earlier BBC famine coverage that had reached far fewer outlets [33]. Initial viewership was estimated at 470 million, though the true number can no longer be confirmed given the BBC’s decision to re-air the footage at the beginning of the global Live Aid broadcast in July 1985 [34]. The second report by Buerk and Amin — aired on October 24, 1984 — delved deeper into the civil war and its implications for famine relief [33].
Together, the two reports birthed a media event that not only informed but also galvanized public consciousness in ways that earlier reports had failed to do [33]. The Buerk-Amin collaboration would come to define the aesthetic and emotional grammar of televised humanitarianism — provoking not only donations and policy debates, but the mobilization of an entire generation of pop stars, citizens, and global donors into action [34].
Framing the Famine: Media, Messaging, and Morality
It is difficult to overstate the effect of the Buerk-Amin BBC reports on both the media and fundraising landscapes. Prior to their broadcast, public engagement with the Ethiopian famine had been limited, partly due to inadequate press and television coverage. A UK Disasters Emergency Committee appeal had raised around £9 million but was on the verge of being closed down by October 1984 due to waning attention [33]. Many aid agencies had become reliant on the 6 o’clock news for fundraising visibility [33], but earlier coverage had failed to resonate. That changed with Mohamed Amin’s unflinching footage and Michael Buerk’s stark narration. Set against the backdrop of a bountiful European harvest, the images aired globally and provoked a seismic public response [33]. Oxfam, for example, had raised a modest £51,149 for Ethiopian famine relief through September, but in just five days after the broadcast, it received £600,000 in unsolicited donations [33]. In the UK, tabloid newspapers, which had previously given the famine scant attention, dramatically expanded their coverage: the number of column inches devoted to the crisis surged from just 50 in the first three weeks of October to over 1,200 in the final ten days of the month [33].
At a time when charitable appeals on television and radio were tightly regulated, the collaboration between aid organisations and broadcasters created new pathways to reach mass audiences [33]. The messaging strategy that emerged emphasized both scale and singularity. Advertisements leaned on hyperbolic language — World Vision called it “the most devastating human crisis of our time,” while the American Red Cross described it as “the worst drought in history” [33]. Most followed the apolitical framing set by Buerk’s “biblical famine” narrative, focusing on failed rains rather than Ethiopia’s civil war or Cold War dynamics [33]. However, exceptions existed. War on Want, for instance, directly confronted the political context in a campaign showing Derg military jets — rendered as locusts — over a ravaged landscape, under the slogan: “Crops are being destroyed by another plague” [33].
As the initial shock of the famine footage began to recede, humanitarian messaging evolved to sustain donor engagement. Save the Children’s advertisements leaned into emotional provocation, suggesting that it would be far less painful for readers to reach for their check books than to witness children starving on television [33]. By the following autumn, Oxfam acknowledged a new kind of viewer fatigue, asking donors whether they were “fed up with pictures of famine on television?” — an appeal framed not just to empathy, but to the weary spectator [33]. In contrast, War on Want challenged the dominant visual narrative. Rather than recycling images of helplessness, the organisation published uplifting scenes from Eritrea in its newsletter, critiquing the proliferation of what it called “helpless and powerless” portrayals [33].
These debates over imagery were intertwined with broader questions about the role of the donor. The marketing strategies employed during the famine crisis largely positioned the donor — not the recipient — as the central figure. As political scientist David Williams put it: “Aid to Africa is about ‘us’, not ‘them’” [33]. This was reflected in the persistent celebration of donor identity: advertisements repeatedly referred to the generous and compassionate publics of Britain and the United States, thanking them for past giving and encouraging future generosity [33]. The American Red Cross described Americans as “the most generous people in the history of mankind,” while CARE’s direct mail campaign lauded “caring Americans” who “share a belief in the value of human beings, whoever and wherever they are” [33]. For many U.S.-based relief organisations, the BBC broadcast marked a turning point: it legitimised Ethiopia’s claim on American attention, reframing the famine not just as a distant tragedy, but as a shared humanitarian crisis [33].
Britain’s Pop Response: Do They Know It’s Christmas?
Among the millions of viewers who watched the second Buerk-Amin BBC report on October 24th was Bob Geldof, an Irish rock singer best known at the time as the frontman of The Boomtown Rats. Of the many haunting images captured in the broadcast — skeletal children, overwhelmed clinics, and sweeping shots of desolate feeding camps — one moment struck Geldof more than any other: the image of a young English Red Cross worker faced with an impossible choice, forced to decide which of the starving children before her might be saved with the limited supplies at her disposal [35]. It was, for Geldof, not just a portrait of tragedy, but a call to action.
In the weeks that followed, Geldof struggled to translate his sense of urgency into meaningful engagement. At an event in early November — attended by London’s cultural elite and filmed as part of the BBC’s Arena arts programme for a segment titled Ligmalion — the disconnect became painfully clear. Caught on camera recounting the horrific images he had seen on the BBC broadcast, Geldof found himself surrounded by party chatter and apparent indifference. That night, he resolved how to act — he would spearhead the recording of a charity single to raise funds for famine relief. Speaking to the camera at the event, he remarked: ‘To die of want in a world of surplus is not only intellectually absurd, it is morally repulsive’ [35].
Shortly thereafter, Geldof called Midge Ure — the Scottish frontman of the New Wave synth-pop band Ultravox — to see if he’d be keen on collaborating on the project. Ure quickly agreed, and they soon began working together. Geldof provided the initial lyrics, while Ure crafted the musical theme, laying the foundation for what would become Do They Know It’s Christmas? Geldof began contacting prominent musicians to participate in the recording. Simon Le Bon of Duran Duran and Sting were among the first to be approached and agreed to join the project, followed by other artists including George Michael of Wham!, Bono of U2, and Phil Collins. The outreach continued rapidly and, in the end, more than 40 artists signed on to the project [35].
With the song taking shape, the next step was to lock in a recording date. They settled on Sunday, November 24th, 1984 — exactly one month after Geldof had seen the second Buerk-Amin report. Securing a suitable studio was the next hurdle. Trevor Horn — the influential producer known for his work with Frankie Goes to Hollywood — was approached and agreed to join the project. Horn’s contribution would prove to be critical, volunteering not only the use of his facility, SARM Studios in Notting Hill — one of London’s most advanced recording spaces at the time — but also his time and skill, taking on the mixing of the track, and giving Do They Know It’s Christmas? the polished, radio-ready sound that would help propel it to success [36].
On the day of the recording session, upon arrival, the musicians were greeted with a banner that read “Feed The World”. With so many high-profile personalities in one space, there was little time for formal speeches. Instead, Geldof made a brief but direct address, reminding everyone why they were there: not for chart positions or egos, but to help prevent mass starvation. Ure then walked the artists through the arrangement, and the session began in earnest. The vocal sequence was largely determined in advance by Geldof and Ure, who matched artists to lyrics based on vocal texture and public resonance. In order of vocal appearance, solo and duetted lines were sung by Paul Young, Boy George, George Michael, Simon Le Bon, Sting, Bono, Marilyn, and Glenn Gregory [36].
A key creative decision is credited to Trevor Horn. After a brief conversation with Ure and Bono, Horn suggested that all the artists should gather together to sing the post-chorus refrain: “Feed the world, let them know it’s Christmastime.” It was a simple but powerful idea, turning the line into a collective chant that emphasized unity and purpose. Horn assembled all the artists into the studio space to explain the new idea, with the artists immediately responding to the idea, applauding and clearly energized by the chance to contribute to this unified, anthemic moment. In addition to the featured solo and duetted vocalists, the full ensemble performing the refrain featured a cross-section of pop acts, including John Keeble, Steve Norman and Gary Kemp of Spandau Ballet; Sara Dallin and Siobhan Fahey of Bananarama; James “J.T.” Taylor, Robert “Kool” Bell, and Dennis Thomas of Kool & the Gang; Andy Taylor, Nick Rhodes, Roger Taylor, and John Taylor of Duran Duran; Simon Crowe, Pete Briquette, and Johnnie Fingers of The Boomtown Rats; and Rick Parfitt and Francis Rossi of Status Quo. They were joined by Phil Collins, Adam Clayton of U2, Paul Weller of The Style Council, Martyn Ware of Heaven 17, Chris Cross of Ultravox, Jody Watley of Shalamar, Jon Moss of Culture Club, and Geldof himself [36].
After all the artists had left the studio, Ure reflected on the sheer intensity of the day. It had taken around 14 hours to record all the vocals and performances—a whirlwind effort involving dozens of high-profile musicians. Yet for Ure, the real work wasn’t over as he and the remaining production team had to stay through the night to finalize the production. The entire project had been designed to unfold — and be completed — within a single 24-hour window. That constraint wasn’t just logistical; it was symbolic, reinforcing the urgency of the cause and the exceptional nature of the collaboration [36].
The single was released in the UK on Monday, December 3, 1984, with the featured ensemble performing under the collective moniker Band Aid. It debuted at number one on the UK Singles Chart and remained there for five consecutive weeks, selling over one million copies in its first week — a UK record at the time. By the end of 1984, it had sold approximately 3.7 million copies in the UK and would go on to sell more than 11.7 million copies worldwide. It became the fastest-selling single in UK chart history at that point and held the title of the UK’s best-selling single until 1997, when it was surpassed by Elton John’s Candle in the Wind.
An American Refrain: We Are The World
On the morning of December 23rd, 1984, staff at the Los Angeles office of music manager Ken Kragen were caught off guard by the unscheduled arrival of Harry Belafonte. At that point in time, Belafonte had long transcended the label of entertainer, widely regarded instead as a statesman of American culture — a figure whose decades of civil rights work and global advocacy endowed him with a singular moral authority. With the Band Aid single having been released just three weeks earlier, Belafonte had expressed his frustration at the optics of the moment: “We have White folks saving Black folks. We don’t have Black folks saving Black folks.” His meeting with Kragen was to explore a different route — one that might bring attention to the famine in Ethiopia through a large-scale, live concert event. Kragen, while supportive of the impulse to act, offered a more pragmatic alternative: they could follow the Band Aid blueprint, but this time gather the biggest names in American music to create a charity single of their own — one that would reflect a broader range of voices and cultural authority from across the American musical landscape [37].
Kragen’s first call after meeting with Belafonte was to Lionel Richie, one of his most successful artist-clients at the time. Richie immediately supported the idea and suggested bringing in music producer Quincy Jones to help shape the musical direction of the project: an offer Jones accepted. During a limousine ride soon after, Richie and Kragen attempted to get in touch with Stevie Wonder, hoping he might serve as a co-writing partner, but were unable to reach him. Richie then called Jones, who mentioned, almost offhandedly, that he would be seeing Michael Jackson the following day, and offered to run the idea by him. Jackson — at the time, the biggest-selling artist on the planet — expressed his interest to Jones and, with Stevie Wonder still not having returned Richie’s calls, Richie and Jackson officially became the songwriting duo for the project. With Kragen, Richie, Jones and Jackson spearheading the effort, it was decided that the recording would take place on January 28th, 1985, to coincide with the night of the American Music Awards — being hosted by Richie, that year — and the work of assembling a list of superstar artists began [37].
By January 18th, just ten days before the scheduled recording, most of the star-studded lineup had been secured. Jackson and Richie, however, had yet to crack the code of the song itself. Their first task was to decide what kind of anthem they wanted to create. After exploring a range of musical ideas, they eventually landed on “Rule, Britannia!” — a British patriotic song known for its grand, declarative style — as a structural reference, drawn to its steady tempo and stately pacing. The next breakthrough came when Jackson offered up the line “we are the world” — a simple, resonant phrase that instantly anchored the song’s message and gave the project its title. As they proceeded with the writing, a separate but equally urgent decision had to be made, on January 19th: selecting a recording studio. They ultimately secured A&M Studios, but discretion was paramount. If word leaked, the inevitable swarm of paparazzi and fans could have derailed the entire operation before a single note was sung. By January 20th, the song was finally complete and Richie handed the finished composition to Jones, who was both ecstatic and relieved that the heart of the project had come together just in time. When demo tapes were sent to the participating artists on January 23rd, they were accompanied by a formal invitation letter with one detail blacked out in marker: the name of the recording studio [37].
On the night of January 28th, as the American Music Awards came to a close, the participating artists began leaving the Shrine Auditorium to make their way over to A&M Studios on La Brea Avenue. Inside the converted film studio, where final preparations were underway, Jones — now the evening’s conductor — taped a handwritten sign to the entrance: “Check Your Ego At The Door.” It was both instruction and aspiration — a reminder that the night’s purpose went beyond fame, beyond music, and even beyond America [37].
Once the artists were assembled, Jones introduced Bob Geldof, the figure whose Band Aid project had sparked the American response. Fresh off a harrowing trip to Ethiopia, Geldof was met with warm applause — but his tone quickly grounded the room. He reminded the artists why they were there: to help millions facing starvation. He described the brutal reality he had just witnessed — camps with almost no food, outbreaks of typhoid and malaria, and rows of bodies that had simply been left behind. He emphasized that the value of a life, in that context, had been reduced to something as small and disposable as a seven-inch plastic record. Geldof’s words cut through the glamour of the evening and reframed the session as a moral obligation rather than a musical endeavour [37].
After the rehearsal — which involved running through vocal arrangements, matching singers to their lines, and fine-tuning harmonies under Jones’ direction — the session kicked off. In order of vocal appearance, solo and duetted lines were sung by Lionel Richie, Stevie Wonder, Paul Simon, Kenny Rogers, James Ingram, Tina Turner, Billy Joel, Michael Jackson, Diana Ross, Dionne Warwick, Willie Nelson, Al Jarreau, Bruce Springsteen, Kenny Loggins, Steve Perry, Daryl Hall, Huey Lewis, Cyndi Lauper, Kim Carnes, Bob Dylan, and Ray Charles. The chorus included all the other vocalists present, with supporting vocal contributions coming from Dan Aykroyd, Harry Belafonte, Lindsey Buckingham, Sheila E., Bob Geldof, Jackie Jackson, LaToya Jackson, Marlon Jackson, Randy Jackson, Tito Jackson, Waylon Jennings, Bette Midler, John Oates, Jeffrey Osborne, Smokey Robinson, and The Pointer Sisters — Anita, June, and Ruth. By 8:00 AM, the session was complete and the artists dispersed, exhausted but emotionally charged [37].
We Are the World was officially released on March 7th, 1985. On April 5th — Good Friday of that year — in a carefully coordinated global launch, the single was broadcast simultaneously on 8,000 radio stations across the globe at 10:50 AM, Eastern Time. This unprecedented rollout meant that millions of people heard the song at the exact same moment — an intentional move to reinforce the spirit of global unity behind the project. The impact was immediate: the song reached No. 1 on charts in multiple countries and went on to sell over 20 million copies, making it one of the best-selling singles of all time and, to that point, the fastest-selling single in U.S. history, dethroned only by Elton John’s Candle In The Wind in 1997. In recognition of its impact, We Are the World won four Grammy Awards in 1986, including ‘Record of the Year’ and ‘Song of the Year’.
Two Studios, Two Worlds: The Contrasts of Creation
While Do They Know It’s Christmas? and We Are the World shared a unified mission — harnessing the star power of popular music to respond to the Ethiopian famine — the tone, process, and cultural context of each recording session revealed striking contrasts.
The recording session for Do They Know It’s Christmas? carried an atmosphere of urgency, tension, and layered visibility. While the project was grounded in humanitarian intent, the energy in the room often reflected the competitive nature of the British pop landscape at the time. As both Geldof and Boy George would later acknowledge, many of the artists weren’t exactly arriving as friends. Geldof wryly observed that many of them “couldn’t stand each other” — but that, for one day, it didn’t matter. Boy George, never one to shy away from a sharper edge, put it even more bluntly: “Every band that ever slagged each other off is here today” [36].
Throughout the day, artists were often seen rehearsing their individual lines alone, mouthing lyrics under their breath, or quietly retreating into corners of the studio to go over their parts. The technical setup reflected the logistical pressure: many singers were sharing microphones, which only heightened the stakes of each line. According to accounts from the studio, engineers at one point had to instruct producer Trevor Horn that some vocalists were coming through too powerfully — a clear sign that certain performers were pushing themselves forward, trying to be more prominently heard in the final mix [36].
This subtle jockeying for sonic space mirrored the broader competitive dynamic of the session — and layered on top of that was the constant presence of the press. Multiple artists gave individual interviews during the day, with journalists milling about in the studio. Their access wasn’t incidental — it was widely understood to be a quid pro quo arrangement, in exchange for the unprecedented editorial space and promotional coverage being donated by UK media outlets. The result was a recording session that often doubled as a publicity engine, with the lines between music, cause, and media spectacle increasingly blurred [36].
Despite a spirit of collaboration under pressure, Do They Know It’s Christmas? faced early and enduring criticism for its lack of racial diversity. Though Jody Watley of Shalamar and members of Kool and the Gang — all African American — were present at SARM Studios that day, they were not included in the vocal lineup, and their visible participation was minimal [36]. This absence was made more conspicuous by the subject matter: a humanitarian crisis in Black East Africa, interpreted and voiced almost entirely by white British and Irish pop artists. The backlash didn’t go unnoticed. Geldof and others defended the effort, citing logistical and scheduling hurdles, but the visual optics and symbolic imbalance were difficult to ignore — and would go on to influence the more deliberately inclusive casting of We Are the World.
By contrast, the We Are the World session at A&M Studios felt markedly more relaxed and deliberately insulated. No press was allowed inside the studio — a decision made to preserve the integrity and intimacy of the moment. Richie, Jones and Jackson had designed the space not just as a recording session, but as a shared experience — something private, communal, and creatively loose. What emerged was a night marked by camaraderie and spontaneous joy, with Stevie Wonder and Ray Charles holding court at the piano at various points, drawing others into casual singalongs. There were a few notable moments of levity during the recording session. When Quincy Jones acknowledged Harry Belafonte’s influence on the project — Belafonte himself present in the room — Al Jarreau spontaneously broke into a rendition of Belafonte’s iconic “Day-O,” prompting the rest of the artists to join in. Later, during a break, when Ray Charles mentioned needing to use the restroom, Stevie Wonder offered to escort him — prompting the opportune quip about the blind leading the blind [37].
And yet, even in this seemingly harmonious setting, cultural tensions surfaced. Most notably, during a discussion about incorporating African languages into the lyrics, Stevie Wonder suggested that a line be sung in Swahili. It was meant as a gesture of connection to East Africa, but it triggered a moment of cultural and political misalignment. Waylon Jennings, one of the country musicians invited to participate, reportedly said “a country boy don’t sing in Swahili” and walked out of the session entirely. To further complicate matters, Wonder had to be informed that Swahili was not spoken in Ethiopia, where languages like Amharic, Oromo, and Tigrinya were predominant — a moment that highlighted the gap between symbolic inclusivity and cultural specificity. Still, We Are the World was much more intentional in centering Black American musical voices — artists like Ray Charles, Dionne Warwick, Lionel Richie, Michael Jackson, and Tina Turner — alongside white peers from the rock, country, and pop worlds. It was a visually and vocally integrated lineup: a conscious effort to build something that looked more like the America it represented [37].
Taken together, the session contrasts between Do They Know It’s Christmas? and We Are the World is instructive. One was competitive, the other collaborative. One was visually limited in its representation, the other more inclusive but still imperfect. And both, in different ways, reflected the racial, cultural, and geopolitical blind spots of Western celebrity humanitarianism — even as they made unprecedented strides in using popular music for global good.
Live Aid: Building The Global Stage
Live Aid was pulled together in just 19 weeks — a considerable feat, given that no blueprint existed [35]. This would place the starting point in early March 1985 — the same week We Are the World was released. While Geldof had been exploring ideas for months, it’s hard to ignore the symbolic timing: at A&M Studios, where We Are the World was recorded, Belafonte — who had first floated the idea of a concert — and Geldof were both present.
The pace from that moment was relentless. In the lead-up to Live Aid, Geldof was clocking up to 14 hours a day in meetings in London, followed by another four or five on the phone to the U.S. [35]. In truth, the concert’s global scale was only possible because of the BBC. Geldof had initially struck a deal with Channel 4, but the arrangement collapsed — prompting the BBC to step in and re-anchor the event [35]. They became the transmission hub, coordinating 14 satellite feeds to over 150 countries. The setup at Television Centre in London became the heartbeat of a global media operation.
The UK arm of Live Aid was led by legendary concert promoter Harvey Goldsmith, who managed the logistical planning for Wembley. Bernard Doherty, the PR lead from Rogers and Cowan, handled press relations, while BBC presenter Mark Ellen anchored the coverage from Wembley before shifting to Regent Street, where the American feed was transmitted [35].
The U.S. was a harder sell. Promoter Bill Graham, known for his work with the Fillmore — a historic music venue in San Francisco — had initially agreed to manage the American leg. But with just three weeks to go, he stopped returning calls, leaving Geldof scrambling [35]. Fortunately, by this point, Goldsmith had developed a strong working relationship with American promoter Larry Magid, whom he liked. Magid helped smooth operations in the final stretch and became instrumental in holding the U.S. side together as the concert approached.
According to production manager Andy Zweck, much of the lineup came together through bluffing. Geldof would tell Elton John that Queen and David Bowie were in — even when they weren’t — and then call Bowie and say Elton and Queen had signed on [35]. This sleight of hand worked, but only just. Queen was hesitant — with Freddie Mercury exhausted at the time, and the band unsure of its future — until Geldof confronted him in a restaurant and told him he’d publicly let it be known that Mercury had refused [35]. The tactic worked. Bowie signed on shortly after, raising the bar for the entire event [35].
Many artists still declined. The Eurythmics’ Dave Stewart, Liza Minnelli, Yoko Ono, and Cyndi Lauper all said no [35]. Billy Joel, Waylon Jennings, and Kris Kristofferson were on early U.S. promotional materials but ultimately didn’t appear. Paul Simon and Huey Lewis & the News accepted but pulled out over disagreements with Graham [35]. Rod Stewart wasn’t touring. David Byrne was in the middle of finishing a project. AC/DC declined. Deep Purple backed out of a satellite performance. Def Leppard cancelled following drummer Rick Allen’s car accident. Culture Club was sidelined due to Boy George’s persistent issues with substance use [35].
The biggest absence was Bruce Springsteen. Geldof had spent months trying to secure him, even shifting the date from July 6 to July 13 to accommodate his schedule [35]. Springsteen ultimately declined, later saying he hadn’t grasped the scale of the event. Still, his presence was indirectly felt — the stage setup at Wembley was based on his tour rig, which he allowed the organizers to use, saving them significant logistical costs [35].
There was also a brief moment when the three remaining Beatles — Paul McCartney, George Harrison, and Ringo Starr — considered reuniting with John Lennon’s son Julian on piano for the event, but the idea quickly leaked to the press and the plan was subsequently abandoned [35].
The day before the concert, Goldsmith placed oversized clocks all over the backstage area and sent notes to each of the artists, saying: “I don’t care what time you go on. I only care what time you come off.” [35] Geldof went to bed at 2:00 AM, unsure if anyone would actually show up, as no contracts had been signed [35]. On the day, he woke at 7:00 AM, stomach in knots, running on almost no sleep [35].
In London, Saturday, 13 July 1985 was a scorcher [35]. From early morning, a steady stream of attendees made their way toward Wembley Stadium — many by tube, descending from the Jubilee Line’s Wembley Park stop. Some had camped outside the gates overnight to secure a prime position near the front. The doors opened at 10:00 AM, two hours before the show was set to begin [35]. Tickets had cost £25: not an insignificant sum in 1985, especially when job centers were advertising employment at £1 to £1.25 per hour [35].
At exactly 12:00 PM, DJ Richard Skinner’s voice rang out over the Wembley PA system: “It’s 12 noon in London, 7:00 AM in Philadelphia, and around the world it’s time for Live Aid,” his emphasis on the word “Aid” — a term that had not yet fully entered the global lexicon [35]. Just after the announcement, the Coldstream Guards — one of Britain’s oldest regiments — performed a brief ceremonial piece, setting the stage for what would become the largest music event in history.
Status Quo went on at 12:01 PM, followed by The Style Council at 12:19 PM, The Boomtown Rats at 12:44 PM, and Adam Ant at 1:01 PM. Ultravox went on at 1:17 PM, followed by Spandau Ballet at 1:46 PM, Elvis Costello at 2:07 PM, and Nik Kershaw at 2:22 PM. Sade went on at 2:53 PM, followed by Sting and Phil Collins at 3:18 PM, Howard Jones at 3:49 PM, and Bryan Ferry with David Gilmour at 4:08 PM. Paul Young and Alison Moyet went on at 4:40 PM, followed by U2 at 5:19 PM, Dire Straits with Sting at 6:00 PM, and Queen at 6:41 PM. David Bowie went on at 7:23 PM, followed by The Who at 7:59 PM, and Elton John with Kiki Dee and Wham! at 8:50 PM. Paul McCartney, along with David Bowie, Bob Geldof, Alison Moyet and Pete Townshend, went on at 9:51 PM. The final performance was by the Band Aid ensemble performing Do They Know It’s Christmas? at 9:57 PM.
With the London Live Aid leg having begun at 12:00 PM British Summer Time, it would be two more hours before the Philadelphia leg would kick off at 9:00 AM Eastern Daylight Time — marking the official start of the American broadcast. The choice of Philadelphia was strategic. Goldsmith had selected the city after being offered free labour and policing, and because it boasted three airports. In 1985, Philadelphia was not only the largest city in Pennsylvania but also the fifth largest in the United States [35].
Before the official start, an 18-year-old high-school graduate, Bernard Watson, took to the stage at JFK Stadium at 8:51 AM with the curtain still closed, performing alone at the lip of the stage with a guitar. Having persuaded promoter Bill Graham to let him perform in the spirit of the day, Watson played largely unnoticed, with footage of his performance not included in the broadcast.
Joan Baez kicked off the official set at 9:00 AM, followed by The Hooters at 9:10 AM, The Four Tops at 9:30 AM, Billy Ocean at 9:45 AM, and Black Sabbath with Ozzy Osbourne at 10:00 AM. Run-D.M.C. performed at 10:15 AM, followed by Rick Springfield at 10:30 AM, REO Speedwagon at 10:50 AM, Crosby, Stills & Nash at 11:15 AM, and Judas Priest at 11:30 AM.
Bryan Adams went on at 12:00 PM, followed by The Beach Boys at 12:20 PM, George Thorogood & The Destroyers with Bo Diddley and Albert Collins at 12:45 PM, Simple Minds at 1:05 PM, and The Pretenders at 1:20 PM. Santana and Pat Metheny performed at 1:40 PM, followed by Ashford & Simpson with Teddy Pendergrass at 2:00 PM, Madonna with the Thompson Twins and Nile Rodgers at 2:27 PM, Tom Petty and the Heartbreakers at 3:00 PM, and Kenny Loggins at 3:30 PM.
The Cars went on at 3:50 PM, followed by Neil Young at 4:10 PM, Power Station at 4:40 PM, Thompson Twins with Madonna at 5:00 PM, and Eric Clapton at 5:20 PM. Phil Collins performed at 5:40 PM, followed by Led Zeppelin with Collins and Tony Thompson at 6:00 PM, Crosby, Stills, Nash & Young at 6:40 PM, Duran Duran at 7:00 PM, and Patti LaBelle at 7:20 PM. Hall & Oates with Eddie Kendricks and David Ruffin went on at 7:50 PM, followed by Mick Jagger with Tina Turner at 8:15 PM, and Bob Dylan with Keith Richards and Ron Wood at 8:40 PM. The final performance was by the USA for Africa ensemble performing We Are The World at 9:00 PM.
Live Aid in Motion: Spectacle, Scale, and Symbol
Live Aid was dubbed the “global jukebox” for a reason — a single-day musical phenomenon that spanned continents and screens, bringing together some of the world’s most recognizable pop hits and voices in what has been described as “a moment where the exuberance of the Fifties, the altruism of the Sixties, the cultural ambition of the Seventies, and the corporate muscle of the Eighties collided in spectacular fashion” [35]. Beyond London and Philadelphia, smaller satellite concerts took place in Sydney, Cologne, Holland, and Moscow, reinforcing the event’s claim to global reach [35].
The numbers themselves were staggering: 72,000 attendees at Wembley, 90,000 at JFK Stadium, and an estimated 1.9 billion viewers tuning in from 150 countries on 500 million television sets, made possible by 14 satellites coordinating across time zones [33, 35]. The contrast in production costs was just as stark: Wembley reportedly cost $250,000 to stage, while Philadelphia ran over $3.5 million, due in large part to the fact that technical and support staff in the U.S. expected to be paid [35].
The music produced moments of almost mythic proportion. Queen’s set became the benchmark for live stadium performance — punctuated by Freddie Mercury’s improvised call-and-response, and the sheer scale of audience participation [35]. In a career-defining moment, U2’s Bono leapt into the crowd mid-performance, pulling a fan to safety, in what later emerged as a genuinely life-saving act, cited by the woman herself [35]. David Bowie introduced harrowing footage from the Ethiopian famine camps — scenes shot by Mohamed Amin and narrated by Michael Buerk — adding a jolt of moral gravity that moved even Elton John to cut one of his songs in order to make space for it [33]. Meanwhile, Paul McCartney’s appearance toward the end of the Wembley broadcast, performing Let It Be in what was his first live appearance since John Lennon’s assassination in 1980, brought many in the crowd to tears [35]. Phil Collins famously performed in both cities, flying from London to Philadelphia via Concorde — a gesture that encapsulated both the extravagance and the commitment of the day [35].
The scale of Live Aid’s broadcast required an unprecedented level of coordination. At the BBC, 300 phone lines were staffed throughout the day to enable viewers to donate via credit card — a logistical innovation at the time. In the final hours before the show, Geldof was reportedly still on the phone with postmasters around the world, working to ensure payment systems wouldn’t be delayed by bureaucratic red tape [35]. Behind the scenes at the events, volunteers and engineers operated under intense pressure, managing stage transitions, live camera feeds, and international handovers with no real precedent for the scale they were attempting.
Corporate sponsors played a central role in underwriting Live Aid’s production costs. Pepsi, Kodak, Chevrolet, and AT&T were among the multinationals whose financial support ensured the event’s technical execution. AT&T, in particular, crafted an effective marketing campaign around the event. Instead of hiring a celebrity spokesperson, the company aired commercials featuring famine victims’ faces fading in and out as a new rendition of their classic jingle played: “Reach out, reach out and touch someone / Someone whose only hope is you” [34]. A spokesperson for AT&T later revealed that the company saw Live Aid as not only a charitable contribution but also an opportunity to test new services and gain major brand visibility — what he described as “a good marketing or advertising buy” [34, 38]. Similar dynamics were evident in the partnership between Pepsi and Lionel Richie, whose co-authorship of We Are the World and concurrent brand association became mutually reinforcing. Richie, like Michael Jackson, had become a “compassionate artist” figure, and Pepsi leveraged that image to appeal to consumers who now saw ethical consumerism and pop music as part of the same emotional economy [34].
The aesthetic and curatorial contrasts between Wembley and JFK were also apparent. In London, the show felt tightly orchestrated, with acts delivering hits in sharp succession. In Philadelphia, while the line-up was more eclectic, the stage management was reportedly chaotic, and the tone more reminiscent of a traditional rock festival. “The Americans saw Live Aid clearly as just a rock event,” Geldof later said. “It’s not a political event in their eyes, and never was. For me, Live Aid was the most political gig of them all” [35]. Singer Paul Young echoed this divide, recalling that in the UK, artists were cooperative and relaxed about their place in the running order, while in the U.S., the atmosphere was reportedly tense with disputes over timing and placement [35]. This contrast extended to the cultural memory of the event. While the UK remembered Do They Know It’s Christmas? and the London gig as politically charged interventions rooted in a BBC-mediated famine narrative, many U.S. audiences remembered We Are the World and the Philadelphia gig as part of a more Hollywood-style pageantry, where charity and celebrity blended in a less overtly political register [35].
Whose Story Was Told? Narratives, Power, and Humanitarian Optics
Beyond the record-breaking sales, landmark broadcasts, and unprecedented global reach that defined the three campaigns, each would come under increasing scrutiny for the stories they told — and those they left out. Do They Know It’s Christmas?, We Are the World, and the Live Aid concerts were not merely acts of charity; they were also powerful instruments of narrative framing, shaped by cultural assumptions, emotional shorthand, and geopolitical omission. As the spotlight widened, questions emerged around authorship, representation, and the ethics of staging solidarity on a global stage.
A particularly controversial lyric from Do They Know It’s Christmas? that has persisted in public memory is: “There won’t be snow in Africa this Christmastime.” It’s unlikely that Bob Geldof meant the line literally — as if forecasting weather patterns in Ethiopia — but rather as a symbolic device, meant to evoke emotional contrast for a British audience, for whom snow is deeply intertwined with the iconography of Christmas. In that sense, it functioned as a rhetorical bridge, translating distant suffering into imagery that felt immediate and relatable for Western listeners. Yet the effect was still reductive. As journalist David Pilling writes, the line collapses a specific humanitarian crisis in one country into a vague and monolithic portrayal of an entire continent “bigger than China, India, the US and Europe combined”. Pilling also points out the absurdity of the suggestion in the song’s title: Ethiopia, which adopted Christianity in AD 325, has one of the oldest Christian traditions in the world — its people, therefore, most certainly knew it was Christmas. But perhaps the most widely condemned line is the one delivered by Bono: “Well, tonight, thank God it’s them instead of you.” Intended as a raw emotional jolt, the lyric has often been interpreted as reinforcing a sense of Western moral superiority — an “othering” mechanism that frames “African suffering” as not just distant, but as the unfortunate fate of someone thankfully “not us”. Even Bono himself would later admit discomfort with the line, which continues to resurface in debates about pity-based humanitarian messaging and the ethical framing of global inequality. More controversially, Pilling highlights what the song does not say: that the famine in Ethiopia was not simply a result of drought, but also the deliberate weaponization of starvation by a dictator — though he concedes that “to sing that may not have aroused as much sympathy” [39]. Such structural and political realities were flattened or omitted in favour of a more emotionally potent — but geopolitically sanitized — storyline. And while Do They Know It’s Christmas? succeeded in raising massive funds, it also cemented an enduring Western narrative of Africa as passive, helpless, and dependent, a view many African artists and thinkers continue to challenge today. As British-Ghanaian musician Fuse ODG recently put it, the song’s framing “fuels pity rather than partnership” [39].
Compared to Do They Know It’s Christmas?, We Are the World was not subject to the same level of immediate lyrical controversy. Its lyrics were more universal, emotionally cautious, and less geographically specific, intentionally sidestepping references to Africa, Ethiopia, or the famine itself. The refrain — “We are the world, we are the children” — spoke in broad moral strokes, framing the crisis as a shared human concern rather than a distant humanitarian emergency. This approach avoided the most explicit pitfalls of othering, but it came at a different cost: vagueness. While emotionally resonant, the song left its audience without a clear understanding of who needed help, why, or what structural issues were at play. The suffering was real, but abstract — its politics and geography blurred into a globalized call for compassion. One of the more subtle critiques of We Are the World lies in the absence of specificity: there’s no mention of Ethiopia, famine, hunger, dictatorship, or aid. The lyrics have been described by some scholars as flattening global inequality into a metaphor for unity — delivering sentiment without context. As cultural theorist Cheryl Lousley has argued, such emotionally expressive humanitarianism often prioritizes the donor’s sense of moral identity over the material complexity of the crisis being addressed [33]. In this sense, We Are the World may have been less problematic than Do They Know It’s Christmas?, but it was also less informative, less courageous, and more attuned to Western emotional safety. Furthermore, critics have pointed to the song’s heavy use of moral universals — “saving our own lives,” “we’re all a part of God’s great big family” — as inadvertently implying a sameness of condition across radically unequal realities. The line “there’s a choice we’re making, we’re saving our own lives” invites a sense of shared investment, but also blurs the actual power imbalances involved in global charity. The victims are never named. Their voices never heard. The givers become the moral protagonists.
Whilst Live Aid may have been hailed as an unprecedented act of global musical solidarity, not all musicians embraced the event uncritically. Roger Waters, formerly of Pink Floyd, who was not performing at Live Aid, publicly criticized the event’s approach in later interviews, arguing that charity concerts risked oversimplifying complex political problems and that governments, not musicians, should be responsible for systemic change. Frank Zappa, formerly of The Mothers of Invention, also declined to participate. He objected to what he saw as Live Aid’s failure to tackle the deeper structural causes of poverty in the developing world. Never one to shy from provocation, Zappa would later claim the concert was little more than “the biggest cocaine money-laundering scheme of all time” [35]. Bob Dylan’s comment remains one of the most cited. Speaking between songs during his set with Keith Richards and Ron Wood of The Rolling Stones, Dylan suggested: “Wouldn’t it be great if we did something for our own farmers right here in America?” His remark reframed the entire event in a different light — pivoting from the famine in Ethiopia to domestic economic hardship. Dylan’s statement angered Geldof, who later called it a “crass, stupid and nationalistic” remark, arguing that it undermined the spirit of global solidarity the event sought to promote. Yet Dylan’s instinct to localize the cause reflected a broader challenge: whether musicians could — or should — sustain attention on distant suffering without reverting to familiar, national frames of reference. To top it off, as with Band Aid, Live Aid drew criticism for its lack of racial diversity. By the time the full lineup was announced, concerns had begun to circulate that the concert — especially its UK leg — reflected a predominantly white, male industry, with few Black performers and a structure some described as neocolonial in tone [35]. Beyond race, gender representation was also strikingly limited. Very few women featured across both stages, and only three women of colour — Patti LaBelle, Sade and Tina Turner — took the stage.
The critiques levelled at Do They Know It’s Christmas?, We Are the World, and Live Aid were not without merit — yet they also risk missing the complicated realities in which these campaigns were conceived and executed. While the shortcomings of representation and lyrical framing are clear, so too were the structural and political challenges that organizers faced.
First, the lack of Black artists on the Live Aid stages was not due to a lack of outreach. Promoters on both sides of the Atlantic maintained that every major Black act had been approached and invited. “We were criticised endlessly for not having enough black acts on the bill, but nobody wanted to do it,” said Harvey Goldsmith. “Honestly, we tried every major black act both here in the UK and in the US and none of them were interested. It was embarrassing. Some even wanted money” [35]. U.S. promoter Bill Graham echoed the sentiment: “What I could say was that I contacted every single major black artist. I won’t name them. But they all turned down Live Aid… That doesn’t mean they didn’t care… But all the major black artists? All the biggest ones? You name them. They all turned Live Aid down” [35]. The reasons for this may have varied — from skepticism about the event’s goals, to discomfort with its optics, to valid concerns about tokenism and the politics of representation. But the refusal was widespread. In a way, it underscored the difficulty of pulling off an inclusive global event in an industry still sharply divided along racial lines — not only in terms of who was visible, but also who was trusted.
Then there was the matter of political climate. These campaigns were staged at a time when governments were retreating from global humanitarian responsibilities. In the UK, Margaret Thatcher’s administration had little interest in pop culture and even less in foreign aid. Her memoirs make no mention of Live Aid, Africa, or Bob Geldof — topics she simply had no interest in [35]. When asked to waive the 17.5% VAT on Live Aid ticket and merchandise sales, her government refused — thereby profiting from the very concert intended to raise funds for famine relief [34]. In the U.S., President Ronald Reagan declined to contribute taped remarks — as did Thatcher in the UK — wary that Live Aid might carry too strong a whiff of political critique or public mobilization [35]. This was the broader context in which these initiatives emerged: a decade defined not only by privatization and austerity, but by a foreign policy increasingly disinterested in multilateral humanitarian intervention — particularly in countries like Ethiopia, which were aligned with the Soviet Union. In this light, Do They Know It’s Christmas?, We Are the World and Live Aid were not simply naïve pop spectacles; they were radical refusals to wait. They sidestepped geopolitics, asserted moral urgency, and asked global citizens — not governments — to close the gap. The effort was improvisational, flawed, and shaped by the limitations of its moment. But it was also, by any standard of global civic action, unprecedented.
Mobilizing Emotion: Donors, Guilt, and the Moral Audience
The response to the campaigns was, in many cases, as emotionally charged as the broadcasts themselves. In the days following the release of Do They Know It’s Christmas?, letters flooded into the BBC — many from children — and were read aloud on air or displayed in studio windows. Across these messages, a shared emotional register emerged: guilt and shame, especially when contrasted with the “food mountains” stockpiled across Europe while images of famine in Ethiopia looped on screen [33]. Some of the letters became emblematic of the personal reckoning that the campaign inspired. One woman from the United States wrote that watching the broadcast left “a big lump in my throat,” especially as she sat down to a large Thanksgiving dinner “knowing there are many starving.” Another letter reported that the famine footage prompted its author to quit smoking and redirect the funds: “I think of that starving child who needs the money a lot worse than I need those cigarettes” [33].
The New York Times published accounts of donations made by two young girls pledging their $5 allowances, by a Vietnamese refugee who gave in gratitude to UNICEF for helping him resettle in the U.S., and by a local baker who turned up at a relief organization’s office carrying a large box of coins — collected entirely by his children [33]. Others offered their resources in kind. One British farmer, speaking on camera, insisted that diverting surplus grain to famine zones wasn’t only the ethical thing to do — it was also in Europe’s long-term self-interest: “There is real value in making sure that people who are hungry are fed” [33]. We Are the World similarly drew millions of individual donations and ultimately raised over $63 million for humanitarian relief efforts across Sub-Saharan Africa and the United States.
Live Aid, for its part, became a masterclass in real-time global giving. On the day of the event alone, £11 million was raised in the United Kingdom, with another £36 million generated in the United States. Total pledges surpassed £50 million. Merchandise sold out rapidly: 50,000 souvenir programmes priced at £5 and 10,000 posters at £2.50 each disappeared by day’s end. Donations flooded in via the 300 telephone lines staffed by the BBC, with viewers contributing by credit card from dozens of countries across the globe [35]. The single largest gift came from the ruling family of Dubai, who pledged £1 million in response to the appeal [35].
The outpouring of support following the three campaigns can be partially understood through foundational theories in media, persuasion, and behavioural psychology. While each campaign operated in its own aesthetic and political register, they all activated well-studied mechanisms of emotional influence and public action.
Framing Theory — developed by political communication scholar Robert Entman in 1993 — offers a useful starting point. The theory holds that the way media presents an issue, what it emphasizes, and what it omits, fundamentally shapes how audiences interpret that issue [40]. A frame is not just what is said, but what is made salient: causes, consequences, moral judgments, and potential solutions. In the case of the campaigns, the prevailing media frame was one of apolitical tragedy — a “biblical famine,” stripped of context and reduced to suffering on a mass scale. This framing omitted the civil war in Ethiopia, the regime’s use of starvation as a weapon, and Cold War geopolitics. Instead, it highlighted emaciated children and overwhelmed mothers, often paired with emotionally neutral narration and soaring musical scores. By centering innocence and need — and excluding power and responsibility — the campaigns framed famine not as a political failure, but as a moral emergency. This made it easier for audiences to respond emotionally without grappling with the structural forces that produced the crisis.
A second mechanism, Guilt Appeals, helps explain why audiences responded so urgently. As theorized by communication scholar Daniel O’Keefe in 2000 [41], guilt appeals work by making individuals aware of the discrepancy between their values and their actions, especially when those values concern compassion, fairness, or global justice. Importantly, guilt appeals are most effective when they provide a direct path to redemption — a clear action the viewer can take to relieve their emotional discomfort. The campaigns leaned heavily on this dynamic. The imagery of starving Ethiopian children was repeatedly juxtaposed with emblems of Western privilege — the sentimental warmth of snow-draped Christmas festivity, the polished exuberance of pop celebrity, and the theatrical spectacle of stadium-scale charity. The discomfort this provoked could be quickly converted into action: buying a single, phoning in a donation, or simply feeling morally aligned with the cause. Guilt, in this context, was not a paralyzing emotion but a mobilizing one — deliberately calibrated to make the audience feel both implicated and empowered.
Finally, Social Cognitive Theory — introduced by psychologist Albert Bandura in 1986 — sheds light on how collective action was modelled and amplified during the campaigns. The theory posits that people learn behaviours not only through direct experience but by observing others, particularly those they admire or identify with [42]. These observational cues help individuals assess both the value of the behaviour and their own ability to carry it out. The campaigns exemplified this principle in action. Featuring dozens of the most famous musicians in the world, the campaigns didn’t just deliver a message — they demonstrated participation. To give was to join a moral community; to abstain was to fall out of step with it. The act of donating was transformed into a symbolic performance of solidarity, modelled not only by celebrities but by children, newscasters, even global leaders in certain cases. Bandura’s concept of mediated modelling — learning by seeing others act within a mass media context — was deployed at scale, turning philanthropy into a kind of participatory ritual.
Together, these theories reveal not just how people were moved to give, but how they were taught to feel. The campaigns did not simply communicate a need — they framed it in moral terms, delivered emotional catalysts, and demonstrated behavioral scripts for viewers to follow. In doing so, they helped rewire how Western publics understood distant suffering: not as something to study, but as something to solve.
Backstage Bureaucracy: How the Band Aid Trust and USA for Africa Spent the Money
Following the fundraising successes of the campaigns, the spotlight shifted from emotional appeal to operational delivery. Two formal entities were established: the Band Aid Trust, registered as a UK charity in January 1985, and United Support of Artists for Africa — commonly known as USA for Africa — incorporated as a nonprofit in the United States that same month [33]. While both grew out of celebrity activism and rejected traditional donor models, they approached the task of humanitarian deployment with distinct philosophies and structures.
The Band Aid Trust was formed around Geldof’s personal vision of a streamlined, urgent response mechanism, unencumbered by the slow-moving bureaucracies of the international aid world. Geldof referred to existing humanitarian agencies as “pickpockets” [33], and sought instead to create a lean operation that would channel 96p of the £1.35 retail price of the charity single directly into relief efforts [33]. While the Band Aid Trust was governed by a board whose founding directors prioritized media, legal, and financial expertise over traditional aid credentials [33], USA for Africa adopted a more centralized leadership model, spearheaded by executive director Marty Rogol, a public interest attorney. The nonprofit explicitly rejected what they called the “Mount Olympus” model of foundation governance, refusing to act as aloof benefactors reviewing unsolicited grant proposals [33]. Like the Band Aid trust, they embraced an anti-bureaucratic identity but paired it with a more professionalized infrastructure. From the start, USA for Africa emphasized collaboration with African partners and local agencies, portraying itself as a cooperative donor rather than a top-down financier [33].
The operational cultures of the two entities quickly diverged. The Band Aid Trust ran on donated office space, volunteer labour, and interest income from bank deposits, with minimal administrative expenditure [33]. Trustees were expected to make complex financial decisions with limited reporting infrastructure, which led to internal problems — most notably, the discovery of an unallocated £23.7 million surplus in September 1985 [33]. This prompted the hiring of paid field directors and the adoption of computerized accounting systems. By 1986, the Trust had formalized oversight mechanisms through an advisory board chaired by former Oxfam director Brian Walker, under the leadership of Trust director Penny Jenden [33]. In contrast, USA for Africa established staffed offices in Los Angeles and New York, quickly forming institutional relationships with major players such as the UN and InterAction [33]. Under Rogol’s direction, the organization gradually professionalized its internal processes — hiring consultants to evaluate major grants by mid-1986 [33]. Though not bloated, its administrative structure was more deliberate and cautious than the Band Aid Trust’s reactive model [33].
The delivery of aid followed a similarly bifurcated path. The Band Aid Trust’s first relief plane landed in Ethiopia in March 1985, delivering vehicles, tents, biscuits, and powdered milk, each emblazoned with label “With Love from Band Aid” [33]. In the following 18 months, the Trust leased cargo ships to move over 100,000 tons of goods to Sub-Saharan Africa and even offered free shipping to other NGOs like Oxfam and regional church groups [33]. Expenditures in the crisis year included US$6.7 million for logistics and US$5.4 million via the Christian Relief and Development Association [33]. The Trust also funded controversial actors, such as the Ethiopian Relief Association (US$10 million) and the Relief Society of Tigray (US$1.8 million) [33]. Ultimately, across six countries, it disbursed US$71.3 million in emergency relief and US$70.2 million for longer-term development, while keeping administrative costs below US$2.5 million — all covered by bank interest [33]. USA for Africa reported a total disbursement of US$19 million in immediate relief and US$24.5 million in development aid by January 1986, along with US$900,000 allocated to hunger relief programs in the United States [33]. Their overall funding formula favoured development (55%) over emergency relief (35%), with 10% set aside for domestic initiatives. The most visible of these was Hands Across America in May 1986, a coast-to-coast solidarity event involving 6.5 million people and raising US$34 million, much of which went to U.S. anti-hunger programs [33].
Criticism, however, dogged both organizations. In the UK, aid professionals often viewed the Band Aid Trust as amateurish and ad hoc; one government consultation called its operations “fairly horrific” [33]. Geldof’s media-savvy but confrontational style alienated humanitarian workers who viewed Band Aid’s process as cavalier and unaccountable [33]. Nevertheless, agencies like Oxfam and CARE acknowledged its donor power and advised field staff to cooperate accordingly [33]. Tensions further escalated over the Trust’s policy of appointing local NGOs to chair peer-review committees, which challenged the dominance of multilateral and governmental donors [33]. USA for Africa drew scrutiny for its first grant cycle, which directed 44% of funding to UN agencies and 31% to U.S.-based organizations, leaving only a quarter for international actors [33]. A Save the Children USA poll revealed that over half of Americans assumed the money would be used domestically [33]. Voluntary organizations in both countries accused the celebrity campaigns of diverting funds away from traditional causes, particularly those relating to cancer, the elderly, youth, and the arts [33], though evidence showed that international giving had increased by 163% during the 1984–85 period [33].
Accountability practices lagged behind expectations. From its inception, the Band Aid Trust resisted conventional reporting mechanisms, arguing that its impact was best captured through symbolic media artifacts like the documentary Food and Trucks and Rock ‘n’ Roll and a photo book titled This Book Saves Lives [33]. It took until 1992 — seven years after its founding — for the Trust to publish With Love from Band Aid, a grant summary it insisted was “not a set of accounts” [33]. While internal controls were reportedly stringent in rebel-held territories [33], the lack of regular reporting and financial transparency earned criticism from UN officials and scholars like Alex de Waal, who argued that Band Aid had “debased the currency of humanitarianism” by concentrating resources on high-profile, media-driven responses [33]. A particularly damaging episode came in 2009, when the BBC aired a radio documentary alleging that Band Aid funds had been diverted to Eritrean rebel forces. Geldof, speaking in Nairobi, condemned the program as baseless and accused the BBC of doing “appalling” harm. It took the broadcaster over a year to issue a formal apology [35]. Although more procedurally cautious, USA for Africa also faced questions about transparency and pace. Its gradual disbursement model frustrated some donors and overwhelmed partner agencies, and several NGOs declined its grants outright for fear of reputational entanglement [33]. The organization issued a 1986 press statement defending its approach: “the money can either be spent wisely or spent fast” [33]. Yet its absence of a permanent presence in Ethiopia was seen as a weakness, particularly when compared to Band Aid’s more embedded local structures [33]. Like its counterpart, USA for Africa failed to adopt standard fiscal-year reporting, complicating future efforts at comparison or long-term evaluation [33].
Echoes and Reverberations: The Afterlife of a Trilogy
Do They Know It’s Christmas?, We Are the World, and Live Aid functioned as large-scale global health communication campaigns, though they were rarely framed as such. Each deployed emotionally charged, high-urgency messaging strategies designed to drive rapid donor response, often privileging moral appeal over contextual nuance. The framing of famine — and by extension, severe acute malnutrition — emphasized suffering, helplessness, and scarcity, with little explanation of underlying systemic, political, or ecological causes. In communication terms, these campaigns relied on a problem-focused frame (highlighting the crisis without unpacking its structural drivers) and did not integrate solution efficacy messaging (offering concrete, evidence-based actions beyond donating) into their appeals. The credibility of the message was heavily mediated through celebrity figures, most of whom lacked direct expertise in global health but held significant cultural authority. This created a dissonance: while audiences were highly engaged behaviourally (donating), their cognitive understanding of the health issue remained shallow. Furthermore, the near-total absence of African voices — and the predominance of white Western celebrities — raises key questions around cultural representation, source legitimacy, and ethical risk in global health storytelling. From a health communication perspective, these campaigns succeeded in visibility but faltered in promoting informed, equity-oriented understanding of public health emergencies.
The framing of famine in the three campaigns aligned closely with what public health scholars describe as the emergency model — a perspective that emphasizes crisis, immediacy, and the need for rapid external intervention. This stood in stark contrast to the structural model, which centers long-term determinants such as poverty, governance, agricultural policy, and healthcare infrastructure. While the emergency model tends to foreground visible suffering and dramatic rescue, the structural model insists on historical context, political accountability, and prevention. The campaigns overwhelmingly adopted the former: the famine was presented as a sudden moral emergency, stripped of its systemic roots. This preference for the emergency model had far-reaching implications. By elevating short-term intervention over structural diagnosis, the campaigns helped popularize a template for global response that privileges visibility and speed over sustained engagement. Donating became synonymous with saving lives in the moment, while deeper issues — such as land reform, international trade policy, or local governance — were left outside the frame. As a result, the global public was conditioned to respond to famine as an episodic crisis, not a chronic outcome of political and economic systems. This not only skewed funding priorities toward reactive measures but also reinforced the notion that Western actors were best positioned to solve “African suffering” through spectacle and immediacy, rather than solidarity and structural change.
The three campaigns also influenced the global imagination in ways that extended well beyond famine relief. The suffix “-Aid” became shorthand for charitable spectacle. In the years that followed, events such as Animal Aid, Tree Aid, Hear ’n’ Aid, Sport Aid, and Fashion Aid proliferated across sectors, each borrowing from the rhetorical and performative blueprint first laid down by Band Aid and Live Aid [35]. Live Aid also caused a quantum shift in the entertainment industry. The stadium became the benchmark of global artistic legitimacy [35]. In addition to this, the campaigns also elevated the cultural status of performers themselves. Pop stars were no longer just entertainers — they were rebranded as philanthropists. A well-received appearance on a charity stage often translated into commercial success, amplifying both message and market [35]. Yet the cultural impact went deeper still: Live Aid helped inspire a new generation of philanthropic events — Comic Relief being one of the most visible examples, and the evolution of Amnesty International’s concert series another [35].
Ironically, Michael Buerk, whose October 1984 BBC report on famine in Ethiopia had sparked the entire sequence of events, never saw Live Aid unfold. At the time, he was in South Africa — where Live Aid was not being broadcast — covering protests and civil unrest in the townships, where flames and riots dominated the news cycle. He later recalled that on the day of Live Aid, he was being tear gassed by police. [35].
Despite their diminished visibility today, the two institutions forged in the wake of these campaigns still exist. The Band Aid Trust continues to operate, quietly spending approximately £1 million a year on development projects across Sub-Saharan Africa. The Trust does not advertise, nor does it lobby; while widely assumed to be dormant, it maintains a presence in African humanitarian work [35]. Do They Know It’s Christmas? continues to resurface every few years in new iterations — re-recorded in 1989, 2004, 2014 and 2024 — each time blending a new generation of popular artists with the legacy of the original. These periodic revivals, often timed to coincide with fresh humanitarian appeals, aim to channel nostalgia and celebrity into renewed fundraising energy. But the song’s symbolism has grown more contentious over time. Singer-songwriter Ed Sheeran, who contributed vocals to the 2014 version, requested his part be removed from the 2024 release, expressing discomfort with the narrative the song perpetuates [39].
USA for Africa also remains active. Now focused on grant-making, educational partnerships, and domestic hunger relief, the organization funds programs that align with its founding ethos of artist-led, socially grounded philanthropy. It continues to operate as a 501(c)(3) nonprofit corporation, with occasional collaborations in music-based education, awareness campaigns, and fundraising for emergency response.
Ultimately, the most observable impact of these campaigns was on the trajectory of Bob Geldof himself. Before Band Aid, Geldof’s career had largely plateaued; after Live Aid, he became a globally recognized humanitarian figure and power broker. He leveraged his newfound platform to launch media ventures — including the DVD series Geldof in Africa, produced by his company Ten Alps — and cultivated relationships with multinational corporations and global NGOs [34]. His influence expanded well beyond music and activism, into the realm of public diplomacy and corporate philanthropy. For his role in galvanizing one of the most ambitious humanitarian movements of the 20th century, Geldof was knighted in 1986 by Queen Elizabeth II, formally recognized for his efforts with Band Aid and Live Aid. The moment not only marked the peak of a global media moment — it consecrated the rise of the celebrity humanitarian.
About the Author
Kevin Samuel is an early-career researcher exploring how sound, music, and mediated performance shape public narratives around health, identity, and collective wellbeing. The Chorus Effect is his first project within this domain.
Contact: kevin.samuel@soundalive.org
References
World Health Organization. (2013). Guideline: updates on the management of severe acute malnutrition in infants and children.
Waterlow, J. C. (1986). Metabolic adaptation to low intakes of energy and protein.
Keys, A., Brožek, J., Henschel, A., Mickelsen, O., & Taylor, H. L. (1950). The biology of human starvation (2 vols).
Collins, S., Dent, N., Binns, P., Bahwere, P., Sadler, K., & Hallam, A. (2006). Management of severe acute malnutrition in children. The Lancet, 368(9551), 1992–2000.
Golden, M. (1988). The effects of malnutrition in the metabolism of children. Transactions of The Royal Society of Tropical Medicine and Hygiene, 82(1).
Reid, M., Badaloo, A., Forrester, T., Heird, W. C., & Jahoor, F. (2002). Response of splanchnic and whole-body leucine kinetics to treatment of children with edematous protein-energy malnutrition accompanied by infection. The American Journal of Clinical Nutrition, 76(3), 633–640.
World Health Organization. (2023, August 9). Identification of severe acute malnutrition in children 6–59 months of age. https://www.who.int/tools/elena/interventions/sam-identification
Grey, K., Gonzales, G. B., Abera, M., Lelijveld, N., Thompson, D., Berhane, M., ... & Kerac, M. (2021). Severe malnutrition or famine exposure in childhood and cardiometabolic non-communicable disease later in life: A systematic review. BMJ Global Health, 6(3), e003161.
Black, R. E., Victora, C. G., Walker, S. P., Bhutta, Z. A., Christian, P., De Onis, M., ... & Uauy, R. (2013). Maternal and child undernutrition and overweight in low-income and middle-income countries. The Lancet, 382(9890), 427–451.
Thompson, D. S., Bourdon, C., Massara, P., Boyne, M. S., Forrester, T. E., Gonzales, G. B., & Bandsma, R. H. (2020). Childhood severe acute malnutrition is associated with metabolic changes in adulthood. JCI Insight, 5(24), e141316.
Tennant, I. A., Barnett, A. T., Thompson, D. S., Kips, J., Boyne, M. S., Chung, E. E., ... & Forrester, T. E. (2014). Impaired cardiovascular structure and function in adult survivors of severe acute malnutrition. Hypertension, 64(3), 664–671.
Liu, L., Pang, Z. C., Sun, J. P., Xue, B., Wang, S. J., Ning, F., & Qiao, Q. (2017). Exposure to famine in early life and the risk of obesity in adulthood in Qingdao: Evidence from the 1959–1961 Chinese famine. Nutrition, Metabolism and Cardiovascular Diseases, 27(2), 154–160.
Woo, J., Leung, J. C. S., & Wong, S. Y. S. (2010). Impact of childhood experience of famine on late life health. The Journal of Nutrition, Health & Aging, 14, 91–95.
Huang, C., Li, Z., Wang, M., & Martorell, R. (2010). Early life exposure to the 1959–1961 Chinese famine has long-term health consequences. The Journal of Nutrition, 140(10), 1874–1878.
Wang, Y., Wang, X., Kong, Y., Zhang, J. H., & Zeng, Q. (2010). The Great Chinese Famine leads to shorter and overweight females in Chongqing Chinese population after 50 years. Obesity, 18(3), 588–592.
Gueri, M., Andrews, N., Fox, K., Jutsum, P., & St Hill, D. (1985). A supplementary feeding programme for the management of severe and moderate malnutrition outside hospital.
Brewster, D. (2004). Improving quality of care for severe malnutrition. The Lancet, 363(9426), 2088–2089.
Cook, R. (1971). Is hospital the place for the treatment of malnourished children? Journal of Tropical Pediatrics, 17(1), 15–25.
Cook, R. (1968). The financial cost of malnutrition in the “Commonwealth Caribbean”. Journal of Tropical Pediatrics, 14(2), 60–65.
Roosmalen-Wiebenga, M. V., Kusin, J. A., & With, C. D. (1986). Nutrition rehabilitation in hospital—a waste of time and money? Evaluation of nutrition rehabilitation in a rural district hospital in southwest Tanzania. I. Short-term results.
Reneman, L., & Derwig, J. (1997). Long-term prospects of malnourished children after rehabilitation at the Nutrition Rehabilitation Centre of St Mary's Hospital, Mumias, Kenya. Journal of Tropical Pediatrics, 43(5), 293–296.
Bengoa, J. M. (1967). Nutrition rehabilitation centres.
Schofield, C., & Ashworth, A. (1996). Why have mortality rates for severe malnutrition remained so high? Bulletin of the World Health Organization, 74(2), 223.
Berg, A. (1993). Sliding toward nutrition malpractice: Time to reconsider and redeploy. The American Journal of Clinical Nutrition, 57(1), 3–7.
Kloos, H., & Lindtjørn, B. (2019). Famine and malnutrition. In The Ecology of Health and Disease in Ethiopia (pp. 103–120). Routledge.
Borton, J., & Clay, E. (1986). The African food crisis of 1982–1986. Disasters, 10(4), 258–272.
Koehn, P. (1979). Ethiopia: Famine, food production, and changes in the legal order. African Studies Review, 22(1), 51–71.
Varnis, S. (1990). Reluctant Aid or Aiding the Reluctant?: US Food Aid Policy and Ethiopian Famine Relief. Transaction Publishers.
Rubenson, S. (1991). Conflict and environmental stress in Ethiopian history: Looking for correlations. Journal of Ethiopian Studies, 24, 71–96.
Cutler, P. (1991). The political economy of famine in Ethiopia and Sudan. Ambio, 176–178.
Rahmato, D. (1991). Famine and Survival Strategies: A Case Study from Northeast Ethiopia. Nordic Africa Institute.
Adhana, A. H. (1988). Peasant responses to famine in Ethiopia, 1975–1985. Journal of Ethiopian Studies, 21, 1–56.
Götz, N., Brewis, G., & Werther, S. (2020). Humanitarianism in the Modern World: The Moral Economy of Famine Relief. Cambridge University Press.
Davis, H. L. (2010). Feeding the world a line?: Celebrity activism and ethical consumer practices from Live Aid to Product Red. Nordic Journal of English Studies, 9(S3), 89–118.
Jones, D. (2014). The Eighties: One Day, One Decade. Random House.
Live Aid. (2024, November 29). Band Aid – The Making of The Original ‘Do They Know It's Christmas?’ [Video]. YouTube.
Nguyen, B. (Director). (2024). The Greatest Night in Pop [Film]. Netflix.
Jones, A. (2017). Band Aid revisited: Humanitarianism, consumption and philanthropy in the 1980s. Contemporary British History, 31(2), 189–209.
Pilling, D. (2024, November 23). Do they know it’s Africa, at all? Financial Times. https://www.ft.com/content/8292727b-48a9-489a-8b69-443271ba3957
Entman, R. M. (1993). Framing: Toward clarification of a fractured paradigm. Journal of Communication, 43(4), 51–58.
O’Keefe, D. J. (2000). Guilt and social influence. Annals of the International Communication Association, 23(1), 67–101.
Bandura, A. (2014). Social-cognitive theory. In An Introduction to Theories of Personality (pp. 341–360). Psychology Press.
Someone with the initials DK sent me the link. Great read, really revealed the history and outcome. Thanks and all the best