Would We Have Already Had a COVID-19 Vaccine Under Socialism?

Debunking the myth that capitalism drives innovation.

April 20, 2020

“Socialism is not a viable solution,” JPMorgan Chase CEO Jamie Dimon declared in his 2018 letter to shareholders. “There is no question that capitalism has been the most successful economic system the world has ever seen,” he went on. “It has helped lift billions of people out of poverty, and it has helped enhance the wealth, health and education of people around the world. Capitalism enables competition, innovation and choice.”

Fast forward two years: A global pandemic has plunged the world into crisis and the American markets into chaos. As of April 17, the number of COVID-19 cases ballooned to more than 2 million worldwide with 139,378 reported deaths, and growing. The United States accounts for 690,714 of these cases, though this estimate is conservative. Comparisons to the Spanish flu and the Great Depression fail to capture the moment. Social distancing for up to 18 months, by some estimates. Shortages of tests and ventilators. Empty store shelves. Toilet paper a rare commodity. The economy grinding to a halt as countries close their borders and businesses shut their doors. “Even with moderate fiscal stimulus, we’re likely to see 3 million jobs lost by summertime,” wrote Josh Bivens, research director at the Economic Policy Institute, in a post published on March 17. One month later, the Washington Post reported more than 22 million Americans have filed for unemployment aid. Absent a dramatic shift, the coronavirus pandemic will be remembered as one of the darkest chapters in American history.


As the crisis blooms, the country demands to know: How soon might “competition, innovation and choice” deliver a vaccine?

The answer looks grim. Anthony Fauci, who leads the National Institute of Allergy and Infectious Diseases, believes a vaccine is unlikely to arrive within a year. Others in the field suggest even 18 months is optimistic, and that’s assuming it can be quickly mass produced and nothing goes wrong. As Bill Ackman, the billionaire and founder of investment firm Pershing Square, warns on CNBC, “Capitalism does not work in an 18-month shutdown.”

So how did the United States, dubbed “the greatest engine of innovation that has ever existed” by New York Times pundit Thomas L. Friedman, end up so sorely unprepared?

Perhaps one clue lies in Texas, where a potentially effective vaccine has been stalled since 2016. Dr. Peter Jay Hotez and his team at Texas Children’s Hospital Center for Vaccine Development created a potential vaccine for one deadly strain of coronavirus four years ago—which Hotez believes could be effective against the strain we face now—but the project stalled after the team struggled to secure funding for human trials. Even the looming crisis did not guarantee additional money. Commenting on the effort to resume development, Hotez told NBC News on March 5, “We’ve had some conversations with big pharma companies in recent weeks about our vaccine, and literally one said, ‘Well, we’re holding back to see if this thing comes back year after year.’ ” Under this logic, vaccines for recurring seasonal illnesses, like the flu, are the more attractive investment. Unlike rarer or less-understood diseases, they promise a client base that can be mined again and again.

The evidence for Dimon and Friedman’s grand claims about capitalism is presumed obvious. Had the West never abandoned feudal and mercantilist systems for capitalism, there would never have been an industrial revolution, nor the technological progress we enjoy today: powerful pocket-sized computers, self-parking cars, robots that vacuum and mop, drugs that reverse overdoses and fend off HIV, maps that predict hurricane routes. The staunchest defenders of the status quo would tell you that a socioeconomic system any less individualistic could never have produced any of it. Without the pressure of competition and promise of riches, they say, no one in their right mind would invest time in useful discoveries. But what if this perspective fundamentally misunderstands the elements that drive innovation? What if innovation actually happens in spite of capitalism?

Innovation is not synonymous with mere “invention.” Rather, innovation describes a process—the stages of developing an existing discovery, moving it into production and disseminating it to a wider audience. Three ingredients seem necessary for innovation to flourish: ample resources (like education and equipment), free and creative minds, and the free sharing of information to expand the universe of people able to build on discoveries. Together, these ingredients make a powerful recipe for maximizing innovative output. Yet, American capitalism obstructs each of them. Taking a closer look at what actually drives innovation helps explain how we got to such a bleak place, and the transformation we need to push our research institutions ahead of societal needs.

Is private better than public?

President Ronald Reagan once said the best minds are not in government, and if any were, business would steal them away. (He also famously quipped, “The most terrifying words in the English language are: ‘I’m from the government and I’m here to help.’ ”) The idea that private firms are better than the government at solving problems persists despite our long history of innovation through public institutions—and despite the dire consequences caused by this mistaken fetishization of private innovation.

A significant amount of “basic research” for new drugs comes from government labs, university departments and nonprofit organizations—basic research being a technical term for the research done without a practical end in mind (beyond a greater understanding of the unknown). Basic research is timeconsuming, but it is essential for innovation. A 2011 study found that, in the 40 years prior, at least 153 Food and Drug Administration (FDA)-approved drugs were discovered with support from public research, while a 2018 study found that “[National Institutes of Health (NIH)] funding was associated directly or indirectly with every drug approved from 2010–2016,” including through basic research. Noting the robust budget for basic research at the NIH, the researchers concluded, “Any reduction in this funding that slows the pace of this research could significantly delay the emergence of new drugs in the future.”

Unfortunately, America’s rush toward privatization has come at the cost of public investment. The Information Technology & Innovation Foundation reports that America’s state and federal spending on university research has slipped dramatically since 2011. Between 2011 and 2018, U.S. spending on R&D fell 11%, from $165.6 billion to $147.3 billion. In fact, every proposed Trump administration budget has requested deep cuts to public research institutions. President Donald Trump’s budget proposal for fiscal year 2018, for instance, asked Congress for a $1.2 billion cut to the Centers for Disease Control and Prevention (CDC), including a $136 million blow to funding for public health preparedness and response. That same year, the CDC curtailed its work against global disease outbreak by 80%.

The public health crisis now hammering America reveals how vital it is to invest in research before it is desperately needed. But that’s not how market logic works. Instead of directing attention to what will protect the largest number of people from the greatest harm, capitalism steers innovation toward the largest profit in the shortest amount of time.

“Many vaccines used to be produced in the public sector; now the majority are produced in the private sector,” Dana Brown, director of the Next System Project at The Democracy Collaborative, explains. “Big Pharma is not well set up to bring a vaccine into the market: Vaccine development and production is a long, risky process requiring patient capital and sustained interest. Big Pharma focuses on short-term gains and maximizing shareholder value—there’s little, if any, gain for shareholders when companies invest in vaccine development.”

What’s more, only four major vaccine producers exist worldwide (Pfizer, Merck, GlaxoSmithKline [GSK] and Sanofi), down from 26 in the United States alone in 1967. “Limiting access to vaccines through high prices and monopolies is terrible public health policy,” Brown says. “A number of companies reported losing money on Ebola or SARS vaccines programs, which might make them hesitant to invest again—and their track record shows it. In recent years, GSK made commitments to Ebola vaccine development and later pulled out. Sanofi did the same with Zika, and Novartis, a pharmaceutical company in Switzerland, dumped its whole vaccine development unit in 2014.”

The Ebola outbreak from 2014 through 2016 is a particularly pertinent example of what can go wrong when private companies become responsible for vaccine development. Much of the early research and development for an Ebola vaccine was conducted by Canada’s National Microbiology Laboratory, which then provided licensing to a small U.S.-based biotech company for the final stages of development. That company then sublicensed the vaccine to Merck for $50 million. A 2020 report published in Journal of Law and the Biosciences reports that Merck then “failed to make any progress toward a phase 1 clinical trial until after the [World Health Organization] Public Health Emergency of International Concern freed substantial donor and public funds for the vaccine’s further development.” The report goes on: “It was unclear what Merck did during this period other than provide permission to use the Canadian procured and financed rVSV-ZEBOV clinical grade vaccine. What the record does establish is that it was the public sector, not Merck, that provided all of the financing, including for clinical trials, during the West African epidemic.”

Had public labs not relied on the private sector, the authors suggest, the vaccine’s 5-year timeline to reach U.S. and European markets might have been shortened.

While capitalists insist innovation can only be driven by a profit motive, the advances made through public research tell a different story. Government funded research projects are not lucrative for public institutions. Although federal law has allowed the public sector to license its research since 1980, this arrangement has proven far more profitable for the licensees than for the public institutions themselves. Private pharmaceutical companies routinely build on publicly funded research to develop new drugs and hold on to the windfalls through exclusive rights. Meanwhile, the institutions that performed the basic research see little financial reward.

The development of antidepressants, for example, is directly traceable to the NIH and the research of its Nobel Prize-winning biochemist Julius Axelrod into neurotransmitter hormones. Pharmaceutical company Eli Lilly relied on that work to develop Prozac, a drug that earned the company $2.6 billion a year until its patent expired in 2001. Similarly, the work of researchers Thomas Folks, then at the CDC, and Robert Grant at the University of California, San Francisco—supported by millions of federal dollars—laid the groundwork for HIV-preventative drugs. The pharmaceutical company Gilead Sciences, which used this public research to market its drug Truvada for pre-exposure prophylaxis (PrEP), now charges $1,600 to $2,000 for a month’s dose. In 2018, Gilead’s reported revenue from Truvada alone was $3 billion.

In March, The Intercept reported that the FDA granted a special designation for Gilead’s antiviral drug remdesivir (“one of dozens being tested as a possible treatment for COVID-19”) that would allow the company “to profit exclusively for seven years from the product.” The article adds: “Experts warn that the designation, reserved for treating ‘rare diseases,’ could block supplies of the antiviral medication from generic drug manufacturers and provide a lucrative windfall for Gilead.” This decision was particularly baffling, in light of the fact that in November 2019, the U.S. Department of Health and Human Services (HHS) filed a suit against Gilead for deliberately infringing on its patents, which HHS argues has permitted Gilead to profit “from research funded by hundreds of millions of taxpayer dollars.” After public backlash, Gilead asked the FDA to rescind the designation. However, The Intercept noted in its follow-up reporting, “Public health experts remain concerned about the potential for Gilead and other pharmaceutical companies to engage in price gouging during the global pandemic.”

“Government really has all the power here—it just has to use it in the public interest,” Brown says. “For instance, the United States has routinely nationalized companies or whole industries in times of crisis and has authorized government patent use on pharmaceutical products to assure an affordable supply. It has the power to do so again.” At the turn of the 20th century, Brown continues, “the New York City Health Department played a pivotal role in developing testing for and treatment of diphtheria, which had reached epidemic levels and caused thousands of deaths in the city. They offered free antitoxins to the poor. The city health department also made a key discovery related to the control of cholera and offered free laboratory analyses to assure patients were able to be diagnosed and treated.”

Finders Keepers

Capitalist America imagines a world in which free enterprise and free markets promote a race to the top, with creators pulling the levers of innovation as they climb, but this isn’t quite how things shake out in practice. Though it may sound counterintuitive, American capitalism deliberately restricts the movement of information and new knowledge. The phenomenon is perhaps best understood in the context of noncompete agreements and patents, both of which prevent information from circulating to maximize the profit of individual companies.

Around 20% of the American workforce—and roughly half of all engineers—is bound by noncompete agreements as a condition of employment. These contracts restrict employees from switching jobs, preventing workers from using prior experience to make meaningful contributions at a new firm. A 2017 study found that as noncompete agreements become more enforceable, the formation of new firms declines. Champions of these agreements claim they encourage discovery and increase human capital investment by incentivizing workers to stay put, but mounting evidence suggests otherwise. Highly skilled workers, for example, actually tend to leave regions where noncompete agreements are enforceable, migrating instead to places like Silicon Valley, where noncompete agreements have been banned since 1872. Today, the California region is widely recognized as a world hub of innovation.

Private ownership of innovation was built into the American economy from the start: The founders included a patent clause in the Constitution, giving Congress the power to grant inventors exclusive rights to their discoveries, which the founders surely assumed would promote the progress of science. Patent protection allows the patent holder to charge high fees for the direct use or licensing of their discovery—and to sue anyone who doesn’t buy this permission.

For the dark side of patent rights, consider Italy, where the coronavirus outbreak has resulted in the highest death toll as of the end of March. As Italian hospitals began to run out of a particular valve needed for equipment used to treat COVID-19, Cristian Fracassi and Alessandro Romaioli reached out to Intersurgical, the patent holder and manufacturer of the valve, hoping to 3D-print them (at a cost of about $1 per piece). Intersurgical refused to share the design, however, citing “manufacturing regulations;” one volunteer claims they were told the design was “company property.” The two volunteers devised the design anyway and printed the valves. (On a related note, the German government was outraged to learn that Trump offered the German pharmaceutical company CureVac $1 billion to develop a vaccine for the coronavirus exclusively for the United States.)

In fields like tech, where improvements can happen very fast, patent protections can hinder development. While the patent holder can certainly improve on their own invention, they may choose not to do so for any number of reasons: perhaps lack of skill or interest, or mediocre profit potential, for example. Meanwhile, equally qualified competitors may be discouraged from trying to innovate at all, knowing they would need patent permissions. Companies must spend considerable time and money obtaining various patent permissions. On any modern smartphone, for example, the incorporation of now-standard features (such as Wi-Fi capability, touchscreen technology, video recording, digital photography and data transfers) involves thousands of patents. Research organization Engine notes that “Bluetooth 3.0 [is] a technology incorporating the contributions of more than 30,000 patent holders, including 200 universities.” Conglomerates like Google have large legal teams and millions of dollars devoted to patent purchases and litigation. (The company bought the phone manufacturer Motorola for $12.5 billion outright in 2011, reportedly to acquire Motorola’s trove of patents.) Because most people do not have this kind of economic power, the current patent system shrinks the overall pool of innovators, slowing down progress.

The cost of delayed innovation might be benign—a slower browser, a glitchy phone, choppier video—but it might also be measured in lives. Tom Frieden, former CDC director, has estimated that, in the worstcase scenario, more than 1.5 million people could die of COVID-19 on U.S. soil alone.

Stifling Workers, Stifling Creativity

Many of the most sophisticated innovations of our time, from groundbreaking drugs to smart car technology, have depended on a deep pool of creative labor. But the idea that capitalism allows the bestsuited workers to join that pool is wishful thinking. As journalist Chris Hayes writes in Twilight of the Elites: America After Meritocracy, meritocracy “can only truly come to flower in a society that starts out with a relatively high degree of equality.” From 1979 to 2015, the annual average household income of the top 1% grew five times faster than that of the bottom 90th percentile. The reality is that deep inequalities in how this country’s wealth is distributed make meritocracy all but a myth. Some people can afford to attend college and access spaces where discovery is encouraged, moving into a “creative pipeline,” while their poorer peers go right into the workforce or juggle demanding classes with work schedules. While some with great innate talent for innovation end up in these coveted creative jobs, many more—poor and workingclass—are pushed by financial necessity into positions mismatched to their potential.

In theory, one doesn’t need a creative-focused job to innovate. But creativity requires a certain freedom— an ability to “waste” time, to work nonlinearly, to experiment and repeatedly fail. Capitalism’s constant dictate to maximize productivity leaves people with little time to spare, at work or at home—especially in poor and working-class households: The bottom fifth of earners have seen their work hours increase by 24.3% since 1979, compared to 3.6% for the top fifth.

Being in a more precarious financial position, or in a job with little security, also discourages workers from taking risks, even when the risks might lead to innovation. The precarity makes it difficult to approach one’s supervisors and ask for sick days, let alone personal time to go down rabbit holes. It makes it frightening to change fields or spend money on any project that might result in even more precarity.

Notably, the corporate structure itself has been known to stifle creation. Many corporate firms are under the effective control of shareholders, to whom managers owe a fiduciary duty to maximize profits. Shareholders who believe this duty has been breached typically have the right to sue the corporation. While this power can be used for the greater good—note how Tesla was sued by shareholders in response to its poor safety record—it also opens the door to shortsighted shareholders. One DuPont shareholder, for example, demanded the chemical company “not invest a single dollar in research that will not generate a positive return within f ive years.” What’s more, according to a 2017 working paper by the Institute for New Economic Thinking, “Many of America’s largest corporations, Pfizer and Merck among them, routinely distribute more than 100% of profits to shareholders, generating the extra cash by reducing reserves, selling off assets, taking on debt or laying off employees.”

Even the most creative of workers who make it into innovative roles in the private sector may find themselves starved of resources. As professors Chen Lin and Sibo Liu of the University of Hong Kong, and Gustavo Manso of the University of California, Berkeley, explain in a 2018 study, the threat of shareholder litigation generally discourages managers from “experimenting [with] new ideas,” which acts as an “uncontrolled tax on innovation.” 

Sharing Is Caring—And Absolutely Necessary

CityLab reports that, in response to the pandemic and the scarcity of official information, “a group of coders, analysts, scientists, journalists and others are working to follow coronavirus testing across the country through an open-sourced database called the COVID Tracking Project.”

Open-source communities have existed since the 1980s and have contributed to a range of innovations, from the creation of the internet to cheaper prosthetics and better disaster management systems. Typically, these online open-source collectives are made up of unpaid volunteers who contribute code and features meant to be used freely. A 2006 study from the University of Illinois set out to understand why coders would donate hours of their personal time to open-source projects. Researchers found that some enjoyed the freedom and creativity of managing their own work and disliked the hierarchical communities that claimed exclusive control over projects. Other coders said they had withheld their labor from more privatized projects because seeing their contributions redirected to private hands sapped both creativity and motivation.

The European Organization for Nuclear Research in Switzerland, known as CERN, has long adhered to an “open-source” philosophy, the belief that “the recipients of technology should have access to all its building blocks … to study it, modify it and redistribute it to others.” This communal ethic has helped support incredibly innovative projects. CERN operates a massive particle physics laboratory; its Large Hadron Collider (LHC) particle accelerator runs 17 miles underground and relies on collaboration networks developed through the open-source cloud system OpenStack. The LHC discovered the Higgs boson particle in 2012, earning a Nobel Prize for physicists Peter Higgs and François Englert. Two years after the Higgs boson discovery, CERN made data from its LHC experiments free to the public through an open data portal, which CERN then-Director General Rolf-Dieter Heuer hoped would “support and inspire the global research community, including students and citizen scientists.”

Some of the most innovative companies in America are now open-sourcing certain projects. On the heels of four years of lawsuits over a number of smartphone patents in 2014, for instance, Apple and Google announced they would settle out of court and “work together” on patent reform. The next year, Microsoft and Google co-founded the open-source Alliance for Open Media. A number of other companies came on board, including Apple, Amazon, Intel, Cisco, Facebook, Mozilla and Netflix. Its goal was to improve video compression technology, which had been held hostage by two patent holders charging millions in licensing fees. Netflix reported the new, open-source, royalty-free video technology improved its compression efficiency by 20%.

Rather than causing the imagination to stagnate, the Alliance for Open Media’s adoption of socialist principles benefited all involved—the larger media industry and their consumers alike. Of course, as Google demonstrated when it recently fired five employees involved in union organizing, the corporations involved are not challenging the economic status quo—they benefit from the precarity of work and go to great lengths to protect their bottom line. It is also highly likely that, aside from open-source advocates like Mozilla, most members of the Alliance for Open Media joined because it was economically advantageous. Intel’s director of strategy and planning, for example, explained his company believed the open-source project would “lower delivery costs across consumer and business devices as well as the cloud’s video delivery infrastructure.”

Without the likely economic benefit, it is difficult to say whether the Alliance for Open Media would have ever formed—and it is precisely because of this uncertainty that a more dramatic shift to a socialist economy is necessary to maximize new innovation. Those in power stand to benefit from sowing fear around socialism, but the rest of us would be better off in a society reorganized around democracy, equality, solidarity, autonomy and collective ownership.

That shift would mean more publicly funded medical research and cheaper drugs reaching those in need faster. It would mean more fearless development, unimpeded by expensive shareholder lawsuits and patent disputes. It would mean life-saving drugs like Truvada being available for all, rather than all who can afford them, and widespread economic security through the kind of safety net that empowers anyone to explore their talents, rather than just the children of the welloff. It would mean labor rights that allow workers to shape their working conditions and turn workplaces into places where ideas can thrive. It would mean an expansion of the open-source philosophy to promote free knowledge and perspectives, with collective ownership of the discoveries fueled by collective investment. It would mean an unwavering commitment to the funding of public institutions and projects, without a constant eye toward financial return.

And perhaps paramount in our minds in this moment: It would mean developing a vaccine against a disease as dangerous as COVID-19—before it could become a global pandemic.

Vanessa A. Bee is a lawyer, writer and associate editor at Current Affairs.


Never miss a story. Subscribe to the free In These Times weekly newsletter:


COPYRIGHT ©2017 In These Times AND THE INSTITUTE FOR PUBLIC AFFAIRS. ALL RIGHTS RESERVED