In the late 1990s, business managers and academic researchers tried to tackle what they saw as an urgent and growing problem: When knowledge workers such as industrial physicists walked out the door in the evening, they inevitably took valuable intellectual property with them. Managers did not fear the theft of patent documents. They feared losing a collection of intangible skills, a deep knowledge of the company’s processes, relationships with other technical workers, and the general know-how that makes an experienced employee more valuable than someone fresh out of college. In other words, businesses were worried that they did not fully own scientists’ minds.
Over the course of centuries, a struggle has been playing out about who gets to own ideas. Is it the person who comes up with them? The employer who funds the research? Or should the ideas be somehow shared between them?
For the most part, that struggle has resulted in scientists slowly losing control of their discoveries, both in private industry and in academia. Patents once went to the inventor by default, but now they belong to the employer. Hands-on skill and experience with the research process—sometimes called know-how or tacit knowledge—was once the most fundamentally personal part of what a worker brought to the table, yet business lawyers have built a variety of legal tools to constrain skilled workers from offering it up on the free market. By the 1990s teams of MBAs and business-school scholars joined forces to see if advances in information technology, management techniques, law, and sociology could allow them to extract workers’ know-how so that the company could store and own it indefinitely. The resulting academic research field and management fad became known as “knowledge management.”
This article traces changes in US law, business practices, and social expectations about research and invention in order to illuminate the history of business control over scientists’ ideas. It will not be the whole history—I skip over huge amounts of history about government sponsorship of research, changing national and international economic conditions, ties between industrial and academic scientists, and many other topics that would be needed for that.1 Still, it is a slice of history that physicists would do well to remember. We live in an age of strong intellectual property rights and relatively weak protections for workers, especially in high-tech fields where unionization is low. Where once an industrial scientist had unquestioned ownership of his or her ideas, that self-determination has eroded in many ways over centuries. Knowing that past might help scientists evaluate what they hope to see in the future.
Patents and invention
In early America, the law—and society as a whole—strongly presumed that people were entitled to the fruits of their minds. As a result, it seemed obvious that inventors would own any patents on technologies they developed, regardless of who was employing them at the time. An inventor could sell a patent, of course, but a company generally could not require employees to sign them over. The DuPont chemical company (figure 1) was founded in 1802. At that time, if one of DuPont’s industrial chemists figured out a better way to manufacture gunpowder or developed a formula for a new dye, the patent would belong to the chemist. DuPont would receive a “shop right” to use the invention without paying royalties, but the employee owned the patent. He could use it to go into business for himself, license it, or sell it outright.
A Dupont company chemical plant in Camden, South Carolina, depicted on a postcard circa 1935. (Courtesy of the Boston Public Library.)
A Dupont company chemical plant in Camden, South Carolina, depicted on a postcard circa 1935. (Courtesy of the Boston Public Library.)
Until roughly the Civil War era, a basic tenet of contract law was that both sides must benefit near equally from a deal. That principle is still true today to some extent—a contract in which only one side gets something of value, such as a promise to donate money, is not legally binding. In the 19th century, however, it went further; judges would sometimes invalidate contracts that they saw as grossly unfair. In terms of employment contracts for inventors, that meant judges would often refuse to enforce a clause requiring inventors to sign over the right to patents developed while working for the company.
The public and the law saw patents as a reward for the creative act, a reimbursement from society for inventing something useful. As Article I of the US Constitution puts it, patents exist “to promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries.”
By the early 20th century, the situation had almost completely reversed. After decades of rulings by business-friendly judges, courts came to favor the concept of freedom of contract. In the new line of thinking, it was not the job of the courts to impose their ideas of fairness onto business agreements. If a person chose to sell their house for one dollar, that was their right, even if the buyer was willing to pay far more. If a scientist chose to sign an employment agreement that handed over patent rights on anything invented as an employee, then that was their choice and the courts should respect it. In theory, the law assumed that scientists would demand and receive fair compensation for giving up those rights.
In practice, however, unemployed people rarely have much bargaining power over the details of employment contracts, especially if certain contract terms have become standard across an industry. Today, many private companies and universities require employees to agree that anything invented with any assistance from the company or university belongs to the organization, not to the individual. Employees have even less power if companies collude. In a famous 2013 case, several Silicon Valley tech firms, including Apple (figure 2), Google, and Pixar, were sued for agreeing not to hire each other’s workers in order to keep wages lower. After a Department of Justice investigation, the companies paid an expensive settlement, but the case did little to shake the view that employees can force fair compensation in their contracts.2
The Apple Campus in Cupertino, California, was the company’s corporate headquarters in 1993–2017. (Photo by Joe Ravi, CC BY-SA 3.0.)
The Apple Campus in Cupertino, California, was the company’s corporate headquarters in 1993–2017. (Photo by Joe Ravi, CC BY-SA 3.0.)
Trade secrets and noncompete agreements
Although patents receive the lion’s share of attention in conversations about intellectual property, companies of all sizes far more often rely on trade secrecy to protect their innovations.
Relying on trade secrets means a little more than just keeping a process quiet. In essence, a company can receive special legal protection for its industrial secrets if it meets a few conditions. It has to be able to demonstrate in court that it made a good-faith effort to keep a process secret—for example, by limiting access to certain areas or training employees about what they can and cannot reveal. The process also has to have demonstrable value—in other words, you won’t get far suing an ex-employee for talking about your secret cafeteria menu. And the information can’t be public, such as a process your CEO wrote about in a publication like Physics Today, or common knowledge, such as something that a lot of other companies in the industry do the same way. Assuming a particular secret meets those qualifications, though, a company can take legal action to prevent it from being divulged.
Trade secrets, too, have a history, and depending on how you’re counting, it can be either a very short or a very long one. Some of the roots of trade-secrets law go back hundreds of years to early English law about how master craftsmen in guilds could restrict knowledge to just their apprentices. In general practice, of course, businesses have used secrecy to retain competitive advantage, from the secret to crafting optical glass to the algorithm behind Google’s search results.
The body of US law behind trade secrecy, however, is relatively young. As trade-secrets laws have changed, so have standards for how businesses treat industrial scientists. Before the 20th century, the law gave companies limited leverage over departing employees. It was possible for companies to sue people who tried to steal their trade secrets by, for example, bribing employees to smuggle out key documents. Proving that kind of illicit behavior was difficult, however, especially since re-inventing someone’s trade secrets through experimentation or reverse engineering was and is entirely legal.
Often courts were faced with hazier issues, like an employee leaving his job to start a new company using technology he had developed or moving to a competing firm but taking no documents with him. In that case, it was entirely conceivable that the former employee would divulge trade secrets, but proving he had done so would be next to impossible. Some companies tried to get courts to impose an order forbidding the employee from taking the new job, but judges almost always rejected those requests in the 19th century. At that time, a worker’s right to earn a living through his skill and knowledge was seen as a more important social value than the company’s right to control potentially valuable information.
During the 1940s and 1950s, businesses became much more interested in how to manage, profit from, and control intangible knowledge, or know-how. Why interest rose at that point is hard to evaluate. It’s possible that the postwar surge in international business made it clear that communicating science and technology across cultural borders is difficult work and can’t be accomplished by sending documents alone. The information usually has to be personally relayed by someone with the necessary knowledge. There were well-known examples of that in physics. After the invention of the cyclotron in the 1930s, for example, only labs visited in person by someone who had worked with a cyclotron were able to build their own machine.3 Reports were almost never enough. Historian David Kaiser has shown4 that Feynman diagrams, too, spread as people who had learned them in person traveled around the world (figure 3).
The cyclotron (left) and a Feynman diagram (right) are examples of scientific advances that were difficult to communicate and spread through writing alone. (Cyclotron image © 2010 The Regents of the University of California, through the Lawrence Berkeley National Laboratory. Feynman diagram redrawn from R. P. Feynman, Phys. Rev. 76, 769, 1949, p. 772, doi:10.1103/PhysRev.76.769.)
The cyclotron (left) and a Feynman diagram (right) are examples of scientific advances that were difficult to communicate and spread through writing alone. (Cyclotron image © 2010 The Regents of the University of California, through the Lawrence Berkeley National Laboratory. Feynman diagram redrawn from R. P. Feynman, Phys. Rev. 76, 769, 1949, p. 772, doi:10.1103/PhysRev.76.769.)
The interest in know-how may also be related to Cold War fears of communist spies stealing scientific secrets, like plans for nuclear weapons. In an interview in 1948, J. Robert Oppenheimer commented that “the best way to send information is to wrap it up in a person.”
Despite some deep historical roots, most of the law surrounding trade secrets goes back only a few decades. Individual states began adopting the first uniform, codified trade secrets law, the Uniform Trade Secrets Act, in 1979 in response to demand from businesses; by 2013 forty-seven states had passed the act. The first federal trade-secrets law, the Defend Trade Secrets Act, copied most of those state-level provisions in 2016.
Noncompete clauses in employment contracts are closely connected to trade-secrets protections. By the early 1800s, businesses started stipulating in job offers that the employee could not work for or start any competing firm after quitting or being fired. Those stipulations initially ran aground due to concern for workers’ rights to the fruit of their minds, but a changing philosophy that favored freedom of contract slowly eroded that protection. By the World War I era, noncompete clauses were fairly common in highly skilled jobs. The trend has continued through the present day, in which even fast-food workers sometimes find themselves signing agreements not to work for competitors.
New laws and new penalties
Whereas most of the changes in trade-secrets law were driven by judicial decisions, legislatures were also important, especially in the move toward criminal penalties for divulging trade secrets. Violating someone’s patent opens you up to lawsuits but not jail time. Since the Economic Espionage Act of 1996, however, federal prosecutors can go after anyone in possession of written trade secrets without permission. In one famous 2009 case, Sergey Aleynikov, a former programmer for Goldman Sachs, was arrested, prosecuted, and sentenced to eight years in prison without parole for stealing trade secrets. The trade secret in question was code he had written himself, then backed up onto cloud storage while he was still an employee. There was no evidence he had accessed the code since leaving Goldman Sachs.
That same year, two engineers were charged and later acquitted of passing trade secrets about semiconductor manufacturing from a California firm to a Chinese manufacturer. In 2010 Kexue Huang, a Canadian citizen working for Dow AgroSciences in Indiana, pled guilty to passing trade secrets about pesticides to companies in Germany and China and was sentenced to 15 years in jail. In 2015 Xi Xiaoxing, a professor of physics at Temple University in Philadelphia, was arrested and charged with passing secrets to China about manufacturing thin films, apparently because agents for the Federal Bureau of Investigation (FBI) misunderstood the science and technology at stake. The charges were eventually dropped, and Xi has since sued the FBI.5
Criminal charges for trade-secrets theft are still very rare, but they are becoming more common. FBI investigations of trade-secrets theft increased 60% from 2009 to 2013. There is likely a real threat to the US economy from economic espionage, but a side effect of growing prosecution is that well-connected businesses are given even more power over their employees. It is one thing to know your former employer could sue you for monetary damages or prevent you from taking another job; it is quite another to know you might face a decade in jail.6
Important limits are still in place on what kind of control the law provides over your ideas, skills, and general employability as a skilled industrial scientist. Noncompete clauses are generally only enforceable within “reasonable” constraints, such as not being allowed to work for a competing firm within a year or two, within a set geographic area, or with a closely competing company. In some states, such as California, lawmakers have decided that noncompete agreements should not be enforceable.
Furthermore, one aspect of an employee’s value has proven difficult to restrict or legislate. Anyone who has worked in science and technology can attest that making things work goes beyond textbook knowledge. We often call that type of knowledge know-how. It encompasses getting a feel for working with the instruments—for example, being able to quickly find out why a spectrometer is giving odd readings. It includes knowing how to order your time to keep things moving efficiently, rather than having hours of downtime between stages. It includes knowing who to ask for help and all the intangible skills and judgment that are the reason physics degrees include lab courses.
That kind of know-how is invaluable, but it doesn’t have a clear place in US intellectual property law. Some of it might count as trade secrets, but a master welder has little need to keep his techniques explicitly secret. Describing or demonstrating them to a less-skilled welder would not turn that junior welder into a serious competitor. Many skills and insights are not patentable because they are not some major innovative step beyond what other skilled practitioners might know, yet they can still be invaluable.
In the mid-20th century, lawyers for various firms worked to build up intellectual property protections for know-how. From the 1950s through the early 1970s, businesses fought for an expansive protection of know-how as a property right. Eventually, corporations lost the battle due to shifts in law beyond the scope of this article; the lawyers refocused on the push for stronger trade-secrets protections, and the idea of legal protection for know-how faded from most people’s memories. Business interest in controlling tacit knowledge did not fade, however. It would return in several forms, perhaps most visibly in a 1990s business management fad called knowledge management (KM).
Knowledge management
In some ways, KM is an odd place to end a discussion of the legal rights of scientists and inventors. The 1990s management fad rarely dealt with the law at all. It is, however, a recent example of how CEOs, consultants, and academics from several disciplines attempt to figure out ways to capture and control the knowledge that makes skilled workers valuable.
Bursting into the marketplace of business management ideas around the mid 1990s, KM peaked in the early 2000s and continues with less urgency but real potency today. To give one sense of the movement’s scale, the market for “knowledge management services” rose from $400 million in 1994 to $2.6 billion in 1997, then $3.6 billion in 1998. In 2002 the consulting firm McKinsey & Company invested $35.8 million in its own KM systems, up from $8.3 million in 1999.
The goal behind KM was to capture what was seen as the enormous, untapped potential of a company’s knowledge, including both formal intellectual property like patents and the know-how that employees brought to the table. For example, Division A might be developing a great technology that Division B could use, but without structures in place to get the two divisions to communicate, Division B might have to reinvent the wheel or even license someone else’s technology. As Lew Platt, a former CEO of Hewlett-Packard, is said to have put it, “If only HP knew what HP knows, we would be three times more productive.” KM also became a hot topic in university administration in the mid to late 2000s as administrators sought to encourage collaboration across disciplines and organizational units.
Despite the ambitious goal behind KM, a lot of KM-branded projects were quite simple in practice. Many centered on information technology, such as Lotus Notes collaboration software, internal discussion forums, and databases of frequently asked questions.
One firm that looked to KM for help was Buckman Laboratories, a chemical company in Memphis, Tennessee. Buckman had relied on “hiring Ph.D.s and putting them on airplanes” for spreading technical knowledge to new production facilities and licensees. In the tight labor market of the late 1990s, however, they “couldn’t hire enough and get them to run fast enough,” according to a Wall Street Journal profile.7 Buckman turned to KM-inspired technological solutions, including setting up message boards for employees and setting up a website with a directory of who knew what about how the company’s processes work, with the goal of helping other employees find the right person to ask questions.
Dow Chemical took KM a step further. In the late 1990s, the company’s newly appointed Chief Knowledge Officer Gordon Petrash set out to review its thousands of patents, find new uses for technologies across product lines, and license patents to other firms. Those efforts were a major success and brought in millions of dollars per year in licensing fees. Petrash began planning to bring together research scientists, engineers, managers, and patent attorneys to chart and replicate the process with even less tangible intellectual property: the company’s know-how.
Dow dove into a morass of philosophical, sociological, and economic questions. How exactly can we define “knowledge”? Can knowledge even be managed in any meaningful way? Can a company be said to learn, rather than just the individuals in it? Can an organization retain its knowledge even as employees come and go? How could Dow compel employees to share their unique, valuable knowledge and experience when that value is exactly what gives them job security? Was it a good or a bad thing if employees traded tips with friends working for rival firms? KM practitioners and scholars never truly answered those questions, but they formed the basis for a human-centered knowledge management that overtook tech-based solutions by the mid 2000s.
KM and tacit knowledge
One important goal of KM researchers was to capture “tacit knowledge,” a term more-or-less synonymous with know-how. A touchstone in the KM literature is Ikujiro Nonaka and Hirotaka Takeuchi’s 1995 book The Knowledge-Creating Company: How Japanese Companies Create the Dynamics of Innovation (figure 4), which argued that cultivating tacit knowledge was a major source of Japanese business success. For US industry still reeling from stiff Japanese competition in the 1980s, the idea was intriguing.
Cover of The Knowledge-Creating Company by Ikujiro Nonaka and Hirotaka Takeuchi (Oxford U. Press, 1995).
Cover of The Knowledge-Creating Company by Ikujiro Nonaka and Hirotaka Takeuchi (Oxford U. Press, 1995).
Nonaka and Takeuchi borrowed the term “tacit knowledge” from Michael Polanyi, a scientist and philosopher who wrote about the concept in the 1960s. For Polanyi, tacit knowledge meant the kinds of knowledge that we cannot necessarily express in writing. For example, it is hard to imagine a textbook on how to ride a bike.
The Knowledge-Creating Company—and many subsequent KM texts—largely ignored Polanyi’s notion that tacit knowledge was impossible to put into words. Instead, they saw capturing tacit knowledge into explicit forms like writing as a primary goal for KM. In a central example from the book, an R&D team designing a bread-making machine spent time interning with a master baker. The team found that a gentle twisting of the dough during the kneading was a crucial missing step for their machine-made bread, and they captured the twisting in their new design schematics. US companies, Nonaka and Takeuchi argued, should look to that Japanese example and seek out valuable tacit knowledge both within their firms and through partnerships.
The desire to render tacit knowledge into explicit forms such as corporate databases, instruction manuals, technical writing, and instrument design connects KM to the earlier history of patents, trade secrets, noncompete agreements, and know-how licensing. A 1999 Wall Street Journal article about KM captured the movement’s ambition in its title: “Know-how in the bank: How to be ready when key employees walk out the door.”8
Put another way, KM aimed to make industrial scientists and skilled technicians far more easily interchangeable. But that end goal unsettled many knowledge workers. As one 56-year-old machinist told the Wall Street Journal in 2002, “If I gave away my tricks, management could use [them] to speed things up and keep me at a flat-out pace all day long.”9
By the late 2000s, the buzz around KM had started to fade, though the use of KM approaches did not. Although KM was never at the center of the business world, it was widespread. The consulting firm Bain & Company found in surveys that KM was one of the management techniques of most interest to CEOs throughout the late 1990s and 2000s. At least 20 KM-focused academic journals are still in print today. Businesses remain acutely interested in capturing the tacit knowledge that a skilled industrial or academic physicist brings to the table, even if they have not yet found the tools to do so.
So why did the excitement die down? There are likely a few reasons. At a basic level, some of KM’s key insights, such as the value of encouraging employees to maintain informal social networks throughout the industry, became even more a normal part of business than they had been. The buzz around businesses using word processors and networked computers has similarly worn off, not because word processors failed as a technology but because they became too normal to draw much notice.
Some of the ambitions central to KM were arguably never achievable. It makes sense that corporations would be receptive to the idea that they should try to maximize the value of the firm’s collective know-how, but that turned out to be easier said than done. Early attempts centered on technological solutions, like having employees write out their knowledge in centralized, computerized databases, but such approaches clearly can’t capture all aspects of any job. In addition, few workers enjoy making themselves more replaceable, even if they have the communication skills to do so.
Knowledge management, then, was not a great existential threat to such skilled workers as research scientists. But the movement’s history shows how far businesses’ ambitions have expanded when it comes to controlling what’s in their employees’ heads. Business owners in early America disliked skilled technical workers moving on to new jobs with company secrets in tow, but they generally accepted it as a reality of doing business. Today, managers’ desire to control the knowledge scientists possess is usually thwarted not by the law but by the nature of knowledge: We often know more than we can say or write down. Science is a process built on human skills, not just a collection of facts, figures, and to-do lists.
The economic benefits of moving ideas
There are reasonable arguments in favor of giving employers more control over intellectual property and leverage over skilled employees. In principle, a company that knows its employees cannot take ideas with them will be more inclined to invest in training those employees and funding research. That might well mean more jobs available for trained physicists and other scientists and engineers. Without such protections, no one might want to fund research that everyone else can simply steal.
Those are not trivial problems, but ongoing research in economics and other fields is starting to provide strong evidence that skilled employees being able to move to new jobs has far more economic benefits overall, especially in terms of encouraging innovation and entrepreneurship.10 As knowledge workers seek out new jobs, they spread know-how and experience, and the spread of knowledge is at the heart of much scientific and technical creativity. Having a government-granted patent monopoly and locking down employees with iron-clad contracts might seem appealingly stable, but allowing more freedom of movement for people and ideas can be extremely effective for individual companies, be better for workers, and lead to a stronger overall economy.
There will likely always be limits to how much businesses can truly control scientists’ minds; many important things simply can’t be separated from the individual. A level of intuition and experience is necessary to know, for example, which tests to run, how to interpret data, or why a delicate process might be failing—things that cannot be captured in any database. However, scientists cannot afford to ignore the structure of law, the standards of business practice, or management techniques. It is entirely reasonable to think that businesses are within their rights to own what their employees develop, but it is also reasonable to favor the thinker’s right to the product of his thought and the worker’s right to the product of his labor.
Either way, we need to understand that the status quo is the result of centuries of political struggles, lobbying, and other historical change. It is not natural or inevitable, and it can and will improve or worsen. Businesses will continue to push for more control over knowledge, both in their home countries and through international treaties. Meanwhile, scientists will seek to retain their autonomy and their right to take their knowledge with them when they leave the office—even when they leave for good.
References
Douglas O’Reagan is a historian of science and a postdoctoral fellow at MIT in Cambridge, Massachusetts. He is the author of Taking Nazi Technology: Allied Scientific Espionage and Exploitation of German Technology After the Second World War, forthcoming from Johns Hopkins University Press.