Please note this is the second article of a two part series on this topic. For part one, please click here.
Open Source: Tools for Hippies and Fortune 500 CEOs
Open Source code companies are paradoxes: they are nonprofit foundations that support billion-dollar companies As such, Open Source serves as a wonderful bridge of an entire Silicon Valley history. Ideals of Open Source mostly started due to pragmatism in the 1950s, was subsequently bolstered by those wishing to maintain the hacker ethic in the 1970s, before affording some the chance to make billions off its unique value proposition in the 1990s. This section will seek to explain what Open Source is, and how it took this marvelous journey over the time period of this paper.
Eric Raymond, Open Source evangelist and engineer, likened Open Source to a bazaar: it is the dynamic creation of many different entities, who, despite not knowing the stalls on other ends of the structure, all strengthen the grander unit and therefore themselves. This structure stands in contrast to other top-down tech companies, which Raymond finds analogous to the construction of a cathedral: a rigid structure built based on the blueprint supplied by someone at the top of a hierarchy. As Steven Weber explains in his book The Success of Open Source, source code is written in programming languages that convey to computer hardware how software packages — for example, any application on your computer — should run. Most commercial applications solely sell binary code, which is just the strings of 0s and 1s that computers can use to interpret the source code, without users being able to dissect the original code. In contrast, Open Source code allows everyone to have the ability to read these original recipes, affording the community the ability to read, and subsequently improve, its own code.
The principles behind Open Source were not just ethical ideas, but also reasons why Open Source code could and still does have the potential to outperform proprietary code. Raymond rephrases collaboration as utilizing a law of large numbers, emphasizing that much like the MIT hackers in the 50s sharing code would allow people to save time by not having to recreate tedious code, Open Source could allow people interested in coding to try it as a hobby before a vocation. As Steven Weber posits, a chief theory behind Open Source is that people naturally want to be creative, and need a little more incentive than access to projects. Weber continues that, “The only times when innovation will be “undersupplied” is when creative people are prevented from accessing the raw materials and tools that they need for work”. Immediately, the word tool should jump out as a clear reference to the Brand Whole Earth movement: technology as a tool for one to free one’s mind. As such, the goal of intellectual property rights for Open Source companies was to maximize the growth and dissemination of the code. As Weber portends, “Open Source intellectual property aims at creating a social structure that expands, not restricts, the commons”. Once again, there is a clear reference to the first generation of hackers’ New Communalist roots by using technology to create social structures whose primary goal is to help regular citizens.
This setup does not mean Open Source code is given away for free. Richard Stallman, the MIT hacker who left the East Coast to find true hackers on the West Coast, explains that by making code non-proprietary and open to the public, it conveys the “freedom [and] the right to run the program for any purpose… to redistribute copies to others, and to improve the program and share your improvements with the community so that all benefit”. This both represents the hacker ethic, and is a recognition of problems facing early software production. With so many technical challenges needing to be solved for anyone to to even interact with computers, the more minds working on any given project provided the best opportunity for advancement. Open Source became a way for those who followed the hacker ethic to recognize a changing landscape: they could still hack for enjoyment and create tools to aid people but could do so by also creating formidable companies and organizations. Open Source companies give away part of their code for free, but as companies begin to build on it, the Open Source companies will charge for greater access to their system code bank and support. These for-profit Open Source companies in turn will pay the programmers who help advance the code. Eventual business models were still able to emerge because Open Source still firmly believed in intellectual property rights, it just thought it could accomplish tasks better as a community. As Weber elaborates, “programmers often explain it with simple short- hand: when you hear the term free software, think “free speech” not “free beer.” Or, in pseudo-French, software libre not software gratis”. Like the First Amendment, Open Source use had restrictions to make it a viable competitor to proprietary code companies such as Bill Gates’ Microsoft.
The high cost of computing was a key variable in the origins of the Open Source movement. The first commercial computer on the market, the IBM 701, was released in 1952 at the extraordinary cost of $15,000 for an hour of use. Furthermore, the 701 was essentially an open sandbox; users would have to physically build much of the software needed to run different applications, with the code for processing radar images approximating a staggering 80,000 lines. With cost constraining time on the computer, it behooved the 1950s MIT researchers discussed earlier to work as much together to build out such programs to help everybody maximize their computer time. Of specific interest was to build a compiler, software which could interpret different programming languages to render the proper output on the computer. As Weber explains, “The engineers facing this challenge recognized the obvious next step: get everyone together who was using the machine, regardless of what company they worked for, and build a compiler that everyone could use.” While this perhaps is more representative of the collaborative, sharing culture of the first-generation hackers, it sets a salient precedent that would be expanded when computers became more affordable in the early 1970s.
Due to the proliferation of technology, Open Source groups in the 70s were able to build operating systems (OS) to power computers. With the spread of the PDP-11 minicomputer in 1971, which “disseminat[ed] computing power to “the masses”’, communities of hackers continued the MIT-collaborative approach to build accessible computer programs. This Open Source group success is admittedly partly because AT&T, the telecom giant many thought would seek to build powerline equivalents for the early internet, was immensely worried about breaking their recent antitrust ruling, which barred them from partaking in any activity not common for a telecom carrier. Due to this legal framework, but not wanting to miss out on the newly created internet, a strange marriage was made between corporate AT&T and Berkeley hackers. Carefully adhering to the strict liability conferred on the company, AT&T decided that instead of selling an OS, called Unix, for computers they had created, it would instead license Unix to Universities for no royalties, but with no assistance in fixing bugs.
This was a dream environment for Bill Joy to encounter when he arrived at Berkeley; he worked with a group of hackers to re-write Unix in the more accessible language of C, building the BSD operating system. As OS powers entire computers— they interpret data, store files, and run programs— making Unix and BSD both powerful inventions.
A third Open Source OS was also created in the 1980s by Richard Stalmman, whose model cements the clear rift between Open Source company ideology and proprietary business models that were emerging in the late 1970s. Stallman departed the MIT lab in 1984, infuriated by the seemingly corrupted mindset of third generation hackers, those who viewed technology as something that could be individually owned. This was antithetical to Stallman, a man who protested the spread of copyrights in 1983 by saying ‘“I don’t believe that software should be owned… Because [the practice] sabotages humanity as a whole. It prevents people from getting the maximum benefit out of the program’s existence.”Stallman, who Steve Levy refers to as the last true hacker, created the Free Software Foundation, an Open Source operating system that had slight technical differences than Unix, but was more so a political statement. Due to decisions by the anti-regulation Reagan Supreme Court, AT&T began dramatically increasing the price of their Unix license. In honor of this anti-hacker decision, Stallman would name his OS GNU, a recursive acronym for GNU’s Not Unix. Due to a convoluted legal and technical rift between GNU, BSD, and Unix, there was an opening for a new Open Source OS by the end of the 1980s, a hole which Linux would fill.
Linux is a bit out of the time period of this essay, but the success of companies built off of Open Source warrant a brief discussion. Created by a Finnish computer hobbyist in 1991, by 2000 Linux would power over one-third of servers in the world and see two companies (VA Linux and Red Hat) that packaged programs of Linux have billion-dollar IPOs. Red Hat started by selling a package of Linux tools accompanied by documentation and support for $50. As they signed larger clients such as NASDAQ and IBM, the company began moving towards a software as a service model in the late 1990s, offering clients more customized packages. In 2019, Red Hat was acquired by IBM for $34 billion, sells scalable programs built on Linux, and pays top Open Source contributors on their platform. The success of Open Source is as clear in 2019 as it was in the 70s. This ability to prosper without closed, proprietary code or data harvesting should not be forgotten today.
Microsoft Diverges from the Hacker Ethic
If Open Source beginnings saw people like Bill Joy draw on Homebrew-style hackers for their products to become viable, the parallel growth of Bill Gates’ Microsoft had just one ask of hackers: to leave their code alone. A People’s Computing Company newsletter from May 1976 prints both of Gates’ impassioned epistles to the community. In the first, Gates laments that he had earned less than 10% of revenue on the Altair BASIC computer interpreter he had built. In a way, Altair was like the compilers MIT hackers built open source in the 1950s. The difference: those who did it at MIT were funded by the University to pursue computers as a hobby, whereas Gates was trying to build a company based on the sale of proprietary code. If hackers were simply giving the program to each other, Gates’ hours put towards the company would not be properly compensated. “As the majority of hobbyists must be aware, most of you steal your software,” Gates bluntly wrote, before sarcastically stating, “Hardware must be paid for, but software is something to share. Who cares if the people who worked on it get paid?” In the second letter, Gates implored the hackers to consider that “the marketability of software to hardware companies is questionable when software is so freely shared among hobbyists”. The Bay Area hackers convening in Jordan Hall were likely as perplexed that Gates was trying to profit from computer technology as Gates was that they were not.
These contrasting views were intractable. Steven Weber remarked that “This was not a marginal disagreement or quarrel over how to interpret rules about intellectual property. Rather it was a clash between two distinct and incompatible cultural frames.” Weber adroitly points out the philosophical differences of computers as tools for artists versus devices for companies to sell, but also notes the technical question of where value add actually happened in hierarchies of computing. No mediator was called in. Gates would pursue Microsoft as a proprietary-code, software as a product, company, laying the foundation for what has become a trillion-dollar empire.
Companies Follow the Microsoft Business Model
As computing technology became more affordable, companies arose in the Microsoft model of selling software products for a one-time, upfront fee, pioneering the software as a product model. By the 1980s, the technology afforded companies the chance to sell a plethora of different products. VisiCalc offered a revolutionary digital spreadsheet— a predecessor to Microsoft Excel of today— LSI Logic offered one of the first computer aided design (CAD) programs, while a team of Xerox engineers worked on a keychain to keep track of a user’s most relevant digital information. While all of these companies followed the model of selling software for an immediate and one-time revenue, it is clear by examining internal company memos and numbers why these companies would, like Red Hat, eventually find selling their proprietary code as a recurring software as a service charge to be more profitable. Benjamin Rosen, who wrote about tech companies for Morgan Stanley in the 1980s, said on Microsoft’s growth in 1982 that “There’s no way Microsoft can become the success it looks to be without abandoning some of the charm and becoming more structured, more organized, and more market-oriented”. Even if hackers thought Microsoft was anti-hacker ethic in 1976, the financial world thought they were too enthusiast-focused to build a scalable business during that time.
VisiCalc, the aforementioned transformational computer application, is a wonderful example of how companies followed the Microsoft software as a product model as the spread of mainframe computing afforded companies a chance to build more diverse businesses on top of the computing hardware. Carl Helmers, a business leader interviewed by Byte Magazine in 1979, said that VisiCalc “has to be one of the neatest software innovations of 1979, if not the most fundamental new concept to date in personal computing field”. The company was made to work on multiple operating systems and was one of the earliest companies to show software applications for business settings. VisiCorp, its parent company, said in an overview memo of VisiCalc’s business offerings that it offered “a powerful planning and forecasting tool” in which “you can also examine various alternatives.” This was truly paradigm-shifting technology in that it brought computer applications out of the Homebrew meetings in Jordan Hall to working professionals. VisiCalc was sold as a bundled software package or standalone; a September 1st, 1979 product sheet by Personal Software Inc— a distributor of software for Apple, Commodore, and TRS-80 computers— listed VisiCalc for $99.50.
This sale of software reflected a change of technology companies from simply building the hardware to offering services on top of said units. An Interface Magazine February 1976 chart showed the growth of services from $5 billion in 1975 to nearly doubling to $9 billion just a year later. Of the $9 billion, about two billion was from services of proprietary software, though that category was growing the fastest at 35%. Benjamin Rosen said VisiCalc was “perhaps the most important and most general purpose piece of software ever developed for the personal computer industry”. While the company would be bought by Lotus in 1985, Rosen’s prediction that spreadsheets “could wind up as one of the most ubiquitous of all software packages” has turned out to be largely true, even if spreadsheets would eventually move to a more recurring business model.
This software as a product business model was continued under LSI Logic, a company which pioneered computer-aided design (CAD). LSI Logic targeted technically experienced customers, as computing power was still fairly expensive (at around $15,000 a month); the company offered services such as logic simulation, delay patch simulation, cell placement and automatic routing. In business plans left to Stanford by company engineer Rob Walker, the company offered these services for an upfront fee, despite their packages offering substantial long term value. For example, their commercially-ready test verification program was able to “check the effectiveness of these input patterns in detecting classes of faults in the array” by “using inputs from logic simulation and other sources”. Once again, one can understand how in eventual recessions or when profits slowed, this business model left revenue open by changing to a recurring revenue model, where the product would be licensed annually and need to be paid to renew.
Regardless of if the software was sold for a one-time fee or recurring charge, the predominant takeaway from these models is that their code was a black box. Unlike open source code, these proprietary products did not allow users to view the true source code. Perhaps this business model was due to the fact that more commercial customers did not desire to tinker with code foreign to them, but it is a model that regardless represents an erosion of both the hacker ethic and New Communalist desire for technology to be a tool for liberation from the workplace, not a tool to increase productivity on the job.
Unsurprisingly, the ultimate tool for self-liberation, a personal computer for the masses, was developed by scientists mere miles away from the Whole Earth Catalog offices. Research by Xerox PARC scientists would alter daily human life when in 1972 they produced Alto, the first ever personal computer. However, management did not see potential in selling it to the public, instead donating the devices to universities. A 1981 Byte Magazine article relayed “It is unlikely that a person outside of the computer-science research community will ever be able to buy an Alto. They are not intended for commercial sale, but rather as development tools for Xerox, and so will not be mass-produced”. Five years after the Alto was developed, and after multiple visits to PARC, Steve Jobs and Apple would release the Apple II, which as Ben Rosen would write the same year would change “the personal computer industry from a hobbyist-oriented market to a far broader one.” Despite tapping into commercial software applications, however, Apple would maintain many aspects of New Communalist movement’s ethos. The company utilized open source code, invited other developers onto their platform, and presented as anti-big government, but questions still linger as to if this jockeying was for sales or was to prove that Brand’s vision was still at the heart of the Valley.
The Paradox of Apple
Though technically selling hardware, Apple was founded to bring the software tools of computing to the masses. Co-founder Steve Wozniak was a frequent attendee of Homebrew meetings, and following the ethos of sharing developments to advance hacking, gave away his BASIC code for free to anyone who purchased a device that could connect cassette records to computers, and put in print the ability to dive into the memory of his 6502 monitor for anyone to see. Furthermore, an early Apple ad proudly declared “our philosophy is to provide software for our machines free or at minimal cost.” Even when Apple began commercially selling their computers, the ultimate mission was to offer an affordable computer so that people— and not just the military and academic departments— could access the liberation of computing. A 1981 Macintosh business plan places computers into four bands, with IBM occupying the top (devices which sold for between three and five thousand dollars); the Macintosh was in band two, priced between five hundred and fifteen hundred dollars, a much more accessible price point.
Bringing computing to the masses opened a proverbial Pandora’s box for new businesses. The devices built on Apple computers became known as Apple World, and those CEOs in the early 1980s followed a version of the hacker ethic, believing that “it was time to scrap the divisive practices of corporate business and adopt a more hacker-like approach”. While there was still a price attached to these products, there was still an underlying belief that the world would be better and more creative with the more people that had access to Apples and their built-in programs. While this could have been a marketing tactic, Microsoft and IBM were focused on selling into corporations, whereas Apple’s primary consumer was everyday, creative users. To underscore just how willing Apple was to work with outside vendors— an embrace in a way of Open Source ideology that the more minds working on a problem the better— the Apple II had a staggering 89 Independent Software Vendors who produced a prodigious 659 programs. However, this hacker ethic encountered problems in the mid 1980s.
Even though Wozniak and Apple engineers attended a 1984 conference at Fort Cronkite convened by Stewart Brand, business and New Communalist thought were arguably already firmly departed. As Apple hired seasoned executives, early employee John Draper fell into trouble due to his idiosyncratic tendencies: as the business wing of the company felt, “Apple was not a showcase for tricks; this was not Homebrew.” It is undeniable that the year 1984 saw Apple boast claims of which Stewart Brand would approve. Apple’s Super Bowl commercial that year triumphed that their computer would help bring people away from a bureaucratic government machine, and break through into freedom, ending with the line “Don’t let 1984 be 1984,” in reference to George Orwell’s famous book. The same year, Stewart Brand would deliver perhaps his most famous line at the Fort Cronkite conference attended by the likes of Steve Wozniak and Richard Stallman: “On the one hand, information wants to be expensive, because it’s so valuable. The right information in the right place just changes your life. On the other hand, information wants to be free, because the cost of getting it out is getting lower and lower all the time. So you have these two fighting against each other”. Open source for-profit companies capture this ethos well: their information is free to a threshold at which point it costs money. Proprietary code, closed models simply made information expensive. The data-driven models of Google and Facebook today are a perversion of the quote: the services and information within them coming free actually comes at great cost.
In the Stanford special collections, Liza Loop left 1976 editions of the People’s Computer Company publication, where the human is explored in typical Brand cybernetic thinking as a “human biocomputer.” In a design that has wavy writing best described as hippie-influenced, the blurb describes that “The human brain is assumed to be an immense biocomputer, several thousands of times larger than any constructed by Man from non-biological components”, accompanied on the same page with a written encouragement of brain power expansion via LSD. The article continues by declaring “It is hard to compare the operations of such a magnificent computer to any artificial ones existing today because of its very advanced and sophisticated construction”. This is not to say everyone in the 1970s believed AI would never be able to do tasks better than the human brain— in fact, another article in the People’s Computer Company presciently posited the question if we should ban AI altogether due to the fear of what it could become. And furthermore, one must recall that the prohibitive cost of computing even into the 1970s made many use cases today– namely business models relying on individuals to possess multiple personal computing devices– as impossible to imagine as AI outperforming humans.
However, that answer would have been difficult to answer in 1976, because there would have been little way to conceive of what AI would become. There was similarly no way to predict that a company named Facebook’s leveraging of the ubiquity of personal computing would pit them as an enemy against democracy and lead to congressional hearings. The business models that followed Microsoft mentioned above also includes selling future updates to their products; recurring, annual revenues for software would have been possible to predict as a natural extension to how they sold their proprietary code– even Open Source company Red Hat has moved to this model– but influencing behavior via cookies-collected data was not. And perhaps this is the paradigm of technology. Innovation is meant to change the world by bringing to life creations thought impossible. This drive has led to truly transformational inventions including, but not limited to AI. But while platforms like Google and Facebook offer their services for free, it is certainly not the free Brand envisioned. As most companies have followed the proprietary code model, society is unable to see and audit what these companies are optimizing their businesses to do. Many believe that Facebook and Google’s code prioritizes time spent on their platforms and clicks; therefore, their source code incentivizes the spread of more inflammatory rhetoric. What originally made hacking, and open source, freeing was that there essentially were no secrets or walls put up between those with access to the tools.
The first generations of hackers opened a great unknown. And, true to form, they essentially made the password “empty string”. As such, the Stewart Brands and Lee Felsensteins of the world have a right to be upset that computers are no longer viewed as tools for personal liberation, but they perhaps should not be surprised. The surveillance capitalism that is today synonymous with top tech brands and products was not created linearly, and while it may be a wrong turn for society, it was a diverging path on the trail Brand and Felsenstein helped create. In hindsight, this should not be shocking: those who worked on early computer-related technology wanted to create tools that people could use to leave society; if they wanted later users to follow their hacker ethic, perhaps they should have written their own user agreement.
Eric Raymond, The Cathedral and the Bazaar, Knowledge, Technology & Policy 12, no. 3 (1993).
 Steven Weber, The Success of Open Source, (Harvard University Press: 2004).
 Weber, 75-77.
 Steven Weber, The Success of Open Source, (Harvard University Press: 2004), 84.
 Ibid, 85.
 Free Software Foundation, “The Free Software Definition,” Boston: Free Software Foundation 1996, atwww.fsf.org/philosophy/free-sw.html.
 Steven Weber, The success of Open Source, (Harvard University Press: 2004), 5.
 Paul N. Edwards, The Closed World: Computers and the Politics of Discourse in Cold War America. (Cambridge, Mass.: MIT Press, 1996), 124.
 Steven Weber, The Success of Open Source, (Harvard University Press: 2004), 21.
 Ibid, 23.
 Ibid, 28-30.
 Levy, 365.
 Steven Weber, The Success of Open Source, (Harvard University Press: 2004), 53.
 Weber, 199.
 Lauren Feiner, “IBM Closes its $34 Billion Acquisition of Red Hat”, CNBC, July 9 2019, https://www.cnbc.com/2019/07/09/ibm-closes-its-34-billion-acquisition-of-red-hat.html.
 Bill Gates Open Letter to Enthusiast, “Liza Loop Papers 1972-1986”.
 Bill Gates Open Letter to Enthusiast, “Liza Loop Papers 1972-1986”.
 Fascinatingly, in a leaked 1998 internal memo (by Eric Raymond) which became known as the Halloween Document, Microsoft explores Open Source as a viable threat. Written by Vinod Valloppillil, a product manager, admits Open Source was “long term credible” and a “direct, short-term revenue and platform threat to Microsoft,” going as far as saying that “Linux and other OSS [open source software] advocates are making a progressively more credible argument that OSS software is at least as robust—if not more—than commercial alternatives”.
 Steven Weber, The Success of Open Source, (Harvard University Press: 2004), 37.
 “Microsoft Matures”, The Rosen Electronic Letters 1979-1988, 1982, 102661121, Box 1, Folder 3, Computer History Museum Archives, Fremont, California.
 Opinion Leaders Thoughts on VisiCalc, “Liza Loop Papers 1972-1986”, Box 1 Folder 3.
 VisiCalc Business Offerings, “Liza Loop Papers 1972-1986”, Box 1 Folder 3.
 Personal Software Inc VisiCalc Order Form, “Liza Loop Papers 1972-1986”, Box 1 Folder 3.
 Interface Magazine Services Industry Growth chart, “Liza Loop Papers 1972-1986”, Box 1 Folder 6.
 Thoughts on Visicalc, The Rosen Electronic Letters 1979-1988, 1980, Box 1 Folder 4.
 LSI Business Plans, “Rob Walker Papers 1959-2005”, Box 4 Folder 3.
 Thomas Walker, “The Xerox Alto Computer”, Byte Magazine, Volume 06 Number 09–Artificial Intelligence, September 1981, 59.
 The Genesis of Apple, The Rosen Electronic Letters 1979-1988, Box 1 Folder 1.
 Ibid, 215.
 Apple, “Preliminary Macintosh Business Plan”, digitally published by Computer History Museum Archives, 12 July 1981, http://archive.computerhistory.org/resources/text/2009/102712692.05.01acc.pdf.
 Levy, 268.
 Applications Software Availability, The Rosen Electronic Letters 1979-1988, Box 1 Folder 3.
 Levy, 231.
 Apple, “Introducing Macintosh Computer”, Advertisement aired on TV January 22nd 1984, 1 minute 3 seconds.
 Turner, 136.
 Human Biocumputer by John. C. Lilly, Liza Loop Collection, M1141, Box 1, Folder 10, Stanford, Special Collections, Stanford University Archives, Stanford, California.