The death of cleverness
In 1969, at age 16, Stallman was invited by IBM to its New York Scientific Center as a precocious if socially maladroit high schooler, to try out his first computer. "It was rather large," he says, "more than enough to fill this room. The program I wrote soon got too big for it."
After high school, Stallman headed to Harvard. (They'd offered more scholarship money than MIT, he reasons. Also: "The male-female relationship at Harvard was not quite as skewed as it was at MIT.")
Freshman year, however, Stallman landed a job at MIT's artificial-intelligence laboratory. "They had a computer with the biggest memory anywhere around," he says. "An entire megabyte."
It was there that Stallman became a lynchpin of the campus's fabled '70s hacker culture — an anarchic, collaborative free-for-all where programmers, in the spirit of "playful cleverness," gave a huge push forward to computing power. (There Stallman developed Emacs, the massively popular text editor, in 1975.)
Code was lent and borrowed and swapped and adapted, freely and without concern for ownership. "This was simply people working together, advancing human knowledge," says Stallman.
But by the early '80s, the huge lucre to be grabbed at the dawn of the PC era had lured away many of the lab's best minds and "the community mostly died."
Nowadays, Stallman clearly still has respect and affection for the brainiacs who populate MIT. (He isn't employed there, but maintains an office on campus as an unpaid research affiliate.) Just watch him on YouTube, doing the Soulja Boy dance with them on the quad.
But he notes that lots of kids these days "get distracted by the things they can do with their computer and don't look into how it works. And maybe that's because what they're used to is proprietary software that won't let them see how it works."
"This stuff that we call open source, which has taken over the IT industry and is the foundation of the Internet, was started by Richard," says Perens. "If we were to put a dollar value on it, it would be at least $1 trillion."
But, says Stallman, "I couldn't make up for the loss of my self-respect with any amount of money. What I saw in the early '80s, when the earlier software-sharing community died, when the only way you could buy a new computer and use it was with a proprietary-operating system, was a life in which people were trying to get as much power over each other as they possibly could." Pursuing a career in proprietary software, he once said, would be "making the world ugly for pay."
"There are some people who think the function of software is just to make a business money," says Perens. "Even some so-called open-source pundits. Richard thinks the function of software is to help people."
So Stallman set to work crafting an alternative. GNU — the name is a recursive acronym for "GNU's not Unix" — was conceived as the world's first entirely free operating system. It's been evolving (not unlike Wikipedia) via open and collaborative global tinkering for the past 25 years.
As one award given to the project noted, the "ubiquity, breadth, and quality of its freely available, redistributable, and modifiable software [have] changed the way the computer world operates."