The Lawerence Berkeley Hack, Part 1

Four decades ago, three quarters would’ve gone a lot further than they do today. With that kind of loose change you could’ve picked up some milk from the grocery store, or over half a gallon of gas, or a bus ticket. But that doesn’t explain why, on one fateful day in 1986, a systems administrator at the Lawrence Berkeley National Laboratory in California made such an issue over 75 missing cents.

Hosted By

Ran Levi

Co-Founder @ PI Media

Born in Israel in 1975, Ran studied Electrical Engineering at the Technion Institute of Technology, and worked as an electronics engineer and programmer for several High Tech companies in Israel.
In 2007, created the popular Israeli podcast, Making History, with over 15 million downloads as of July 2022.
Author of 3 books (all in Hebrew): Perpetuum Mobile: About the history of Perpetual Motion Machines; The Little University of Science: A book about all of Science (well, the important bits, anyway) in bite-sized chunks; Battle of Minds: About the history of computer malware.

The Lawerence Berkeley Hack, Part 1

In March, 1986 the annual CeBIT expo took place in Hannover, Germany. At the 4.8 million square meter Hannover fairground, computer enthusiasts from around the country, and Europe, and the wider world, gathered together to talk information security.

Later, at an after-hours party, two young hackers — who were, as Der Spiegel put it, quote, “in a hashish mood” — floated an idea to their friends. A way to make money off their unique set of skills. If they broke into certain targets, their argument went, they might find the kinds of information that could be valuable to a…certain buyer.

Not long after, some of those young men took a three hour drive west, entered a wide, impressive looking building surrounded by a tall, black metal fence, and struck a deal.

The Case of the Missing 75 Cents

Four decades ago, three quarters would’ve gone a lot further than they do today. With that kind of loose change you could’ve picked up some milk from the grocery store, or over half a gallon of gas, or a bus ticket. But that doesn’t explain why, on one fateful day in 1986, a systems administrator at the Lawrence Berkeley National Laboratory in California made such an issue over 75 missing cents.

Lawrence Berkeley is a Department of Energy-affiliated lab, employing thousands of scientists in cutting-edge research. 75 cents would’ve made not one iota of difference to anybody, especially because, back then, computers were so rare and so expensive that to use one of the lab’s dozen mainframe computers for an hour cost hundreds of dollars. It added up to thousands of dollars a month, of course. On this particular month, their thorough accounting system showed a tiny, 75 cent discrepancy between how much time LBL’s scientists used the computers, and how much they were billed for it. Equivalent to just a few seconds of use.

Clifford Stoll — 35 years old, skinny, brown hair shaggy in that Albert Einstein kind of style — was only two days on the job when the systems administrator — his boss — walked into his office, mumbling about a few missing quarters.

“First degree robbery, huh?” Cliff joked.

Cliff was hardly thrilled about being part of the computer club. He’d been happily researching astronomy at the lab, until his grant ran out and he needed some money. So they transferred him from the Observatory on the top floor of the building, down to the basement, where they needed a third person to help with the dozen mainframe computers they ran 24/7. Cliff didn’t have much experience with computers, and they hardly compared with studying the planets and the stars. 

You figure that, compared to discovering the secrets of the universe, he wouldn’t have been thrilled about chasing after a 75-cent computer error. But this minor accounting problem was something of a scientific problem. In his book, The Cuckoo’s Egg, he explained how it triggered that instinct in his brain. “[A]n error of a few thousand dollars is obvious and isn’t hard to find,” he wrote. “But errors in the pennies column arise from deeply buried problems[.]”

His boss, Dave Cleveland — a middle-aged systems developer with receding white hair and thick, square glasses — assigned him to discover the root cause. “Figure it out, Cliff,” he said, “and you’ll amaze everyone.”

The Error Found

After one look at the lab’s accounting software, it couldn’t have been hard to imagine an error cropping up somewhere.

It was a Jenga tower of different programs, probably created by different generations of programmers over time who hadn’t communicated much with one another. It was written in all of the most archaic programming languages — Assembler, Fortran, Cobol — and each bit had its own little, independent function. There was the program which collected and stored data, the one that read the data and converted it to charges, the one that collected the charges and converted them into bills for different departments, and, finally, the one that checked the other programs’ work. Then there was the entirely separate accounting system that did largely the same thing, in less detail. Somehow, this house of cards had worked as planned for years, until now.

Cliff wrote up a short program for verifying the accounting files, then used it to check program number one. Program one was working fine. Program two, also, no problem. Program three, four — there were no counting errors anywhere, it seemed. The software wasn’t leaking. “I got stubborn,” Cliff recalled. “Dammit, I’d stay there till midnight, if I had to.”

It took until around seven o’clock at night to find the discrepancy: one user who wasn’t quite like the others. An account with the name “Hunter,” which had no valid billing address. Somebody must’ve incorrectly added this account to the system, and used a few minutes of computer time without a means of being charged for it.

Problem solved. Cliff left to celebrate over late night cappuccinos with his girlfriend, Martha.

The next day, Cliff informed Dave that one of the systems managers must’ve inadvertently added a user “Hunter,” who caused the error. His hypothesis was quickly dispelled.

In fact, according to the rules of the system, even privileged users like Cliff or Dave couldn’t just create a new account out of thin air. The computer did it automatically, using special programs that covered all the bits necessary, including billing information. So a new account with a necessary field left blank wouldn’t have been authorized in the system to begin with. After some deliberation, everybody agreed: on one hand, nobody could’ve created the account, and on the other hand, the system couldn’t have created an account with an error like that.

It was strange. Anyway, Cliff removed the aberrant account. Good enough.


Another day passed, and the lab received an email.

It came from an unknown systems administrator in Maryland. The message was alarming: someone from Lawrence Berkeley, they said, had hacked into their network.

Suspicious. Who was this person anyway? Cliff looked around and, lo and behold, found another discrepancy in the LBL accounting system. This time it wasn’t Hunter, but the culprit was obvious. Only one user was logged in at Lawrence Berkeley at the time of the supposed breach: username “Sventek.”

Cliff went back to his boss, to report that he’d found a hacker. Dave was unimpressed. Joe Sventek, Dave explained, was a respected professor. Not to mention, he’d long since moved from Berkeley to the U.K. And another colleague chimed in: he’d heard that Sventek was currently on vacation, quote, “hiding out in the backwoods, far away from any computers.” End quote.

So Cliff was wrong yet again. “Had I screwed things up when I poked around last week? Or was there some other explanation?”

It was probably some troublemaker student, Dave said, breaking into their computer systems and hijacking accounts to cause a hassle for the poor systems admins. In a PBS documentary, Cliff remembered his frustration growing. Quote: “I wanted to teach this guy a lesson.”

So, how do you catch a hacker in 1986?

Cliff had an idea of where to start. He went to his personal terminal at the lab, and programmed it to beep, audibly, any time a new user logged in.

He watched as his colleagues upstairs logged in, working on their research papers and dissertations. Most of the pings were from strangers, though, accessing their documents, processing data, all the usual stuff.

Then, at 12:33 P.M… 



Cliff hurried to rein in his catch — to see what the hacker was doing, where they were coming from. But in less than a minute, Sventek logged back off. It was almost as if he knew someone might be watching. 

He’d left just one trace behind: the port he’d used to connect into the system, number 23. Something to go on. Barely.

A proto-keylogger

LBL’s computers, because it was the 80s, communicated remotely with one another via telephone lines. Unlike modern cables, phone lines could only be used one at a time. So, in order to meet the needs of its community, Lawrence Berkeley ran 50 of them at once. Over 500 terminals had connected to those 50 lines, over time, and Cliff’s first step in tracing Sventek was to figure out if he was coming from one of the lines in the building, or one of those that led outside of the lab, to the wider world.

Cliff recalled the sight when his colleague, Paul, a hardware technician, revealed the network of lines running under the floorboards of the facility. Quote:

“In this roomful of wires, the telephones, intercoms, radios, and computers were all interconnected by a tangle of cables, wires, optical fibers, and patch panels. The suspicious port [23] entered this room and a secondary computer switched it to one of a thousand possible terminals.”

To determine whether the user connecting to this labyrinthine system was coming from within the lab, or using one of the dial-in lines from the outside, would be damn near impossible. “The next time I saw a suspicious character,” Cliff explained, “I’d have to run over to the switchyard and unwind the connection by probing the switching computer.” And if Sventek were only logging in only for a minute at a time, it’d be no use.

One option to solve this problem, was to write a program that could record network traffic. But if a hacker were smart enough to understand and hack into their systems, the admins worried, perhaps they were also perceptive enough to pick up on something like that. Indeed, later on, this assumption would prove correct.

Paul, the hardware technician, had a different idea. Rather than identify when Sventek logged in, rush to the control center, and reverse engineer which phone line it was coming in on, they could monitor each connection, all at once, all the time, without a computer program. Quote:

“About the only other place to watch our traffic was in between the modems and the computers. The modems converted the tones of a telephone into electronic pulses, palatable to our computers and the daemons in their operating systems. [. . .] A printer or personal computer could be wired to each of these lines, recording every keystroke that came through.”

A proto-keylogger. Except, where would they find a printer and computer for every single one of those lines?

Cliff could use the terminal from his office, that’s one. He went to a few of his colleagues and asked to borrow from them. Still far from what he needed, he had only one option.

Around 5 P.M. the lab began to clear out, as people headed home for the evening. Cliff stayed behind. 5:00 turned into 5:30, then 6:00. As soon as just about everybody was gone, “I walked from office to office,” Cliff wrote, “liberating personal computers from secretaries’ desks. There’d be hell to pay on Monday, but it’s easier to give an apology than get permission.”

It took all night for Cliff to find, disconnect, and carry four dozen stolen computers and printers down to the basement, then repurpose each one to record every bit of information that passed through its own, dedicated inbound connection line.

When he was finished, he unrolled a sleeping bag, took off his shoes, grabbed a thermos of tomato juice, and went to sleep on the floor.

The next morning, Cliff woke up to the sound of confused scolding.

“I understand there’s some equipment missing from around the lab,” said the director of Lawrence Berkeley. Cliff reenacted the exchange in a 2011 interview to AT&T. 

Blinking his eyes open, laying on the floor, he looked up at the man towering above him. “I don’t know anything about it,” Cliff replied.

In arm’s length, 50 computers and printers were still humming along, printing out all the activity through every line into the network. “Well,” the director said, “it would be a right, neighborly thing if they were, sort of, returned?”

A Record of the Hack

Half awake, Cliff walked each of the computers and printers back to their rightful owners. Among all the printouts were 49 useless records of what’d happened on the network that evening. And one roll of paper that stood out from the rest for being, according to Cliff’s book, around 80 feet long. In the postmortem, Cliff explained, quote: 

“These printouts proved essential to our understanding of events; we had records of his every keystroke, giving his targets, keywords, chosen passwords, and methodologies. The recording was complete in that virtually all of these sessions were captured, either by printer or on the floppy disk of a nearby computer.”

Cliff now possessed a precise, detail-by-detail account of how a hacker had completely compromised the laboratory computer system, and what damage they’d done with that power.

If ever a physicist at Lawrence Berkeley were to email a file to a colleague, they did so using a program called “GNU Emacs.”

Emacs was a text editor on steroids. Its manual described it as, quote, “the extensible, customizable, self-documenting, real-time display editor.” A highly customizable platform, it let users do just about anything, like combine one of 10,000 possible commands into macros in order to automate certain tasks, or develop new plug-ins to extend the functionality of the program.

Among Emacs’ many functions was the ability to send files inside the network. It was a very basic software, though: If you wanted to send somebody a file, all you did was rename it to belong to them, whether they wanted it or not.

So say, for example, that I ate shawarma with fries for lunch. There’s no way I’m getting up from my chair for at least an hour after that, especially if I had hummus on it. But I have some information I need to send to my colleague, Dani, who’s a couple doors down in our offices. I can simply create a file in the Emacs system, rename it to belong to Dani, and now it’s in his account. That’s it.

What the technicians at LBL didn’t realize was that, in addition to sending files peer-to-peer, Emacs also allowed a user to send a file to the systems area of the Unix system it was running on – what we’d call ‘admin area’ nowadays. 

This was unusual. We’re talking about the lowest, most powerful and, thereby, most sensitive layer of a computer. Only the most privileged administrator should be allowed inside, in theory — no other users, and no random software programs. Emacs, however, used the “Set-User-ID-to-Root” function in Unix to stick one foot in that otherwise protected space.

Knowing this, the hacker wrote a shell script and assigned it  to the systems area of the LBL network. They named the file “atrun.”

Atrun was the name of an existing, default program that ran once every five minutes on Unix computers. It automatically performed mundane tasks that users didn’t have to think about, from the root level of the machine.

The hacker sent their atrun into the root of the LBL network through Emacs. At the next five minute mark, the system ran it, thinking it was its own, legitimate program. This malicious atrun told the system to grant its creator the powers of a superuser in the network.

In other words, the hacker didn’t merely gain access to the network. He was now, in effect, running the network.

“This much is for sure,” Cliff thought, “I was now dedicated to catching this hacker.” 

“Him Against Me”

To help Cliff track the hacker’s movements without needing 50 printers, one of the lab technicians devised a logic analyzer — a computer they could latch on to the network to periodically track activity, without making a digital imprint.

Still, overseeing the analyzer meant that Cliff had to stay glued to his computer terminal, 24/7, waiting for his adversary to show up for a minute at a time. He spent day after day sleeping on the floor of his office, and Sventek didn’t even show. “If only my computer would call me whenever the hacker appeared,” he thought, “then the rest of the time would be my own.”

Finally, he had an idea. He headed down to a hardware store to get a pager (a pager, if you’re under 30, is how cavemen like myself used to text before the Stone Age). Back then you could rent one, at a cost of $20 a month. He wrote a program to ping the pager, if one of the hacker’s dummy accounts was picked up by the logic analyzer. Further, it would send the alert in Morse code, so that Cliff could, even if he was miles away, know exactly which account the hacker was using and the telephone line he came in on.

Any time the hacker even touched his network — for any move he made, at all — Cliff would know it within seconds.

“It was him against me now. For real.”

One day, just after 5 PM, Dave visited Cliff’s office. He had noticed one very minor detail — a specific command the hacker had used: ‘ps -eafg.’

This was a bigger deal than was immediately obvious.‘ps’ just means ‘print status,’ and each of the other letters – commonly called ‘flags’ – modify it in one way or another. The ‘e’, ‘a’ and ‘g’ flags weren’t so interesting – but the ‘f’ flag was, as Dave explained, quote,

“not in any Berkeley Unix. It’s the AT&T Unix way to list each process’ files. Berkeley Unix does this automatically, and doesn’t need the f flag.” 

It’s like that scene in Inglorious Basterds, where Michael Fassbinder holds up his three middle fingers when ordering a round for the table, instead of his thumb, index and middle, in the German way. The hacker used an old-fashioned command still used elsewhere, like on the East Coast, but not in or around where they were. and so – 

“From a single letter, Dave ruled out the entire computing population of the West Coast. Conceivably, a Berkeley hacker might use an old-fashioned command, but Dave discounted this.” 

As they tracked Sventek’s behavior further, in the days and weeks that followed, they realized that anything they’d seen the hacker do thus far was just one small part of a far bigger plot.

Take one case, where Cliff watched the hacker take advantage of Unix, capturing the files that stored user passwords in the system. They were encrypted, but publicly readable. Quote:

“We observed him downloading encrypted password files from compromised systems into his own computer. Within a week he reconnected to the same computers, logging into new accounts with correct passwords. The passwords he guessed were English words, common names, or place-names. We realized that he was decrypting password files on his local computer by successively encrypting dictionary words and comparing the results to password file entries.”

The hacker repeated this behavior at organizations beyond Lawrence Berkeley too, with dummy accounts just like Hunter and Sventek. He wasn’t dropping malware, he was raising an army of sleeper cells — entry points into networks around the country. Quote:

“On one obscure gateway computer, [the attacker] created an account with system privileges that remained untouched until six months later, when he began using it to enter other networked computers. On another occasion, he created several programs that gave him system-manager privileges and hid them in system software libraries. Returning almost a year later, he used the programs to become system-manager, even though the original operating-system hole had been patched in the meantime.”

This wasn’t merely a cyber attack against Lawrence Berkeley, and maybe another lab on the other side of the country. 

The reality of what was really going on revealed itself at 7:51 AM on a Wednesday in September. The hacker logged into the Lawrence Berkeley network, this time for six minutes. Cliff wasn’t at work yet, but a printer recorded the details.

The hacker entered in as Sventek, then used LBL to connect to a distant IP address located in Alabama. He logged into that computer with the username “Hunter,” and checked for an instance of Gnu-Emacs. 

That IP address just happened to belong to a U.S. army base. Cliff recalled in the PBS documentary how, quote, 

“once he got into this army computer, I could see him searching their database, looking for military information, looking for stuff about their missile plans. Weird stuff was happening here.”

Cliff called up the base to inform them of their breach, but didn’t get the response he’d expected. “Hunter,” it turned out, had already cracked their system multiple times before.

The longer Cliff and his colleagues watched Hunter-Sventek, the more they saw him use Lawrence Berekeley merely to get to the MilNet — America’s network for unclassified military communications. Around 450 computers in all, from the Army to the Navy, the Air Force and the Pentagon, the FBI and the CIA.

It was clear by now: this wasn’t just a hack of Lawrence Berkeley, and maybe another lab in Maryland. This was a coordinated campaign against the United States of America.