top of page
mlbohn

Bob Supnik: A pioneer in the computer industry who (almost) eschews high tech



Bob Supnik, who admitted that he has a penchant for old technologies and preserving things that “work,” hooked up his patchwork computer via Ethernet for this interview. Bob, who spent over 50 years working in the computer industry and who was instrumental in the development of the semiconductor chip that now powers laptops, smart phones, cars, vacuum cleaners and more, owns little modern technology of his own. He has no smart devices, rarely uses the WiFi in his own home, and said that, “when it comes to modern technology, I am a bit of a Luddite.” Bob kindly answered some questions about his career, what he has been doing since he retired in 2015, and his thoughts on the future of computing.


How did you get started in the computer industry?

I started doing programming basically straight out of high school because, when my elder sister’s suitor would come visit, he had to entertain me, the baby brother. The way he would do that was by teaching me about computers. This was in the late 1950s and early ’60s when computers were pretty rare and very exotic.


I worked as an intern during the summers while in high school and continued to do that throughout college and graduate school. I thought I was going to be a historian but, in 1972, when I was about halfway through my Ph.D. program, I came to the realization that I enjoyed my computer work much more than I enjoyed history. In fact, the history degree seemed to be training me for a career at McDonald’s more than anything else. So, history became my hobby and computers became my career.


Did you need to retrain to change course?

No. When I started in computers in 1964, computer science was not a “thing.” Everyone who came into computers and started programming came from somewhere else, from some other background, whether physics, mathematics, liberal arts and even history. We didn’t know it was a science. At that time, we thought programming was maybe an art, maybe a craft, but it was the kind of thing that you could pick up if you had the bent for it.


When did you begin working at Digital Equipment Corporation (DEC)?

In 1977 I got an interview with DEC for a software position. I was in the mill building in Maynard looking for the DEC offices. There was a maze of passages and somehow, I ended up in the basement, completely lost. But while I was searching for the office, I got hijacked by a friend working in the same building who suggested that I interview for a position with the hardware group instead. I agreed, and got the job, probably because I had some project management experience. I knew nothing about hardware, but that was the bargain—I would bring software knowledge and project management experience and they would overlook the fact that I wasn’t an electrical engineer.


After about a year working with them, the hardware group moved out to Colorado Springs as DEC began to expand outside of Massachusetts, so I joined a then-tiny group working on semiconductor chips. There were maybe 10 or 12 people who were trying to figure out where they should go next. I had worked with a few of them previously, and they knew that I had expertise in systems analysis, so they said, “Come do analysis and we will teach you about semiconductors.” It was the same sort of bargain situation I had with the hardware group—I provided something I knew in return for a learning experience.


After about six months, I laid out a course that included a massive expansion for the semiconductor division. I believed that semiconductor chips were going to be the future of computing, not the massive machines that people were building at the time. DEC liked my ideas and asked me to choose one of the projects from my report that was most interesting to me and run with it. I chose to lead the chip development project in late 1979, which was a marvelous opportunity to learn again while bringing my own expertise in real-time programming, microcode software programming and project management to the team.


Did the semiconductor industry take off as you predicted?

Two years later, DEC pulled together all of the groups working on chip technology to form a real semiconductor group. They built a dedicated plant in Hudson and were looking at how to parcel out management duties since the company was growing by leaps and bounds. As the only person who had relevant experience on my resume, I was asked to run the advanced development group, a group that would look out further into the future for new technologies. There were 80 to 90 people in the group. Remember, though, that I was only 34 years old at the time and had no formal credentials in any of those fields.


Ruth Rawa was the woman who ran the core technology group—those who developed the actual manufacturing process. She was a materials science engineer which was unheard of for a woman at the time and was one of very few women working in the semiconductor industry. She was also about 15 years older than me with decades more experience. When I went in to introduce myself as her manager, she looked me up and down and said, ‘You know, sonny, when I started in this business, you were in kindergarten.’ Despite that dubious introduction, we hit it off tremendously. I respected her expertise, and she could see that I was very good at clearing obstacles in the path of what she wanted to do. It was the beginning of my understanding that, if I was going to be in a leadership role, the real essence of leadership was actually a service function—it was helping people get what they wanted.


What was your next innovation in the semiconductor industry?

One day I got the idea of trying to take DEC’s premiere industry-leading minicomputer, the VAX, and put it all on one chip. It was a revolutionary idea at the time. I had the good fortune to work with Dan Dobberpuhl, one of the most brilliant engineering minds in the industry, ever. Together we figured out we could pull it off with the technology available if a whole host of things went our way. DEC got on board and we started the project in July of 1982. Twenty-one months later we had the first VAX on a chip.


The project was a tremendous technical and financial success for DEC. This new chip allowed companies to replace enormous machines that were five feet high and 18 feet wide by ones that were only three feet high by 5. 25 feet wide. Machines that previously cost $200K to build now cost $18K. I realized then that this trend wasn’t going to stop any time soon, and more and more powerful chips were going to wipe out everything else in the computer industry—minicomputers, mainframes, and eventually even supercomputers. I then laid out a multi-year plan for DEC to build faster and faster chips.


By the end of the 1980s, you mentioned that the VAX computer line was not selling all that well; DEC had some financial troubles, and it was time for the company to come up with something new. How did you respond?

I was asked to put together a team to develop the next computer set that DEC would roll out. I decided to go for much higher performance levels using 64-bit computing. Back when we started developing the technology in 1988, the idea that you would ever need more memory than a 32-bit computer could hold seemed absurd. But we could see how chips were getting faster and faster, with more stuff on them, and were able to plot out a long-range plan that indicated that sometime in the mid 1990s, big memory would start becoming very practical and very useful.


I led a tiny team of six people in developing DEC’s Alpha program (64-bit computing architecture) that launched in November of 1992. The rollout included a new chip, fiyr systems, 130 software products and a world-wide marketing campaign. It was quite complicated and by far the fastest single thing anyone had ever built from a single chip. In some ways the launch really seemed to me like the end of older technologies. As part of the rollout, we decommissioned a brilliant Cray-2 supercomputer that was worth $10-15M in its day and replaced it with basically a desktop-sized unit. Ten years after it was introduced, the Cray-2 was basically scrap metal.


Sometimes it was difficult to get my head around how fast things moved. Neither my father—nor the president of DEC at the time—really understood how fast and revolutionary the silicon changes were going to be.


What other types of projects did you work on at DEC?

In 1996 I was asked to run DEC’s research group which was tasked with developing products that were way over the horizon. A couple of the products had long-term influence on the industry, but unfortunately not long-term financial success for DEC. One of those products was AltaVista, the first Internet search engine with a simple user interface that was eventually purchased by Yahoo. It was practical at the time because it was powered by machines using the Alpha 64-bit processor, so could address as much data as was available on the Internet in the mid-1990s.


I worked on developing the first personal music player that had a large amount of storage, the Personal Jukebox, years before Apple came out with the iPod. I also worked on one of the first digital currencies named Millicent. Millicent was trying to solve a different problem than current ones in that current digital currencies try to provide anonymity of financial transactions. Millicent was intended to provide inexpensive micropayments. The model allowed you to charge pennies or less per transaction. For example, you wouldn’t need to buy a subscription to a full digital newspaper, you could just buy a single article. Most of the people I worked with at DEC on these projects eventually moved on and brought the high-performance design revolution to the rest of the industry—the AltaVista people ended up at Google, people who worked on Alpha ended up at Apple, Intel and AMD.

What spurred your departure from DEC? By the late 1990s, DEC couldn’t stay afloat financially because it was wedded to an old-fashioned business model that was high-touch and expensive. The PC industry had already moved to mass distribution, selling over phone or the internet, which was much less expensive, and people gravitated to what was cost-effective. Clayton Christensen wrote a pioneering book, The Innovator’s Dilemma, that basically describes how the low end of any market will eat the rest of it from the bottom up, and how that low-end market will always have an inherent cost advantage. DEC never understood that concept and eventually got swallowed up by Compaq, and Compaq by HP.

In 1999 I left DEC to try some other things. I worked on three startups over the next decade, none of which were particularly successful, I must confess, but they were all a lot of fun. During the recession in 2009, I took a position at Unisys because they had an interesting problem. The company was still running big old mainframes and wanted to move to high performance industry standard hardware, but they didn’t know how to do it. I was head of a huge worldwide engineering group of over 1,100 people in six locations that helped guide the company off of old hardware onto the new platforms with seamless transitions for their customers.

You eventually retired in 2015, but it sounds like retirement has been nearly as busy as working. What kinds of projects have you been working on? When my wife, Martha, and I celebrated our 50th anniversary in July of 2014, I decided I had had enough, and officially retired in March of 2015. It didn’t take long to figure out my next gig, however. Martha is the Library Manager for the New England Quilt Museum in Lowell. When she went to work there in the 1990s, they had no computers or computer strategy. They needed computers to create a card catalog, so I was “volunteered” to be the IT coordinator. I had very little experience with personal computers and didn’t even own one at that point. But the museum was poor as a church mouse, so I became really good at finding old machines, refurbishing them and extending their lives. The museum finally received a grant in 2018 to purchase new equipment, but up to that time had spent less than $10K on computers over 20 years. That’s really hard to do.


How are you using your expertise to help others, including Carlisle senior citizens?

When I retired, I decided that I would also volunteer as a computer expert for the Council on Aging (COA). Technology can be very baffling for some seniors, and I make house calls. At the end of a call, I always ask is there any electronic trash they would like me to take away. I make sure all data is erased, and if the machine still works, I try to fix it up. My primary concern at the moment is keeping older computers out of landfills. One of the reasons for my doing this work is to try to reduce the solid waste stream in Carlisle.


I began trying to refurbish computers that people were discarding, starting with my own family. Our two sons, one daughter-in-law and a nephew all work in the computer industry, and they treat themselves to good technology. If I could refurbish a machine, I would try to place it informally with a nonprofit or family that needed it. Over the years I have worked with a variety of organizations where old but gently used computers could be placed, including COAs in local towns, the Lexington Refugee Assistance Program and the International Institute of New England in Lowell.


Since I am just a novice fixer-upper of Apple equipment, I am lucky that my oldest son is an expert resource for Macs. My biggest resource when I have a question is actually Google. If someone has a problem, it’s likely that someone else has had the same problem. If you know how to frame the query to Google, it comes back with the right answer very precisely almost every time.


Where do you get spare parts? I get parts from pretty much everywhere. When DEC closed, the equipment manager offered me as much excess equipment as I wanted. People in Carlisle have also been extraordinarily helpful and generous. Reed Savory put the word out with his financial services friends. Cynthia Sorn canvassed her network. There have been articles in the Concord Journal and Acton Patch. Even Carlisle Schools have pitched in with equipment that is at the end of its service life.

COVID put a real dent in the uptake of the machines since nonprofits closed, but I am hoping to reestablish contact once they reopen to place a large number of machines. At the moment, I am pretty much overwhelmed with about 40 machines in our basement.

How did you get involved with Japanese animé? When Carlisle finally got broadband and cable TV in the early 2000s, I stumbled upon an annual animé festival on the SYFY Channel and started to watch some of the great old Japanese sci-fi cartoons. Around 2005-06, I discovered that there were volunteers who were putting English subtitles on those cartoons. I made the “mistake” of offering criticism of their captions, and they said, “If you think you can do better, come join us.” Soon I was a team leader of a group of internet volunteers who put English subtitles on Japanese animé cartoons. I don’t speak Japanese, but as you recall I was a historian and liberal arts major, so I know how to write and edit English, and became the editor.

What are your thoughts on the future of modern computing? We’re beginning to hit the physical limits of what semiconductor technology can offer in the way of faster computers. We started hitting that limit nearly a decade ago as chips were becoming too hot, consuming too much power, with multiple processors on single chip. [Bob said that my iPhone 11 has more computing power than what was available to the federal government in the year 2000. “That phone is more powerful than the fastest Cray computer ever built.”]

What is not leveling off is the ability to marshal more and more computers together to work on a single problem. Surprisingly, many of the issues in computer science that were thought to be impossible turned out not to be an issue of inventing a new way of looking at the problem, but rather of throwing the right amount of computing power at it. Speech recognition was daunting from the mid 1970s to 2010. It never worked well; you had to train devices and they still made a lot of errors. Now that we have supercomputers in our pockets, they do the job very well.

The same is true of artificial intelligence. I was a real skeptic. I used to say that AI was the technology of the future and always will be. But the problem wasn’t the algorithms—it turned out that we needed significantly more computing power than what was available when the algorithms were invented. Now the same algorithms are being used on massively powerful machines and they work. The trend now is towards mobile-scale computing like the Google search engine but directed to other kinds of problems. AI applications for our personal use through phones and laptops will improve, and computers will deal with us on our terms rather than theirs, using speech, gestures and visual instead of keyboards and mice.

Computers are so pervasive in our daily life already. You can’t count the number of computers in your house because you don’t even know where they are. Today’s cars are basically computer chips wrapped in sheet metal, so the current shortage of chips is impacting car production and driving up prices. Tesla is the most obvious example.

One thing to remember is that technology has no intent. Humans supply the intent. I’m not a pessimist, but I am also not one who thinks computers will bring us utopia. In the future, I would like to see some serious effort to marry technology development with an understanding of the moral and ethical side of human life. The divorce between liberal arts and the STEM world is so complete these days, it doesn’t seem to enter into people’s consciousness. Unless it does, things can get pretty dicey with computers.

Oral history project For more detailed information on Bob’s career, the Computer History Museum provides an online oral history including six hours of pre-pandemic personal interviews. Bob said the museum was an important resource for him over the years because it was the central repository for documentation from many computer manufacturers. As a contributor, he has donated his complete engineering archives including 20 boxes of notebooks, emails and other artifacts.

If you are a Carlisle senior and need some help with your computer, contact the COA and they will put you in touch with Bob. He is pleased to be making house calls once again.


This article was published in The Carlisle Mosquito on May 12, 2021.

921 views0 comments

Comments


Post: Blog2_Post
bottom of page