X

An Evening With Steve Wozniak

Ed Wenck | Apr 18, 2019

Steve Wozniak’s shoes are connected.

Wozniak, appearing on stage at Purdue University’s Elliott Hall on the evening of April 17, revealed that his Nikes are controlled by an app that tie and tighten his sneakers to the perfect pressure. It’s a neat trick, but it’s also worrisome for Woz. The inventor of the Apple computer noted: “Can you imagine if some bigger company that made my shoes finds out I haven’t paid a bill someplace?”
 
“I wouldn’t be able to take my shoes off at night.”
 
Woz is a fan of local control: “Give me the switch, the manual lock – I need a backup in case something fails.”
 
“If there’s a system failure someplace, or I owe money to my security company, how am I supposed to get into my home if my powered, connected door doesn’t have an actual key?” 


The most terrifying example of a lack of this kind of control is a recent one. “It looks like the 787 MAX pilots that lost control of their aircraft couldn’t overcome bad software in the planes.”

The Singularity? Not so fast

Wozniak, working in an interview format with questions (provided by Purdue’s John A. Edwardson Dean of the College of Engineering Mung Chiang) had props with him: two very early Apple machines, including the iconic Apple II. That machine, early on, was a boon for video game designers. “Before we came along,” said Woz, “games were completely constructed out of hardware – hard-wired chips to create colors and patterns on a screen.” Because game design was software-based with the Apple II, the process of prototyping video games dropped from six months to six hours.
 
That intimate knowledge of machine development, coupled with Woz’s front-row-spectator vantage point as Moore’s Law proved itself over and over again well into the present decade, has led Woz to some conclusions that might seem counter-intuitive.
 
Example one: Wozniak doesn’t believe that autonomous vehicles are right around the corner. “There are so many road conditions that pop up when you’re behind the wheel that I just don’t see the technology developing that can handle all of it. I just don’t think we’ll see that ‘Level 5’ [the completely driver-less experience] soon – certainly not in my lifetime."
 
Example two – and this is a biggie – Woz doesn’t believe in the notion of the singularity anymore. He once did, admittedly, but now he has doubts about machines achieving human self-awareness, human emotion, and the ability to mimic true human interactions. “A baby sees a dog. The human can gain the awareness of the idea of ‘dog’ really rapidly. Google sees pixels.”
 
Beyond that, “Machines never think: ‘Is this what I should do? Will this be good, will this be ethical?’ I haven’t seen any true examples of that. The bottom line is simply this: We don’t know how the brain is wired, and until we can figure that out, we won’t get there.”


“Where do we stop?”

As the session neared its end, Woz took questions from the crowd. One student was curious about the notion of technology overload when it came to trying to put the processing genie back in the bottle: “When it comes to technology, where do we stop?” (The question paraphrases the famous line uttered by Jeff Goldblum in “Jurassic Park:” “They were so busy figuring out if they could, they never stopped to think if they should.”)
 
Woz’s answer was a bit startling, if utterly pragmatic: “We don’t have a say.
 
“People will continue to build and tinker and improve or re-work whatever the last person was trying to create. It just happens.
 
“I spent my career trying to do good work, trying to make products that regular people found easy to use.”
 
And then Woz uttered a sentence that could easily double as any CEDIA member’s mission statement: 


"The user is more important than the technology.”