A Life on the Digital Screen

One day in 1977, a mysterious package arrived at my school.  It turned out to be a brand new sort of machine people were calling a “microcomputer” — this one was a thirty-pound, sheet metal encased, black and white behemoth named “PET 2001,” with less computing power than you would find today in a digital watch.  It was unpacked and unceremoniously dumped on a little desk in a corner of the counselor’s office.  And there it sat.  After a couple of weeks peering curiously at it every time I walked by in the hall, I stopped and asked Mr. Woods what it did.  He said he didn’t know, but I was welcome to learn.  So he gave me a stack of manuals and I began spending my free time trying to figure it out.  I was already fascinated with technology, but this was the first time I had experienced a real computer face-to-screen, as it were.

Not long after, I found that the local vocational-technical school was the proud owner of an IBM System/360 Model 30 Mainframe, and I arranged to take classes.  It was the end of an era in computing: clean rooms, card sorters, and tape racks would all soon disappear into history.  But that’s where I got my start with computers: as a kid on the cusp of the personal computer revolution that would usher out the age of “big iron.”

Computer hobbyists were also starting to talk about something the Defense Department had cooked up called the “ARPANET” — a fast growing nationwide network of computers that was still largely the experimental province of computer scientists and selected, government-funded researchers. By the time I got my first personal computer (an Apple ][+) in 1979, computers were clearly advancing — mine had 48K of RAM, a floppy disk drive, six-color high resolution graphics, and a 300 baud acoustic coupler modem.  Bulletin Board Systems and other dial-up host services were forming the backbone of a vibrant and expanding underground geek community, homemade newsletters and a few glossy magazines catering to these hobbyists were proliferating, and computer clubs (patterned after early groups such as the Berkeley “Home Brew Computer Club” and the “Apple Pugetsound Program Library Exchange”) were popping up everywhere to bring local enthusiasts together in person swapping software, sharing news and information, and showing off our latest “hacks.”  (The term “hacker” only later took on a negative connotation; it originally meant someone who had a clever or innovative solution to a technical problem.)

By today’s standards, you couldn’t do much useful with those first PCs (they weren’t even yet called “PCs”), but they were mesmerizing artifacts of arcane technology that demanded a huge time commitment.  Everyone involved had the vague sense that things were changing, but no one could predict what actually happened.  In those early days, microcomputers came with complete copies of their schematics and technical specifications, and just about everyone who owned one was immediately violating their warranty by taking them apart, reconfiguring, and modifying them.  By the early eighties, I was one of those kids hacking into the ARPANET (soon to be officially called the “Internet”), spending late nights programming, and tearing down and rebuilding the hardware of my increasingly customized stable of jerry-rigged computers. Other regional networks were being established (such as the “Cleveland Free-Net”), and eventually, most of them would be incorporated into what we now know as the Internet (or the “network of networks”).

I originally thought that I would have a career in the natural sciences.  But as I progressed through my education, I found people too fascinating.  I sojourned through many of the social sciences and finally took my Ph.D. in Sociology with a collateral field in Psychology.  Through it all, I never lost my fascination with technology: still programming, building websites and apps, and using digital technology for everything from data science to video production.

I’ve lived through the end of the age of “Big Iron” to our modern era of digital ubiquity, digital convergence, smartphones, and now the “Internet of Things.”  It’s been an exciting journey: from command lines to GUIs, from dial up modems through Ma Bell’s aging copper wires to the wireless networking of everything, from punch cards and magnetic tape to the cloud.

In the past few years, I’ve been able to merge my lifelong hacker geek mentality with my academic training as a social and behavioral scientist, and as a research methodologist, into the emerging role of “data scientist.”  I sort of feel like the world has finally caught up with what I’ve been doing all along.

As a technologist, data scientist, and social scientist, I help your business integrate technology into your organizational processes.  If you want to find out more about how I help your enterprise improve and streamline your organizational and decision-making processes — and use the latest technology to improve them — then contact me now!

And, of course, sign up for my email list to keep up on the latest information about making better decisions, living better lives, and building better enterprise.