Machine intelligence: Scary? Necessary.

We owe a lot of rich drama and philosophy to the concept of machine intelligence. From the pure horror (or, uh, campy comedy?) robot overlords inspire, through Asimov’s deep probing of the nature of humanity, to the idea of the Singularity – rephrased as “Transcendence” in the upcoming movie of that name.

But what does an intelligent machine necessarily “transcend” (other than “the threshold of intelligence” – which is tautological)? Mechanisms – profoundly complex mechanisms, to be sure, that elude our current understanding – underlie human intelligence. Indeed, the human brain is routinely likened to a wonderful machine. Why should a machine that succeeds in performing a feat we normally associate with intelligence – such as consuming data from a wide variety of sources, “remembering” (storing) that data in a form it can readily access and correlate, and using what it has “learned” to arrive at a likely answer to a question of fact  – not be characterized as intelligent?

I submit that many “machines” (human-designed systems) display intelligence, today. To be sure, it is a far less flexible and remarkable intelligence than you, dear human, have displayed in arriving at this blog post and (I hope) making sense of it, in the context of your own background, interests, and motivations. It is also far less mysterious. I believe it is merely the fact that we do not (yet) understand how we are so intelligent that makes most of us believe – nay, insist – that we are far more intelligent than anything we design shall ever be.

Whatever you believe, as an information technology professional I trust you will agree that the “smarter” the system we can create for our client, the better. Let’s further agree that the output of a “smart” system is:

  • clear,
  • accurate,
  • timely,
  • consistent,
  • trusted, and
  • pertinent; and therefore
  • more valuable than the input

That sets the bar nice and low for “machine intelligence” – or does it?

In future posts, I’ll share some of the best practices I’ve discovered for creating and maintaining intelligent “information machines”. Meanwhile, if this post inspires or provokes you, I’m sure you’ll let me know about it!

Raison d’etre

Hello, World! 😉

My name is Sean Konkin, and I am an information technology professional (read: IT Guy). Through 25 years of trials and tribulations (read: tests, defects and rework) I have earned the right to call myself an architect.

I was immersed in the culture of oil & gas business systems for a long time – most of the last 20 years, in fact – and have recently resurfaced. I was laid off from my job as an IT Architect for a large energy company, and am taking this opportunity to look all around me, instead of straight ahead along the path of corporate strategy. The information technology landscape has changed a lot, from the viewpoint of the professional, since the last time I found myself at this sort of crossroads!

For one thing, it’s all around us. We (as workers and consumers) no longer walk to a single room in our houses or allocate a single compartment in our minds to the means of “going online”. We live online – unless we consciously and effortfully opt out. We keep a growing portion of ourselves in the so-called cloud. The Internet programs us. As professionals, we are expected somehow to stay ahead of the Internet, in terms of the risks and opportunities it presents to our employers and clients. No small feat!

Another huge shift is a function of both the development of IT and my own development as a professional: the meaning of “development” (in a systems, rather than personal or industry, sense). I got my start in the industry just at the point the title “Programmer” was giving way to “Developer” – but before the difference was anything more than aesthetic. I remember the developers’ bullpen in the first company I worked for: a darkened half-floor of cubicles stacked with empty Jolt Cola cans, occupied by jeans-clad Unix programmers (all male, of course). That group of “developers” was completely and intentionally different from the rest of the company and its clients. Their job function, Systems Development (really, programming) was mysterious, requiring a completely different set of skills than that of anyone else at the company – and seemingly a completely different personality. IT Guys were geeks – and proud of it.

Fast-forward 20 years: I’m an “Architect”. I love to program (or “code”; it’s now a verb) – but the task is completely different and much less mysterious than it used to be. How often does a developer write a single program, in a single language, to solve a problem, anymore? Proficiency in languages is still important, as is understanding the problem – but more important is the ability to see, communicate, and build a complex solution out of the various components at hand, using available tools. (I realize that’s a value statement, and certainly arguable; I’d love to elucidate and defend it; perhaps that will become part of the raison d’etre of this blog.) My main point, here, is that I now find myself in an environment where programming and other highly technical, specialized skills are secondary, and to a certain extent taken “as read”; the defining essence of IT systems development proficiency is inherent in much more nebulous attributes, such as abstraction, modelling, vision, and synthesis.

I am very curious as to how today’s IT landscape looks from the perspective of other “IT Guys” (inclusive of “Gals”, of course). If you’re another 40- or 50-something veteran, I’d love to commiserate! If you’re just starting out, your perspective is extremely valuable, untrammelled by the baggage we old hackers carry. If, through my posts and your comments (you’ll need to Register, at left), we can better understand and occupy our place as leaders in information technology, I’ll be most gratified.