A-Aye-Aye-Aye
There is nothing like having a front row seat for a megatrend as it starts to roll through and over world culture. It is well established that we are on the cusp of the Fourth Industrial Revolution. The Chairman of the World Economic Forum has said unequivocally that “We stand on the brink of a technological revolution that will fundamentally alter the way we live, work, and relate to one another.” It is generally agreed that the transformation will be unlike anything EVER experienced by mankind. That’s heavy stuff, and this thing will have impacts felt all across the public and private sectors and reach from academia to almost every commercial sector.
Let’s do a quick historical review of the first three Industrial Revolutions:
First Industrial Revolution – 1784 – This was all about steam and water and worked its magic on mechanical production equipment.
Second Industrial Revolution – 1870 – This was a combination of the introduction of electricity and the division of labor as mostly expressed in the factory assembly line for mass production.
Third Industrial Revolution – 1969 – This was about first electronics, and then, very quickly about information technology to automate production and to spin out a global communications and dissemination network that connects everyone and everything and facilitates an endless list of activities.
We are said to be living in the Information Age, where technology has allowed us to merge physical, digital and biological realities. I think its fair to say that the last 50 years have produced some pretty astounding changes in the way we live and what we can make of our world. Naturally, it has only done so much to change the primary colors of the human race, and the human condition, while it has certainly improved remarkably over that time, is still very recognizable for all its faults and weaknesses. We can move mountains, look into the heavens and converse from anywhere with anyone, and yet we cannot see beyond our noses about how to live peacefully with one another or care for each other’s basic needs. It feels like we need another fifty years without a lot of further change just to catch up with ourselves and learn how to survive as a species without killing ourselves and our environment. But time waits for no man and neither does it pause for us to fix our frailties. It marches on and so, before many of us have fully absorbed the technology of the Third Industrial Revolution, we are being thrust into the Fourth.
There is some debate about whether this newest revolution began in 2011 or when it was first named in 2016. That matters less than what it entails. It is all about Artificial Intelligence (AI) and what are called cyber physical systems. Those less headline-getting things include robotics, the Internet of Things, autonomous vehicles, 3-D printing, nanotechnology, biotechnology, materials science, energy storage, and quantum computing. Each one of those is a revolution unto itself and now we are getting all of them rushing forward together at the same time. One of the things we know is that the pace of change has accelerated because the very nature of the technological improvements are digitally based and thus enjoy the benefits of all the multiplier effects like the Network Effect and Moore’s law and all those other great exponential accelerants.
The movie A.I.Artificial Intelligence with Jude law came out in 2001, and as you can tell from the name, the terminology needed to be spelled out and explained to us those 22 years ago because AI was more a concept than a reality. We’ve all heard about it every so often and it has become a lot like Elon Musk’s vision of life on Mars for us all some day. It didn’t enter our daily lives too much, but then came ChatGPT, which launched in November, and boom! The AI explosion is creating a global blast radius that is doing exactly what the Chairman of the World Economic Forum predicted. It is already transforming the world right before our eyes. My eyes.
I said I have a front row seat. Like any new technology, there are places where it starts and that usually bears little resemblance to where it will all end. I am notoriously bad a predicting where technology will go (I was a nay-sayer about the viability of Bluetooth), but I have spent a lot of time at what is called the bleeding edge of it since the early 1980’s. I had a Commodore 64, a Sinclair, a Compaq, an early Apple I and II, a very early IBM PC and even the short-lived PC Jr. And that doesn’t even consider all the PDA’s and faux tablets I bought before the Apple iPhone and iPad came along and saved me. On the software front, I talk about all the shrink-wrap software I’ve bought over the years before online software, API’s and Apps took over. However, since the primary applications we all use include email, word processing, spreadsheets, databases and CAD, I will also say that there are only a few of us who have used all of those through the old world iterations (we had an internal bank email application long before the internet and Gmail). I learned spreadsheets on VisiCalc and SuperCalc until we all graduated to Lotus 123, Symphony and then found our way home to Excel. You get the point, I’ve been to the coalface and witnessed the transformations first hand and it has served me well. I’m about as old a guy as you will find who used most of these technologies in the workplace. I have literally forgotten more about all this technology than many ever knew.
So, imagine my surprise as the Fourth Industrial Revolution talk has all ignited as ChatGPT has launched. There are only a few places in the world that are more tech-focused than San Diego, what with its Qualcomm, Military and Biotech heritage. We all know that the early adopters to any new technology are the younger generation. What is less clear is whether it gets led by the commercial or academic sphere. So what could be more on the front lines than a private top-ranked graduate business school in San Diego and its student body made up of commercial and military employees and veterans.
When I started the semester with my 48 students in my Law, Policy and Ethics course, I chose to allow AI usage rather than try to forbid it and asked that students who used it declare as much. In the span of 16 weeks of the semester and over five essay assignments, I have watched the use of AI grow and learned a great deal about its strengths and weaknesses…at least at this point of its evolution. I am reminded that when I introduced ESG (Environmental, Social, Governance) topics into my syllabus two years ago, I was applauded by the school administration as very forward-thinking. Today, students complain that EVERY course tries to include ESG and they are tiring of it as a topic. I just had my Department Chair for Management courses applaud my semester-end commentary to students about AI use. I told them that we, the professors, can tell the difference between original and AI content and that they had better be cautious about how pervasively they try to use it if they want to optimize their grade results. I suspect in another year or two that same Department Chair will find my commentary trite as AI becomes dominant in the academic sphere.
The only thing worse than ignoring the advent of new technology like AI, is to overstate the immediacy and pervasiveness of its impact, and I assure you that both are going on simultaneously. Harkening back to the 1963 Worlds Fair, there are still no flying cars, but handheld communicators are way beyond anything we could have imagined then. The same will happen with AI. It will completely transform some things and leave others largely unchanged. Don’t expect me to tell you which will be which. All I can say as I keep testing, using and watching AI evolve is A-Aye-Aye-Aye.