Resources

AI Is The Latest Victim Of The Dreadnought Effect

Written by Chris Crouch | 29/04/24 13:18

I love efficiency; there’s a reason my favourite poems are haikus. A wise person once said ‘Why waste time say lot word when few word do trick?

In the spirit of this I’m going to write a blog. I’ll keep it concise, efficiently wading into the ongoing and ever-contentious AI debate, going straight down the middle firing-off broadsides from both sides at both sides… What's the worst that could happen? For the keen eyed, I’m also going to attempt to leverage as many maritime references as possible.

Avid strategists and historians will already have spotted where I’m going with this. The HMS Dreadnought entered into service in 1906 and was considered to be a very, very big deal in naval warfare. Too big, in fact. The Dreadnought “revolutionised naval warfare” and in doing so – plot twist – crippled the Royal Navy’s competitive edge against its many adversaries. Who could predict that when they had the biggest and best?

Being superior in just about every way to anything that existed at the time, competing naval forces merely had to build their own version of the Dreadnought to come out on top of a fight. Unfortunately, not only did this put an end to the Navy’s resource superiority, it also turned the majority of maritime skirmishes into standoffs, changing the way nations fought altogether. The warfare game had changed, and anyone still in contention had to change along with it to remain a player.

Rather like the coming of the
Dreadnought

LinkedIn today is awash with people that lionise AI as if it’s our tech Rapture, Doomsday, and the second coming of Tim Burners Lee all rolled into one. It’s the answer to every problem that does or doesn’t exist. We now know that the Dreadnought was so effective it made itself redundant, and ultimately took part in very few naval battles. As the naval warfare landscape had evolved, as quickly as it left safe harbour that massive vessel was retired with nary a scratch on its hull. 
The metaphor here - stay with me - is that if you treat these new AI tools as anything other than just that - a tool, you’re probably going to be first fooled by the hype but ultimately get left behind too; look at the controversy already stirred up over deep fakes for example. In just 12 months we’ve already catapulted to Sora and Microsoft making photorealistic videos from text prompts alone.

It’s not all adulation for AI either. LinkedIn is positively overflowing with controversial click bait, rage bait, hyper inflated opinions of what AI is, what it isn’t, what it should be, what it will be… you get the picture. AI has now surfed the waves of hype and overshot into territory of being easy prey to shoot down. 

Many people are quick to overlook what the real strategists are doing though, and they’re usually the ones you want to pay attention to during changing landscapes. Microsoft have reportedly invested over $10Bn in AI from OpenAI (over $100Bn including infrastructure initiatives). Nvidia’s new Blackwell chip could propel it beyond its previous scale, using AI to leap to $2.2Tn net worth, $1.5Tn valuation increase in the last year alone. In a world of ‘well actually’ talk let’s still remember to pay attention to the silent mountains moving behind us. No, OpenAI isn’t going to stop trading next year despite legal battles and IP lawsuits. Yes, AI is here to stay. You can wring your hands and disagree, or you can brace yourself and keep up; the tech world isn’t simply moving, it’s accelerating.

In summary, the aforementioned acceleration really is the white whale in all of this. It’s not really about what’s out there right now that’s even noteworthy, it’s how quickly AI is evolving that’s important. Industry velocity is increasing and the already-moving train we’re all on is only going one way. ‘When will the speed of AI transformation normalise and be viewed as an unremarkable tool in everybody’s toolbox?’ is the real question.

Better get on with it though - Quantum Computing is just over the horizon, right?