Can We Ask the Right Question?

Suppose we do manage to create AGI or even super intelligence. Do we even know what to ask it? Will we understand its reply?

Can We Ask the Right Question?

We often talk about AI taking over the world. Dystopian futures like those portrayed in The Terminator or The Matrix. How come we never use the example from my favourite bit of science fiction. What if we spend billions building project Stargate and it just tells us the answer is 42?

I am of course referring to The Hitch Hiker’s Guide to the Galaxy and the supercomputer Deep Thought. It was tasked with calculating the answer to life, the universe and everything. Of course, they then had to build Earth to find out what the question is.

What if we build “AGI” or even “Super Intelligence” and just replies to our prompts with the equivalent of 42. 

Leaving aside for a moment if it is even possible to achieve “AGI” let alone “Super Intelligence” would we even understand its answer? Do we even know what we’d ask it?  I mean beyond the charade of using AI as an excuse to further weaken labour, IP and consumer laws and divide and enslave society at large of course 😉


More serious question. I tend to write most of my content assuming that the reader has an in depth knowledge of the subject matter. I don’t really explain very much. This helps keep it short and easy for me to write but no doubt makes little sense and comes across as cryptic to some.

The question is, should I be more explicit? Explain more? It might make it easier for more people to understand but also more frustrating for those who already know. I find this with Theo’s more recent videos for example. He’s taken to breaking everything down to the extent they’re sometimes tedious to watch now.

Thoughts?