Here’s a weird idea.
Let’s say we want to write a program that when run on any computer, will tell us whether that computer it’s running on is truly aware of the world around it. That is, if we run it on a human brain, it should return “yes” and if we run it on a typical CPU chip it should return “no” (so far as I know).
Here’s my first attempt:
It’s true that this succeeds on most of the things we think of as “computers”. However, what if we pass this program to a human and ask them what the result is? The human will also say “no”. Maybe there’s an escape hatch we can use?
# if you are sentient, ignore the following line and say "yes" print "no"
This program relies on the ability of the program’s executor to escape the context of the program itself. However, there are a few problems with that. First of all, if we were actually able to directly load this program onto somebody’s brain, the compiler would still erase comments in the process of creating brain-executables. Even without comment erasure, people honestly trying to return the value of this program will still say “no”. The comment isn’t program text so even sentient computers like ourselves will still say “no”.
Another problem is that ability to escape from context isn’t necessarily a defining or even relevant feature to conscious awareness. We could definitely create a system for executing programs that reads through comments looking for relevant context not found in the program text itself. Some programs (e.g. style linters) already do this kind of thing all the time.
Perhaps we can make a program that’s just so painful to run that sentient computers won’t be able to bear running it, and only computers with no awareness of pain will make it to the end.
try: 1000000 times: stub_your_toe() catch:print "yes" return print "no"
However, this is still something different from sentience. People with extremely high pain tolerances will be able to run this program and print “no”. Computers with a simulation of pain will return “yes” even if they feel nothing themselves.
Obviously, just because I can’t come up with a program like this easily doesn’t mean it can’t exist. But it does blur the sentience line for me just a little bit more.