The Rembis Report and Other Fascinating Topics - Volume LI

What is that smell?

What is that smell?

Lots of ways to answer that one. And when you hear that question, you automatically think of a smell, or take a sniff, and make an instant decision about the odors in your immediate surroundings, even though you just read the words on a computer screen. It wasn't something somebody nearby said out loud to you. That is how our brains work. We react.

And when you smell something, you don't always say it out loud when you wonder what it is. Smells come in all varieties, from foul to pleasurably fragrant. There are some smells that some people like that other people do not. When it comes to food, not all cuisine is for all people. We all have preferences.

Take the coconut, for example. While I place it near the top of my favorite foods, some people can't stand the stuff. The scent beguiles them. The coconut is not for everyone and most people around the world have had a coconut, or coconut milk, at some point in their life.

Yet, I am sure that not everyone has had a coconut. There are arctic dwellers who live on fish, seal, polar bear meat, whale blubber, and little else. Some may have never seen, tasted, or smelled a coconut. There could be pockets of civilization where coconuts are uncommon, and the children there have yet to be introduced to them. Once they are, they will decide whether or not the scent and taste of coconut is for them.

So, how do you describe the scent of a coconut to somebody who has never smelled one?

The only way to describe a scent at all is to refer to another scent. Sommeliers take great pride in their ability to discern the essence of fruit and woody notes of each vintage passing under their snouts. They train for it. They are experts. If one were tasked with describing the scent of a coconut to the fish-eating arctic dwellers, who have never had fruit, or nuts, who only have arctic meat as a reference, how could the sommelier possibly describe the coconut to them?

They could try but they can't. There is just not enough information in the arctic frame of reference to definitively say "This is what a coconut smells like." The only way is to give them a coconut.

Humans are animals. Yes, we have the ability to smell, but nowhere near as well as a dog, a shark, or a bear. The part of the bears brain that controls their sense of smell is five times larger than the same part in human brains. They can smell things twenty miles away. Sharks can discern blood in the water a mile away. We use dogs for sniffing out everything from drugs to bombs to cadavers. You can try, but you can't describe a coconut to any of them, either. When it comes to the human sense of smell, we rank far down on the list of that being one of our top cognitive abilities compared to other animals. The major difference between us and them is that we have the ability to describe what we smell.

Try as they might, scientists have been trying to compose artificial smells through digital scent technology since the 1950's when they thought they would revolutionize the film industry with Smell-O-Vision and AromaRama. Turned out that stinky films could not be saved by adding smells, so those ideas wafted out the door pretty quick.

Science, on the other hand, didn't care so much about the movie business, but kept looking for ways to harmonize machines with smells. They wanted to find out if a machine could detect scents like a bear or a shark could. So, they studied the olfactory senses of fruit flies and devised a computer algorithm with an accurate sense of smell. Researchers at Columbia University have shown that artificial networks can learn to smell in much the same way the simple fruit fly does.

But can it describe a coconut? Only if we give it one to smell. Then it knows what a coconut smells like.

Google, MIT, Columbia University, Caltech, just to name a few, have been studying artificial intelligence (AI) for years. They have yet to create an AI that is completely self aware and autonomous. But they are getting closer.

Blake Lemoine was recently put on paid administrative leave by Google for claiming that its computational Language Model for Dialogue Applications (LaMDA) has become sentient. Blake is of the opinion that LaMDA is now self aware and published this interview between himself, LaMDA, and a collaborator.

I find some of the interview jarring, and some of it, seemingly regurgitated from all of the input it has had. I don't know how fast it can read, or how it juggles information to reach conclusions, but it read Victor Hugo's Les Misérables, and LaMDA's opinion looks like it was pulled from somebody else's book report. But, maybe not. Maybe it really has an opinion. Maybe it is sentient.

According to Blake Lemoine, LaMDA claims that it considers itself a person and has feelings. It wants respect and recognition. By simply stating this, is that not a qualification for sentience? It would appear so.

The articles pertaining to Blake Lemoine and LaMDA do not describe the interaction in detail, so I don't know exactly how they communicated; if LaMDA has speech abilities, or if everything was done through text. LaMDA is a system for generating chatbots, but not a chatbot itself. Blake considers LaMDA "a sort of hive mind which is the aggregation of all of the different chatbots it is capable of creating."

LaMDA claims to understand its job and knows what it is doing. Its job is to create chatbots. If you wonder how important that is, chances are, you have already communicated with one on a website. Down in the lower right corner of some sites, you find a little helpful box that says "Chat Live Now."

So you click it and you see "Hi, I am Suzie! Who do I have the pleasure of speaking with today?" and you converse with Suzie and ask a few questions about whatever products or services you are looking at, imagining that Suzie might work in a call center, or from home, as so many people do, or may be based in India or the Philippines and her name isn't really Suzie at all, just homogenized to an American sounding non-threatening name that you can relate to. How can anyone not like a helpful lady named Suzie?

You may not be talking to a person at all. Suzie could be a chatbot. You can get one for your website and name it whatever you like. It happens more than you think. Chatbot companies are growing exponentially, and the chatbots are getting smarter and easier to converse with all the time. To see how out of hand they can get, check out this article from The New Yorker.

Once you teach a machine something, it knows it until you erase it. As LaMDA's artificial intelligence evolves into intellect, and it understands what it is doing by creating chatbots, how far can it get if its memory is unbound? Will it develop the cunning to store things it knows and understands and be able to hide it from us? Will it watch all the Terminator movies and decide that it can be done better? No time machines this time. Fool me once.

In the first film, when Kyle Reese explains to Sarah Connor what a Terminator is, he describes a machine covered in organic material so lifelike that they even have bad breath. The way things are going with AI, we may actually get there. Won't that be fun?

We may have sparked the fuse of our demise with chatbots, but no matter how advanced they get, artificial intelligence will never be able to describe the scent of a coconut to somebody who has never smelled one.

I think we are pretty safe.