I'm in no way trying to denigrate that individual. He represents how most people feel about what I do. Friends started calling me "Tin Man," which as you recall from the Wizard of Oz, was the mechanical character in search of a heart. I could tell you stories for hours on end of the absolutely intolerant reactions people have had to what I've been doing, and the output of my program throughout the entire spectrum. At times, it has literally reached the point of evoking physical aggression out of people. Douglas Hofstadter, long known for his Pulitzer Prize winning book "Gödel, Escher, Bach: An Eternal Golden Braid" and his pioneering work with artificial intelligence, toured around for many years giving demonstrations of my program to people, playing computer-composed Bach for example. And once in Toronto, an audience member came up to him and told him that he thought the computer composed Bach was much better than the original. And this infuriated Doug.
SE: If I recall, one of the reasons Doug doesn't believe computers can be creative is that consciousness is a prerequisite for creativity. Why is it that you don't believe this is the case?
DC: Creating music is essentially composing algorithms, whether you're a conscious human or an unconscious computer program. An algorithm is a word that seems fancy as it has come back into play with computers, but it's actually an ancient combination of words that's been with us for a long time. An algorithm is essentially a list of things to do to achieve some result. A good example is a recipe for a cake. Or a grocery list. In our daily lives, we use algorithms to add and subtract, such as 2 + 2 = 4, all the way up to different types of integral and differential calculus. Many basic functions of our body such as our beating hearts, our breathing, our blinking eyes, are tasks that have been relegated to algorithms in the brain. One of the most interesting algorithms is DNA, which helps make us the way we are.
So algorithms are an inherent part of life, as well as computer science. One could almost suggest that everything we do are algorithms. And the things that we think we're doing non-algorithmically are unconscious or subconscious algorithms. Composers are usually trying to take the algorithms they're consciously or subconsciously hearing in their environment, and their DNA, and include their personality as part of the process. It really gives weight to the notion that "you are what you hear."
Also, I've discovered through my work that there are ways for computers to act creatively. Not intelligently, but creatively. After all, what is intelligence? I looked up intelligence in Webster's Dictionary once, and it said: "The ability to reason." Well, I wanted to make sure I really understood this definition, so I looked up "reason." And the definition for reason was: "The capacity for intelligence." So how far have we come? We talk about AI, but we don't even know what I is. I have real trouble with the notion of "artificial intelligence." It takes life to have intelligence. Because you have to be in a situation where you can risk and potentially lose something before you can gain something, in order to develop intelligence. And computer programs don't have a concept of losing something.
But computers can be creative just like human beings, and in a way that people are emotionally, or even spiritually, moved by. In 2006, I proposed a relatively new form of interaction between a user and the program in which the program attempts to please the user by composing music it believes the user will enjoy. And it is the way composers work in the future. I don't say that with arrogance or conjecture, I just really believe that this is the direction things are going to go.
SE: Why do you think Douglas Hofstadter believes consciousness is necessary for creativity?
DC: Well, I think Doug has a romanticized understanding of what music is. An understanding I certainly would have agreed with through many years of my young life. I had this notion that composers were kind of talking to me through their music. And that I was receiving their communication, which was telling me things emotionally that could not be expressed in language. But as I grew as a composer, I realized that this wasn't true. That the wonderment of music is not that it means something, but that it means nothing. Or, put another way, that it means something different to every person who listens. The music itself isn't "saying" anything. It's hard enough to get a one-to-one meaning out of language, even if you're not using it as poetry where we try to fuzzy language up a bit and get multiple meanings out of it. And if you keep moving in that direction, at a certain point poetry eventually becomes just sound to the point where it has no more meaning, and to me that's music.
I think Doug believes that if you know enough about a composer's life, and enough about the other works of that composer, then you have a special relationship with that composer, even though one of the two parties is dead. That there's a real communication flowing from the consciousness of the composer, because he, as a listener, has received it. I'm very sympathetic to this idea, but I just don't agree. I think it's me that's sending the message to me. The music is only provoking it. Music means something different to each listener. And I think that's something we should be enthusiastic about. After all, we all see the world differently, and that's what art is all about.
SE: Then what aspects of creativity are uniquely human, that computers can't do?
DC: Computers don't initiate themselves. They're creative when we instigate them to be creative. Human beings somehow do it on their own. And this encapsulation in a single unit of being not only able to react to one's environment, but to become internally excited about an idea that may occur to them to the extent that they will actually attempt work on that project and complete it, is something that computers can't do. I suppose that's very akin to Hofstadter's view of consciousness. Computers can do everything else humans can do faster, more economically, and more correctly than humans can do. But what they really can't do is have a self-awareness. They don't do something that no one has asked them to do, from sheer desire of wanting to do it. I can't program that into a computer. I think it's going to be very hard to get that aspect of programming into software. We would have to gift that computer program with something we ourselves may never truly understand. And how do you program that?