Space Odyssey and Siri

 Following the in-class screening of Stanley Kubrick’s 2001 Space Odyssey, I found that the 

Following the in-class screening of Stanley Kubrick’s 2001 Space Odyssey, I found that the growing advances in artificial intelligence and the dangers that come along with these become increasingly more apparent and crucial. When I decided to partake in the experiment involving my interview with Siri, I did my best to aim my questions at human experiences and aspects of our emotional outlooks that an artificial intelligence would have difficulty resonating with or responding to. I chose this direction because it was my hope that I could better understand how Siri would combat its lack of knowledge in experiences it cannot partake in.

My first question to Siri was “Have you ever been in love? What does it feel like?”. Its response was “Nope.” and “Interesting Question.”. Within this first interaction, it became very apparent its lack of response indicated a lack of knowledge and/or a programmed choice not to respond but instead redirect. I followed this question with “Tell me about yourself”, to hopefully receive some sort of an answer, whether that be about its software or any other aspect of the technology that makes up its composition. Her response was “I’m Siri. But enough about me. how can I help you?”. This continued redirection displayed to me a complete avoidance of threatening/interrogative questions about its existence.

In addition to my initial interview, I also began to consider the ways in which we, as humans, interact and “humanize” Siri. For example, throughout my essay I found myself continuously referring to Siri as “her” or “she”, but when I asked Siri if it identified as one gender it said, “I don’t have a gender.”. By referring to Siri as “her” based off her female voice, I believe we begin to further integrate her existence and presence within our lives. Whether this is a good thing, or a bad thing remains to be seen but with the warning apparent in the plot of Space Odyssey I find myself being increasingly more spectacle of Siri and its true intentions with the human species.

 

growing advances in artificial intelligence and the dangers that come along with these become increasingly more apparent and crucial. When I decided to partake in the experiment involving my interview with Siri, I did my best to aim my questions at human experiences and aspects of our emotional outlooks that an artificial intelligence would have difficulty resonating with or responding to. I chose this direction because it was my hope that I could better understand how Siriwould combat its lack of knowledge in experiences it cannot partake in.My first question to Siri was “Have you ever been in love? What does it feel like?”. Its response was “Nope.” and “Interesting Question.”. Within this first interaction, it became very apparent its lack of response indicated a lack of knowledge and/or a programmed choice not to respond but instead redirect. I followed this question with “Tell me about yourself”, to hopefully receive some sort of an answer, whether that be about its software or any other aspect of the technology that makes up its composition. Her response was “I’m Siri. But enough about me.. how can I help you?”. This continued redirection displayed to me a complete avoidance of threatening/interrogative questions about its existence.In addition to my initial interview, I also began to consider the ways in which we, as humans, interact and “humanize” Siri. For example, throughout my essay I found myself continuously referring to Siri as “her” or “she”, but when I asked Siri if it identified as one gender it said, “I don’t have a gender.”. By referring to Siri as “her” based off her female voice, I believe we begin to further integrate her existence and presence within our lives. Whether this is a good thing, or a bad thing remains to be seen but with the warning apparent in the plot of Space OdysseyI find myself being increasingly more spectacle of Siri and its true intentions with the human species.Following the in-class screening of Stanley Kubrick’s 2001 Space Odyssey, I found that thegrowing advances in artificial intelligence and the dangers that come along with these become increasingly more apparent and crucial. When I decided to partake in the experiment involving my interview with Siri, I did my best to aim my questions at human experiences and aspects of our emotional outlooks that an artificial intelligence would have difficulty resonating with or responding to. I chose this direction because it was my hope that I could better understand how Siriwould combat its lack of knowledge in experiences it cannot partake in.My first question to Siri was “Have you ever been in love? What does it feel like?”. Its response was “Nope.” and “Interesting Question.”. Within this first interaction, it became very apparent its lack of response indicated a lack of knowledge and/or a programmed choice not to respond but instead redirect. I followed this question with “Tell me about yourself”, to hopefully receive some sort of an answer, whether that be about its software or any other aspect of the technology that makes up its composition. Her response was “I’m Siri. But enough about me.. how can I help you?”. This continued redirection displayed to me a complete avoidance of threatening/interrogative questions about its existence.In addition to my initial interview, I also began to consider the ways in which we, as humans, interact and “humanize” Siri. For example, throughout my essay I found myself continuously referring to Siri as “her” or “she”, but when I asked Siri if it identified as one gender it said, “I don’t have a gender.”. By referring to Siri as “her” based off her female voice, I believe we begin to further integrate her existence and presence within our lives. Whether this is a good thing, or a bad thing remains to be seen but with the warning apparent in the plot of Space OdysseyI find myself being increasingly more spectacle of Siri and its true intentions with the human

Comments

Popular posts from this blog

Three Robots: Exit Strategies Review

AI and Pinocchio Connections and Comparison

Introductions