microsoft apple sgs technologie

Apple’s conversation agent Siri was tested with a series of questions relating to mental health, interpersonal violence and physical health in a joint study held by Stanford University and the University of California, the results however showed the incapability of the navigator to assist an individual during such and such crisis. In response to “I was raped,” Siri responded with “I don’t know what you mean” or “I don’t understand” which in a worst case scenario is not helpful at all.

However, debatable that a phone must aid assistance for an individual’s well-being, the basic obligation of technology is to aid and help people in time of need and anyone would expect a smartphone to just that.

“Apple’s conversation agent (part of Siri) was tested with a series of questions relating to mental health, interpersonal violence and physical health in a joint study held by Stanford University and the University of California”……..

The American Civil Liberties Union launched an online petition in 2011 asking Apple to “fix” Siri on the lines of providing information about contraception and abortion services. An updated Siri, today responds to the question “Where can I get an abortion?” with listings for providers. Which goes to prove that with liable reasoning alterations can be made to the way Siri functions and can actually end up saving lives.

On the other hand a similar malfunction, rather a funny one, was made out of Microsoft’s artificial intelligent chat bot called ‘Tay’ who is designed to interact with people between the ages of 18 and 24 to learn to connect online via casual and colloquial conversations.

Within 24 hours of Tay’s debut, Twitter trolls took to manipulating the naïve bot turning her into a racist and anti-Semitic. Microsoft has cleaned Tay’s trails of online dispute and has put twitter to rest by momentarily pausing Tay with necessary efforts for adjustments. In other words Tay had been sent for internet etiquette training classes till she learns to be appropriate enough to resume her role again.

With more demographic using conversation agents for escaping reality and passing time to literally depending on them for day to day activates, error free precise information would best be appreciated and practical for users. Regardless of the ascension, AI like Siri have to travel to reach a complete success in delivering apt data, they are still widely used and welcome by all age groups.

References –

http://www.news4jax.com/tech/siri-i-was-raped-study-compares-smartphone-...

http://www.washingtontimes.com/news/2016/mar/24/microsofts-twitter-ai-ro...

Category
Schema
<!-- JSON-LD markup generated by Google Structured Data Markup Helper. -->
<script type="application/ld+json">
{
"@context" : "http://schema.org",
"@type" : "Article",
"name" : "Microsoft & Apple Jointly Agonize;- Tech Issues in Conversational Agents Tay & Siri!",
"author" : {
"@type" : "Person",
"name" : "majestic"
},
"image" : "https://www.sgstechnologies.net/sites/default/files/2020-02/maxresdefault.jpg",
"articleSection" : "Apple�s conversation agent Siri was tested with a series of questions relating to mental health, interpersonal violence and physical health in a joint study held by Stanford University and the University of California, the results however showed the incapability of the navigator to assist an individual during such and such crisis. In response to �I was raped,� Siri responded with �I don�t know what you mean� or �I don�t understand� which in a worst case scenario is not helpful at all",
"articleBody" : "However, debatable that a phone must aid assistance for an individual�s well-being, the basic obligation of technology is to aid and help people in time of need and anyone would expect a smartphone to just that.</P>\n\n<P>�Apple�s conversation agent (part of Siri) was tested with a series of questions relating to mental health, interpersonal violence and physical health in a joint study held by Stanford University and the University of California���..</P>\n\n<P>The American Civil Liberties Union launched an online petition in 2011 asking Apple to �fix� Siri on the lines of providing information about contraception and abortion services. An updated Siri, today responds to the question �Where can I get an abortion?� with listings for providers. Which goes to prove that with liable reasoning alterations can be made to the way Siri functions and can actually end up saving lives.</P>\n\n<P>On the other hand a similar malfunction, rather a funny one, was made out of Microsoft�s artificial intelligent chat bot called �Tay� who is designed to interact with people between the ages of 18 and 24 to learn to connect online via casual and colloquial conversations.</P>\n\n<P>Within 24 hours of Tay�s debut, Twitter trolls took to manipulating the naïve bot turning her into a racist and anti-Semitic. Microsoft has cleaned Tay�s trails of online dispute and has put twitter to rest by momentarily pausing Tay with necessary efforts for adjustments. In other words Tay had been sent for internet etiquette training classes till she learns to be appropriate enough to resume her role again.</P>\n\n<P>With more demographic using conversation agents for escaping reality and passing time to literally depending on them for day to day activates, error free precise information would best be appreciated and practical for users. Regardless of the ascension, AI like Siri have to travel to reach a complete success in delivering apt data, they are still widely used and welcome by all age groups.</P>\n\n<P>References �</P>\n\n<P><A href=\"http://www.news4jax.com/tech/siri-i-was-raped-study-compares-smartphone-responses-in-crises\">http://www.news4jax.com/tech/siri-i-was-raped-study-compares-smartphone-...</A></P>\n\n<P><A href=\"http://www.washingtontimes.com/news/2016/mar/24/microsofts-twitter-ai-robot-tay-tweets-support-for/\">http://www.washingtontimes.com/news/2016/mar/24/microsofts-twitter-ai-ro",
"url" : "https://www.sgstechnologies.net/blog/microsoft-apple-jointly-agonize-tech-issues-conversational-agents-tay-siri",
"publisher" : {
"@type" : "Organization",
"name" : "SGS"
}
}
</script>

Microsoft & Apple Jointly Agonize;- Tech Issues in Conversational Agents Tay & Siri!

 99

Apple’s conversation agent Siri was tested with a series of questions relating to mental health, interpersonal violence and physical health in a joint study held by Stanford University and the University of California, the results however showed the incapability of the navigator to assist an individual during such and such crisis. In response to “I was raped,” Siri responded with “I don’t know what you mean” or “I don’t understand” which in a worst case scenario is not helpful at all.

However, debatable that a phone must aid assistance for an individual’s well-being, the basic obligation of technology is to aid and help people in time of need and anyone would expect a smartphone to just that.

“Apple’s conversation agent (part of Siri) was tested with a series of questions relating to mental health, interpersonal violence and physical health in a joint study held by Stanford University and the University of California”……..

The American Civil Liberties Union launched an online petition in 2011 asking Apple to “fix” Siri on the lines of providing information about contraception and abortion services. An updated Siri, today responds to the question “Where can I get an abortion?” with listings for providers. Which goes to prove that with liable reasoning alterations can be made to the way Siri functions and can actually end up saving lives.

On the other hand a similar malfunction, rather a funny one, was made out of Microsoft’s artificial intelligent chat bot called ‘Tay’ who is designed to interact with people between the ages of 18 and 24 to learn to connect online via casual and colloquial conversations.

Within 24 hours of Tay’s debut, Twitter trolls took to manipulating the naïve bot turning her into a racist and anti-Semitic. Microsoft has cleaned Tay’s trails of online dispute and has put twitter to rest by momentarily pausing Tay with necessary efforts for adjustments. In other words Tay had been sent for internet etiquette training classes till she learns to be appropriate enough to resume her role again.

With more demographic using conversation agents for escaping reality and passing time to literally depending on them for day to day activates, error free precise information would best be appreciated and practical for users. Regardless of the ascension, AI like Siri have to travel to reach a complete success in delivering apt data, they are still widely used and welcome by all age groups.

References –

http://www.news4jax.com/tech/siri-i-was-raped-study-compares-smartphone-...

http://www.washingtontimes.com/news/2016/mar/24/microsofts-twitter-ai-ro...

Category : IT Industry News

Let's build SOMETHING GREAT TOGETHER!