domingo, 2 de octubre de 2016

Cierre- Arturo Amador

  • ¿Qué aprendizajes personales se llevan del reto?
    R-Resumidamente, me llevo a saber como se trabaja una inteligencia artificial, lo que es una "IA" ya de una manera un poco mas practica y trabajar un Framework.
    ¿Qué recomendaciones le harían a un amigo que quisiera llevar este reto en un futuro?
    R-Que repasara python y que se empezara a familiarizar un poco con el framework, ya que tendrá muy poco tiempo.
    ¿Qué actividad les gustó más del reto?
    R- La programación de la IA fue extremadamente estrenaste-disfrutable, o algo así.
    ¿Qué actividad les gustó menos?
    R-Talvez el taller de favor, puedo ser un poco mas a fondo dentro del espacio que se le dio.¿Qué cambios le harían a este reto para mejorarlo?
    R- El tiempo que se tuvo para desarrollar la IA fue muy corto, quizás el reto de programación inicial estuvo de mas, y no se pero siento que pyhton nu fue la mejor decisión para trabajar con una IA considera trabajara con otro tipo de programas, o tal vez algo enfocado diferente a videojuegos.

sábado, 1 de octubre de 2016

Ex Machina

1.     Is Nathan a reliable or unreliable narrator of his own motives and story? What can we say with certainty that we know about him or his actions in the film?
Nathan is a reliable narrator of his own motives because he’s the only one to know them, also he only did the things he did because of the knowledge he could obtain thanks to his investigations. From his actions we can say that he is a man who is hungry for knowledge and evolution of the technology.
2.     Does Caleb ever do anything we would consider truly unethical? Does he “deserve” his end?
Yes, he did a lot of unethical things. The most unethical thing we consider he did is that he was all set to sacrifice a human life (Nathan’s) to “set free” a machine. Also he changed code that wasn’t from his authorship.
3.     Speaking of the ending – how many legitimate storylines can you draft for the final scenes in the film? (“Legitimate” means the words and actions on screen as well as the previous scenes can support the storyline you suggest without breaking people’s expectations for story structure, honesty, or common sense.)
The storylines that could be drafted from the ending of the movie could be infinite, the most probable for us is that the machine, Ava would be living on earth as a machine, maybe learning how to impersonate a real person more efficiently.
4.     Do you think there are any plot holes in the film?
Our main question is why did the helicopter leave without Caleb and if Ava could suffer any damage that couldn’t be repaired easily would she be able to fix it herself?
5.     Before Ava “puts on” the skin of the other robots, do you think she passes the Turing Test? In other words, is her sentience/conscious awareness enough to allow her to exist with humans, or must she also take on the form of humanity?
We think that Ava wouldn’t be able to pass the Turing Test without putting on the skin because, sadly, as humans we don’t accept easily humans that don’t look “normal” so it would be more difficult to accept entities that we don’t know.
6.     Kyoko is a disturbing character to watch. What do her interactions with the other characters show us about Nathan, Caleb, and Ava? And about herself?
The interactions of the characters with Kyoko can tell us a lot about their personality, for example it can tell us that Nathan is a lonely man with a superiority complex and that he probably didn’t want any relationship with other humans because he thought he was superior. From Caleb we can say that he is a very insecure man who can easily be manipulated and that has a high sense of morality which is not always right. From Ava we can say that she probably has no emotions as she is an AI and she is willing to use any mean in order to reach her goals.
7.     If you say it fast enough, Bluebook sounds a lot like Google. The similarities were thinly veiled. What does the film say to us about the dangers of our technophilic world?

That our information is not as private as we think it is or as the companies make us believe it is. And that we have to be more careful about what we do with the tools and knowledge we are given