It seems like a week doesn’t pass without artificial intelligence being mentioned by media in relation to robotics and progress in this section. Introducing AI capable robots into our everyday lifestyle can be beneficial to it, creating more conformity in it and a sense of safety among other effects. But where is the line drawn when it comes to teaching these artificial intelligent robots about our way of living?
Intentions behind Actions Three Major Investors.
Both Rein Hoffman and Pierre Omidyar, founders of LinkedIn and eBay, contributed $10 million each for this project based on MIT media lab and Berkman Kein Centre for Internet and Society at Harvard. Knight Foundation- a less popular investor, joined forces with them adding another 5 million dollars into the budget for this project. The project will consist of finding the best approach for teaching an AI robot about religion, morality, ethics and what it means to be human in general. This idea is, of course, backed up by the fact that this will be beneficial to their service to us in everyday life. A better understanding of our ways will provide means to future series of robots to help us with greater ease and higher efficiency.
What this basically means is that this project advocated equality in “species” over time as they learn more about us and how to look and behave like us. But there is an obvious problem that this project will encounter in later stages. Emotions are what binds these ideas like religion and morality together.
These carefully constructed ideas of a higher purpose and a reason to consider certain actions as a bad behavior, simply don’t work without arousing emotions. And since we don’t have a strict definition of how emotions work, except, roughly speaking, the one that suggests communication of data inside our brain that evokes certain reactions in our nervous system, how are we to teach a machine what emotion is? If we consider this problem the only thing that can be said about this investment and project, is that time, effort and money that it requires could be used better for other purposes