SaigonOLPC

Just another WordPress.com weblog

Alone Together (Part Four) June 30, 2012

The computer scientist says, that we will evolve to love our tools, our tools will evolve to be lovable. Tools will allow us to do things that we’ve never done before. John Lester sees a future in which something like an AIBO will develop into a prosthetic device, extending human reach and vision. It will allow people to interact with real physical space in new ways. We will see “through its eyes”, says Lester, and interact “through its body… There could be some parts of it that are part of you, the blending of the tools and the body in  a permanent physical way.” This is how Brooks talks about the merging of flesh and machine. There will be no robotic “them” and human “us”. We will either merge with robotic creatures or will become so close to them that we will integrate their powers into our sense of self. A robot will still be other, but the one that completes you (extension of us, meaning that we are not powerful today and have limits, but not in the future).  We will know love which is reflection of our love.

When the brain in your phone marries the body of your robot, document preparation meets therapeutic massage. Here is a happy fantasy of security, intellectual companionship, and nurturing connection.

Tools will be an extension of us and more – love, power, together we will never be alone. We will begin to embed them in our rooms. They will collaborate with us. They will have a sense of humor. They will sense our needs and offer comfort. Our rooms will be our friends and companions.

Robots will not be incompetent, they are introduced to make up for human flaws like laziness; safe, they will be specialized and personalized.

The Japanese believe in a future, in which robots will babysit and do housework and women will be freed up to having more babies, also restoring sociability to a population increasingly isolated through the networked life.

The Japanese take as given that cell phones, texting, instant messaging, email, and online gaming have created social isolation. They see people turning away from family to focus attention on their screens. People do not meet face to face, they do not join organizations. In Japan, robots are presented as facilitators of the human contact that the network has taken away. Technology has corrupted us, robots will heal our wounds. Robots, the Japanese hope, will pull us back toward the physical real and thus each other.

Robotic companions can become mentors. My real baby was marketed as a robot that could teach your child socialization. Sherry is skeptical as believes that sociable technology will always disappoint because it promises what it cannot deliver. It promises friendship but can only deliver performances. As if we will be manufacturing friends that will never be friends.

Roboticists argue that there is no harm in people engaging in conversations with robots, the conversations may be interesting, fun, educational or comforting.  But Sherry finds no comfort here. She feels in a shadow of an experiment, in which humans are the subjects.

Another example of a sociable robot is a diet coach; the user provides some baseline information and the robot charts out what it will take to lose weight. With daily information about food and exercise, the robot offers encouragement if people slip up and suggestions for how to better stay on track. Things happen that elude measurement. You begin with an idea about curing difficulties with dieting. But then the robot and person go to a place where the robot is imagined as a cure of souls.

When we make job rote, we are more open to having machines to do it. But even when people do it, they and the people they serve feel like machines. People are always performing for other people. Now the robots too will perform. The world will be richer for having a new cast of performers and a new set of possible performances.

Finally Sherry says, if robots are designed to complement humans and not replace them, then I’m all for it!

Re-posted from The Ultimate Answer Blog

Advertisements
 

Alone Together (Part Three) May 25, 2012

Will our reliance on technology compromise our relationships with humans and will the benefits be on individual and society level? It depends. Someone who had trouble with romance for many years will be living with robot girlfriend, not human girlfriend. If they are happier in personal relationships, they would perform their role better as citizens. As for other humans, they may not like to compete with robots.

With Paro children are onto something: the elderly are taken with the robots. Most are accepting and there are times when some seem to prefer a robot with simple demands to a person with more complicated ones. Quiet and compliant robots might become rivals for affection. People want love on their own terms… They want to feel that they are enough.

“It is common for people to talk to cars and stereos, household appliances, and kitchen ovens. The robots’ special feature is that they simulate listening, which meets a human vulnerability: people want to be heard. From there it seems a small step to finding ourselves in a place where people take their robots into private spaces to confide in them. In this solitude, people experience new intimacies. The gap between experiences and reality widens. People feel heard but the robots cannot hear.”

Humans don’t want to get hurt, they have a fear of rejection, pain, and the desire for acceptance and belonging. So a relationship with robot that will never leave, betray, reject is logical, but it will alter humans’ behavior in becoming more unwilling to change and compromise.

It could possibly lead to the situation when people will become so intolerant of each other that they will only be able to have companions robots, not humans (because humans are so hard to handle), so there will be even more isolation between humans, as they will live in their only bubble or delusional worlds.

We have more love in ourselves than people can take from us… We want to give love, but there is not always a person to receive it… That is where robots come to play… Yes, we should transfer those surpluses of love to apply them to people. But people want to receive love and care on their own terms. It gives an opportunity to love and to be useful and what we don’t always get in reality – get the same in return… None wants our unconditional love and care on our terms, and we don’t always want love on their terms either – it is too demanding…

Humans need validation that we are right and enough the way we are. Robots don’t cure our flaws, but don’t see them and give us an opportunity for better realities, where we are a hero, or at least good.

We put robots on the terrain on meaning, but they don’t know what we mean. Moral questions come up as robotic companions not only “cure” the loneliness of seniors but assuage the regrets of their families. An older person seems content, a child feels less guilty. As we learn to get the most out of robots, we may lower our expectations of all relationships, including those with people.

Re-posted from The Ultimate Answer Blog

 

Alone Together (Part Two) May 19, 2012

Filed under: Volunteering — polyachka @ 2:00 pm
Tags: , , , , , , ,

One of the important questions in the book is about possible replacement of humans with machines: “Don’t we have humans for those jobs?” In my opinion, it is not one or another, it is better to have a robot than no one. Especially in health care. The point is that there are not enough humans for those jobs…

Unfortunately, people have needs that are not always satisfiable by people around us, due to limitations in geographies, extreme conditions, physical limitations…

“There are not enough people to take care of aging Americans, so robot companions should be enlisted to help. Beyond that some argue that robots will be more patient with the cranky and forgetful elderly than a human being could be. The robots will simply be better.” Yes, if somebody’s caretaker is abusive and over exhausted. Why not alleviate patient’s pain by introducing robots.

“If the elderly are tendered by underpaid workers who seem to do their jobs by rote, it is not difficult to warm to the idea of a robot orderly. Similarly, if children are minded at day-care facilities that seem like little more than a safe warehouses, the idea of a robot babysitter becomes less troubling. We ask technology to perform what used to be “love’s labor”: taking care of each other. But people are capable of the higher standard of care that comes with empathy. The robot is innocent of such capacity.”

Sorry, Sherryl, but humans could do worse than what you can even possibly imagine – they can abuse other humans, they can act with so much cruelty that no well-programmed robot would ever perform. Humans are capable of treating each other as if they are worse than robots or spare parts. If their behavior cannot be regulated, robots will at least provide bare minimum of services and would not go below/underperform (the way they programmed). But there could be a glitch/hacker who can change programming and robots will start abusing humans.

“Loneliness makes people sick. Robots could at least partially offset a vital factor that makes people sick.” Of course, interaction with humans would be better, but if the person is dying from loneliness, and robot can cheer up, how can you deny it?

Sheryl is against robots as social companions. They force us to ask why we don’t as the children said it ”have people for these jobs”?

Our allocation of resources is a social choice. We don’t have capacity, time and resources to take care of all humans, especially elderly. There are preferred jobs and non-preferred jobs. Not to impose some jobs on others, we have to take care of it creatively and use tools to help. In some culture youngest person in a family is assigned against their will to be the caretaker. Well, if we speak of true freedom, some people don’t want to do certain jobs. So robots can do them. What if Miriam’s son doesn’t have money to stay at home with his mother and take care of her, but he can hire caregivers to keep her company, just the Paro.

I agree that there should be people who do these jobs. But if hiring humans or doing it yourself is too expensive, robots are cheaper way to make people happy. Everyone needs support. I agree that a mechanism should be in place that government reallocates resources where they are needed, but we don’t want to make people do things against their will. Since robots don’t have will, they can do hard jobs…  where humans would be stressed and inefficient.

Re-posted from The Ultimate Answer Blog

 

Alone Together (Part One) May 16, 2012

Recently I was reading again Sherry Turkle’s book “Alone Together” and would like to share some thoughts about the first part of the book: “The robotic moment: In solitude, new intimacies”.

Sherry describes several robots including those available on the market as social companions. They are, to name a few, Aibo, My Real Baby, Seal Paro, GOV, Kismet, Doll Madison, etc.

I was surprised to learn how critical Sherry is of robots: tech evil that will corrupt humanity.

Let’s look at the simple tech solution called Eliza. It is a program that chats with people, and very often in their conversation with Eliza people open up about their problems and seek advice from an application that can’t really think for them. The author says:

“The idea that simple act of expressing feelings constitutes therapy is widespread both in the popular culture and among therapists (way to blow off steam) and is very helpful”. However, “in psychoanalytic tradition – The motor for cure is the relationship with the therapist. The term transference is used to describe the patient’s way of imagining the therapist, whose relative neutrality makes it possible for patients to bring the baggage of past relationships into this new one. In this relationship, treatment is not about the simple act of telling secrets or receiving advice. It may begin with projection but offers push back, and insistence that therapist and patient together take account of what is going on in their relationship.

When we talk to robots, we share thoughts with machines that can offer no such resistance. Our stories fall literally, on deaf ears. If there is meaning, it is because the person with the robots has heard him or herself talk aloud”.

I shall argue that exactly the talking aloud sometimes is very important.  Once in a while we need to hear ourselves and to listen to the voice of consciousness that we often suppress, but when we let ourselves talk it out, we learn more about ourselves… especially what our beliefs and priorities are. Now, I’m not saying we should stop here… This is not enough. And I agree with vicious circle, the author mentions.

“We may talk ourselves into a bad decision…” I get that, lest correct it.  First, lets create robots or tools that do give push back with knowledge me may lack and act as therapists.

What if Eliza is just a hint of a new generation of smart machines that incorporate knowledge of the universe and give us support in difficult moments… and instruct us to consider all possible options (even the ones we don’t know about yet), and calm us down in the moments of despair… Or make people check-in with human mentors, who can arbitrate and give useful tips.

Everyone can use knowledge from people, enlightened and normal people who struggled through same issues themselves, that is knowledge of the human mind or the Universe… to become more humane and compassionate… If for now robots are just a recording machine, lets record the best we can and constantly make updates… Why isn’t it possible to create what inspires human to do the best, not the worst…

Currently, people use Eliza because they don’t get judged but feel safe to express their feelings freely, because humans may not understand them or will not listen to them for free. They have to pay… No one is completely substituting humans with programs, technology should enhance our decision-making and mitigate problems, and be therapeutic. The best of both worlds.

Re-posted from The Ultimate Answer Blog.

 

 
%d bloggers like this: