quinta-feira, 25 de novembro de 2010
segunda-feira, 13 de setembro de 2010
Dani Dias - cha cha cha: O pesadelo dos noticiários
Dani Dias - cha cha cha: O pesadelo dos noticiários: "Sinceramente, estou cada vez com menos vontade de ver noticiários. Muita notícia macabra e ruim. Pior ainda, muita notícia de mulheres joven..."
terça-feira, 27 de julho de 2010
domingo, 25 de julho de 2010
terça-feira, 20 de julho de 2010
The Benefits of Blowing your Top
July 5, 2010
The Benefits of Blowing Your Top
By BENEDICT CAREY
The longing for President Obama to vent some fury at oil executives or bankers may run far deeper than politics. Millions of people live or work with exasperatingly cool customers, who seem to be missing an emotional battery, or perhaps saving their feelings for a special occasion. People who — unlike the mining operators in the gulf — have a blowout preventer that works all too well.
Sang-froid has its place, especially during a crisis; but so does Sigmund Freud, who described the potential downside of suppressed passions. Those exhortations being directed at the president could be just as easily be turned on countless co-workers, spouses, friends (or oneself):
Lose it. Just once. See what happens.
“One reason we’re so attuned to others’ emotions is that, when it’s a real emotion, it tells us something important about what matters to that person,” said James J. Gross, a psychologist at Stanford University. When it’s suppressed or toned down, he added, “people think, damn it, you’re not like us, you don’t care about the same things we do.”
Rigorous study of what psychologists call emotion regulation is fairly new, and for obvious reasons has focused far more on untamed passions than on the domesticated variety. Runaway emotion defines many mental disorders, after all; restraint is typically associated with good mental health, from childhood through later life.
Yet social functioning is a different matter. Research in the past few years has found that people develop a variety of psychological tools to manage what they express in social situations, and those techniques often become subconscious, affecting interactions in unintended ways. The better that people understand their own patterns, the more likely they are to see why some emotionally charged interactions go awry — whether from too little control or, in the president’s case, perhaps too much.
Most scientists agree that a person’s range of possible emotional expression is a matter of inborn temperament. Growing up is, in one sense, a living education in how to manage that temperament so it elicits help from others and does not torment oneself.
“As we grow, the prefrontal areas of the brain develop, and we become more biologically able to control our impulses as well,” said Stefan G. Hofmann, a professor of psychology at Boston University.
Psychologists divide regulation strategies into two broad categories: pre-emptive, occurring before an emotion is fully felt; and responsive, coming afterward. The best known of the latter category, and one of the first learned, is simple suppression. First-graders will cover a smile with their hand when a classmate does something embarrassing; in time, many become far more adept, reflexively masking surprise, alarm, even rage with a poker face.
Suppression, while clearly valuable in some situations (no laughing at funerals, please), has social costs that are all too familiar to those who know its cold touch. In one 2003 Stanford study, researchers found that people instructed to wear a poker face while discussing a documentary about the atomic bombings of Hiroshima and Nagasaki made especially stressful conversation partners.
In another, published last year, psychologists followed 278 men and women as they entered college, giving questionnaires and conducting interviews. Those who scored highest on measures of emotion suppression had the hardest time making friends.
“An individual who responds to the college transition by becoming emotionally guarded in the first few days,” the authors wrote, will most likely miss opportunities for friendships.
Pre-emptive techniques can work in more subtle ways. One of these is simple diversion, reflexively focusing on the good and ignoring the bad — rereading the praise in an evaluation and ignoring or dismissing any criticism. A 2009 study led by Derek Isaacowitz of Brandeis University found that people over 55 were much more likely than those aged 25 and under to focus on positive images when in a bad mood — thereby buoying their spirits. The younger group was more likely to focus on negative images when feeling angry or down.
More striking, Dr. Isaacowitz found in another study that older people were twice as likely as younger ones to be “rapid regulators” — people whose mood bounced back quickly, sometimes within minutes, after ruminating on depressing memories.
“We have found in general that older people tend to regulate their emotions faster, and are not as motivated to explore negative information, to engage negative images, as younger people are,” Dr. Isaacowitz said. “And it makes some sense, that younger adults would explore the negative side of things, that they need to and maybe want to experience them — to experience life — as they develop their own strategies to regulate.”
Socially speaking, in short, the ability to shrug off feelings of disgust or outrage may suit an older group but strike younger people as inauthentic, even callous.
Finally, people may choose the emotions they feel far more often than they are aware — and those choices, too, can trip up social interactions. A series of recent experiments led by Maya Tamir, a psychologist at Hebrew University in Jerusalem and at Boston College, has found that people subconsciously prime themselves to feel emotions they believe will be most useful to them in an anticipated situation. The researchers call these instrumental emotions.
In one experiment, published last year, Dr. Tamir and Brett Q. Ford of Boston College prepared participants to play a video game in which they would be hunted down by monsters. Before playing, the study volunteers rated what type of music they wanted to hear and what kind of autobiographical memories they preferred to recall.
They were much more likely to want to recall fearful memories, and to prefer to listen to ominous music, than others who were expecting to play a video game in which they would build a theme park or solve a simple puzzle. They were, the authors argue, adopting an emotion that would serve them well in the game.
Dr. Tamir has found similar results in a variety of situations, showing for example that people role-playing as landlords will ramp up their anger before confronting a tenant about late rent.
Mr. Obama’s analytical composure probably comes so easily because it has repeatedly served him well, Dr. Tamir said.
“If staying calm and patient and confident is what has worked for you in crisis situations in the past,” she said, “then subconsciously it may become automatic. And the more automatic it becomes, the less of the actual anger, or panic, you feel.”
All of which makes it a treacherous task to express the real thing, at exactly the moment and pitch that people expect. For people like the president, said Dr. Gross of Stanford, it means throwing the switch on two psychological systems at once: the habitual, analytical one (power down) and the instrumental one (power up).
“If that process interrupts expression even a little, people notice,” Dr. Gross said. “We have an exceptional capacity to track whether the timing and morphology of an emotion is correct.”
The most socially skilled among us — those who project the emotions they intend, when they intend to — are not wedded to any one strategy, Dr. Hofmann argues. In a paper published last month with Todd Kashdan of George Mason University, he proposed that emotion researchers adopt a questionnaire to measure three components of regulation: concealing (i.e., suppression), adjusting (quickly calming anger, for instance) and tolerating (openly expressing emotion).
“These are each valuable strategies, in different situations,” Dr. Hofmann said. “The people who get into trouble socially, I believe, are the ones who are inflexible — who stick to just one.”
The Benefits of Blowing Your Top
By BENEDICT CAREY
The longing for President Obama to vent some fury at oil executives or bankers may run far deeper than politics. Millions of people live or work with exasperatingly cool customers, who seem to be missing an emotional battery, or perhaps saving their feelings for a special occasion. People who — unlike the mining operators in the gulf — have a blowout preventer that works all too well.
Sang-froid has its place, especially during a crisis; but so does Sigmund Freud, who described the potential downside of suppressed passions. Those exhortations being directed at the president could be just as easily be turned on countless co-workers, spouses, friends (or oneself):
Lose it. Just once. See what happens.
“One reason we’re so attuned to others’ emotions is that, when it’s a real emotion, it tells us something important about what matters to that person,” said James J. Gross, a psychologist at Stanford University. When it’s suppressed or toned down, he added, “people think, damn it, you’re not like us, you don’t care about the same things we do.”
Rigorous study of what psychologists call emotion regulation is fairly new, and for obvious reasons has focused far more on untamed passions than on the domesticated variety. Runaway emotion defines many mental disorders, after all; restraint is typically associated with good mental health, from childhood through later life.
Yet social functioning is a different matter. Research in the past few years has found that people develop a variety of psychological tools to manage what they express in social situations, and those techniques often become subconscious, affecting interactions in unintended ways. The better that people understand their own patterns, the more likely they are to see why some emotionally charged interactions go awry — whether from too little control or, in the president’s case, perhaps too much.
Most scientists agree that a person’s range of possible emotional expression is a matter of inborn temperament. Growing up is, in one sense, a living education in how to manage that temperament so it elicits help from others and does not torment oneself.
“As we grow, the prefrontal areas of the brain develop, and we become more biologically able to control our impulses as well,” said Stefan G. Hofmann, a professor of psychology at Boston University.
Psychologists divide regulation strategies into two broad categories: pre-emptive, occurring before an emotion is fully felt; and responsive, coming afterward. The best known of the latter category, and one of the first learned, is simple suppression. First-graders will cover a smile with their hand when a classmate does something embarrassing; in time, many become far more adept, reflexively masking surprise, alarm, even rage with a poker face.
Suppression, while clearly valuable in some situations (no laughing at funerals, please), has social costs that are all too familiar to those who know its cold touch. In one 2003 Stanford study, researchers found that people instructed to wear a poker face while discussing a documentary about the atomic bombings of Hiroshima and Nagasaki made especially stressful conversation partners.
In another, published last year, psychologists followed 278 men and women as they entered college, giving questionnaires and conducting interviews. Those who scored highest on measures of emotion suppression had the hardest time making friends.
“An individual who responds to the college transition by becoming emotionally guarded in the first few days,” the authors wrote, will most likely miss opportunities for friendships.
Pre-emptive techniques can work in more subtle ways. One of these is simple diversion, reflexively focusing on the good and ignoring the bad — rereading the praise in an evaluation and ignoring or dismissing any criticism. A 2009 study led by Derek Isaacowitz of Brandeis University found that people over 55 were much more likely than those aged 25 and under to focus on positive images when in a bad mood — thereby buoying their spirits. The younger group was more likely to focus on negative images when feeling angry or down.
More striking, Dr. Isaacowitz found in another study that older people were twice as likely as younger ones to be “rapid regulators” — people whose mood bounced back quickly, sometimes within minutes, after ruminating on depressing memories.
“We have found in general that older people tend to regulate their emotions faster, and are not as motivated to explore negative information, to engage negative images, as younger people are,” Dr. Isaacowitz said. “And it makes some sense, that younger adults would explore the negative side of things, that they need to and maybe want to experience them — to experience life — as they develop their own strategies to regulate.”
Socially speaking, in short, the ability to shrug off feelings of disgust or outrage may suit an older group but strike younger people as inauthentic, even callous.
Finally, people may choose the emotions they feel far more often than they are aware — and those choices, too, can trip up social interactions. A series of recent experiments led by Maya Tamir, a psychologist at Hebrew University in Jerusalem and at Boston College, has found that people subconsciously prime themselves to feel emotions they believe will be most useful to them in an anticipated situation. The researchers call these instrumental emotions.
In one experiment, published last year, Dr. Tamir and Brett Q. Ford of Boston College prepared participants to play a video game in which they would be hunted down by monsters. Before playing, the study volunteers rated what type of music they wanted to hear and what kind of autobiographical memories they preferred to recall.
They were much more likely to want to recall fearful memories, and to prefer to listen to ominous music, than others who were expecting to play a video game in which they would build a theme park or solve a simple puzzle. They were, the authors argue, adopting an emotion that would serve them well in the game.
Dr. Tamir has found similar results in a variety of situations, showing for example that people role-playing as landlords will ramp up their anger before confronting a tenant about late rent.
Mr. Obama’s analytical composure probably comes so easily because it has repeatedly served him well, Dr. Tamir said.
“If staying calm and patient and confident is what has worked for you in crisis situations in the past,” she said, “then subconsciously it may become automatic. And the more automatic it becomes, the less of the actual anger, or panic, you feel.”
All of which makes it a treacherous task to express the real thing, at exactly the moment and pitch that people expect. For people like the president, said Dr. Gross of Stanford, it means throwing the switch on two psychological systems at once: the habitual, analytical one (power down) and the instrumental one (power up).
“If that process interrupts expression even a little, people notice,” Dr. Gross said. “We have an exceptional capacity to track whether the timing and morphology of an emotion is correct.”
The most socially skilled among us — those who project the emotions they intend, when they intend to — are not wedded to any one strategy, Dr. Hofmann argues. In a paper published last month with Todd Kashdan of George Mason University, he proposed that emotion researchers adopt a questionnaire to measure three components of regulation: concealing (i.e., suppression), adjusting (quickly calming anger, for instance) and tolerating (openly expressing emotion).
“These are each valuable strategies, in different situations,” Dr. Hofmann said. “The people who get into trouble socially, I believe, are the ones who are inflexible — who stick to just one.”
July 10, 2010
Students, Meet Your New Teacher, Mr. Robot
By BENEDICT CAREY and JOHN MARKOFF
LOS ANGELES — The boy, a dark-haired 6-year-old, is playing with a new companion.
The two hit it off quickly — unusual for the 6-year-old, who has autism — and the boy is imitating his playmate’s every move, now nodding his head, now raising his arms.
“Like Simon Says,” says the autistic boy’s mother, seated next to him on the floor.
Yet soon he begins to withdraw; in a video of the session, he covers his ears and slumps against the wall.
But the companion, a three-foot-tall robot being tested at the University of Southern California, maintains eye contact and performs another move, raising one arm up high.
Up goes the boy’s arm — and now he is smiling at the machine.
In a handful of laboratories around the world, computer scientists are developing robots like this one: highly programmed machines that can engage people and teach them simple skills, including household tasks, vocabulary or, as in the case of the boy, playing, elementary imitation and taking turns.
So far, the teaching has been very basic, delivered mostly in experimental settings, and the robots are still works in progress, a hackers’ gallery of moving parts that, like mechanical savants, each do some things well at the expense of others.
Yet the most advanced models are fully autonomous, guided by artificial intelligence software like motion tracking and speech recognition, which can make them just engaging enough to rival humans at some teaching tasks.
Researchers say the pace of innovation is such that these machines should begin to learn as they teach, becoming the sort of infinitely patient, highly informed instructors that would be effective in subjects like foreign language or in repetitive therapies used to treat developmental problems like autism.
Several countries have been testing teaching machines in classrooms. South Korea, known for its enthusiasm for technology, is “hiring” hundreds of robots as teacher aides and classroom playmates and is experimenting with robots that would teach English.
Already, these advances have stirred dystopian visions, along with the sort of ethical debate usually confined to science fiction. “I worry that if kids grow up being taught by robots and viewing technology as the instructor,” said Mitchel Resnick, head of the Lifelong Kindergarten group at the Media Laboratory at the Massachusetts Institute of Technology, “they will see it as the master.”
Most computer scientists reply that they have neither the intention, nor the ability, to replace human teachers. The great hope for robots, said Patricia Kuhl, co-director of the Institute for Learning and Brain Sciences at the University of Washington, “is that with the right kind of technology at a critical period in a child’s development, they could supplement learning in the classroom.”
Lessons From RUBI
“Kenka,” says a childlike voice. “Ken-ka.”
Standing on a polka-dot carpet at a preschool on the campus of the University of California, San Diego, a robot named RUBI is teaching Finnish to a 3-year-old boy.
RUBI looks like a desktop computer come to life: its screen-torso, mounted on a pair of shoes, sprouts mechanical arms and a lunchbox-size head, fitted with video cameras, a microphone and voice capability. RUBI wears a bandanna around its neck and a fixed happy-face smile, below a pair of large, plastic eyes.
It picks up a white sneaker and says kenka, the Finnish word for shoe, before returning it to the floor. “Feel it; I’m a kenka.”
In a video of this exchange, the boy picks up the sneaker, says “kenka, kenka” — and holds up the shoe for the robot to see.
In person they are not remotely humanlike, most of today’s social robots. Some speak well, others not at all. Some move on two legs, others on wheels. Many look like escapees from the Island of Misfit Toys.
They make for very curious company. The University of Southern California robot used with autistic children tracks a person throughout a room, approaching indirectly and pulling up just short of personal space, like a cautious child hoping to join a playground game.
The machine’s only words are exclamations (“Uh huh” for those drawing near; “Awww” for those moving away). Still, it’s hard to shake the sense that some living thing is close by. That sensation, however vague, is enough to facilitate a real exchange of information, researchers say.
In the San Diego classroom where RUBI has taught Finnish, researchers are finding that the robot enables preschool children to score significantly better on tests, compared with less interactive learning, as from tapes.
Preliminary results suggest that these students “do about as well as learning from a human teacher,” said Javier Movellan, director of the Machine Perception Laboratory at the University of California, San Diego. “Social interaction is apparently a very important component of learning at this age.”
Like any new kid in class, RUBI took some time to find a niche. Children swarmed the robot when it first joined the classroom: instant popularity. But by the end of the day, a couple of boys had yanked off its arms.
“The problem with autonomous machines is that people are so unpredictable, especially children,” said Corinna E. Lathan, chief executive of AnthroTronix, a Maryland company that makes a remotely controlled robot, CosmoBot, to assist in therapy with developmentally delayed children. “It’s impossible to anticipate everything that can happen.”
The RUBI team hit upon a solution one part mechanical and two parts psychological. The engineers programmed RUBI to cry when its arms were pulled. Its young playmates quickly backed off at the sound.
If the sobbing continued, the children usually shifted gears and came forward — to deliver a hug.
Re-armed and newly sensitive, RUBI was ready to test as a teacher. In a paper published last year, researchers from the University of California, San Diego, the Massachusetts Institute of Technology and the University of Joensuu in Finland found that the robot significantly improved the vocabulary of nine toddlers.
After testing the youngsters’ knowledge of 20 words and introducing them to the robot, the researchers left RUBI to operate on its own. The robot showed images on its screen and instructed children to associate them with words.
After 12 weeks, the children’s knowledge of the 10 words taught by RUBI increased significantly, while their knowledge of 10 control words did not. “The effect was relatively large, a reduction in errors of more than 25 percent,” the authors concluded.
Researchers in social robotics — a branch of computer science devoted to enhancing communication between humans and machines — at Honda Labs in Mountain View, Calif., have found a similar result with their robot, a three-foot character called Asimo, which looks like a miniature astronaut. In one 20-minute session the machine taught grade-school students how to set a table — improving their accuracy by about 25 percent, a recent study found.
At the University of Southern California, researchers have had their robot, Bandit, interact with children with autism. In a pilot study, four children with the diagnosis spent about 30 minutes with this robot when it was programmed to be socially engaging and another half-hour when it behaved randomly, more like a toy. The results are still preliminary, said David Feil-Seifer, who ran the study, but suggest that the children spoke more often and spent more time in direct interaction when the robot was responsive, compared with when it acted randomly.
Making the Connection
In a lab at the University of Washington, Morphy, a pint-size robot, catches the eye of an infant girl and turns to look at a toy.
No luck; the girl does not follow its gaze, as she would a human’s.
In a video the researchers made of the experiment, the girl next sees the robot “waving” to an adult. Now she’s interested; the sight of the machine interacting registers it as a social being in the young brain. She begins to track what the robot is looking at, to the right, the left, down. The machine has elicited what scientists call gaze-following, an essential first step of social exchange.
“Before they have language, infants pay attention to what I call informational hotspots,” where their mother or father is looking, said Andrew N. Meltzoff, a psychologist who is co-director of university’s Institute for Learning and Brain Sciences. This, he said, is how learning begins.
This basic finding, to be published later this year, is one of dozens from a field called affective computing that is helping scientists discover exactly which features of a robot make it most convincingly “real” as a social partner, a helper, a teacher.
“It turns out that making a robot more closely resemble a human doesn’t get you better social interactions,” said Terrence J. Sejnowski, a neuroscientist at University of California, San Diego. The more humanlike machines look, the more creepy they can seem.
The machine’s behavior is what matters, Dr. Sejnowski said. And very subtle elements can make a big difference.
The timing of a robot’s responses is one. The San Diego researchers found that if RUBI reacted to a child’s expression or comment too fast, it threw off the interaction; the same happened if the response was too slow. But if the robot reacted within about a second and a half, child and machine were smoothly in sync.
Physical rhythm is crucial. In recent experiments at a day care center in Japan, researchers have shown that having a robot simply bob or shake at the same rhythm a child is rocking or moving can quickly engage even very fearful children with autism.
“The child begins to notice something in that synchronous behavior and open up,” said Marek Michalowski of Carnegie Mellon University, who collaborated on the studies. Once that happens, he said, “you can piggyback social behaviors onto the interaction, like eye contact, joint attention, turn taking, things these kids have trouble with.”
One way to begin this process is to have a child mimic the physical movements of a robot and vice versa. In a continuing study financed by the National Institutes of Health, scientists at the University of Connecticut are conducting therapy sessions for children with autism using a French robot called Nao, a two-foot humanoid that looks like an elegant Transformer toy. The robot, remotely controlled by a therapist, demonstrates martial arts kicks and chops and urges the child to follow suit; then it encourages the child to lead.
“I just love robots, and I know this is therapy, but I don’t know — I think it’s just fun,” said Sam, an 8-year-old from New Haven with Asperger’s syndrome, who recently engaged in the therapy.
This simple mimicry seems to build a kind of trust, and increase sociability, said Anjana Bhat, an assistant professor in the department of education who is directing the experiment. “Social interactions are so dependent on whether someone is in sync with you,” Dr. Bhat said. “You walk fast, they walk fast; you go slowly, they go slowly — and soon you are interacting, and maybe you are learning.”
Personality matters, too, on both sides. In their studies with Asimo, the Honda robot, researchers have found that when the robot teacher is “cooperative” (“I am going to put the water glass here; do you think you can help me by placing the water glass on the same place on your side?”), children 4 to 6 did much better than when Asimo lectured them, or allowed them to direct themselves (“place the cup and saucer anywhere you like”). The teaching approach made less difference with students ages 7 to 10.
“The fact is that children’s reactions to a robot may vary widely, by age and by individual,” said Sandra Okita, a Columbia University researcher and co-author of the study.
If robots are to be truly effective guides, in short, they will have to do what any good teacher does: learn from students when a lesson is taking hold and when it is falling flat.
Learning From Humans
“Do you have any questions, Simon?”
On a recent Monday afternoon, Crystal Chao, a graduate student in robotics at the Georgia Institute of Technology, was teaching a five-foot robot named Simon to put away toys. She had given some instructions — the flower goes in the red bin, the block in the blue bin — and Simon had correctly put away several of these objects. But now the robot was stumped, its doughboy head tipped forward, its fawn eyes blinking at a green toy water sprinkler.
Dr. Chao repeated her query, perhaps the most fundamental in all of education: Do you have any questions?
“Let me see,” said Simon, in a childlike machine voice, reaching to pick up the sprinkler. “Can you tell me where this goes?”
“In the green bin,” came the answer.
Simon nodded, dropping it in that bin.
“Makes sense,” the robot said.
In addition to tracking motion and recognizing language, Simon accumulates knowledge through experience.
Just as humans can learn from machines, machines can learn from humans, said Andrea Thomaz, an assistant professor of interactive computing at Georgia Tech who directs the project. For instance, she said, scientists could equip a machine to understand the nonverbal cues that signal “I’m confused” or “I have a question” — giving it some ability to monitor how its lesson is being received.
To ask, as Dr. Chao did: Do you have any questions?
This ability to monitor and learn from experience is the next great frontier for social robotics — and it probably depends, in large part, on unraveling the secrets of how the human brain accumulates information during infancy.
In San Diego, researchers are trying to develop a human-looking robot with sensors that approximate the complexity of a year-old infant’s abilities to feel, see and hear. Babies learn, seemingly effortlessly, by experimenting, by mimicking, by moving their limbs. Could a machine with sufficient artificial intelligence do the same? And what kind of learning systems would be sufficient?
The research group has bought a $70,000 robot, built by a Japanese company, that is controlled by a pneumatic pressure system that will act as its senses, in effect helping it map out the environment by “feeling” in addition to “seeing” with embedded cameras. And that is the easy part.
The much steeper challenge is to program the machine to explore, as infants do, and build on moment-to-moment experience. Ideally its knowledge will be cumulative, not only recalling the layout of a room or a house, but using that stored knowledge to make educated guesses about a new room.
The researchers are shooting for nothing less than capturing the foundation of human learning — or, at least, its artificial intelligence equivalent. If robots can learn to learn, on their own and without instruction, they can in principle make the kind of teachers that are responsive to the needs of a class, even an individual child.
Parents and educators would certainly have questions about robots’ effectiveness as teachers, as well as ethical concerns about potential harm they might do. But if social robots take off in the way other computing technologies have, parents may have more pointed ones: Does this robot really “get” my child? Is its teaching style right for my son’s needs, my daughter’s talents?
That is, the very questions they would ask about any teacher.
Choe Sang-Hun contributed reporting from Seoul.
Extraido do New York Times
Students, Meet Your New Teacher, Mr. Robot
By BENEDICT CAREY and JOHN MARKOFF
LOS ANGELES — The boy, a dark-haired 6-year-old, is playing with a new companion.
The two hit it off quickly — unusual for the 6-year-old, who has autism — and the boy is imitating his playmate’s every move, now nodding his head, now raising his arms.
“Like Simon Says,” says the autistic boy’s mother, seated next to him on the floor.
Yet soon he begins to withdraw; in a video of the session, he covers his ears and slumps against the wall.
But the companion, a three-foot-tall robot being tested at the University of Southern California, maintains eye contact and performs another move, raising one arm up high.
Up goes the boy’s arm — and now he is smiling at the machine.
In a handful of laboratories around the world, computer scientists are developing robots like this one: highly programmed machines that can engage people and teach them simple skills, including household tasks, vocabulary or, as in the case of the boy, playing, elementary imitation and taking turns.
So far, the teaching has been very basic, delivered mostly in experimental settings, and the robots are still works in progress, a hackers’ gallery of moving parts that, like mechanical savants, each do some things well at the expense of others.
Yet the most advanced models are fully autonomous, guided by artificial intelligence software like motion tracking and speech recognition, which can make them just engaging enough to rival humans at some teaching tasks.
Researchers say the pace of innovation is such that these machines should begin to learn as they teach, becoming the sort of infinitely patient, highly informed instructors that would be effective in subjects like foreign language or in repetitive therapies used to treat developmental problems like autism.
Several countries have been testing teaching machines in classrooms. South Korea, known for its enthusiasm for technology, is “hiring” hundreds of robots as teacher aides and classroom playmates and is experimenting with robots that would teach English.
Already, these advances have stirred dystopian visions, along with the sort of ethical debate usually confined to science fiction. “I worry that if kids grow up being taught by robots and viewing technology as the instructor,” said Mitchel Resnick, head of the Lifelong Kindergarten group at the Media Laboratory at the Massachusetts Institute of Technology, “they will see it as the master.”
Most computer scientists reply that they have neither the intention, nor the ability, to replace human teachers. The great hope for robots, said Patricia Kuhl, co-director of the Institute for Learning and Brain Sciences at the University of Washington, “is that with the right kind of technology at a critical period in a child’s development, they could supplement learning in the classroom.”
Lessons From RUBI
“Kenka,” says a childlike voice. “Ken-ka.”
Standing on a polka-dot carpet at a preschool on the campus of the University of California, San Diego, a robot named RUBI is teaching Finnish to a 3-year-old boy.
RUBI looks like a desktop computer come to life: its screen-torso, mounted on a pair of shoes, sprouts mechanical arms and a lunchbox-size head, fitted with video cameras, a microphone and voice capability. RUBI wears a bandanna around its neck and a fixed happy-face smile, below a pair of large, plastic eyes.
It picks up a white sneaker and says kenka, the Finnish word for shoe, before returning it to the floor. “Feel it; I’m a kenka.”
In a video of this exchange, the boy picks up the sneaker, says “kenka, kenka” — and holds up the shoe for the robot to see.
In person they are not remotely humanlike, most of today’s social robots. Some speak well, others not at all. Some move on two legs, others on wheels. Many look like escapees from the Island of Misfit Toys.
They make for very curious company. The University of Southern California robot used with autistic children tracks a person throughout a room, approaching indirectly and pulling up just short of personal space, like a cautious child hoping to join a playground game.
The machine’s only words are exclamations (“Uh huh” for those drawing near; “Awww” for those moving away). Still, it’s hard to shake the sense that some living thing is close by. That sensation, however vague, is enough to facilitate a real exchange of information, researchers say.
In the San Diego classroom where RUBI has taught Finnish, researchers are finding that the robot enables preschool children to score significantly better on tests, compared with less interactive learning, as from tapes.
Preliminary results suggest that these students “do about as well as learning from a human teacher,” said Javier Movellan, director of the Machine Perception Laboratory at the University of California, San Diego. “Social interaction is apparently a very important component of learning at this age.”
Like any new kid in class, RUBI took some time to find a niche. Children swarmed the robot when it first joined the classroom: instant popularity. But by the end of the day, a couple of boys had yanked off its arms.
“The problem with autonomous machines is that people are so unpredictable, especially children,” said Corinna E. Lathan, chief executive of AnthroTronix, a Maryland company that makes a remotely controlled robot, CosmoBot, to assist in therapy with developmentally delayed children. “It’s impossible to anticipate everything that can happen.”
The RUBI team hit upon a solution one part mechanical and two parts psychological. The engineers programmed RUBI to cry when its arms were pulled. Its young playmates quickly backed off at the sound.
If the sobbing continued, the children usually shifted gears and came forward — to deliver a hug.
Re-armed and newly sensitive, RUBI was ready to test as a teacher. In a paper published last year, researchers from the University of California, San Diego, the Massachusetts Institute of Technology and the University of Joensuu in Finland found that the robot significantly improved the vocabulary of nine toddlers.
After testing the youngsters’ knowledge of 20 words and introducing them to the robot, the researchers left RUBI to operate on its own. The robot showed images on its screen and instructed children to associate them with words.
After 12 weeks, the children’s knowledge of the 10 words taught by RUBI increased significantly, while their knowledge of 10 control words did not. “The effect was relatively large, a reduction in errors of more than 25 percent,” the authors concluded.
Researchers in social robotics — a branch of computer science devoted to enhancing communication between humans and machines — at Honda Labs in Mountain View, Calif., have found a similar result with their robot, a three-foot character called Asimo, which looks like a miniature astronaut. In one 20-minute session the machine taught grade-school students how to set a table — improving their accuracy by about 25 percent, a recent study found.
At the University of Southern California, researchers have had their robot, Bandit, interact with children with autism. In a pilot study, four children with the diagnosis spent about 30 minutes with this robot when it was programmed to be socially engaging and another half-hour when it behaved randomly, more like a toy. The results are still preliminary, said David Feil-Seifer, who ran the study, but suggest that the children spoke more often and spent more time in direct interaction when the robot was responsive, compared with when it acted randomly.
Making the Connection
In a lab at the University of Washington, Morphy, a pint-size robot, catches the eye of an infant girl and turns to look at a toy.
No luck; the girl does not follow its gaze, as she would a human’s.
In a video the researchers made of the experiment, the girl next sees the robot “waving” to an adult. Now she’s interested; the sight of the machine interacting registers it as a social being in the young brain. She begins to track what the robot is looking at, to the right, the left, down. The machine has elicited what scientists call gaze-following, an essential first step of social exchange.
“Before they have language, infants pay attention to what I call informational hotspots,” where their mother or father is looking, said Andrew N. Meltzoff, a psychologist who is co-director of university’s Institute for Learning and Brain Sciences. This, he said, is how learning begins.
This basic finding, to be published later this year, is one of dozens from a field called affective computing that is helping scientists discover exactly which features of a robot make it most convincingly “real” as a social partner, a helper, a teacher.
“It turns out that making a robot more closely resemble a human doesn’t get you better social interactions,” said Terrence J. Sejnowski, a neuroscientist at University of California, San Diego. The more humanlike machines look, the more creepy they can seem.
The machine’s behavior is what matters, Dr. Sejnowski said. And very subtle elements can make a big difference.
The timing of a robot’s responses is one. The San Diego researchers found that if RUBI reacted to a child’s expression or comment too fast, it threw off the interaction; the same happened if the response was too slow. But if the robot reacted within about a second and a half, child and machine were smoothly in sync.
Physical rhythm is crucial. In recent experiments at a day care center in Japan, researchers have shown that having a robot simply bob or shake at the same rhythm a child is rocking or moving can quickly engage even very fearful children with autism.
“The child begins to notice something in that synchronous behavior and open up,” said Marek Michalowski of Carnegie Mellon University, who collaborated on the studies. Once that happens, he said, “you can piggyback social behaviors onto the interaction, like eye contact, joint attention, turn taking, things these kids have trouble with.”
One way to begin this process is to have a child mimic the physical movements of a robot and vice versa. In a continuing study financed by the National Institutes of Health, scientists at the University of Connecticut are conducting therapy sessions for children with autism using a French robot called Nao, a two-foot humanoid that looks like an elegant Transformer toy. The robot, remotely controlled by a therapist, demonstrates martial arts kicks and chops and urges the child to follow suit; then it encourages the child to lead.
“I just love robots, and I know this is therapy, but I don’t know — I think it’s just fun,” said Sam, an 8-year-old from New Haven with Asperger’s syndrome, who recently engaged in the therapy.
This simple mimicry seems to build a kind of trust, and increase sociability, said Anjana Bhat, an assistant professor in the department of education who is directing the experiment. “Social interactions are so dependent on whether someone is in sync with you,” Dr. Bhat said. “You walk fast, they walk fast; you go slowly, they go slowly — and soon you are interacting, and maybe you are learning.”
Personality matters, too, on both sides. In their studies with Asimo, the Honda robot, researchers have found that when the robot teacher is “cooperative” (“I am going to put the water glass here; do you think you can help me by placing the water glass on the same place on your side?”), children 4 to 6 did much better than when Asimo lectured them, or allowed them to direct themselves (“place the cup and saucer anywhere you like”). The teaching approach made less difference with students ages 7 to 10.
“The fact is that children’s reactions to a robot may vary widely, by age and by individual,” said Sandra Okita, a Columbia University researcher and co-author of the study.
If robots are to be truly effective guides, in short, they will have to do what any good teacher does: learn from students when a lesson is taking hold and when it is falling flat.
Learning From Humans
“Do you have any questions, Simon?”
On a recent Monday afternoon, Crystal Chao, a graduate student in robotics at the Georgia Institute of Technology, was teaching a five-foot robot named Simon to put away toys. She had given some instructions — the flower goes in the red bin, the block in the blue bin — and Simon had correctly put away several of these objects. But now the robot was stumped, its doughboy head tipped forward, its fawn eyes blinking at a green toy water sprinkler.
Dr. Chao repeated her query, perhaps the most fundamental in all of education: Do you have any questions?
“Let me see,” said Simon, in a childlike machine voice, reaching to pick up the sprinkler. “Can you tell me where this goes?”
“In the green bin,” came the answer.
Simon nodded, dropping it in that bin.
“Makes sense,” the robot said.
In addition to tracking motion and recognizing language, Simon accumulates knowledge through experience.
Just as humans can learn from machines, machines can learn from humans, said Andrea Thomaz, an assistant professor of interactive computing at Georgia Tech who directs the project. For instance, she said, scientists could equip a machine to understand the nonverbal cues that signal “I’m confused” or “I have a question” — giving it some ability to monitor how its lesson is being received.
To ask, as Dr. Chao did: Do you have any questions?
This ability to monitor and learn from experience is the next great frontier for social robotics — and it probably depends, in large part, on unraveling the secrets of how the human brain accumulates information during infancy.
In San Diego, researchers are trying to develop a human-looking robot with sensors that approximate the complexity of a year-old infant’s abilities to feel, see and hear. Babies learn, seemingly effortlessly, by experimenting, by mimicking, by moving their limbs. Could a machine with sufficient artificial intelligence do the same? And what kind of learning systems would be sufficient?
The research group has bought a $70,000 robot, built by a Japanese company, that is controlled by a pneumatic pressure system that will act as its senses, in effect helping it map out the environment by “feeling” in addition to “seeing” with embedded cameras. And that is the easy part.
The much steeper challenge is to program the machine to explore, as infants do, and build on moment-to-moment experience. Ideally its knowledge will be cumulative, not only recalling the layout of a room or a house, but using that stored knowledge to make educated guesses about a new room.
The researchers are shooting for nothing less than capturing the foundation of human learning — or, at least, its artificial intelligence equivalent. If robots can learn to learn, on their own and without instruction, they can in principle make the kind of teachers that are responsive to the needs of a class, even an individual child.
Parents and educators would certainly have questions about robots’ effectiveness as teachers, as well as ethical concerns about potential harm they might do. But if social robots take off in the way other computing technologies have, parents may have more pointed ones: Does this robot really “get” my child? Is its teaching style right for my son’s needs, my daughter’s talents?
That is, the very questions they would ask about any teacher.
Choe Sang-Hun contributed reporting from Seoul.
Extraido do New York Times
quinta-feira, 17 de junho de 2010
terça-feira, 1 de junho de 2010
Some more information...
BBC News - How volcano chaos unfolded: in graphics
news.bbc.co.uk
The ash cloud produced by the eruption of a sub-glacial volcano in Iceland brought chaos to the European air industry between 14 and 21 April. Since then, disruption to flights has continued sporadically depending on the varying intensity of ash cloud and weather patterns. ...
news.bbc.co.uk
The ash cloud produced by the eruption of a sub-glacial volcano in Iceland brought chaos to the European air industry between 14 and 21 April. Since then, disruption to flights has continued sporadically depending on the varying intensity of ash cloud and weather patterns. ...
domingo, 30 de maio de 2010
Brazil And Turkey´s Nuclear Deal with Iran
For a harshly critical view of Lula and Brazil’s actions vis-à-vis Iran, see Thomas L. Friedman, “As Ugly as It Gets,” The New York Times (op-ed), May 25, 2010 (May 26, 2010 print edition)
Compare Friedman’s views with those of his fellow op-ed columnist, Roger Cohen, cited below (May 20, 2010). Cohen has spent a lot of time reporting from Iran, up to and including the period of protests in the summer of 2009 following the June 12 first-round presidential elections.
See also “Obama havia dito a Lula que acordo com Irã criaria confiança,” Véja, 21 de maio de 2010
For an interesting look at the degree of Brazil’s growing relationship with Iran, see Claudio Dantas Sequeira, “O acordo secreto do Brasil com o Irã,” ISTOÈ, 1 de julio de 2009, atualizado em 15 novembro 2009.
Op-Ed Columnist - As Ugly As It Gets - NYTimes.com
www.nytimes.com
It’s shameful for Brazil and Turkey, nascent democracies, to embrace the Iranian president, who crushes democracy.
Op-Ed Columnist - America Moves the Goalposts - NYTimes.com
www.nytimes.com
Further sanctions will not change Iran's nuclear behavior; negotiations might. The Brazilian-Turkish Iran deal is worth pursuing.
Comments are invited
Compare Friedman’s views with those of his fellow op-ed columnist, Roger Cohen, cited below (May 20, 2010). Cohen has spent a lot of time reporting from Iran, up to and including the period of protests in the summer of 2009 following the June 12 first-round presidential elections.
See also “Obama havia dito a Lula que acordo com Irã criaria confiança,” Véja, 21 de maio de 2010
For an interesting look at the degree of Brazil’s growing relationship with Iran, see Claudio Dantas Sequeira, “O acordo secreto do Brasil com o Irã,” ISTOÈ, 1 de julio de 2009, atualizado em 15 novembro 2009.
Op-Ed Columnist - As Ugly As It Gets - NYTimes.com
www.nytimes.com
It’s shameful for Brazil and Turkey, nascent democracies, to embrace the Iranian president, who crushes democracy.
Op-Ed Columnist - America Moves the Goalposts - NYTimes.com
www.nytimes.com
Further sanctions will not change Iran's nuclear behavior; negotiations might. The Brazilian-Turkish Iran deal is worth pursuing.
Comments are invited
sábado, 29 de maio de 2010
Discussing natural disasters


Hi,there!
I´ve received a link from a friend with these pictures showing the eruption of the volcano in Iceland!! The name... awfully difficult to be pronounced. Have tried many times but wasn´t able to!! But the photos? Really amazing!
How about having a look at the photos and thinking about the words which are connected to it and that you don´t know in English!! For example: ash clouds. Would that come straight to you mind? Even if you don´t know the word write in Portuguese if you´re lucky you´ll be able to find them in the text!!
Have got the list of words with you??
Here´s the text:
Ash cloud flight delays extended
Restrictions on England's airspace will remain in place until at least 1300 BST on Saturday as a cloud of volcanic ash continues to drift across the country.
However, the air traffic control body Nats said Manchester, Liverpool and all other airports north of there may be operational between 0400 and 1000 BST.
It warned that the situation was changing constantly.
The grounding of aircraft began on Thursday morning after a volcanic eruption in Iceland.
The restrictions were imposed because of the danger the ash poses to aircraft.
Tiny particles of rock, glass and sand in the cloud could damage engines.
Initially, Nats said all flights would be grounded until 1800 BST on Thursday.
However, further reviews meant the ban was extended until at least 0700 BST on Saturday, and then until 1300 BST - with the possible exception of some northern airports.
Barbados flights
In Manchester, Nats lifted its restriction on flights for an hour to allow a plane to take off before 1300 BST on Friday.
It travelled to Florida in the US to pick up stranded holidaymakers.
The restriction was lifted due to the positioning of the ash cloud.
Two flights from Barbados and Vancouver were also able to land at the airport.
A Nats spokesman said: "We are looking for opportunities when the ash cloud moves sufficient for us to enable some flights to operate under individual co-ordination with ATC [air traffic control].
Restrictions were lifted in Scotland and Northern Ireland earlier.
Some smaller airlines in England have been operating flights, on routes where aircraft fly below the controlled airspace.
Isle of Man airline Manx2 has a few services between the island and Blackpool and Belfast, and there are flights between Newquay in Cornwall and the Isles of Scilly.
Thousands of passengers across England have faced severe disruption since Thursday.
Airports have been cleared of all but essential staff.
Travellers were warned further delays could be expected when restrictions are eventually lifted and they have been advised to check with their airlines for up-to-date information.
Teesside University lecturer Michael Short is stuck in Stockholm, Sweden, after flying out there for a conference.
Speaking on Friday afternoon, he said: "I should be on my way home now. I should have landed at Heathrow. It's annoying really.
"I can think of worse places to be stuck but I am supposed to be at a friend's wedding tomorrow and I've got to get back for my job as well.
"There's not a great deal I can do though."
Barbara and Tony Mallinder travelled to Heathrow from their home in Rotherham, South Yorkshire, hoping to catch a flight to Shanghai on Friday afternoon.
They are due to leave on a cruise around Hong Kong, Vietnam and Japan on Sunday, but now fear they won't make it.
Mrs Mallinder said: "We didn't know it would still be shut.
"Unlike other people, we can't go 48 hours later and have the rest of our holiday.
"But it can't be helped. They can't send the planes up if it's dangerous."
The ash has also affected air ambulances, with several areas grounding their aircraft and transferring crews to rapid response vehicles.
Helicopters which support North Sea oil and gas rigs have also been grounded.
Many air passengers have tried to find alternative transport methods.
Eurostar trains reported a complete sell-out of its services to Brussels and Paris for the second day on Friday.
Ferry operator Norfolkline laid on special coaches to take foot passengers from Dover to France.
The ash cloud was created by an eruption in the Eyjafjallajoekull area of Iceland, which began on Wednesday and is continuing.
It is the second in Iceland in less than a month.
An atmospheric research team from Gloucestershire has been monitoring the volcanic ash cloud, having flown to the edge of the plume in a specially-adapted plane.
However, the air traffic control body Nats said Manchester, Liverpool and all other airports north of there may be operational between 0400 and 1000 BST.
It warned that the situation was changing constantly.
The grounding of aircraft began on Thursday morning after a volcanic eruption in Iceland.
The restrictions were imposed because of the danger the ash poses to aircraft.
Tiny particles of rock, glass and sand in the cloud could damage engines.
Initially, Nats said all flights would be grounded until 1800 BST on Thursday.
However, further reviews meant the ban was extended until at least 0700 BST on Saturday, and then until 1300 BST - with the possible exception of some northern airports.
Barbados flights
In Manchester, Nats lifted its restriction on flights for an hour to allow a plane to take off before 1300 BST on Friday.
It travelled to Florida in the US to pick up stranded holidaymakers.
The restriction was lifted due to the positioning of the ash cloud.
Two flights from Barbados and Vancouver were also able to land at the airport.
A Nats spokesman said: "We are looking for opportunities when the ash cloud moves sufficient for us to enable some flights to operate under individual co-ordination with ATC [air traffic control].
Restrictions were lifted in Scotland and Northern Ireland earlier.
Some smaller airlines in England have been operating flights, on routes where aircraft fly below the controlled airspace.
Isle of Man airline Manx2 has a few services between the island and Blackpool and Belfast, and there are flights between Newquay in Cornwall and the Isles of Scilly.
Thousands of passengers across England have faced severe disruption since Thursday.
Airports have been cleared of all but essential staff.
Travellers were warned further delays could be expected when restrictions are eventually lifted and they have been advised to check with their airlines for up-to-date information.
Teesside University lecturer Michael Short is stuck in Stockholm, Sweden, after flying out there for a conference.
Speaking on Friday afternoon, he said: "I should be on my way home now. I should have landed at Heathrow. It's annoying really.
"I can think of worse places to be stuck but I am supposed to be at a friend's wedding tomorrow and I've got to get back for my job as well.
"There's not a great deal I can do though."
Barbara and Tony Mallinder travelled to Heathrow from their home in Rotherham, South Yorkshire, hoping to catch a flight to Shanghai on Friday afternoon.
They are due to leave on a cruise around Hong Kong, Vietnam and Japan on Sunday, but now fear they won't make it.
Mrs Mallinder said: "We didn't know it would still be shut.
"Unlike other people, we can't go 48 hours later and have the rest of our holiday.
"But it can't be helped. They can't send the planes up if it's dangerous."
The ash has also affected air ambulances, with several areas grounding their aircraft and transferring crews to rapid response vehicles.
Helicopters which support North Sea oil and gas rigs have also been grounded.
Many air passengers have tried to find alternative transport methods.
Eurostar trains reported a complete sell-out of its services to Brussels and Paris for the second day on Friday.
Ferry operator Norfolkline laid on special coaches to take foot passengers from Dover to France.
The ash cloud was created by an eruption in the Eyjafjallajoekull area of Iceland, which began on Wednesday and is continuing.
It is the second in Iceland in less than a month.
An atmospheric research team from Gloucestershire has been monitoring the volcanic ash cloud, having flown to the edge of the plume in a specially-adapted plane.
Story from BBC NEWS:http://news.bbc.co.uk/go/pr/fr/-/2/hi/uk_news/england/8624178.stmPublished: 2010/04/16 20:39:58 GMT© BBC MMX
Was it too long ?? Well, hope you´ve read at least part of it!!!
Now, let´s get down to some more serious work!!
What would be relevant to comment about it! Post your comments...your friends might have different ideas!
Let´s have a look at how the linker however is used in the text:
As you can notice the verb be occured in the text 8 times!! So let´s have a look at the words which are on the left and the ones on the right. have you come to any conclusions??
Now let´s observe the been...
Now you could prepare a summary of this piece of news and afterwards record a one-minute podcast to post to you colleagues!! Don´t forget to include some of the new items!!
Now you could prepare a summary of this piece of news and afterwards record a one-minute podcast to post to you colleagues!! Don´t forget to include some of the new items!!Have a nice work! Enjoy!
Assinar:
Comentários (Atom)

