Anyone care to share their thoughts?
By Tom Heneghan
PARIS (Reuters) - Neuroscientists are making such rapid progress in unlocking the brain's secrets that some are urging colleagues to debate the ethics of their work before it can be misused by governments, lawyers or advertisers.
The news that brain scanners can now read a person's intentions before they are expressed or acted upon has given a new boost to the fledgling field of neuroethics that hopes to help researchers separate good uses of their work from bad.
The same discoveries that could help the paralyzed use brain signals to steer a wheelchair or write on a computer might also be used to detect possible criminal intent, religious beliefs or other hidden thoughts, these neuroethicists say.
"The potential for misuse of this technology is profound," said Judy Illes, director of the Stanford University neuroethics program in California. "This is a truly urgent situation."
The new boost came from a research paper published last week that showed neuroscientists can now not only locate the brain area where a certain thought occurs but probe into that area to read out some kinds of thought occurring there.
Its author, John-Dylan Haynes of the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany, compared this to learning how to read books after simply being able to find them before. "That is a huge step," he said.
Haynes hastened to add that neuroscience is still far from developing a scanner that could easily read random thoughts.
"But what we can do is read out some simple things that are quite useful for applications, such as simple intentions, attitudes or emotional states," he said. "We're finding we can read out yes-or-no situations."
"THIS COULD BE REALLY BIG"
Haynes and his research team used a brain scanning technique called functional magnetic resonance imaging to detect a volunteer's unspoken decision to add or subtract two numbers flashed on a screen. They got it right 70 percent of the time.
Barbara Sahakian, a clinical neuropsychologist at Cambridge University in Britain, saw possible misuse of this similar to the plot of Steven Spielberg's 2002 movie Minority Report, where police arrest people who psychics predict will commit a murder.
"We have to discuss how we want to use this technology and who should have access to it," she said.
Martha Farah, director of the University of Pennsylvania's Center for Cognitive Neuroscience, said advances such as Haynes's way of reading more out of imaging data were opening the path to very rapid growth in understanding the brain.
"We're just beginning to find out the power of these more fine-grained analyses," she said. "From the neuroethics point of view, this could be really big."
Farah, Illes and Sahakian are among a small group of neuroscientists who founded the Neuroethics Society in May 2006 to promote an international debate about the proper use of the discoveries their field was making and will make in future.
"As a neuroscientist, I'm trained to think about these issues once I have a result in hand," Illes said. "But we need to think about those ethical implications right now.
"People want to know if, when they go to an airport, their luggage will go through one scanner and their brains will go through another. Do I think that's around the corner? I do."
NO TO COMMERCIAL MIND-READING
Haynes estimated his research into unspoken intentions could yield simple applications within the next 5 to 10 years, such as reading a person's attitude to a company during a job interview or testing consumer preferences through "neuromarketing."
There are already companies trying to use brain scanners to build a more accurate lie detector, a technology that could dazzle judges and juries so much that they could mistake it for the final word in deciding a case, the researchers said.
Law enforcement officials might use the technology, which tracks heightened activity in areas linked to mental responses to outside stimuli, to screen people for pedophilia, racial bias, aggression or other undesirable tendencies, they said.
"If you're reading out something for neuromarketing or job interviews, or doing this against people's wills, that could be considered unethical," Haynes said.
Lie detection is more complex, he said, because it can violate mental privacy but also prove innocence. In some cases, refusing to use it to uphold a right of mental privacy could end up denying an accused person's right of self-defense.
Amid all these worrying scenarios, Haynes said people should learn about the promise and the limits of brain scanning so they can make informed decisions when new applications arise. They should also keep the technology in its proper context.
"I still strongly support the power of simple questions in psychology," he said. "If you want to tell what someone is going to do, the best way to find out is to ask them."
© Reuters 2007. All Rights Reserved.
It rather reminds me of that 'futuristic' movie, 'Minority Report'. You may remember this look into the future where the police used psychics to tell if someone was ABOUT to commit a crime EVENTUALLY. No? Well, like I said, it wasn't very good, especially since the police already do that, but call it 'profiling'.
Back to this story, I REALLY took note of the multi-use of the word, 'Ethicical' or 'UN-ethical' LOL, when a SCIENTISTS starts talking about what is or isn't ethical concerning a line of research, that should be the big giant RED FLAG telling them to STOP THE RESEARCH!!! But lets face it; it is all about the money! and there isn't a scientist that can honestly say different because it is the money that makes the research available! What is needed is for some stupidly rich entity to hand over a lab and just tell the scientists, 'don't worry about cost, invent/find/research something, AYTHING!!!' yEAH, AS IF THAT WOULD HAPEN!
LOL, yes but WE are BETTER than that! Aren't we?!?!
Quote:Originally Posted by ArianeWaaa... That's horrible ! I don't see any medical purposes of this research. They talk about ethics but sincerly, who thinks that the militaries (because I suppose they are interested in developping these kind of technologies) care about ethics ??? Science, money, politics and ideology... History already tells us that it can be a very dangerous mix.Ariane
Actually, for a quadriplegic, and for people with several other major handicaps, this technology could be a wonderful gift. They could one day interface with electronic devices of all sorts just by thought. They have done some impressive early stage stuff towards this goal already.
THE PROBLEM IS THE OTHER THINGS THIS TECHNOLOGY CAN DO... to otherwise healthy people! Mind reading people at the door to the sports arena, the courthouse, the grocery store, the airport, and perhaps, even in your own home, sound like an invasion of privacy at a deeply profound level.
Imagine if they learn to implant thoughts too, it would be the ultimate in sheep herding and commercial advertising technology.
We would never have privacy at any level ever again.
So, as the author of the article says, it is all in the ethical use of the technology. Personally, I am not too fond of what they, the powers that be, call “ethical.”
The way I see it, this technology does have the potential to do a lot of good but it also has the capacity to turn the human race into programmable machines with open source code that is viewable by anyone with the controls.
The way I see it, this technology does have the potential to do a lot of good but it also has the capacity to turn the human race into programmable machines with open source code viewable by anyone with the controls.
What MACJR said!!!!! Of course, Atomic Energy has been used for MANY great things, power, medicine, etc. but it's ULTIMATE use still makes one wonder at the wisdom of inventing it so soon! On the otherhand; the bombing done in both Germany & Japan BEFORE dropping the A-bomb made those two bombs pale in comparison, didn't they?
Scientists try to predict intentions
By MARIA CHENG, AP Medical Writer 24 minutes ago
BERLIN - At a laboratory in Germany, volunteers slide into a donut-shaped MRI machine and perform simple tasks, such as deciding whether to add or subtract two numbers, or choosing which of two buttons to press. They have no inkling that scientists in the next room are trying to read their minds — using a brain scan to figure out their intention before it is turned into action.
While still in its initial stages, the techniques may eventually have wide-ranging implications for everything from criminal interrogations to airline security checks. And that alarms some ethicists who fear the technology could one day be abused by authorities, marketers, or employers.
Tanja Steinbach, a 21-year-old student in Leipzig who participated in the experiment, found it a bit spooky but wasn't overly concerned about the civil liberties implications.
"It's really weird," she said. "But since I know they're only able to do this if they have certain machines, I'm not worried that everybody else on the street can read my mind."
Researchers have long used MRI machines to identify different types of brain activity, and scientists in the United States have recently developed brain scans designed for lie detection.
But outside experts say the work led by Dr. John-Dylan Haynes at the Bernstein Center is groundbreaking.
"The fact that we can determine what intention a person is holding in their mind pushes the level of our understanding of subjective thought to a whole new level," said Dr. Paul Wolpe, a professor of psychiatry at the University of Pennsylvania, who was not connected to the study.
The research, which began in July 2005, has been of limited scope: only 21 people have been tested so far. And the 71 percent accuracy rate is only about 20 percent more successful than random selection.
Still, the research conducted at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, about 150 kilometers (90 miles) southwest of Berlin, has been generating strong interest in the scientific community.
"Haynes' experiment strikes at the heart of how good we will get at predicting behaviors," said Dr. Todd Braver, an associate professor in the department of psychology at Washington University, who was not connected with the research.
"The barriers that we assumed existed in reading our minds keep getting breached."
In one study, participants were told to decide whether to add or subtract two numbers a few seconds before the numbers were flashed on a screen. In the interim, a computer captured images of their brain waves to predict the subject's decision — with one pattern suggesting addition, and another subtraction.
Haynes' team began its research by trying to identify which part of the mind was storing intentions. They discovered it was found in the prefrontal cortex region by scanning the brain to look for bursts of activity when subjects were given choices.
Then they went about studying which type of patterns were associated with different intentions.
"If you knew which thought signatures to look for, you could theoretically predict in more detail what people were going to do in the future," said Haynes.
For the moment, reading minds is a cumbersome process and there is no chance scientists could spy on decision-making surreptitiously. Haynes' studies focus on people who choose between just two alternatives, not the infinite number present in everyday life.
But scientists are making enough progress to make ethicists nervous, since the research has already progressed from identifying the regions of the brain where certain thoughts occur to identifying the very content of those thoughts.
"These technologies, for the first time, give us a real possibility of going straight to the source to see what somebody is thinking or feeling, without them having any ability to stop us," said Dr. Hank Greely, director of Stanford University's Center for Law and the Biosciences.
"The concept of keeping your thoughts private could be profoundly altered in the future," he said.
Civil libertarians are concerned that mind-reading technology may fit into a trend of pre-emptive security measures in which authorities could take action against individuals before they commit a crime — a scenario explored in the 2002 science fiction film "Minority Report."
Already, Britain is creating a national DNA database that would allow authorities to track people with violent predispositions. In addition, the government has also floated the idea of locking up people with personality disorders that could lead to criminal behavior.
"We need to start thinking about how far we are going to allow these technologies to be used," said Wolpe.
Despite the fears, Haynes believes his research has more benign practical applications.
For example, he says it will contribute to the development of machines already in existence that respond to brain signals and allow the paralyzed to change TV channels, surf the Internet, and operate small robotic devices.
For now, the practical applications of Haynes' research are years if not decades away.
"We are making the first steps in reading out what the specific contents of people's thoughts are by trying to understand the language of the brain," Haynes said. "But it's not like we are going to have a machine tomorrow."
By RACHEL KONRAD, AP Technology Writer Mon Apr 30, 7:48 AM ET
SAN JOSE, Calif. - A convincing twin of Darth Vader stalks the beige cubicles of a Silicon Valley office, complete with ominous black mask, cape and light saber. But this is no chintzy Halloween costume. It's a prototype, years in the making, of a toy that incorporates brain wave-reading technology.
Engineers at NeuroSky Inc. have big plans for brain wave-reading toys and video games. They say the simple Darth Vader game — a relatively crude biofeedback device cloaked in gimmicky garb — portends the coming of more sophisticated devices that could revolutionize the way people play.
Technology from NeuroSky and other startups could make video games more mentally stimulating and realistic. It could even enable players to control video game characters or avatars in virtual worlds with nothing but their thoughts.
Adding biofeedback to "Tiger Woods PGA Tour," for instance, could mean that only those players who muster Zen-like concentration could nail a put. In the popular action game "Grand Theft Auto," players who become nervous or frightened would have worse aim than those who remain relaxed and focused.
NeuroSky's prototype measures a person's baseline brain-wave activity, including signals that relate to concentration, relaxation and anxiety. The technology ranks performance in each category on a scale of 1 to 100, and the numbers change as a person thinks about relaxing images, focuses intently, or gets kicked, interrupted or otherwise distracted.
The technology is similar to more sensitive, expensive equipment that athletes use to achieve peak performance. Koo Hyoung Lee, a NeuroSky co-founder from South Korea, used biofeedback to improve concentration and relaxation techniques for members of his country's Olympic archery team.
"Most physical games are really mental games," said Lee, also chief technology officer at San Jose-based NeuroSky, a 12-employee company founded in 1999. "You must maintain attention at very high levels to succeed. This technology makes toys and video games more lifelike."
Boosters say toys with even the most basic brain wave-reading technology — scheduled to debut later this year — could boost mental focus and help kids with attention deficit hyperactivity disorder, autism and mood disorders.
But scientific research is scant. Even if the devices work as promised, some question whether people who use biofeedback devices will be able to replicate their relaxed or focused states in real life, when they're not attached to equipment in front of their television or computer.
Elkhonon Goldberg, clinical professor of neurology at New York University, said the toys might catch on in a society obsessed with optimizing performance — but he was skeptical they'd reduce the severity of major behavioral disorders.
"These techniques are used usually in clinical contexts. The gaming companies are trying to push the envelope," said Goldberg, author of "The Wisdom Paradox: How Your Mind Can Grow Stronger As Your Brain Grows Older." "You can use computers to improve the cognitive abilities, but it's an art."
It's also unclear whether consumers, particularly American kids, want mentally taxing games.
"It's hard to tell whether playing games with biofeedback is more fun — the company executives say that, but I don't know if I believe them," said Ben Sawyer, director of the Games for Health Project, a division of the Serious Games Initiative. The think tank focuses in part on how to make computer games more educational, not merely pastimes for kids with dexterous thumbs.
The basis of many brain wave-reading games is electroencephalography, or EEG, the measurement of the brain's electrical activity through electrodes placed on the scalp. EEG has been a mainstay of psychiatry for decades.
An EEG headset in a research hospital may have 100 or more electrodes that attach to the scalp with a conductive gel. It could cost tens of thousands of dollars.
But the price and size of EEG hardware is shrinking. NeuroSky's "dry-active" sensors don't require gel, are the size of a thumbnail, and could be put into a headset that retails for as little as $20, said NeuroSky CEO Stanley Yang.
Yang is secretive about his company's product lineup because of a nondisclosure agreement with the manufacturer. But he said an international toy manufacturer plans to unveil an inexpensive gizmo with an embedded NeuroSky biosensor at the Japan Toy Association's trade show in late June. A U.S. version is scheduled to debut at the American International Fall Toy Show in October.
"Whatever we sell, it will work on 100 percent or almost 100 percent of people out there, no matter what the condition, temperature, indoor or outdoors," Yang said. "We aim for wearable technology that everyone can put on and go without failure, as easy as the iPod."
Researchers at NeuroSky and other startups are also building prototypes of toys that use electromyography (EMG), which records twitches and other muscular movements, and electrooculography (EOG), which measures changes in the retina.
While NeuroSky's headset has one electrode, Emotiv Systems Inc. has developed a gel-free headset with 18 sensors. Besides monitoring basic changes in mood and focus, Emotiv's bulkier headset detects brain waves indicating smiles, blinks, laughter, even conscious thoughts and unconscious emotions. Players could kick or punch their video game opponent — without a joystick or mouse.
"It fulfills the fantasy of telekinesis," said Tan Le, co-founder and president of San Francisco-based Emotiv.
The 30-person company hopes to begin selling a consumer headset next year, but executives would not speculate on price. A prototype hooks up to gaming consoles such as the Nintendo Wii, Sony PlayStation 3 and Microsoft Xbox 360.
Le, a 29-year-old Australian woman, said the company decided in 2004 to target gamers because they would generate the most revenue — but eventually Emotive will build equipment for clinical use. The technology could enable paralyzed people to "move" in virtual realty; people with obsessive-compulsive disorders could measure their anxiety levels, then adjust medication accordingly.
The husband-and-wife team behind CyberLearning Technology LLC took the opposite approach. The San Marcos-based startup targets doctors, therapists and parents of adolescents with autism, impulse control problems and other pervasive developmental disorders.
CyberLearning is already selling the SmartBrain Technologies system for the original PlayStation, PS2 and original Xbox, and it will soon work with the PlayStation 3 and Xbox 360. The EEG- and EMG-based biofeedback system costs about $600, not including the game console or video games.
Kids who play the race car video game "Gran Turismo" with the SmartBrain system can only reach maximum speed when they're focused. If attention wanes or players become impulsive or anxious, cars slow to a chug.
CyberLearning has sold more than 1,500 systems since early 2005. The company hopes to reach adolescents already being treated for behavior disorders. But co-founder Lindsay Greco said the budding niche is unpredictable.
"Our biggest struggle is to find the target market," said Greco, who has run treatment programs for children with attention difficulties since the 1980s. "We're finding that parents are using this to improve their own recall and focus. We have executives who use it to improve their memory, even their golf."
It seems we are getting ever closer to having to wear those brain scanning helmets at work… and possibly at all times of the day and night.
We are watched via video cameras in about all the public spaces these days, we are eavesdropped on in our private conversations, on the phone and online, and coming soon, we will be brain scanned to know our thoughts and even when we get bored. Next, we will have our thoughts manipulated and utterly, and completely, lose all illusions of free will.
Don’t you just love technology developed by the control freaks, not the kind of technology that make your life easier, but the kind of technology meant only to control you?
The Discovery Channel News
April 22, 2008 -- It turns out that dull tasks really do numb the brain. Researchers have discovered that as people perform monotonous tasks, their brain shifts towards an at-rest mode whether they like it or not.
And by monitoring that area of the brain, they were able to predict when someone was about to make a mistake before they made it, a study published Monday in the Proceedings of the National Academy of Sciences found.
"There's this thing that's probably intrinsic where your brain says I do need to take a little break here and there's nothing you can do about it," said study author Tom Eichele of Norway's University of Bergen.
"Probably everyone knows that feeling that sometimes your brain is not as receptive or as well performing and you didn't do anything to actually induce that."
When that happens, blood flows into the part of the brain which is more active in states of rest.
And since this state begins about 30 seconds prior to a mistake being made, it could be possible to design an early-warning system which could alert people to be more focused or more careful, Eichele said.
That could significantly improve workplace safety and also improve performance in key tasks such as airport security screening.
"We might be able to build a device (that could be placed) on the heads of people that make these easy decisions," he said.
"We can measure the signal and give feedback to the user that well, your brain is in the state where your decisions are not going to be the right one."
Eichele and his colleagues in the United States, Britain and Germany were able to detect these brain patterns with MRI scans, which are not portable.
The next step is to see if more mobile EEG devices are able to detect the phenomenon.
A prototype of a wireless, mobile, and lightweight EEG amplifier is currently in development and could be ready for the market in 10 to 15 years, he said.
Supported videos include:
Create your own forum with Website Toolbox!
The Old MACJR'S Mini-Verse² Forum - Copyright © 2005-2016 - All rights reserved
Back to Top Forum