Aetherius Rimor wrote:
What the result of my goal is used for, I could care less about honestly, but I know it could have many useful applications.
The easiest way to test the functionality of a software translation of the human brain, is as a form of AI. Sure it won't be nearly as efficient as specifically targeted AI programs, but it will suit my purposes just fine.
I get quite scared by that state of mind really. You are not being technorealistic. What problem does your science seek to solve? Will it create new problems?
I sense a sort of double think. First you state that you don't care what it can be used for. So, it maybe used in a bad way too. And then you say based on no argument that it could have many applications. These two statements are in conflict. If you are certain it has useful applications, then you must have thought about it and come up with a few to justify your research. If so, you would care about it, cause what you come up with would be useful by your own opinion.
I am nervous about any research being done these days involving our brain. As we are trying to understand our brain we open up novel ways of, indeed, hacking into it. Already there are studies completed on scanning peoples' brains and then understanding the nature of their thought. There is even a project that can retrieve something out of your brain and create the picture that you recollected as you were measured. it is a vague picture. But it still is a leap forward.
What about the ethical implications? Would your studies be part of a puzzle where once it is finished we can detect what people are thinking and 'correct' it when governments find it...undesirable?
Curiosity is one thing, but we live in a society where privacy means less every day. Websites are being hacked, there is identity theft, there is massive profiling of the individual. With our loss of privacy and with that, our autonomy, as we live in a fear-ridden society where child pron and terrorism is used as an excuse to take away our most principle rights, we enter into an age where only the content of our minds is still safe. And technology and science is working hard to undo that.
Of course in most cases there is a genuine honesty. 'If we can understand how it works, then...' And then they list a disease or how it would benefit to find a drug for some ailment or some abstract goals that would possibly lead to another, just as they are saying a lot about quantum computing, when they state that they found another piece of the puzzle to bring quantum computers closer.
Can you justify being part of this larger societal predicament, that carelessly dismisses such issues as fear? We live in a world connected by computers, it is in many ways the infrastructure of our own oppression. With the ethics we use now any study undertaken to understand the brain is bound to be misused by governments, Big Pharma and secret services.
You should tread very carefully and with great care about why you wish to make a career out of this. And if you pursue this, what steps can you take to make sure results of your findings cannot be used in some bad way.
I hope you do not take this as an attack. These are serious concerns to me as a privacy fighter/technorealist. I add this because science, to some people, is something they hold onto for psychological stability and as such, becomes an ego issue.