I have a PhD student working on EEG audio decoding. We are presently focused on a simpler subtopic: the detection of consonance and dissonance in the brain as it listens to music.
Prediction: even if this requires surgery, unlocking inner thought will be used in criminal proceedings to establish guilt or attempt to be used to prove innocence. It will definitely be used unethically in military/intelligence interrogations until the law catches up.
I'm not sure if this would be able to detect the difference between truthful thoughts about actual memories, and intrusive thoughts that could give the entirely wrong impression.
Yet, they still do use lie detectors, even though the things they detect can be faked, or triggered out of personal alarm or offense. So it is entirely possible, regardless.
As I understand it, the big challenge with brain electrodes is that because they are implanted in a big jiggly piece of jelly, they shift out of position and/or cause localized scarring. The practical effect is that the brain-electrode interface "wears out" after a while, and you can't get useful data. Has this been solved, or are implants still temporary?
> "It wasn't perfect, but 60% of the words were judged intelligible by testers"
I don't understand this part. Are they trying to pull the audio of the words out of the brain or something? I'd think it would be easier to use a dictionary of words, and use some machine learning to try and pull out the most likely next word from the brain activity, in which case 100% of the words would be intelligible
No idea, but the words themselves would be intelligible. The only way I can think that they could be generating unintelligible words is if theyre building them from tokens/letters, or generating audio directly
They don't seem to mention if it is elective. An all or nothing mechanism might spell out words that the patient really didn't intend on others seeing (like "Ugh, that guy again! I can't stand the way he...")
It is pretty difficult to control your inner dialog against spontaneous and triggered thoughts.
I wanted to comment this HN entry with "people with intrusive thoughts sweating profusely" or something similar, but in truth are there people with no intrusive thoughts whatsoever?
I for one don't fight them, regardless how horrible they would be spoken out aloud, because so far I haven't seen any evidence of anyone reading my mind.
I also made a point of explaining to my child that her thoughts are hers and hers alone, so she can think whatever she likes.
I would rather not have to backtrack on any of this.
I think every verbal person has the ability to “speak” phrases in their mind; people without an internal monologue (as is, I suppose, the case for me) just don’t need / tend to do that with every thought they have.
Yet, they still do use lie detectors, even though the things they detect can be faked, or triggered out of personal alarm or offense. So it is entirely possible, regardless.
I don't understand this part. Are they trying to pull the audio of the words out of the brain or something? I'd think it would be easier to use a dictionary of words, and use some machine learning to try and pull out the most likely next word from the brain activity, in which case 100% of the words would be intelligible
what percentage of the words would be correct though?
It is pretty difficult to control your inner dialog against spontaneous and triggered thoughts.
I for one don't fight them, regardless how horrible they would be spoken out aloud, because so far I haven't seen any evidence of anyone reading my mind.
I also made a point of explaining to my child that her thoughts are hers and hers alone, so she can think whatever she likes.
I would rather not have to backtrack on any of this.
There are people with no internal monologue whatsoever.