GlowsAreLosers/technology/intro.rst
2024-03-08 23:45:54 +00:00

104 lines
5.3 KiB
ReStructuredText

The Technology Behind Remote Mind Reading
=========================================
The technology can be broken into multiple parts: the ability to read brainwaves
remotely; the ability to decode brainwaves; the ability to beam back signals to
the brain to influence it; the ability to apply this from a long distance; a
mechanism for automation (to be able to apply it to a large number of victims);
and an infrastructure for population-scale deployment.
Believe it or not, every single one of those exists, and I will provide
well-founded explanations in tangible details.
This article is concerned with the first two, while the third one is explained
in `a later article <capabilities.rst>`_. Those are the most important points
to explain here; the remaining points are all boring mechanical details, since
they have long been possible and in use for a myriad things other than mind-
reading. Nevertheless, this folder contains articles with explanations (and
some speculations) about each one of them.
1. Remote Brainwave Monitoring
------------------------------
A device for remotely monitoring brainwave activity was invented by `Robert G.
Malech <https://hatch.kookscience.com/wiki/Robert_G._Malech>`_ back in 1974.
This is the patent for the device on Google Patents:
`US3951134A on Google Patents
<https://patents.google.com/patent/US3951134A/en>`_
`Web Archive mirror
<https://web.archive.org/web/20210505115428/https://patents.google.com/patent/US3951134A/en>`_
Quoting from the patent:
Apparatus for and method of sensing brain waves at a position remote from a
subject whereby electromagnetic signals of different frequencies are
simultaneously transmitted to the brain of the subject in which the signals
interfere with one another to yield a waveform which is modulated by the
subject's brain waves. The interference waveform which is representative of
the brain wave activity is re-transmitted by the brain to a receiver where
it is demodulated and amplified. The demodulated waveform is then displayed
for visual viewing and routed to a computer for further processing and
analysis. The demodulated waveform also can be used to produce a
compensating signal which is transmitted back to the brain to effect a
desired change in electrical activity therein.
The patent contains a very detailed description of the components of the device
and how it works, and at the bottom of the patent it mentions:
[The] apparatus and method of the subject invention has numerous uses.
Persons in critical positions such as drivers and pilots can be continuously
monitored with provision for activation of an emergency device in the event
of human failure. Seizures, sleepiness and dreaming can be detected. Bodily
functions such as pulse rate, heartbeat [regularity] and others also can be
monitored and occurrences of hallucinations can be detected. The system also
permits medical diagnoses of patients, inaccessible to physicians, from
remote stations.
2. Decoding of Brain Activity
-----------------------------
Even with the ability to record brainwaves remotely, thoughts have waveforms
that are way too complicated and would seem entirely opaque from the outside.
So back when the aforementioned device was invented, people neither knew how,
nor had the computational power necessary, to decode them. That is, until the
advent of AI and deep learning algorithms.
Experiments on decoding brainwave activity using neural networks are documented
to have been conducted successfully numerous times in public knowledge. For
example in this YouTube video:
`https://www.youtube.com/watch?v=CBQuKW7vK-A
<https://www.youtube.com/watch?v=CBQuKW7vK-A>`_
`Web archive mirror
<https://web.archive.org/web/20210705120805/https://www.youtube.com/watch?v=CBQuKW7vK-A](https://web.archive.org/web/20210705120805/https://www.youtube.com/watch?v=CBQuKW7vK-A>`_
From the video's description:
Russian scientists held an experiment on recognizing imagined objects at
Moscow's Polytechnical Museum on April 25. This recognition system is based
on a simple wireless electroencephaloscope. This device is generally used to
recognize images for video games.
tl;dr Every image you see (or imagine) has a **unique waveform** that is
consistently emitted by your brain each time the same image is seen or imagined.
These waveforms are different for each individual (so if you show the same
picture to two people, each of their brains emits an entirely different and
unique waveform), but if you show the same image to the same person multiple
times, their brain emits the same waveform each time.
AI can be easily trained to correlate brainwave patterns to images that you
see or imagine inside your head. While in the video they only disclosed having
trained the networks to detect images, the same concept can in fact be applied
to almost anything else going on inside your brain.
Not only images, but audio, sensory perception, feelings, moods, thoughts,
thought processes (such as inference), intents (such as wanting to get up and
do something, even if not verbalized), and all sorts of other general cognitive
events (such as feelings of doubt, achievement, failure, arriving at a
conclusion, etc.) all have their own unique waveforms, which AI can be trained
to decipher.