Neuralink: Technological Panacea Or Time To Panic?
Artificial Intelligence to aid patients suffering from stroke or brain injury
Last week, we featured an article on Elon Musk and his new company, Neuralink. Mr. Musk is one of the most innovative and truly visionary CEOs out there today—you only have to look at Tesla, SpaceX and PayPal to appreciate his range of interests and intellectual acuity. Now, he hopes to develop breakthrough technologies using artificial intelligence (AI) to help patients who have lost their abilities to communicate due to debilitating illnesses such as stroke and/or brain injury, among others.
A stroke is when a part of the brain experiences an interruption of blood flow depriving the brain of oxygen. According to the CDC, every year approximately 795,000 Americans will experience a stroke—130,000 of them will die. In addition, stroke is the leading cause of long-term disability impacting our health care system by more than $33 billion each year which includes costs related to health care services, medicines for treatment, and missed days of work.
By using a computer interface which connects to the brain (although exact details as to how this technology will work are sketchy), Neuralink hopes that patients will be able to recapture their lost abilities which include their language skills. Mr. Musk is proposing “consensual telepathy” which as far as we know, will allow a patient to express ideas to another via connected interface. Harnessing the potential of artificial intelligence is at the core of his technology. AI Is an increasingly more exciting modality.
While we hope Mr. Musk succeeds, there are still questions, society must answer. Who decides whether “consensual telepathy” is really consensual? Can someone else hack a patient’s brain the way cyber intruders hack websites, companies and organizations? And, who will protect a patient’s privacy? Patients and their vulnerabilities are very real concerns. Therefore, medical practitioners and bioethicists must be part of solution and process—not afterthoughts as is often the case.
Justin Killian, the Foundation’s legal advisor believes that the concept of “consensual telepathy” raises fascinating issues, especially from a regulatory and legal perspective. According to Mr. Killian, “It is yet another radical technology that appears to be advancing so quickly that regulatory bodies (and later jurists) will be hard pressed to fully anticipate and comprehensively address potential risks prior to product deployment. Some broad issues would be privacy rights (both in terms of how to limit access to thoughts to authorized recipients, and also how to discern when a thought is “ready” for consumption rather than just a fleeting concept that has not been subject to the patient’s editing and judgment) and property rights: are the fruits of someone’s consensual telepathy (the idea for an invention, the outline of a novel, the hook for a new song) protectable by any extant legal mechanism, and would such protection be appropriate?”
Mr. Killian strongly advises that these issues (and others that have likely yet to be conceived) must be addressed in a timely fashion. “The American system of jurisprudence, in which new regulations are implemented and then gradually refined by years (and sometimes decades) of jurisprudence, is ill-suited to the hyper-acceleration of technological innovation that is now beginning to crest.”
As Mr. Killian further suggests, whether or not inventors like Mr. Musk recruit ethicists and engage legislators preemptively will determine whether or not these amazing new advances can be implemented without unintentionally harming the very people they earnestly aim to serve.
To learn more about strokes, download the CDC fact sheet for more information.
 –Mozzafarian D, Benjamin EJ, Go AS, Arnett DK, Blaha MJ, Cushman M, et al., on behalf of the American Heart Association Statistics Committee and Stroke Statistics Subcommittee. Heart disease and stroke statistics—2016 update: a report from the American Heart Association. Circulation 2016;133(4):e38–360.