|
신경계 기술에 대하여 : 인간 두뇌를 컴터에 직접 연결하기 Neurotechnology Overview: Wiring Human Brains Directly to Computers By Filippa Lentzos & Isobel Butorac May 3, 2020 - 3:39:48 PM |
By Filippa Lentzos and Isobel ButoracGlobal Research, April 30, 2020
Bulletin of the Atomic Scientists 28 April 2020
Elon Musk's newest venture, Neuralink, is attempting to wire brains directly to computers. The start-up's vision is to insert thousands of tiny threads into the neurons of your brain. The other ends of the threads are attached to chips, embedded under the skin on your head and wirelessly connected to a detachable Bluetooth ‘pod' behind your ear, enabling you to control a phone or another device with your thoughts. Sound far-fetched? The company has already successfully tested the technology in monkeys and aims to start testing it in humans later this year.
머스크가 회사의 실시간 스트리밍 행사에서 강조한 것처럼 Neuralink의 뇌-기계 인터페이스는 움직일 수 없거나 감각을 잃어버린 뇌 및 척수 부상을 앓고있는 사람들을 잠재적으로 도울 수 있습니다. 머스크는 자신의 장기 목표는 "인간을 [인공 지능]과 일종의 공생을 이루는 것"이라고 더 애매하게 말했다. 그는 생각과 계획 (뇌 피질)과 감정과 기억 (변연계)을 담당하는 뇌의 일부를 보완하기 위해 디지털 초지능층 digital superintelligence layer라고 부르는 것을 구축하려고 합니다.
실제로 그는 "이 계층이 이미 있습니다, "라고 말했습니다. 그것은 휴대 전화와 노트북입니다. 그러나 보고있는 내용을 얼마나 빨리 처리할 수 있는지와 응답을 얼마나 빨리 입력할 수 있는지에 따라 제한됩니다. 머스크는 정답은 뇌-기계 인터페이스의 대역폭을 증가시키는 것이라고 말했다.
Neuralink's brain-machine interface could potentially help people with brain and spinal cord injuries who have lost the ability to move or sense, as Musk highlighted at the company's livestreamed launch event. Even more ambitiously, Musk said his long-term goal is "to achieve a sort of symbiosis with [artificial intelligence]." He wants to build what he calls a digital superintelligence layer to complement the parts of the brain responsible for thinking and planning (the cerebral cortex) and for emotions and memory (the limbic system). In fact, he said, "you already have this layer." It is your phone and your laptop. But you are limited by how quickly you can process what you see, and how quickly you can type a response. The answer, Musk says, is to increase the band-width of the brain-machine interface.
Neuralink는 최첨단 신경 기술을 개발하는 조직 중 하나일 뿐이지 만 카네기멜런 Carnegie Mellon, 라이스 대학Rice University 및 바텔레 Battelle의 팀과 같은 다른 사람들은 사람들의 두개골을 통해 드릴링을 제안하지 않고 뇌에 미세한 실도 삽입하지 않고, 단지 전자기파, 광선, 그리고 음파를 통해 하겠다는 것이다.
Neuralink is just one of the organizations developing cutting-edge neurotechnology, although others like teams at Carnegie Mellon, Rice University, and Battelle, are not proposing drilling through people's skulls and inserting microscopic threads into their brains, opting instead for electromagnetics, light beams, and acoustic waves.
신경 기술이 연구자들의 목표와 무관하게 어둠의 목적으로 사용될 수 있다고 상상하는 것도 어렵지 않습니다. 예를 들어, 뇌-기계 인터페이스는 해킹되어 누군가의 내면의 사상을 염탐하거나 의도적으로 침입하는 데 사용될 수 있습니다. 새로운 기억을 이식하거나 기존의 기억을 제거하는 데 사용될 수 있습니다. 심지어 바이오닉 병사 bionic soldiers를 조종하거나, 원격 조종사를 조종하거나, 현장에서 로봇을 조작하거나, 인공 지능이 가능한 드론의 무리를 텔레파시로 제어하는 데 사용될 수도 있습니다.
It's also not difficult to imagine neurotechnology being used for darker purposes, unrelated to the goals of the researchers developing it. A brain-machine interface could, for instance, be hacked and used to spy on or deliberately invade someone's innermost thoughts. It could be used to implant new memories, or to extinguish existing ones. It could even be used to direct bionic soldiers, remotely pilot aircraft, operate robots in the field, or telepathically control swarms of artificial-intelligence-enabled drones.
엘론 머스크에 따르면 원숭이는 이미 생각으로 컴퓨터를 통제했다고 한다. 그의 신생 회사 Neuralink는 올해 사람들의 신경기술 테스트를 시작하는 것을 목표로합니다. A monkey has already controlled a computer with its thoughts, according to Elon Musk. His startup Neuralink aims to start testing its neurotechnology on people this year. Credit: Steve Jurvetson. CC BY 2.0.
생물, 화학 및 핵 기술의 경우, 무기 개발에 사용되지 않는 국제 규칙이 존재합니다. 특정 전자장치, 컴퓨터, 소프트웨어, 센서 또는 통신 기술과 같은 것들이 기존 무기에 사용되지 않도록하는 제어 기능도 있습니다. 모든 경우에 있어 문제의 기본 기술은 유용하고 유익한 목적을 가지고 있습니다. 그러나 이러한 규정은 신경 기술에는 직접 적용되지 않습니다.
In the case of biological, chemical, and nuclear technologies, international rules exist to ensure these are not used for developing weapons. There are also controls to ensure things like certain electronics, computers, software, sensors, or telecommunications technology are not used in conventional weapons. In all cases, the underlying technologies in question have useful and beneficial purposes. But these regulations do not directly apply to neurotechnologies.
유엔에서 치명적인 자율무기 시스템, 특히 인간과 기계의 상호 작용, 인간 통제력 상실 및 책임과 관련된 측면에 관한 논의가 더욱 관련성이 있다. 이것들을 무기로 쓰는 것이 제한되어 있지만 유엔의 비공식 토론은 군사 의사 결정, 정보 수집, 지휘 및 통제 시스템을 포함하여 인공 지능 및 군사화에 관한 광범위한 문제를 검토하고 있습니다.
'Of more relevance are discussions taking place at the United Nations on lethal autonomous weapons systems, particularly around aspects associated with human-machine interactions, the loss of human control, and accountability. While these are limited to weaponry, informal discussions at the United Nations are also examining broader issues around artificial intelligence and militarization, including military decision-making, intelligence-gathering, and command and control systems.
미군,‘Doomsday Genetics’기술에 투자 :‘내 주요 걱정은 우리가 돌이킬 수없는 것을 하는 것이다'
그러나 국제 기관의 어느곳이나 현재 논의 중 어떤 것도 사람들이 신경 기술이 보유하는 유익하고 해로운 잠재력을 어떻게 고려해야 하는 지에 대한 지침을 제공하지 않습니다.
조나단 모레노Jonathan Moreno, 말콤단도Malcolm Dando, 죠르다노James Giordano 및 디엘리스Diane DiEuliis와 같은 연구원들의 모형 연구를 바탕으로 미국, 영국, 호주에 있는 기존 대학의 실험실에서 8 명의 선임 신경 기술자와 대화하여 새로운 위험에 대해 관찰했습니다. 기술과 기술을 안전하게 개발할 책임이 있는 사람에 대해 인터뷰는 시범 프로젝트의 일환으로, 참여는 비밀이 유지되었으며 사회 과학 연구에서 일반적인 관행과 같이 데이터에서 식별 정보가 제거되었습니다.
Yet, none of the international regimes or current discussions provide guidance for how people should consider the beneficial and harmful potential that neurotechnology holds, a growing area of research among scholars as militaries begin developing the technology.
Building on formative work by researchers like Jonathan Moreno, Malcolm Dando, James Giordano, and Diane DiEuliis, we talked to eight senior neurotechnologists from labs at established universities in the United States, the United Kingdom, and Australia about the risks they saw with the new technology and about who has responsibility for safely developing it. The interviews were part of a pilot project, in which participation was confidential and identifying information was removed from the data, as is usual practice in social science research.
기술자들은 뇌-컴퓨터 인터페이스 외에도 인간 뇌의 형태를 모방하는 컴퓨터 시스템 설계를 목표로 하는 분야인 신경성 컴퓨팅 (neuromorphic computing)과 같은 최첨단 기술과 로봇 설계와 관련된 기업인 인지 로봇공학을 연구하고 있었습니다. 보다 원활하고 간결하게 사람들과 상호작용할 수 있습니다. 우리가 말한 기술자들은 특정 기술이 무기로 사용되거나 보안 문제를 제기할 가능성을 보지 못했습니다. 그들은 자신들이 "최전선에서 멀리 떨어져있는"것으로 보았다. 그러나 동시에, 우리가 말한 연구 기술자 6 명 중 6 명은 이전에 직접 또는 간접 국방부 자금을 수령했습니다.
In addition to brain-computer interfaces, the technologists were working on cutting-edge technologies like neuromorphic computing, a field with the goal of designing computer systems that mimic the form of the human brain, and cognitive robotics, an enterprise concerned with designing robots that can more seamlessly and empathetically interact with people. The technologists we talked to didn't see the potential for their particular technologies to be used as weapons or to pose security concerns. They saw themselves as being "away from the front line." Yet, at the same time, six of the study technologists we talked to, from each of the three countries, had been previous recipients of direct or indirect Pentagon funding.
일부는 과거에 만든 기술이 예측 불가능한 완전히 예상치 못한 목적으로 사용되었다고 말했다. 예를 들어, 하나는 스마트 폰과 같은 기술 제품에 적용되는 에어백 용 구성 요소를 설계했습니다.
신경 기술이 발전하고 민간 용도뿐만 아니라 군대가 가능한 응용 프로그램이 개발됨에 따라 소위 이중 사용 위험에 대한 논쟁이 더욱 심각해질 것입니다.
Some also said that technology they had created in the past had gone on to be used for entirely unexpected purposes that would have been impossible to predict. One, for instance, designed a component for airbags that eventually found its way into tech products like smartphones.
As neurotechnology advances and applications with potential military as well as civilian uses are developed, debates about the so-called dual-use risks it poses will become more acute.
군사 신경기술 및 이중 사용의 정의. 이중 사용의 개념을 생각하는 일반적인 방법은 민간과 군사 조직 간의 기술 이전과 관련이 있습니다. 민간인 및 군사 연구 개발은 함께 진행되는 것으로 생각되는데, 윈-윈 시나리오에서 민간인 및 군사 이해 관계자 모두의 상호 이익을 위해 인터넷 및 GPS와 같은 혁신이 극대화 될 수있는 것으로 생각된다. 기술은 기본 연구에서 군사 응용 프로그램으로, 또는 군사 연구에서 민간 응용 프로그램으로 분할됩니다. 그러나 이러한 이중사용 형태의 주요 원동력은 경제적 이익입니다.
Military neurotechnology and the definition of dual use. A common way to think about the concept of dual use relates to technology transfers between civilian and military organizations. Civilian and military research and development are thought to go hand-in-hand, where innovations, like the internet and GPS, can be maximized for the mutual benefit of both civilian and military stakeholders in a win-win scenario. Technologies are spun-in from basic research to military application or spun-out from military research to civilian application. The main drivers behind this form of dual use, however, are economic interests.
국제 보안에 초점을 맞추면 이중 사용 개념이 더 복잡해집니다. 여기서 민간과 군용은 서로 반대되는 입장을 취하고 있으며 민간과 군사 응용 사이의 기술 이전은 민간 기술이 외국 또는 비 정예군으로 이주하는 것을 제한하는 데 중점을두고있다.
호주 그룹이 합의한 수출 통제 하에, 화학 또는 생물 무기에 사용될 수있는 기술의 확산을 통제하기 위해 규정을 조화시키기로 동의한 많은 세계 주요 경제그룹이 미국의 회사는 예를 들어, 라이센스없이 박테리아를 성장시킬 수있는 20 리터 발효기를 수출하십시오. 수취인이 명시적으로 군사 조직인지 여부에 관계없이 회사가 생물학적 무기 프로그램을 보유한 것으로 의심되는 국가로 수출하는 경우 라이센스가 거부됩니다. 따라서 민간용과 군용의 이중 사용뿐만 아니라 합법적 사용과 불법적 사용으로 구분되는 구별도 있습니다.
When the focus shifts to international security, the dual use concept becomes more complicated. Here, civilian and military uses stand in opposition to one another, and technology transfers between civilian and military applications are focused on restricting civilian technologies from migrating to foreign or non-aligned militaries. Under the export controls agreed on by the Australia Group, a group of many of the world's major economies that have agreed to harmonize regulations to control the spread of technology that could be used in chemical or biological weapons, a company in the United States couldn't, for instance, export a 20-liter fermenter capable of growing bacteria without a license. A license would be denied if the company were exporting to a country suspected of having a biological weapons program, regardless of whether the recipient was explicitly a military entity or not. As such, there is not just a civilian versus military distinction to dual use, but also a distinction between what are considered legitimate and illegitimate uses.
생물 무기 활동을 금지하는 국제 조약인 생물 무기협약 대표는 2015 년에 만났다. Representatives to the Biological Weapons Convention, the international treaty banning bioweapons activity, meet in 2015. Credit: Eric Bridiers/US Mission Geneva. CC BY-ND 2.0.
생물무기 활동을 금지하는 국제 협약인 생물무기 협약과 같은 국제 무장해제 및 비확산 조약은 또 다른 차이점을 소개합니다. 그들은 이중 사용이라는 용어를 사용하지 않고 연구 개발 활동의 평화롭고 비 평화적인 목적을 구별합니다. 9/11 이후 생물 무기 협약은 테러리스트나 범죄자와 같은 비 국가행위자에 의한 확산도 포괄할 수 있도록 범위가 넓어졌다. 이러한 경향은 이중 사용도 자비롭고 악의적인 목적의 병치 측면에서 생각되어야 한다는 생각에 기반을 두고있다.
International disarmament and nonproliferation treaties like the Biological Weapons Convention, the international agreement that bans bioweapons activities, introduce yet another distinction. They do not use the term dual use but instead differentiate between peaceful and non-peaceful purposes of research and development activities. Originally aimed at curtailing proliferation by states, since 9/11 the Biological Weapons Convention has broadened in scope to also encompass proliferation by non-state actors like terrorists and criminals. This trend has layered on the idea that dual use has to also be thought of in terms of the juxtaposition of benevolent and malevolent purposes.
The technologists we spoke to found these security concepts of dual use too abstract to relate to their own work. The problem is that whichever concept of dual use is applied-civilian versus military, legitimate versus illegitimate, peaceful versus non-peaceful, benevolent versus malevolent-there is very little practical guidance for how to assess the risks of neurotechnology research being used for harm, or to determine the potential contribution of neurotechnologies to a military program. It's easy to understand how a fermenter that creates bacteria could be used in biological weapons. Countries have done that sort of thing before. There's no such direct line between existing nuerotechnology and an already developed weapons system.
Developing clear guidance for neurotechnologies is increasingly urgent, because as it stands, militaries are already developing neurotechnology. The US Defense Department's research wing, the Defense Advanced Research Projects Agency (DARPA), is significantly expanding brain-machine interfaces for use in military applications. It is "preparing for a future in which a combination of unmanned systems, artificial intelligence, and cyber operations may cause conflicts to play out on timelines that are too short for humans to effectively manage with current technology alone," Al Emondi, manager of DARPA's Next-Generation Nonsurgical Neurotechnology (N3) program, said.
The N3 program is pushing for "a neural interface that enables fast, effective, and intuitive hands-free interaction with military systems by able-bodied warfighters," according to its funding brief, and the program is sponsored at approximately $120 million over four years. But DARPA also funds many other programs, as do military research and development units in other countries. These various programs are expanding the reach of neurotechnologies into military intelligence gathering, image analysis, and threat and deception detection, as well as developing technology to manipulate emotional states and to incapacitate adversaries.
The technologists we spoke to talked about the "capabilities race" they saw developing within countries and internationally, and that "technological supremacy" was at the forefront of many researchers' minds. Despite this, none of the six technologists who had received DARPA funding believed their scientific work was being developed for military application. The other two neurotechnologists we talked to said they would refuse military funding on the grounds that they did not promote warfare and that such funding may instigate political tensions within their labs-echoing the mixed perspectives on defense dollars from the synthetic biology field.
Of course, militaries aren't the only organizations funding neurotechnology. Universities, major brain initiatives like the European Union's Human Brain Project, and national health funding schemes all fund projects, as well. But it is private funders that really get technologists excited. According to an article last year in the journal Brain Stimulation, the technologies may constitute a $12-billion-dollar annual market by 2021.
The pursuit of private capital led two of the neurotechnologists we spoke with to move to Silicon Valley in California, a place where, as one of them said, "You don't even have to explain it." Half of the people we talked to had spinout companies, separate from their university research. These ventures may promote benefits by creating wider access to neurotechnology, but they also create privacy and other ethical dilemmas separate from concerns about whether a technology could be weaponized or not. For instance, as private companies potentially become gatekeepers of large amounts of personal brain data, they could choose to monetize it.
How can scientists and institutions account for the potential of misuse inherent in the development of neurotechnology? "Boundaries are not always so obvious when people are crossing them," one of the technologists we spoke to said. "It is only in hindsight that people think, ‘yeah this is bad.'" Different people have different boundaries. Perceptions of beneficial technology can vary, too.
Often the benefits or potential harms associated with a technology are tightly wrapped up in a particular implementation. Even if technologists hold "good" intentions, later applications of their technology are not always within their control. Talking with neurotechnologists underscores that what is and isn't a dual-use technology is often in the eye of the beholder, even when militaries are paying to develop the products.
While no treaty regulates neurotechnology, safely developing this sci-fi like technology calls for a new framework that articulates specific harmful or undesirable uses of the technology in political, security, intelligence, and military domains. It would be better to develop the framework now, at the stage when many entrepreneurs are more focused on telepathically controlling smartphones than the weapons of the future.
*
Note to readers: please click the share buttons above or below. Forward this article to your email lists. Crosspost on your blog site, internet forums. etc.
Filippa Lentzos is a senior research fellow jointly appointed in the Departments of War Studies and of Global Health and Social Medicine at King's College London.
Isobel Butorac is a recent graduate from King's College London with a Master's in Bioethics and Society and an early career researcher with a research agenda looking at public health, neuroethics, and technology.
Featured image: Neurotechnology could help people with disabilities use their thoughts to control devices in the physical world. It may also be useful in weapons systems. Private companies, militaries, and other organizations are funding neurotechnology research. Credit: US Army.
The original source of this article is Bulletin of the Atomic ScientistsCopyright © Filippa Lentzos and Isobel Butorac, Bulletin of the Atomic Scientists, 2020
|