Highest Rated Comments


nanathanan506 karma

I did an undergraduate degree and first master's degree in electronic engineering and nanotechnology. (MEng)

I then did a research master's degree (MRes) in 2-dimensional materials (graphene, 2D transition metal dichalcogenides, etc). While doing my master's project in graphene biosensors for the brain I got interested in neural interfaces.

For my PhD work, I then focused on designing a novel neural interface that could tackle the key problems with the existing technology. These were: biocompatibility (affecting operable liftime of the sensors in the brain), data quality (single neuron addressability, signal type, and computational value for external applications), and data quantity (the number of sensors in a given area or for a given implant).

Now as I finish my PhD I hope to further develop and eventually someday sell my sensors, and therefore entrepreneurship is the way forward for me.

Fantastic that you are interested in neuroscience - there are plenty of exciting fields that you can further specialize in. Have you considered doing a master's degree? With a bachelor's I'm not sure what the career prospects are, to be honest. I never intended to enter the workforce at this stage of my education, so I never properly researched the opportunities. I'm sure there's plenty out there, but I'm not the best person to answer this particular question.

nanathanan233 karma

Simple answer: No / Maybe, but it depends on your definitions and what you're imagining.

To clarify, let's unpack the question a little bit:

'Downloading your brain'

This isn't the main goal for the technology, but there are of course people interested in pursuing this. The interconnectivity of the brain is computationally demanding to model, and as far as I know, beyond our current computing power to map the entire human brain. Adding to that, the real benefits of BCI's is to allow humans further to utilize computing power that's not native to the brain (e.g. increased memory and numerical proficiency). It doesn't make sense to take what works well in the brain and replicate it in a computer where it doesn't work as well. However, with a connection between the brain and a computer, the idea is that a human brain can benefit from the additional information handling and storing capabilities of a computer.

A more sensible goal is that humans use their brain-computer interface to improve their cognitive abilities, much like what we already do today, but with a far greater bandwidth. By bandwidth, I'm referring to the quantity and speed of information transfer.

'Turning your mind into AI'

An enhanced brain wouldn't fit the definition of artificial intelligence - if it would, where do you draw the line? Our brains are already technically enhanced by technology as we already store information and perform tasks with external technology.

However, like you asked if we model a human brain on a computer, would this be AI? Well, it depends on your definition of AI. Using a dictionary definition, a computer that can perform the cognitive tasks of a human is an AI. This definition may need updating in the future when computing systems can perform the cognitive tasks of a human without the need to perfectly model the operation of a human brain. such a system would be a very different AI and perhaps more true to the meaning of 'artificial'.

nanathanan194 karma

I always intended to become an entrepreneur. I didn't really want to work on other people's projects, as I always had many ideas of my own. I've always felt that a good business is the best way to bring new technology to the world.

I have been in university education for almost 10 years, yes. Luckily I come from a country that paid for some of it and I have a government-backed zero-interest loan for the rest of my student fees. Studying mostly in Europe as a European, tuition fees for me never exceed $5k per year. For my PhD, I'm part of a fully-funded program and I get paid a salary.

nanathanan178 karma

Neuralinks' claims are somewhat true, but only if represented correctly. That sheer amount of sensors has been done many times before, but without comparable biocompatibility and without the analog-to-digital conversion (ADC) chip. Their true industrial innovation is the ADC and the robotic arm for implanting the sensors, everything else is fairly established technology already done by countless research groups around the world. what Neuralink has excelled at, is bringing some of the best technology from publicly available research and putting it all into a viable device. (I'm a massive fan)

Academia is sadly not focussed on making a commercially available neural interface. Research for neural interfacing in academia in my opinion is very slow because there's no common goal or objective. It s actually disappointing to see how little makes it through from the world of research to clinical applications. The clinically approved multi-electrode array devices supplied for clinical purposes by companies such as Blackrock microsystems are almost 30 years old and extremely rudimentary compared to what we can do today. The industry is ripe for innovation and technology translation.

There's a huge amount of companies in this space - current investment in neurotech around the world is in the order of about $2billion and currently growing by 10-20% a year. There's a great deal of work being done and its about to blow up in the coming decade. Most exciting companies, in my opinion, are: BIOS, Neuralink, and Kernel.

For a full view of the neurotech industry, this report is quite good (Although, sadly, the author doesn't quite capture the value of invasive sensors):

: https://www.idtechex.com/en/research-report/invasive-and-non-invasive-neural-interfaces-forecasts-and-applications-2018-2028/573?fbclid=IwAR28NXrToSYQtoc1tOc5vO2BCm-igud1h9TM_6l6CsrZdIKHKBN-qzybFyw

nanathanan89 karma

Thanks!

I'm currently still patenting my work before publishing. When I go public with my tech in a year or so from now, I will update this post.