So, this is only tangentially related to the topic, but I figured it goes here anyway. A lot of people think of the brain as a sort of meat computer. It's not (it's actually mostly made out of fat, but that's beside the point). The actual complexity is difficult to convey partly because we don't understand it very well; every time we zoom in, we just find another level of staggering complexity.
Let's start here: an average young healthy adult has about 200 billion neurons in their central nervous system alone. That's a lot of neurons. Each neuron makes up to 10,000 connections with other neurons. If we're looking at connectivity, we're up to a really, really big number... what is that, 2x1018? But of course, when you're talking about functionality, just the number of connections doesn't tell the whole story. You have to account for combinations of connections. I can't even guess at how many different combinations are likely for each neuron, so lets just be really conservative and say ten, because it's a nice simple number. That gives us, (unless I fucked up the math, which is completely possible) about 4.5x1084.
That's a lot. That is a very, very big number. But that's not all.
There are about ten times more glial cells than neurons. Glial cells are the support cells of the nervous system, and we don't know very much about what they do. We do know that they play a role in neuroplasticity (changing and strengthening connections) and in neurotransmitter regulation. I'm going to stop even trying with the math now, because I'm sure you get the point.
Speaking of neurotransmitters, there are over 100 different ones that we know about so far. What a neurotransmitter can do mostly depends on the types of receptors on the postsynaptic neuron, and where THAT neuron terminates. Some neurotransmitters only have one known receptor, but most have several; serotonin has 15. 14 of those serotonin receptors are G-protein-coupled receptors, which means that they release one of many types of G protein when the neurotransmitter binds to the receptor. What's a G protein, you might ask? WELL, it's a sort of tiny molecular switch that can act as a messenger that triggers a variety of other effects in a cell, some of which ultimately effect the DNA in the nucleus, changing how it expresses its genes. Remember, neurons are cells, so this means it alters the functionality of the neuron, usually on a graded scale, in any one of the many ways the neuron's DNA is capable of expressing.
I am simplifying this WAY WAY down. It's much more complex than I'm making it sound. I'm just trying to give some insight into why I do not believe that computer technology is anywhere close to "brain like", and may never be.
Brains are not very good calculators, which is why we invented calculators. Electrons move at the speed of light; electrochemical impulses in neurons are dependent on the physical movement of ions, and that speed maxes out at about 90 meters per second in humans. The trade-off is that they are staggeringly complex and yet are literally made of the most common and easily-replaceable stuff in the universe; crap you'd just find laying around on a planet.
I pretty much agree with everything you're saying here, other than the fact that the brain, to me, is
nothing more than a computer. In the classical sense that all it does is processes stores and outputs information.
I agree that it's nothing like our technological computers, for one thing, although the "clock speed" is pitifully slow, the massive parallelism evident, would make it more akin to a super network, with each neuron being a computer in it's own right but, when treated as a discrete unit, it accepts input via various system edge connections, it processes these inputs and it stores and outputs data.
So we're down to use cases. When would you use meat and when would you use silicon? As silicon progresses, regardless of the complexity and parallelism issues, which (despite protests to the contrary) I'm confident will be solved on a long enough timeline, for sake of argument somewhere between this century and a billion years from now. We're fast knocking down use cases where the brain is better.
Information storage, organisation and retrieval is handled much better by machines.
Raw numerical and mathematical calculation is handled much better by machines.
Pattern recognition, which has, until now, been strictly the domain of meatware is now falling fast to neural nets which, despite being an architecture inspired by the brain, are actually not an attempt to replicate it (except in the minds of people who don't get machine learning) Neural nets are much more stable and potentially much more accurate than meat in domain spaces which become broader as density, speed and algorithmic topologies advance. Bear in mind that AI is currently at the stage computing was back in the 80's. Current chips are (quite literally) millions of times more powerful than they were back then. We expect neural nets to be millions of times more powerful in the next few decades.
One of the poster boy domains in machine learning right now is face recognition, in which machines now outperform meat by a significant degree. Pretty soon recognition is going to be machine dominated across the board. Including the kind of recognition which the human brain has never been capable. Machine learning systems are already beginning to see patterns in data that no human being alive could ever comprehend.
So where does that leave meat? What's it's strong suit? Even short term (next 20-30 years) the only thing I can see being left is consciousness/personality. Seems to me to be the granddaddy of computational tasks. I think that's the one thing we might never understand but, in gaining greater degrees of incomplete understanding, maybe we find out some things it can do that we never figured would be possible. Maybe consciousness can migrate to an alternative substrate. Maybe it can exist in a less complex framework than the one which built itself by accident.
So do I think this means machines will take over? Fuck no. Machines are part of our mind. If I need to know something, I can employ a machine to find or calculate that information. That's just me doing something. The machine is an extension of me. The meat part hasn't changed in millennia but the machine part is advancing year on year. Stands to reason an increasingly large part of the entity considered as me will, over time, be made of machine. This has been happening since paper but it's only recently we seem to be approaching a tipping point.