Internet
Fact-checked

At EasyTechJunkie, we're committed to delivering accurate, trustworthy information. Our expert-authored content is rigorously fact-checked and sourced from credible authorities. Discover how we uphold the highest standards in providing you with reliable knowledge.

Learn more...

What is a Neural Network?

Michael Anissimov
Michael Anissimov
Michael Anissimov
Michael Anissimov

In a typical computer, made according to what is called a Von Neumann architecture, memory banks live in an isolated module. There is only one processor, which processes instructions and memory rewrites one by one, using a serial architecture. A different approach to computing is the neural network. In a neural network, made up of thousands or even millions of individual "neurons" or "nodes," all processing is highly parallel and distributed. "Memories" are stored within the complex interconnections and weightings between nodes.

Neural networking is the type of computing architecture used by animal brains in nature. This isn't necessarily because the neural network is an inherently superior mode of processing than serial computing, but because a brain that uses serial computing would be much more difficult to evolve incrementally. Neural networks also tend to deal with "noisy data" better than serial computers.

Neural networks are made up of thousands or millions of individual neurons.
Neural networks are made up of thousands or millions of individual neurons.

In a feedforward neural network, an "input layer" filled with specialized nodes takes in information, then sends a signal to a second layer based on the information it received from the outside. This information is usually a binary "yes or no" signal. Sometimes, to move from a "no" to a "yes," the node has to experience a certain threshold amount of excitement or stimulation.

Our bodies contain trillions of synapse "data" connections, many of which are constantly active.
Our bodies contain trillions of synapse "data" connections, many of which are constantly active.

Data moves from the input layer to the secondary and tertiary layers, and so on, until it reaches a final "output layer" which displays results on a screen for programmers to analyze. The human retina works based on neural networks. First level nodes detect simple geometric features in the visual field, like colors, lines, and edges. Secondary nodes begin to abstract more sophisticated features, such as motion, texture, and depth. The final "output" is what our consciousness registers when we look at the visual field. The initial input is just a complex arrangement of photons that would mean little without the neurological hardware to make sense of it in terms of meaningful qualities, such as the idea of an enduring object.

In backpropagating neural networks, outputs from earlier layers can return to those layers to constrain further signals. Most of our senses work this way. The initial data can prompt an "educated guess" at the final result, followed by looking at future data in the context of that educated guess. In optical illusions, our senses make educated guesses that turn out to be wrong.

Instead of programming neural networks algorithmically, programmers must configure a neural network with training or delicate tuning of individual neurons. For example, training a neural network to recognize faces would require many training runs in which different "facelike" and "unfacelike" objects were shown to the network, accompanied by positive or negative feedback to coax the neural network into improving recognition skills.

Michael Anissimov
Michael Anissimov

Michael is a longtime EasyTechJunkie contributor who specializes in topics relating to paleontology, physics, biology, astronomy, chemistry, and futurism. In addition to being an avid blogger, Michael is particularly passionate about stem cell research, regenerative medicine, and life extension therapies. He has also worked for the Methuselah Foundation, the Singularity Institute for Artificial Intelligence, and the Lifeboat Foundation.

Learn more...
Michael Anissimov
Michael Anissimov

Michael is a longtime EasyTechJunkie contributor who specializes in topics relating to paleontology, physics, biology, astronomy, chemistry, and futurism. In addition to being an avid blogger, Michael is particularly passionate about stem cell research, regenerative medicine, and life extension therapies. He has also worked for the Methuselah Foundation, the Singularity Institute for Artificial Intelligence, and the Lifeboat Foundation.

Learn more...

Discuss this Article

Post your comments
Login:
Forgot password?
Register:
    • Neural networks are made up of thousands or millions of individual neurons.
      By: Vector Art Design
      Neural networks are made up of thousands or millions of individual neurons.
    • Our bodies contain trillions of synapse "data" connections, many of which are constantly active.
      By: rolffimages
      Our bodies contain trillions of synapse "data" connections, many of which are constantly active.