From: Erik Max Francis Newsgroups: comp.os.msdos.djgpp Subject: Re: neural network code Date: Wed, 12 Mar 1997 08:46:53 -0800 Organization: Alcyone Systems Lines: 29 Message-ID: <3326DDFD.4A7BC468@alcyone.com> References: <199703040109 DOT LAA09346 AT solwarra DOT gbrmpa DOT gov DOT au> <19970304 DOT 153858 DOT 7543 DOT 4 DOT fwec AT juno DOT com> NNTP-Posting-Host: newton.alcyone.com Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit To: djgpp AT delorie DOT com DJ-Gateway: from newsgroup comp.os.msdos.djgpp Mark T Logan wrote: > All it is, (AFAIK, I'm know expert) is a set of functions (usually > member functions), > which call other functions based on simple tests. > This is like a brain, where a neural cell has several tendrils snaking > out to connect with the axon of another cell. When that axon > fires, it activates the aforementioned neuron, which may/may not > fire it's axon. Neural network implementations are even closer to the model of the brain than what you specify. Usually you have virtual neurons and axons with connecting the different neurons. Each neuron has a threshhold and each axon has a weight. The input to each neuron is the sum of the weighted inputs from all the neurons it's connected to. A neuron will fire if and only if the input it receives exceeds its threshhold. Neural networks can be trained to create the desired response, which is what makes them valuable in the artificial intelligence field. -- Erik Max Francis, &tSftDotIotE / email: max AT alcyone DOT com Alcyone Systems / web: http://www.alcyone.com/max/ San Jose, California, United States / icbm: 37 20 07 N 121 53 38 W \ "I am become death, / destroyer of worlds." / J. Robert Oppenheimer (quoting legend)