delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp/1997/03/04/20:46:33

Date: Wed, 5 Mar 1997 09:31:19 +0800 (GMT)
From: Orlando Andico <orly AT gibson DOT eee DOT upd DOT edu DOT ph>
To: Mark T Logan <fwec AT juno DOT com>
cc: leathm AT solwarra DOT gbrmpa DOT gov DOT au, djgpp AT delorie DOT com
Subject: Re: neural network code
In-Reply-To: <19970304.153858.7543.4.fwec@juno.com>
Message-ID: <Pine.SGI.3.93.970305092728.8941K-100000@gibson.eee.upd.edu.ph>
MIME-Version: 1.0

On Tue, 4 Mar 1997, Mark T Logan wrote:

> Excuse me, do you really know what a neural network is?
> 
> All it is, (AFAIK, I'm know expert)  is a set of functions (usually
> member functions), 
> which call other functions based on simple tests.  
> This is like a brain, where a neural cell has several tendrils snaking
> out to connect with the axon of another cell.  When that axon
> fires, it activates the aforementioned neuron, which may/may not
> fire it's axon.

No that's not it. Neural networks have been around longer than digital
computers. Some people in the 1930's implemented neural networks with a
bunch of resistors.

And the tests may look simple at first glance, but they're not. Basically
neural networks are an energy-minimization problem. And there are lots of
types, the most common being back-propagation, and back-propagation with
momentum term. And of course the perceptron (which Marvin Minsky pretty
much demolished in the 1960's).

So that this post doesn't become TOTALLY off-topic, you can get backprop
code from FTP.FUNET.FI. Look in the DJGPP area (of all places..) there's
some backprop code there.

Note that just having your backprop net ain't enough. The hardest part is
training your net  :)  but if you're lazy, try a Kohonen net, they're
self-training (for a small class of problems).
 
.-----------------------------------------------------------------.
| Orlando Andico                email: orly AT gibson DOT eee DOT upd DOT edu DOT ph |
| IRC Lab/EE Dept/UP Diliman   http://gibson.eee.upd.edu.ph/~orly |
|  "through adventure we are not adventuresome" -- 10000 Maniacs  |
`-----------------------------------------------------------------'

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019