SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Politics : Should God be replaced? -- Ignore unavailable to you. Want to Upgrade?


To: cosmicforce who wrote (1740)10/17/2000 9:17:46 AM
From: Dayuhan  Read Replies (2) | Respond to of 28931
 
Have you read any of the work of Jorge Luis Borges?

<edit>

If you haven't, one of the more accessible stories can be found at:

sccs.swarthmore.edu

I think you would like his work. The stories are very short, but they must be read very slowly, ideally aloud.



To: cosmicforce who wrote (1740)10/17/2000 9:28:10 AM
From: TigerPaw  Respond to of 28931
 
neural networks
When training a nerural network it will often become fixated on a pattern. For example if a network is trained to recognize letters it may get hung up on the difference between a and e for example. To continue training one technique is to present low level white noise to the inputs while training and thereby bump various neurons out of a local minimum. During the process the network will spontaneously recognize various letters, sometimes with a high confidence level. I take this to be dreams of the network. After the dream is over, the network is better able to continue training.

A trained network can be "killed" by arbitrarily setting connection values to zero. A well trained network will continue to recognize it's patterns even after damage, but when enough connections are zeroed it will begin to present spontaneous recognition indication with high confidence. This is the networks "near death" experience.

Both effects can be more pronounced if the network also stores state information, that is has a sense of time.

TP