SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Strategies & Market Trends : NeuroStock

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Jay Hartzok who wrote (399)11/26/1998 7:41:00 PM
From: Jay Hartzok  Read Replies (2) of 805
 
To all,

The following has come to my attention:

If you train an unsatisfactory net and decide to change it and start again, simply clicking on Forget and then changing the settings and starting again is not adequate. Some memory of the old net is retained somewhere, either in the program itself or elsewhere in the computer. This will contaminate the new net.

You can prove this for yourself. Set up a small net with a verify period and have both back prop and annealing checked. Use just a few neurons, 10 to 14, so that it trains fast. Watch it train. Pay close attention to the verify period during back prop. When it shifts to annealing watch for the changes that occur in the verify period. After it does annealing for awhile or completes it, stop the training and have it forget. Restart the training and watch the verify period closely. You will see that some of the changes that annealing made in the supposedly forgotten net are now part of the initial back prop training of the new net.

I have had to go as far as to completely delete the neu file and reboot in order to totally free the new net of the contamination.

Jay
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext