SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Strategies & Market Trends : NeuroStock -- Ignore unavailable to you. Want to Upgrade?


To: Jay Hartzok who wrote (199)10/7/1998 11:24:00 PM
From: Bill Scoggin  Read Replies (2) | Respond to of 805
 
I saw something a while back where someone wrote something to the effect "There's a tendency in neural networks to throw everything available to a network, including the kitchen sink, and let the net sort it out...but at some point, the network quits learning and starts memorizing, which does away with the network's ability to make generalizations".

Anyway, the point of the article was that if a net can be trained with a few reliable indicators as inputs, as opposed to a lot of inputs that may or may not have much effect on the outcome, then the predictions it make will often times be more accurate to real world systems.

As best as I remember, the article suggested monitoring the weight structure to see which inputs had the more important influence on the output.

The problem is, I suppose, with Neurostock, do we have access to the weight structure at all, and do we know exactly how the input training sets are submitted to the training loop?

More food for thought.

Bill



To: Jay Hartzok who wrote (199)10/23/1998 10:16:00 AM
From: Jay Hartzok  Respond to of 805
 
Andrew, Suggestion for the next version:

The addition of two preprocessed filter inputs. One for relative strength and the other for average directional movement. I think using both of these indicators would give any net additional insight.

Jay