SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Strategies & Market Trends : Neural Nets - A tool for the 90's

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Larry Livingston who wrote (619)8/7/1999 10:08:00 PM
From: LastShadow  Read Replies (1) of 871
 
I would think it would be easier to set up something like the Point of Balance Oscillator - or anything you can create that formulates all the lags into one or two conditionals you're looking for. That is, if you are looking at a relationship that involves the 2 day high lag and the 4 day low lag, it would be better (processor wide) to incorporate that as one criteria as a single input. The problem with using more that 30 inputs is that what you are doing in reality is just weighting specific (although perhaps not apparent) data. After all, we are only talking about OHLC and volume and the mathematics of what can be done with those few peices of information. Using more tends to defeat the purpose anyway. If what you are using (more than a dozen inputs) you should reveiw the specific signals to make sure that they are not simply reinforcing another input (therefore overweighting that characterisic). The other caveat is that the more lags you utilize, the more you are relying on specific historical patterns rather than general relationships. And it is the understanding and determination of the relationships that provide for better predictive accuracy rather than an isolation of a specific lag occurance.

One wants a net that is predictive of general directional movement (of price or volume or stochastic or whatever) - that one time it may give a 5% profit and another a 50% is immaterial, as sentiment, news and other nonlinear, non-input functions largely influence the degree of movement.

A neural net, by the nature of the math, will give a higher probability of fitness/correctness with more data, but less likely to be right in shorter timeframes. The fewer pieces of data (characteristics) you give, with a great deal of occurrances give a lower probability of being right, but is more likely to be correct. That is the nature of the beast.

How to know whether what works depends largely on whether you are creating a general tool for screening, or a specific tool for an isolated equity.

lastshadow
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext