Jay, If I'm seriously going to train the net, I use the maximum number of neurons.
Hope you don't mind my two cents. From what I understand about neural networks, there is a trade-off between the number of inputs, size of the networks (number of neurons), and reliability and robustness. While it may be true that a net that has more inputs will be able to see more of what is going on the market, it also means that the net has more points of failure if any of the relateds diverge from their usual relationship. This is commonly referred to as degrees of freedom. It is a concept that I have taken to heart when apply a model to real world trading. This also applies to the number of neurons, as the more neurons a model has, the more relationships or patterns it is able to discern. This has an effect of 'memorizing' previous data. This means you will get great results in the training set, but poor results during verification and real world trading. The only way around all this is to extend the training set. By forcing the net to see a large number of samples (historical data) you force it to generalize on the relationships between inputs. This means that longer-term patterns are used, instead of short-term fluctuations. It also has an effect of decreasing the profit in a network, as you become a longer-term trader, trading on longer-term patterns. You can keep a training set short to have the net sense only the most recent, shorter-term patterns, but you must watch that you don't overtrain or 'curve-fit' the data. This will boost the profit of the nets, as they are able to see the most recent patterns, and not those from years ago, which may not apply today. I know the NeuroStock manual says that you can not overtrain a model, but if I am betting real dollars in the market, I would want to be sure. This is also why you would want to combine neural networks with other methods such as technical analysis and fundamental analysis. Great thread BTW.
Optim |