Jay wrote: > I haven't given up on Dave yet.
Good. Actually, I was away for several days, so couldn't respond to the discussion. Further, I'm still at work, so I won't take time to enter a long post now, except to say thanks for the responses.
It appears you have figured out the verification issue--as Andrew says, just make the last day of the training set something before the last date of your data. Then you will have a verification set. As you will then note when you look at the scatter plot, it is often the case that the network appears to be quite good on the training set, but when applied to "out of sample" data, does not perform well at all. Further, the confidence level does not take this into account. Thus, my SPX model with 72% confidence was in fact a poor model. If I had not used a verification set, however, I would have expected it to perform well. (That was really the point of my first item in the last posting.)
Well, gotta go. Sorry for the long delay. I'll try to visit regularly, but no firm promises. I'm in the middle of changing jobs/relocating, so things will get interesting in the next 30-60 days...
Regards, Dave |