SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Strategies & Market Trends : TA- Scans and System Tests

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Derek Barkalow who wrote (374)3/2/1997 5:02:00 PM
From: Paul Beattie   of 989
 
Derek, Grabbing a Web page and import it into Excel - My suggested approach uses the following tools:

1. An offline browser to fetch the web page and store it. I use Surfbot, but I think that routines are available in perl. Surflogic's web site is surflogic.com. You can download an evaluation copy of Surfbot which will do the job. I use Surfbot 2.02. They have a new release (3.0) which may have more features.

2. A perl script to parse the html fetched, and extract the numbers into a text file (either delimited or fixed column format). Perl 5.0 is available free for Windows95. Details on how to download it were posted in the QuotesPlus thread.

3. A macro (part of an Excel spreadsheet) to read the text file into a worksheet where you can do additional work. The macro will simply read the text file into a worksheet. It can be created with the record macro facility in Excel. You may want to add the Macro to the "Tools" dropdown menu in the spreadsheet for convenience

4. WinBatch to automate these steps. This is a bit complex, but really only runs these 3 steps sequentially to save you having to do it manually.

I use this process to collect data from stock quote sites and analyze for hedging opportunities.

Regards,

Paul
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext