Nenad Milenkovic is an experienced OS/2 user, specialized in networking and communications, and the best known OS/2 advocate in Yugoslavia. He has published two books and now writes for mainstream print computer magazines, but welcomes the opportunity to contribute to a magazine read by people with similar views and preferences as his own.
Blast Back! Send a private message directly to Nenad Milenkovic with your thoughts:
Go to a Printer Friendly version of this page
Summary: Learn how to make downloads painless with a simple utility that not only re-tries failed attempts automatically, but also supports "resume" or "reget" too.
For many people, especially those with bad dial-up connections, Navigator's "Saving Location" is not good enough. Although in new version it supports continuation of unfinished downloads, it's implementation is unreliable and rather clumsy (file must still be in cache, it's not written to disk if you press "Cancel" while transferring to try restarting, etc.). Many people use command line tool called WGET which is ported from Unix under GNU license. It's very stubborn when it wants to download a file, works reliably on slow connections, knows how to restart and retry if connection is lost, etc. But it's certainly easier just to click on link in Netscape than to start some command line utility from prompt, passing all necessary parameters to it (although only '-c' and URL is enough in most times). Changing (adding) file/MIME associations in Netscape doesn't help since it's supposed only to pass already downloaded file to some external program for processing.
Not surprisingly, solution comes with a little help of WPS and Rexx. It's called Auto Wget.
It works by starting a process that monitors requests for downloads and calls WGET for all of them. How do you issue those requests? As simple as possible: just by dragging links from Navigator's page into the "ToDo" folder (a "shadow" is created on Desktop by the installation procedure). Dragging them will create URL Objects in it and the "Auto Wget Daemon" will scan it's content every minute (configurable), downloading whatever it finds there.
However, what really shines is the implementation of this idea. "Auto Wget Daemon", which is a Rexx script, is started automatically from "StartUp" folder. When it detects URL Object in "ToDo" folder (which has "Running" and "Info" sub folders) it will:
1. If configured to do so, check if dial-up connection is active (by running 'netstat -a' and analyzing it's output)
2. Check if some non-finished downloads exist in "Running" sub folder (it does that with the help of semaphore files) and resume downloading them if possible.
3. Check if maximum number of concurrent downloads is reached
4. Prepare semaphore files and call another Rexx script that will handle invoking of WGET with parameters and do some reporting stuff, like write to log file, and use PMPOPUP program to show you status messages on screen about finished downloads, etc.
The entire process is so well designed that you'll have no worries once each URL Object is created. The Auto Wget Daemon will download the file now or the next day, in this or the next call, in one or in 100 attempts. It's truly a "fire and forget" solution. Once there's free space in queue, a URL object from "ToDo" folder will be used to start download process. But while it's still waiting you can delete it or move it somewhere else if you change your mind. You can check the "Running" sub-folder to see what files are currently downloading, plus you'll find log of activities in the "Info" folder too.
If you are familiar with Rexx, take some time to examine both scripts, they are nice example of good Rexx programming and demonstrate some features not commonly used (like queues, stems, etc.) To make AutoWget work, you must have the WGET utility installed and available somewhere in PATH. The same goes for NETSTAT, but if you have installed TCP/IP for Warp then you already have it, and WGET is available from all well-known OS/2 file repositories.
What makes this magic possible is the fact that "URL Object" is actually a subclass of the "Data File" class of WPS objects, and that in addition to the usual data that the Workplace Shell keeps about it, it also contains the whole URL as the first (and only) line in the file, which is what AutoWget reads.
There are two problems with this solution, tough. First of all, all documentation, including the Web page, is only in Russian (the authors are from Russia). Luckily, the installation procedure and usage is very simple, and you already know all you have to know from this article. Just unzip the distribution archive into some directory, run INSTALL.CMD and check out the configuration file called AWGET.CFG. You'll find it is placed in directory stated in the %ETC% variable, where most other TCP/IP configuration files are contained. So, after installation just type 'e %etc%\awget.cfg' and make sure download directory is correct. Other parameters in this file are self-explaining, just bear in mind that 1 means "yes" and 0 is for "no". Make sure WGET is in available via PATH statement, make sure check_connection parameter is 1 if you use dial-up, and you can start dragging and dropping.
The second problem is that Netscape v4.04 for OS/2 still doesn't support dropping URL Objects (or images and files for that matter) directly to WPS folders - only to Desktop. So, if you're using v4 browser, you'll have two steps: first create URL Object on Desktop and then move it to the "ToDo" folder. Netscape v2 users should have no problems and developers of Communicator for OS/2 v4 have stated that full drag & drop functionality is their "number one priority" for next release.
In short, if you have problems transferring files, or need nice way of queuing multiple files for long overnight downloads, go to author's web page and download Auto Wget Daemon. You'll hardly find another solution as elegant and powerful as this, even on OS/2, not to mention other platforms.
|Copyright © 1998 - Falcon Networking||ISSN 1203-5696||November 1, 1998|