The author of this tool uploaded a YouTube video demonstrating it a few days ago: https://www.youtube.com/watch?v=15_-hgsX2V0
At one point in his demo, he uploads a file but terminates the upload more or less halfway. Then he begins downloading the file - which only progresses to the point it had been uploaded, and subsequently stalls indefinitely. And, finally, he finishes uploading the file (which gracefully resumes) and the file download (which is still running) seamlessly completes.
I found that particularly impressive.
It's very impressive, particularly if you remember waking up to a failed download from the night before over dial-up.
I recall we had special apps to queue and schedule our downloads, and resume them where servers supported it. They were a dream compared to the boredom of staring at progress bars.
In the BBS days there were transfer programs that allowed for interrupted downloads to be resumed:
Anyone remember DAP, Download Accelerator Plus? The colorful bars were nice. A part of my childhood, downloading shareware Windows games through dial-up.
Download Accelerator Plus... wow what a memory.
Finding that piece of software around 2001-2002 was what allowed me to finally download a specific piece of, ahem, 'shareware', that was about 400 MB, zipped, that I would never have been able to finish on a 14.4kbps modem on a single very noisy phone line that usually dropped the call every 2 hours or so. It eventually took three days but the file came across uncorrupted. It wouldn't have been possible without the ability to resume downloads after dropped connections.
And that software download went on to allow me to start the path learning what I wanted to learn about, and that paved the way for my engineering degrees and thus setting me up for the last 20-some years. Wild how little pieces of the puzzle like that drive so much of your life.
Some apps still do the same, eg:
(also a great app to download everything you wanted from a site, regex selections, etc.)
Makes several connections and downloads chunks in parallel, for some sites with limited upload (their, your download) speeds per session it really speeds up the downloads.
Sadly, not much development recently (9 months ago was the last commit)
Download The Mall!
Wow, that's a throwback.
- [deleted]
I remember that...
Getright!
The trouble is those special tools also needed downloading. So I could either sacrifice an evening's, ahem, download, or just chance it yet again. I eventually got an FTP client and it was like a superpower. BitTorrent was honestly more impressive to me than AI. Ah, the good old days.
> BitTorrent was honestly more impressive to me than AI. Ah, the good old days.
That’s because BitTorrent was immediately useful and empowering.
The server that has moved countless Petabytes is glFTPd that allows FXP ( clients without bandwidth can initiate to transfer files from server to server ).
That’s a built-in feature of FTP that doesn’t require server support.
Edit: Source: https://en.wikipedia.org/wiki/File_eXchange_Protocol#Technic...
1. You connect to servers A and B.
2. Tell B to receive a PASV transfer. It replies with the IP address and port it's receiving on.
3. Tell A to send to that address and port.
This is documented in RFC 959, starting with
"In another situation a user might wish to transfer files between two hosts, neither of which is a local host."
One of those things of the past even old nostalgic greybeards like me do not miss at all.
Most files were available via FTP which supported resume.
Not most. There was (and still is) so much locked behind HTTP on poor servers
The vast majority of web servers out there¹ support partial download and have done for years. That the most common UA for accessing them (web browsers) don't support the feature² without addons, is not a server-side problem.
Sometimes there are server-side problems: some dynamic responses (i.e. files that are behind a user account so need the right to access checked before sending) are badly designed so that they uneccesarily break sub-range downloads. This could be seen as a “poor server” issue, but I think it is more a “daft dev/admin” or “bad choice of software” problem.
--------
[1] admittedly not all, but…
[2] wget and curl do, though not automatically without a wrapper script
Many sites also had an ftp server behind it. E.g. ftp.id.com and ftp.cdrom.com were two off the top of my head. Another I remember was downloading high resolution images of Tyan motherboards from ftp.tyan.com. Supermicro also had an ftp server you grabbed bios images from. I dont really recall ever having to download anything big via http. Mostly images, pdf's and small zip files.
Obviously I wasn’t using FTP way back then, or I wouldn’t have made the comment I did.
FTP can't restart a PPP or SLIP connection.
No, but FTP and such protocols shouldn't need to be aware of that layer, any more than it should be aware of my VPN that is happens to be connecting through. You can resume an FTP transfer once the PPP or SLIP connection has been restored though.
I remember redownloading Liero over and over again and failing. And then cherishing it once getting a successful download. It would barely fail to fit into a floppy.
Amateur. Use Flashget or Netants which download the file in 8 simultaneous chunks. I used to cheer the threads on last legs of a whopping 5m download. I hated servers which dont allow resume or even report file size.
If I remember correctly, unfortunately the HTTP server hosting Liero back then didn't support requests with the Range header.
But also likely I didn't have FlashGet on my cousin's computer which was the computer we were using to play it.
Magic of http 206 ?
And you forgot to disable call waiting!
I really didn't think I need this software but the video is so good that I'm gonna try hard to find a use case.
Could be useful when launching a Doom shareware release.
“Race the beam”
That’s really cool. I’ve never seen that work before.
Sounds like...BitTorrent.
Or… proper adherence to HTTP RFCs… with some added devx
bittorrent needs to know the complete file at the beginning to make the pieces. This tool doesn't need to know the complete file to start the upload, nor the download...
IIRC webtorrent /can/ do streaming though....
Sound like BitTorrent needs better PR then
You might like NNCP which was written precisely to support severely constrained or even cut down down networks.
It would be even more impressive if he rebooted the server in the meantime.
NNCP supports that.