I think most of us have seen the Microsoft Download Center and those provided by others. For example, the Visual Studio Service Pack 1 was released yesterday. Somebody downloads it then puts it on a network drive and tells everyone. Wouldn’t it be nice if this were automated. This process will happen over and over with various software downloads throughout the year. It would conserve quite a bit of bandwidth if 1 person got the 130 mb file instead of 10. There are also huge packages from Installshield and all the patches to update a Windows XP box.
It would be a help to network bandwidth to make certain files download once to a “local” download center for a company. This should be down on the router level so that when the router sees a request for that file again it could fill it from itself instead of downloading again. This is the idea of caching on the proxy server, but instead of piddly 20 kb web pages your doing it for 100+ mb files. There would need to be an admin screen to view and tag the large files that came in. Surely, you don’t want to save every large file. And you may want to save a product key or some other metadata so the file can be installed or activated.