Hello all, I have successfully uploaded the entire kit (18mb) twice and of course for every new website you create, you have to upload the entire compliment of files each time, or do you? Firstly
let me explain that despite me having Broadband here in Australia, that only gives you fast download speed, but the upload speed is still restricted to the speed your telephone line is rated at and in my case pretty slow. It takes me about two hours all up
to upload the full lot of files but I can't just set it to start uploading then go off somewhere else, because the connection seems to drop out every now and again, so I have worked out that I need to upload each sub-folder one at a time and to physically
sit at the computer and keep an eye on things. I am uploading to a shared environment namely goDaddy which I find is great, but the speed issue of course belongs to me.
My question is this, can I cheat the system and have just one lot of files on the server and somehow point new domains at that folder, and say perhaps just have unique storage folders for the unique information from site to site but use the common files that
are never altered in one central position?
Even if you can do that, it would still be great if somehow I could come up with a way of increasing the speed of the upload, I tried using the built in goDaddy ftp facility which is a nice unit and I use VS2008 for ftp as well, but regardless of that
it is slow. I was wondering if compressing the entire package and then expanding it on the server might help, or if there is some utility out there that is designed to transfer large files around the 18mb mark or greater. Years ago I had a utility that used
to reconnect automatically if you disconnected but I can't remember what that was called.
Generally is there a better way of getting files to a remote sever, is there a way of eliminating having to do it every time you create a new website that has a unique domain, are there any secrets??? Thanks for reading :-)