What we need is a simple server setup and a script like a file hosting for a website with big static files to serve.
The website is a portal to download files between 500MB and 900MB. We have a big dedicated server with a 10GB/s dedicated line attached to it. It has centos and apache with php, also it has nginx with php-fpm in other port but it is not having any good function to serve this files. We tried to use nginx but it overloads the server with more than 300 concurrent connections. We tried lighttpd but we had no luck configuring it and so far the best solution we’ve found is the php file with apache, it can handle more than 600 concurrent connections with no overload, when reaching 700 connections the load goes too high.
We have a php file to hide the path for direct download and to control the bandwidth per user but we know that there are other methods for that. What we do is to call the file [url removed, login to view] but that way we don’t prevent from being hotlinked and it consumes lots of resources. (I’ve attached the php file)
What we want is to have the server properly configured to manage the amount of users it should, our competitors have less resources in their servers and they have more concurrent connections per server.
We will ask the freelancer to do: Configure the engine (Apache, Lighttpd, Nginx, etc…) with whatever it works best to hide the path to direct download (could be a php script) and make sure that the generated download link has the random md5 hash of the ip of the user who downloads the content in order to start the download. We have a folder in our root with all the files in it and we have a script that generates the name of the file depending on the content. Example: we have [url removed, login to view]’.$generatedname.’ That way we can select the file depending on the content so it must be compatible.
A good example of that would be Bayfiles ([url removed, login to view]) they generate the download link encrypted and compare it to the one is downloading, that way you prevent from being hotlinked too and can't download 2 files at the same time. They used Lighttpd and a php script to manage that, well we need it to work like that or near.