On 09/01/12 18:14, mick wrote:
Hi Guys
I have recently devoted a VM to tails (tails.boum.org). That VM appears in a DNS round robin with five other servers. When I first offered the server, trials showed I could expect to see outbound around 20-30 GB per day. I can live with that. Unfortunately, the traffic has now jumped to 150-180 GB per day. I can't live with that. It will cost me too much.
So I need some advice on bandwith shaping. Can anyone recommend any tools which will allow me to set limits so that I can throttle web traffic to around 30 Gb per day (or 250 GB per week or 1000 GB per month for example).
I am not sure what you want to do to enforce the limit ?
So do you propose to serve 30GB of data over 24 hours and then stop/go offline ?
Or do you wish to limit bandwidth so that it isn't possible to transfer more than 30GB over 24 hours. In which case you need to throttle bandwidth to 2.75Mb/s (assuming the utilisation is constant which presumably it won't be)
It's sounds like having such a heavy continuous rate limit on the sort of service you are providing would make your contribution less useful and the only "clever" way I know of doing it slightly better is to use Committed Access Rates which can at least give burst allowances to smooth out the peaks.
AFAIK you can do CAR using policy based routing in Linux using tools like iproute2. Last time I did it I was using a Cisco router.