Hi to all! (howtoforge is a very wonderful website, i love it!) I have some doubts about the real transfer bandwitdh of a server with guaranteed connection of 500Mbit/s to make a stream of a live event using the shoutcast software! 1) a 100mbps network interface is capable of transferring ~12.5 megabytes per second. 2) 500Mbits = 64000kilobytes each listener use 128kilobytes to listen the stereo audio stream and....the total amount of listeners will be of ~500. But there is a range of bandwidth that can be loss? How much bandwidth I would have to delete from the nominal calculation to make a realistic forecast of the real connected listeners? The hardware will be like this: Dell PowerEdge™ R710 with cpu Quad-Core Intel ® Xeon ® E5504 and 4gb of ram. Any council is very accepts! Thanks you very much in advance.