But let's consider the network as you requested. It sounds like multiple concurrent uses of the existing link is the problem – too many users downloading too many large files at once. If you have no choice in dealing with the user behavior, you certainly want to check whether you getting the bandwidth you are currently paying for.
You have several options to determine your bandwidth capabilities:
- Use a utility like iPerf or Chariot (assuming you can set up a remote agent at the other end) and stress the link to determine if it can deliver bandwidth as rated.
- Use a utility like AppareNet (if you can't set up a remote agent or are not allowed to stress the link) to derive the same information without instrumenting or deploying.
NOTE: Don't trust FTP downloads or other forms of bandwidth testing – there are far too many uncontrolled factors that may impact the measurements.
If it turns out you have an unexpected bandwidth bottleneck or other performance degradation problem, finding it is your next challenge. This problem has fewer options and is the goal of my own research as represented by our product AppareNet.
Upgrading to more dedicated bandwidth may help - it is often the only thing you can do when you have poor visibility. And it may not. It depends on what is at the root of your problem. Guessing is a poor basis for business decisions so I would suggest you need more solid information.