Both UTP and STP cabling systems are designed to reduce the impact from internal and external sources of electromagnetic interference by perpetually twisting the four pairs of internal wires around each other. This is done for two reasons:
- To reduce interference from the other pairs in the same cable.
- To reduce interference from external sources by constantly changing the angle of interaction between the electromagnetic pulses traveling through the cable and the electromagnetic interference.
Realistically, I have seen very few circumstances where there has been an actual need for STP. Actually I can only recall one manufacturing plant and one restaurant that required STP due to the cable path passing extremely close to large generators and motors.
Now, under typical circumstances, UTP may (at times) be exposed to passing sources of EMI, which typically presents itself through an increased number of CRC errors and retries. Whilst I have seen this happen in a couple of abnormal instances (abnormal as most cable runs are static and done away from EMI sources) it has tended to represent a 10-20% reduction in performance and even these degradation patterns were sporadic.
Now, on to your question :-)
Assuming a BEST case scenario, you're using an access point that has the following attributes:
- An Ethernet interface of 100Mbps
- Supports the operation of both 802.11a and 802.11g simultaneously giving you a maximum (across the air) bandwidth of 108Mbps. This translates to approximately 50-60 Mps real Ethernet throughput.
As above, if you were to experience some level of EMI on your UTP, you'd probably be looking at a reduction of 10-20% of maximum throughput, meaning that your maximum Ethernet throughput was 80Mbps – well above the throughput provided by BOTH your wireless interfaces, combined!
So, will STP give you a performance increase? Possibly. Will you notice it in 99.9% of cases? Probably not.
This was first published in March 2004