Round-trip time (RTT), also called round-trip delay, is the time required for a signal pulse or packet to travel from a specific source to a specific destination and back again. In this context, the source is the computer initiating the signal and the destination is a remote computer or system that receives the signal and retransmits it.
- The data transfer rate of the source's Internet connection
- The nature of the transmission medium (copper, optical fiber, wireless or satellite)
- The physical distance between the source and the destination
- The number of nodes between the source and the destination
- The amount of traffic on the LAN (local area network) to which the end user is connected
- The number of other requests being handled by intermediate nodes and the remote server
- The speed with which intermediate nodes and the remote server function
- The presence of interference in the circuit.
In a network, particularly a WAN (wide-area network) or the Internet, RTT is one of several factors affecting latency, which is the time between a request for data and the complete return or display of that data. The RTT can range from a few milliseconds (thousandths of a second) under ideal conditions between closely spaced points to several seconds under adverse conditions between points separated by a large distance.
A theoretical minimum is imposed on the RTT because it can never be less than the total length of time the signals spend propagating in or through the transmission media. In a satellite communications system this minimum time can be considerable because the RF (radio frequency) signals may have to propagate tens of thousands of kilometers through space between the surface and the satellite transponder.
In a radar system, the RTT is the length of time between the transmission of an RF pulse towards a target and the arrival of the returned echo from that target.