Timeout and RTT Estimation
Timeout: for robust detection of packet loss
Problem: How long should timeout be ?
- Too long => underutilization; too short => wasteful retransmissions
- Solution: adaptive timeout: based on RTT
RTT estimation:
- Early method: exponential averaging:
- R ? ?*R + (1 - ?)*M { M =measured RTT}
- RTO = ?*R {? = delay variance factor}
- Suggested values: ? = 0.9, ? = 2