1) In computer technology, throughput is the amount of work that a computer can do in a given time period. Historically, throughput has been a measure of the comparative effectiveness of large commercial computers that run many programs concurrently. An early throughput measure was the number of batch jobs completed in a day. More recent measures assume a more complicated mixture of work or focus on some particular aspect of computer operation. While "cost per million instructions per second (MIPS)" provides a basis for comparing the cost of raw computing over time or by manufacturer, throughput theoretically tells you how much useful work the MIPS are producing.
Another measure of computer productivity is performance, the speed with which one or a set of batch programs run with a certain workload or how many interactive user requests are being handled with what responsiveness. The amount of time between a single interactive user request being entered and receiving the application's response is known as response time.
A benchmark can be used to measure throughput.
2) In data transmission, throughput is the amount of data moved successfully from one place to another in a given time period.
'throughput' is part of the:
View All Definitions