In data communications, a gigabit is one billion bits, or 1,000,000,000 (that is, 109) bits. It's commonly used for measuring the amount of data that is transferred in a second between two telecommunication points. For example, Gigabit Ethernet is a high-speed form of Ethernet (a local area network technology) that can provide data transfer rates of about 1 gigabit per second. Gigabits per second is usually shortened to Gbps.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Some sources define a gigabit to mean 1,073,741,824 (that is, 230) bits. Although the bit is a unit of the binary number system, bits in data communications are discrete signal pulses and have historically been counted using the decimal number system. For example, 28.8 kilobits per second (Kbps) is 28,800 bits per second. Because of computer architecture and memory address boundaries, bytes are always some multiple or exponent of two. See kilobyte, etc.
Continue Reading About gigabit
- Cary Lu's The Race for Bandwidth - Understanding Data Transmission is recommended. His definitions of kilobit, megabit, and gigabit, however, use binary number system exponents and their decimal equivalents.