Definition

gigabit

In data communications, a gigabit is one billion bits, or 1,000,000,000 (that is, 109) bits. It's commonly used for measuring the amount of data that is transferred in a second between two telecommunication points. For example, Gigabit Ethernet is a high-speed form of Ethernet (a local area network technology) that can provide data transfer rates of about 1 gigabit per second. Gigabits per second is usually shortened to Gbps.

Some sources define a gigabit to mean 1,073,741,824 (that is, 230) bits. Although the bit is a unit of the binary number system, bits in data communications are discrete signal pulses and have historically been counted using the decimal number system. For example, 28.8 kilobits per second (Kbps) is 28,800 bits per second. Because of computer architecture and memory address boundaries, bytes are always some multiple or exponent of two. See kilobyte, etc.

This was last updated in August 2006

Continue Reading About gigabit

Dig Deeper on Network Administration

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchSDN

SearchEnterpriseWAN

SearchUnifiedCommunications

SearchMobileComputing

SearchDataCenter

SearchITChannel

Close