Definition

gigahertz (GHz)

The gigahertz, abbreviated GHz, is a unit of alternating current (AC) or electromagnetic (EM) wave frequency equal to one thousand million hertz (1,000,000,000 Hz). The gigahertz is used as an indicator of the frequency of ultra-high-frequency (UHF) and microwave EM signals and also, in some computers, to express microprocessor clock speed.

An EM signal having a frequency of 1 GHz has a wavelength of 300 millimeters, or a little less than a foot. An EM signal of 100 GHz has a wavelength of 3 millimeters, which is roughly 1/8 of an inch. Some radio transmissions are made at frequencies up to hundreds of gigahertz. Personal computer clock speeds are increasing month by month as the technology advances, and reached the 1 GHz point in March of 2000, with a processor from AMD, closely followed by a 1 GHz Pentium 3 from Intel.

Other commonly-used units of frequency are the kHz, equal to 1,000 Hz or 0.000001 GHz, and the MHz, equal to 1,000,000 Hz or 0.001 GHz.

This was last updated in September 2005

Continue Reading About gigahertz (GHz)

Dig Deeper on Network Administration

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

hi I am having trouble in allocating the gigahertz in my computer so I really need ur attention in helping me to know more about it.
Cancel

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchSDN

SearchEnterpriseWAN

SearchUnifiedCommunications

SearchMobileComputing

SearchDataCenter

SearchITChannel

Close