What Is Gigabyte

What Is Gigabyte
What Is Gigabyte

Video: What Is Gigabyte

Video: What Is Gigabyte
Video: What Is A MB, GB, and TB? The Difference Between Megabytes, Gigabytes, and Terabytes! 2024, May
Anonim

When the development of civilization reached the point of computer data processing, quite a lot of new terms and concepts arose. In particular, it became necessary to somehow designate units of information stored on storage devices and transmitted over networks. And with the advent of personal computers, players and mobile phones, many highly specialized terms have become widely known.

What is gigabyte
What is gigabyte

The smallest possible informative unit consists of two values - "yes" or "no", 0 or 1. Since 1948, this unit has been called "bit". In computer processing, any information is broken into bits - numbers, text, color, sound, spatial position, etc. The processor processes each unit of data sequentially, but the bit insertion makes the queue too long and thus limits the performance. Therefore, modern processors work with groups of information units, consisting of 8 bits - such a group is called "bytes" and is considered the minimum unit of computer data processing. Grouped into bytes, information is stored on disks or in virtual memory, and also transmitted over network connections.

In the metric system of SI units adopted today in most countries, the rules are fixed according to which any units of measurement are scaled. To designate a value that is a thousand times higher than that which is accepted in this system, the prefix "kilo" is added to its name. For example, 1000 grams = 1 kilogram, 1000 bytes = 1 kilobyte. There are the same prefixes for other thousand-fold increased units - a million is assigned the prefix "mega" (1,000,000 bytes = 1,000 kilobytes = 1 megabyte), and a billion - "giga". Therefore, 1 gigabyte corresponds to a billion minimum units of information - bytes.

However, due to the fact that computer information is binary in nature (yes / no, 0/1), computer scientists for their internal needs from the very appearance of processors use not the decimal number system, as in SI, but binary. Because of this, confusion often arises with the exact definition of gigabytes - in the binary system, this unit corresponds not to 10⁹ (1 billion), but 2³⁰ (1 073 741 824). Most often, this discrepancy is encountered when purchasing various storage devices (hard drives, flash drives, players, etc.) - manufacturers indicate their capacity in the interpretation of gigabytes, which shows the product in a more favorable light.