The kB, MB, and GB symbols are units of measurement for computation. They are usually found when referring to the size of a digital file or the storage capacity of computers, cell phones and other computing devices.
However, kB, MB and GB are just three of the computational units of measure. As information processing grows every day, with the advancement of technology, the units have been adapting to new sizes and uses.
Name | Symbol | Definition |
---|---|---|
bit | - | Unit based on binary code (0 and/or 1) for storing computational information. |
Byte | B / Byte | 8 bit |
kilobyte | kB / KByte | 1024 bytes |
Megabyte | MB / MByte | 1024 kilobytes |
Gigabyte | GB / GByte | 1024 megabytes |
terabyte | TB / TByte | 1024 gigabytes |
petabyte | PB / PByte | 1024 terabytes |
exabyte | EB / EByte | 1024 petabytes |
Zettabyte | ZB / ZByte | 1024 exabytes |
Yottabyte | YB / YByte | 1024 zettabytes |
Computer measurement units
To understand computational measurement units in detail, it is necessary to bear in mind that computer science uses binary code. This coming from mathematics to the transmission of information.
In this sense, computers use electrical impulses, which form a bit, translated by binary code as a state of 0 or 1.
The word bit comes from English and is an abbreviation of binary digit. It is important not to confuse the bit as byte, which is the result of the 8-bit join. The bit is formed by zero and/or one, eight zeros and/or ones give rise to 1 byte.
By gathering 8 bits, 256 different combinations of 0 and/or 1 are possible. It is the equivalent of forming the alphabet, numbers and symbols present on the computer keyboard. That is, they would fit in 1 byte, resulting from the union of 8 bits.
From then on, new units of measurement appear. One kilobyte / kilobyte (kB) is the equivalent of 1024 bytes; one megabyte equals 1024 kilobytes; a gigabyte would be 1024 megabytes. With each denomination, the values become extremely complex, with much more information in storage.
Although the word kilo (kilo) refers to 1,000 units (103 = 1000), in computing it refers to the value of 1024 units.
This is because the computational transmission of information is based on binary code, that is, it is based on the number 2. In this sense, 210 = 1024, value that started to be used to define the computer measurement units.
See also the difference between:
- hardware and software
- Input and output devices
- Gel film and glass film