Megabyte

The megabyte is a multiple of the unit byte for digital information. Its recommended unit symbol is MB. The unit prefix mega is a multiplier of 1000000 (106) in the International System of Units (SI).[1] Therefore, one megabyte is one million bytes of information. This definition has been incorporated into the International System of Quantities.

Multiples of bytes
Decimal
Value Metric
1 Bbyte
1000 kBkilobyte
10002 MBmegabyte
10003 GBgigabyte
10004 TBterabyte
10005 PBpetabyte
10006 EBexabyte
10007 ZBzettabyte
10008 YByottabyte
Binary
Value IEC JEDEC
1 Bbyte Bbyte
1024 KiBkibibyte KBkilobyte
10242 MiBmebibyte MBmegabyte
10243 GiBgibibyte GBgigabyte
10244 TiBtebibyte
10245 PiBpebibyte
10246 EiBexbibyte
10247 ZiBzebibyte
10248 YiByobibyte

However, in the computer and information technology fields, several other definitions are used that arose for historical reasons of convenience. A common usage has been to designate one megabyte as 1048576bytes (220 B), a measurement that conveniently expresses the binary multiples inherent in digital computer memory architectures. However, most standards bodies have deprecated this usage in favor of a set of binary prefixes,[2] in which this quantity is designated by the unit mebibyte (MiB). Less common is a convention that uses the megabyte to mean 1000×1024 (1024000) bytes.[2]

Definitions

The megabyte is commonly used to measure either 10002 bytes or 10242 bytes. The interpretation of using base 1024 originated as a compromise technical jargon for the byte multiples that needed to be expressed by the powers of 2 but lacked a convenient name. As 1024 (210) approximates 1000 (103), roughly corresponding to the SI prefix kilo-, it was a convenient term to denote the binary multiple. In 1998 the International Electrotechnical Commission (IEC) proposed standards for binary prefixes requiring the use of megabyte to strictly denote 10002 bytes and mebibyte to denote 10242 bytes. By the end of 2009, the IEC Standard had been adopted by the IEEE, EU, ISO and NIST. Nevertheless, the term megabyte continues to be widely used with different meanings:

Base 10
1 MB = 1000000 bytes (= 10002 B = 106 B) is the definition recommended by the International System of Units (SI) and the International Electrotechnical Commission IEC.[2] This definition is used in networking contexts and most storage media, particularly hard drives, flash-based storage,[3] and DVDs, and is also consistent with the other uses of the SI prefix in computing, such as CPU clock speeds or measures of performance. The Mac OS X 10.6 file manager is a notable example of this usage in software. Since Snow Leopard, file sizes are reported in decimal units.[4]

In this convention, one thousand megabytes (1000 MB) is equal to one gigabyte (1 GB), where 1 GB is one billion bytes.

Base 2
1 MB = 1048576 bytes (= 10242 B = 220 B) is the definition used by Microsoft Windows in reference to computer memory, such as RAM. This definition is synonymous with the unambiguous binary prefix mebibyte.

In this convention, one thousand and twenty-four megabytes (1024 MB) is equal to one gigabyte (1 GB), where 1 GB is 10243 bytes.

Mixed
1 MB = 1024000 bytes (= 1000×1024 B) is the definition used to describe the formatted capacity of the 1.44 MB 3.5-inch HD floppy disk, which actually has a capacity of 1474560bytes.[5]

Semiconductor memory doubles in size for each address lane added to an integrated circuit package, which favors counts that are powers of two. The capacity of a disk drive is the product of the sector size, number of sectors per track, number of tracks per side, and the number of disk platters in the drive. Changes in any of these factors would not usually double the size. Sector sizes were set as powers of two (most common 512 bytes or 4096 bytes) for convenience in processing. It was a natural extension to give the capacity of a disk drive in multiples of the sector size, giving a mix of decimal and binary multiples when expressing total disk capacity.

Examples of use

1.44 MB floppy disks can store 1,474,560 bytes of data. MB in this context means 1,000×1,024 bytes.

Depending on compression methods and file format, a megabyte of data can roughly be:

  • a 1 megapixel bitmap image with 256 colors (8 bits/pixel color depth) stored without any compression.
  • a 4 megapixel JPEG image with normal compression.
  • approximately 1 minute of 128 kbit/s MP3 compressed music.
  • 6 seconds of uncompressed CD audio.
  • a typical English book volume in plain text format (500 pages × 2000 characters per page).

The human genome consists of DNA representing 800 MB of data. The parts that differentiate one person from another can be compressed to 4 MB.[6]

See also

References

  1. "Archived copy". Archived from the original on June 7, 2007. Retrieved June 1, 2007.CS1 maint: archived copy as title (link)
  2. "Definitions of the SI units: The binary prefixes". National Institute of Standards and Technology.
  3. SanDisk USB Flash Drive "Note: 1 megabyte (MB) = 1 million bytes; 1 gigabyte (GB) = 1 billion bytes."
  4. "How Mac OS X reports drive capacity". Apple Inc. 2009-08-27. Retrieved 2009-10-16.
  5. Tracing the History of the Computer - History of the Floppy Disk
  6. Christley, S. .; Lu, Y. .; Li, C. .; Xie, X. . (2008). "Human genomes as email attachments". Bioinformatics. 25 (2): 274–275. doi:10.1093/bioinformatics/btn582. PMID 18996942.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.