bit

A bit is the smallest unit of data measurement.

“Bit” stands for “binary digit”. Binary means a bit can hold only one of two values, either 0 or 1. These two numbers represent the electrical values of off or on, meaning that a signal is absent or present in the transistor of the memory cell in which the data is stored in a computer memory chip.

Eight bits make up one byte. Thus, one byte of data is represented by an eight-digit string of binary code such as 10001011.

Binary data is the most common form of digital data in modern information systems. Large amounts of binary data are quantified in the following units:

  • 1 kilobyte (kB) = one thousand bytes
  • 1 megabyte (MB) = one million bytes
  • 1 gigabyte (GB) = one billion bytes
  • 1 terabyte (TB) = one trillion bytes
  • 1 petabyte (PB) = one quadrillion bytes

Because it uses only two symbols (0 and 1), binary code can be seen as the simplest way to represent data. All other representations, such as hexadecimal code or the English alphabet, can be expressed as, and converted into, binary code.

For the sake of simplicity, My Big TOE therefore assumes that the basic data structure within consciousness, represented by the concept of reality cells, is binary. Whether this is really the case is irrelevant to the validity of the model – reality cells are only a metaphor to convey the idea of consciousness being an information system.

« Go to Glossary Index