What is Bit ?

Bits in a computer processor

The processors of beginning computers, such as 8088 and 80286 were had the capability to function with 16-bit binary numbers as they were 16-bit processors. Later, to work with 32-bit binary numbers, the 32-bit processor was introduced. Nowadays, computers come with 64-bit that are capable of working with 64-bit binary numbers.

History of the bit

By discrete bits in the punched cards, the use of encoding data invented by Jean-Baptiste Falcon and Basile Bouchon in 1732, Joseph Marie Jacquard developed it in 1804. Later, it was adopted by Charles Babbage, Semyon Korsakov, Hermann Hollerith, and initially computer manufacturers such as IBM. The perforated paper tape was another variation of that concept. The card or tape (medium) theoretically carried the collection of hole positions in all those systems; all positions could be able to be punched through or not, therefore carrying one bit of information. In 1844, the use of encoding of text by bits was performed in Morse code and in 1870, was also used in initially digital communications machines like stock ticker and teletypes machines.

In 1928, a logarithmic measure of information was suggested by Ralph Hartley and described how to use it. In 1948, the “bit” word was used by Claude E. Shannon for the first time in his seminal paper named “A Mathematical Theory of Communication”. He credited its basic to John W. Tukey, who was the writer of the Bell Labs memo that was written on 9 January 1947. He contracted the binary information digit to a bit in Bell Labs memo. In 1936, to be stored on the punched cards, the “bits of information” was written by Vannevar Bush. At that time, mechanical computers used this bit of information.

Leave a Reply

Your email address will not be published. Required fields are marked *