By using all eight bits EBCDIC may have encouraged the use of the eight-bit byte by IBM, while ASCII was more likely to be adopted by systems with 36 bits (as five seven-bit ASCII characters fit into one word). Source: Internet
IN and OUT tokens contain a seven-bit device number and four-bit function number (for multifunction devices) and command the device to transmit DATAx packets, or receive the following DATAx packets, respectively. Source: Internet
This meant that, while the IBM 1401 had a seven-bit word, almost no-one ever thought to use this as a feature, and override the assignment of the seventh bit to (for example) handle ASCII codes. Source: Internet
However, it was necessary to limit the length of the messages to 128 bytes (later improved to 160 seven-bit characters) so that the messages could fit into the existing signalling formats. Source: Internet
In a seven-bit message, there are seven possible single bit errors, so three error control bits could potentially specify not only that an error occurred but also which bit caused the error. Source: Internet
The committee voted to use a seven-bit code to minimize costs associated with data transmission. Source: Internet