Modem

This is an old revision of this page, as edited by 195.194.18.253 (talk) at 12:43, 10 January 2006 (Broadband). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

A modem (a portmanteau word constructed from modulator and demodulator) is a device that modulates a carrier signal to encode digital information, and also demodulates such a carrier signal to decode the transmitted information. The goal is to produce a signal that can be transmitted easily and decoded to reproduce the original digital data.

The most familiar example of a modem turns the digital '1s and 0s' of a personal computer into sounds that can be transmitted over the telephone lines of Plain Old Telephone System (POTS), and once received on the other side, converts those sounds back into 1s and 0s.

Far more exotic modems are used by internet users every day. In telecommunications, "radio modems" transmit repeating frames of data at very high data rates over microwave radio links. Some microwave modems transmit more than a hundred million bits per second.

Optical modems transmit data over optic fibers. Most intercontinental data links now use optic modems transmitting over undersea optical fibers. Optic modems usually use interferometric filters called etalons to separate different colors of light, and then individually turn the pulses of each color of light into electronic digital data streams. Optical modems routinely have data rates in excess of a billion (1x10^9) bits per second. Their bandwidths are currently limited by the thermal expansion of the etalons. Heat changes an etalon's size and thus its frequency.

Modems can be used over any means of transmitting analog signals, from driven diodes to radio.

History

Modems were first introduced as a part of the SAGE air-defense system in the 1950s, connecting terminals located at various airbases, radar sites and command-and-control centers to the SAGE director centers scattered around the US and Canada. SAGE ran on dedicated communications lines, but the devices at either end were otherwise similar in concept to today's modems. IBM was the primary contractor for both the computers and the modems used in the SAGE system. A few years later a chance meeting between the CEO of American Airlines and a regional manager of IBM led to a "mini-SAGE" being developed as an automated airline ticketing system. In this case the terminals were located at ticketing offices, tied to a central computer that managed availability and scheduling. The system, known as Sabre, is the distant parent of today's SABRE system.

By the early 1960s commercial computer use had bloomed, due in no small part to the developments above, and in 1962 AT&T released the first commercial modem, the Bell 103. Using frequency-shift keying, where two tones are used to represent the 1's and 0's of digital data, the 103 had a transmission rate of 300 bit/s. Only a short time later they released the Bell 212 modem, switching to the more reliable phase-shift keying system and increasing the data rate to 1200 bit/s. The similar Bell 201 system used both sets of signals (send and receive) on 4-wire leased lines for 2400 bit/s operation.

The next major advance in modems was the Hayes Smartmodem, introduced in 1981 by Hayes Communications. The Smartmodem was a simple 300 bit/s modem using the Bell 103 signaling standards, but attached to a small controller that let the computer send commands to it to operate the phone line. The basic Hayes command set remains the basis for computer control of most modern modems.

Part of the reason for the advance to line-connected modems like the Smartmodem, rather than acoustically-coupled modems, was internationally widespread deregulation of telephone companies. At one time, any equipment electrically connected to most telephone lines had to belong to the telephone company. The acoustic coupler, despite inherent speed and reliability limitations, would allow a user's computer to communicate through a telephone which was rented from the telephone company.

 
Acoustically coupled modem

Prior to the Smartmodem, modems almost universally required a two-step process to activate a connection: first, manually dial the remote number on a standard phone handset, then plug the handset into a modem-attached acoustic coupler, a device with two rubber cups for the handset that converted between the audio signals and the electrical modem signals. With the Smartmodem, the acoustic coupler was eliminated by plugging the modem directly into a modular phone set or wall jack, and the computer was "smart" enough to bypass the phone and dial the number directly. These changes greatly simplified installation and operation of bulletin board systems (BBS).

Modems stayed at about these rates into the 1980s. A 2400 bit/s system very similar to the Bell 212 signalling was introduced in the US, and a slightly different, and incompatible, one in Europe. By the late 1980s most modems could support all of these standards, and 2400 bit/s was becoming common. A huge number of other standards were also introduced for special-purpose situations, commonly using a high-speed channel for receiving, and a lower-speed channel for sending. One typical example was used in the French Minitel system, where the user's terminals spent the majority of their time receiving information. The modem in the Minitel terminal thus operated at 1200 bit/s for reception, and 75 bit/s for sending commands back to the servers.

These sorts of solutions were useful in a number of situations where one side would be sending more data than the other. In addition to a number of "medium-speed" standards like Minitel, four US companies became famous for high-speed versions of the same concept. Microcom Systems introduced their MNP, Hayes their Ping Pong, USR had their HST protocol, and Telebit used software to increase performance. In all of these cases the high-speed line was set to 9600 bit/s, and the low-speed line to between 75 and 300 bit/s. Each company carved out a niche in the market, Telebit was huge in the universities due to their direct support of UUCP protocols in the modem itself, Microcom became common in commercial settings, and USR was huge among BBS operators (as they could download Fidonet messages more quickly), but the Hayes standard never caught on. In all of these cases there was a well defined high-speed and low-speed direction, but such a split was not so obvious for users who were uploading and downloading files in the same session, and these solutions were rarely used by them.

Operations at these speeds pushed the limits of the phone lines, and would have been generally very error-prone. This led to the introduction of error correction systems built into the modems, made most famous with Microcom's MNP systems. A string of MNP standards came out in the 1980s, each slowing the effective data rate by a smaller amount each time, from about 25% in MNP1, to 5% in MNP4. MNP5 took this a step further, adding compression to the system, thereby actually increasing the data rate - in general use the user could expect an MNP modem to transfer at about 1.3 times the normal data rate of the modem. MNP was later "opened" and became popular on a series of 2400 bit/s modems, although it was never widespread.

Another common feature of these high-speed modems was the concept of fallback, allowing them to talk to less-capable modems. During the call initiation the modem would play a series of signals into the line and wait for the remote modem to "answer" them. They would start at high speeds and progressively get slower and slower until they heard an answer. Thus two USR modems would be able to connect at 9600 bit/s, but when another user with a 2400 bit/s modem called in, the USR would "fall back" to the common 2400 bit/s speed. Without such a system the operator would be forced to have multiple phone lines for high and low speed use.

Long haul modems

In the 1960s, Bell began to digitize the telephone system, and developed early high-speed radio modems for this purpose. Once digital long-haul networks were in place, they were leased for every other purpose.

Optic fiber manufacturing was perfected in the 1980s, and optic modems were first invented for these early systems. The first systems simply used light emitting diodes and PIN diodes. Faster modulation was quickly adopted for long-haul networks. In the 1990s, multispectral optical modems were adopted as well.

Echo cancellation

Echo cancellation was the next major advance in modem design. Normally the phone system sends a small amount of the outgoing signal, called sidetone, back to the earphone, in order to give the user some feedback that their voice is indeed being sent. However this same signal can confuse the modem, is the signal it is "hearing" from the remote modem, or its own signal being sent back to itself? This was the reason for splitting the signal frequencies into answer and originate; if you received a signal on your own frequency set, you simply ignored it. Even with improvements to the phone system allowing for higher speeds, this splitting of the available phone signal bandwidth still imposed a half-speed limit on modems.

Echo cancellation was a way around this problem. By using the phone system's timing, a slight delay, it was possible for the modem to tell if the received signal was from itself or the remote modem. As soon as this happened the modems were able to send at "full speed" in both directions at the same time, opening the market to a slew of 9600 bit/s bidirectional modems in the late 1980s. These earlier systems were not very popular due to their price, but by the early 1990s the prices started falling. The "breaking point" occurred with the introduction of the SupraFax 14400 in 1991, which cost the same as a 2400 bit/s modem from a year or two earlier (about $300US), but ran at the latest 14,400 bit/s rate (14.4 kbit/s) and also included fax capability. Over the next few years the speed increased to 28.8 kbit/s, then to 33.6 kbit/s, along with a slew of one-off non-standards like AT&T's 19.2 kbit/s system.

The last major advance in modem design was the 56 kbit/s standard, introduced in the late 1990s. This standard is similar to the earlier high-speed/low-speed systems rejected by users in the 1980s, but with the increasing use of the internet, which is largely "read only", the small sacrifice for higher speeds made sense once again.


Narrowband

 
28.8kbps serial-port modem from Motorola

A standard modem of today is what would have been called a "smart modem" in the 1980s. They contain two functional parts: an analog section for generating the signals and operating the phone, and a digital section for setup and control. This functionality is actually incorporated into a single chip, but the division remains in theory.

In operation the modem can be in one of two "modes", data mode in which data is sent to and from the computer over the phone lines, and command mode in which the modem listens to the data from the computer for commands, and carries them out. A typical session consists of powering up the modem (often inside the computer itself) which automatically assumes command mode, then sending it the command for dialing a number. After the connection is established to the remote modem, the modem automatically goes into data mode, and the user can send and receive data. When the user is finished, the escape sequence, "+++" followed by a pause of about a second, is sent to the modem to return it to command mode, and the command to hang up the phone is sent. One problem with this method of operation is that it is not really possible for the modem to know if a string is a command or data. Funny things happen when they get confused.

The commands themselves are typically from the Hayes command set, although that term is somewhat misleading. The original Hayes commands were useful for 300 bit/s operation only, and then extended for their 1200 bit/s modems. Hayes was much slower upgrading to faster speeds however, leading to a proliferation of command sets in the early 1990s as each of the high-speed vendors introduced their own command styles. Things became considerably more standardized in the second half of the 1990s, when most modems were built from one of a very small number of "chip sets", invariably supporting a rapidly converging command set. We call this the Hayes command set even today, although in this use the terminology is misleading: the command set in question has three or four times the numbers of commands.

The 300 bit/s modems used frequency-shift keying to send data. In this system the stream of 1's and 0's in computer data it translated into sounds which can be easily sent on the phone lines. In the Bell 103 system the originating modem sends 0's by playing a 1070 Hz tone, and 1's at 1270 Hz, with the receiving modem putting its 0's on 2025 Hz and 1's on 2225 Hz. These frequencies were chosen carefully, they are in the range that suffer minimum distortion on the phone system, and also are not harmonics of each other. In early systems the choice of answer or originate was selected by a switch on the front of the modem, but as time went on the Smartmodems would assume originate if asked to dial, and answer if asked to answer the phone.

In the 1200 bit/s and faster systems, phase-shift keying was used. In this system the two tones for any one side of the connection are sent at the similar frequencies as in the 300 bit/s systems, but slightly out of phase. By comparing the phase of the two signals, 1's and 0's could be pulled back out, for instance if the signals were 90 degrees out of phase, this represented two digits, "1, 0", at 180 degrees it was "1, 1". In this way each cycle of the signal represents two digits instead of one, 1200 bit/s modems were, in effect, 600 bit/s modems with "tricky" signalling.

It was at this point that the difference between baud and bit per second became real. Baud refers to the signaling rate of a system, in a 300 bit/s modem the signals sent one bit per signal, so the data rate and signalling rate was the same. In the 1200 bit/s systems this was no longer true, the modems were actually 600 baud. This led to a series of flame wars on the BBSes of the 80s.

Increases in speed have since used increasingly complicated communications theory. The Bell 208 introduced the 8 phase shift key concept. This could transmit three bits per signaling instance (baud.) The next major advance was introduced by Codex Co. in the late 1960's. Here the bits were encoded into a combination of amplitude and phase. Best visualized as a two dimensional "eye pattern", the bits are mapped onto points on a graph with the x (real) and y (quadrature) coordinates transmitted over a single carrier. This technique became very effective and was incorporated into an international standard named V.29, by the CCITT (now ITU) arm of the United Nations. The standard was able to transmit 4 bits per signalling interval of 2400 Hz. giving an effective bit rate of 9600 bits per second. For many years, most considered this rate to be the limit of data communications over telephone networks.

In 1980 Godfried Ungerboek from IBM applied powerful channel coding techniques to search for new ways to increase the speed of modems. His results were astonishing but only conveyed to a few colleagues. Finally in 1982, he agreed to publish what is now a landmark paper in the theory of information coding. By applying powerful parity check coding to the bits in each symbol, and mapping the encoded bits into a two dimensional "eye pattern", Ungerboek showed that it was possible to increase the speed by a factor of two with the same error rate. The new technique was called mapping by set partitions. This new view was an extension of the "penny packing" problem and the related and more general problem of how to pack points into an N-dimension sphere such that they are far away from their neighbors (so that noise can not confuse the receiver.)

The industry was galvanized into new research and development. More powerful coding techniques were developed and commercial firms rolled out new product lines, and the standards organizations rapidly adopted to new technology. Today the ITU standard V.34 represents the culmination of the joint efforts. It employs the most powerful coding techniques including channel encoding and shape encoding. From the mere 16 points per symbol, V.34 uses over 1000 points and very sophisticated algorithms to achieve 33.6 kbit/s.

In the late 1990's Rockwell and U.S. Robotics introduced new technology based upon the digital transmission used in modern telephony networks. The standard digital transmission in modern networks is 64 kbit/s but some networks use a part of the bandwidth for remote office signalling (E.G. hang up the phone.) So the effective rate is 56 kbit/s DS0. This rate is possible only from the central office to the user site (downlink). The uplink (from the user to the central office still uses V.34 technology. This new technology was adopted into ITU standards V.90 and V.92 and are common in modern computers.


It is guessed that this rate is near the theoretical Shannon limit. Higher speeds are possible but may be due more to improvements in the underlying phone system than anything in the technology of the modems themselves.

Software is as important to the operation of the modem today as the hardware. Even with the improvements in the performance of the phone system, modems still lose a considerable amount of data due to noise on the line. The MNP standards were originally created to automatically fix these errors, and later expanded to compress the data at the same time. Today's v.42 and v.42bis fill these roles in the vast majority of modems, and although later MNP standards were released, they are not common.

With such systems it is possible for the modem to transmit data faster than its basic rate would imply. For instance, a 2400 bit/s modem with v.42bis can transmit up to 9600 bit/s, at least in theory. One problem is that the compression tends to get better and worse over time, at some points the modem will be sending the data at 4000 bit/s, and others at 9000 bit/s. In such situations it becomes necessary to use hardware flow control, extra pins on the modem-computer connection to allow the computers to signal data flow. The computer is then set to supply the modem at some higher rate, in this example at 9600 bit/s, and the modem will tell the computer to stop sending if it cannot keep up. A small amount of memory in the modem, a buffer, is used to hold the data while it is being sent.

Almost all modern modems also do double-duty as a fax machine as well. Digital faxes, introduced in the 1980s, are simply a particular image format sent over a high-speed (9600/1200 bit/s) modem. Software running on the host computer can convert any image into fax-format, which can then be sent using the modem. Such software was at one time an add-on, but since has become largely universal.

Winmodem

A Winmodem or Softmodem is a stripped-down modem for Windows that replaces tasks traditionally handled in hardware with software. In this case the computer's built-in sound hardware is used to generate the tones normally handled by the analog portion of the modem. A small piece of hardware is then used to connect the sound hardware to the phone line. Modern computers often include a very simple card slot, the Communications and Networking Riser slot (CNR), to lower the cost of connecting it up. The CNR slot includes pins for sound, power and basic signalling, instead of the more expensive PCI slot normally used. Winmodems are often cheaper than traditional modems, since they have fewer hardware components. One downside of a Winmodem is that the software generating the modem tones is not that simple, and the performance of the computer as a whole often suffers when it is being used. For online gaming this can be a real concern. Another problem with WinModems is lack of flexibility, due to their strong tie to the underlying operating system. A given Winmodem might not be supported by other operating systems (such as Linux), because their manufacturers may neither support the other operating system nor provide enough technical data to create an equivalent driver. A Winmodem might not even work (or work well) with a later version of Microsoft Windows, if its driver turns out to be incompatible with that later version of the operating system.

Apple's GeoPort modems from the second half of the 1990s were similar, and are generally regarded as having been a bad move. Although a clever idea in theory, enabling the creation of more-powerful telephony applications, in practice the only programs created were simple answering-machine and fax software, hardly more advanced than their physical-world counterparts, and certainly more error-prone and cumbersome. The software was finicky and ate up significant processor time, and no longer functions in current operating system versions.

Today's modern audio modems (ITU-T V.92 standard) closely approach the Shannon capacity of the PSTN telephone channel. They are plug-and-play fax/data/voice modems (broadcast voice messages and records touch tone responses).

Wireless "modems"

Wireless modems come in a variety of types, bandwidths, and speeds. Wireless modems are often referred to as transparent or smart. They transmit information that is modulated onto a carrier frequency to allow many simultaneous wireless communication links to work simultaneously on different frequencies.

Transparent modems operate in a manner similar to their phone line modem cousins. Typically, they are half duplex, meaning that they cannot send and receive data at the same time. Typically transparent modems are polled in a round robin manner to collect small amounts of data from scattered locations that do not have easy access to wired infrastructure. Transparent modems are most commonly used by utility companies for data collection.

Smart modems come with a media access controller inside which prevents random data from colliding and resending data that is not correctly received. Smart modems typically require more bandwidth than transparent modems, and typically use more radio frequency bandwidth. The IEEE 802.11 standard includes a short range modem that is used on a large scale throughout the world.

Wireless Data Modems are used in WiFi or WiMax Standards.

WiFi= could be used in laptops for internet connections (wireless access point and wireless application protocol/WAP)

Broadband

 
DSL Modem

ADSL modems, a more recent development, are not limited to the telephone's "voiceband" audio frequencies. Some ADSL modems use coded orthogonal frequency division modulation.

Cable modems use a range of frequencies originally intended to carry RF television channels. Multiple cable modems attached to a single cable can use the same frequency band, using a low-level media access protocol to allow them to work together within the same channel. Typically, 'up' and 'down' signals are kept separate using frequency division multiplexing.

New types of broadband modems are beginning to appear, such as doubleway satellite and powerline modems.

Broadband modems should still be classed as modems, since they do utilise analog/digital conversion. They are more advanced devices that traditional telephone modems as they are capable of modulating/demodulating hundreds of channels simultaneously.

Many broadband "modems" include the functions of a router and other features such as DHCP, NAT and firewall features.

When broadband technology was introduced, networking and routers were not very familiar to most people. However, many people knew what a modem was as most internet access was through dialup. Due to this familiarity, companies started selling broadband adapters using the familiar term "modem".

Popularity

Modems are the most popular means of Internet access. A UCLA 2001 study of American Internet users showed that 81.3% of them used a telephone modem and 11.5% used a cable modem though, with the advent of new ways of accessing the internet, the traditional 56K modem is fast losing popularity.

See also