Scheme for transmitting information through various technical channels. Transmission of information Encoding and decoding of information

The process of information transfer is shown schematically in the figure. It is assumed that there is a source and recipient of information. The message from the source to the recipient is transmitted through a communication channel (information channel).

Rice. 3. – Information transfer process

In this process, information is presented and transmitted in the form of a certain sequence of signals, symbols, signs. For example, during a direct conversation between people, sound signals are transmitted - speech; when reading a text, a person perceives letters - graphic symbols. The transmitted sequence is called a message. From the source to the receiver, the message is transmitted through some material medium (sound - acoustic waves in the atmosphere, image - light electromagnetic waves). If technical means of communication are used during the transmission process, they are called information transmission channels(information channels). These include telephone, radio, television.

We can say that human senses act as biological information channels. With their help, the informational impact on a person is conveyed to memory.

Claude Shannon, a diagram of the process of transmitting information through technical communication channels was proposed, presented in the figure.

Rice. 4. – Process of information transfer according to Shannon

The operation of such a scheme can be explained in the process of talking on the phone. The source of information is the person speaking. The encoding device is a telephone handset microphone, with the help of which sound waves (speech) are converted into electrical signals. The communication channel is the telephone network (wires, switches of telephone nodes through which the signal passes)). The decoding device is the handset (earphone) of the listening person - the receiver of information. Here the incoming electrical signal is converted into sound.

Communication in which transmission is carried out in the form of continuous electrical signal, is called analog communication.

Under coding refers to any transformation of information coming from a source into a form suitable for its transmission over a communication channel.

Currently, digital communications are widely used, when the transmitted information is encoded in binary form (0 and 1 are binary digits), and then decoded into text, image, sound. Digital communication is discrete.

The term “noise” refers to various types of interference that distort the transmitted signal and lead to loss of information. Such interference, first of all, arises for technical reasons: poor quality of communication lines, insecurity of different streams of information transmitted over the same channels from each other. In such cases, noise protection is necessary.

First of all they apply technical methods protection of communication channels from noise. For example, using a screen cable instead of a bare wire; the use of various types of filters that separate the useful signal from noise, etc.

Claude Shannon developed a special coding theory that provides methods for dealing with noise. One of the important ideas of this theory is that the code transmitted over the communication line must be redundant. Due to this, the loss of some part of the information during transmission can be compensated.

However, the redundancy should not be too large. This will lead to delays and increased communication costs. K. Shannon's coding theory allows us to obtain a code that will be optimal. In this case, the redundancy of the transmitted information will be the minimum possible, and the reliability of the received information will be maximum.

In modern digital communication systems, the following technique is often used to combat the loss of information during transmission. The entire message is divided into portions - blocks. For each block, a checksum (the sum of binary digits) is calculated and transmitted along with the block. At the receiving site, the checksum of the received block is recalculated, and if it does not coincide with the original one, then the transmission of this block is repeated. This will happen until the source and destination checksums match.

Information transfer rate is the information volume of a message transmitted per unit of time. Units for measuring the speed of information flow: bit/s, byte/s, etc.

Technical information communication lines (telephone lines, radio communications, fiber optic cable) have a data transfer speed limit called information channel capacity. Transmission speed restrictions are physical in nature.

Using Internet resources, find answers to the questions:

Task 1

1. What is the process of transmitting information?

Transfer of information- the physical process by which information is transferred in space. We recorded the information on a disk and moved it to another room. This process is characterized by the presence of the following components:


2. General scheme of information transfer

3. List the communication channels you know

Communication channel(English) channel, data line) - system technical means and a signal propagation medium for transmitting messages (not just data) from source to destination (and vice versa). Communication channel, understood in a narrow sense ( communication path), represents only the physical signal propagation medium, for example, a physical communication line.

Based on the type of distribution medium, communication channels are divided into:

4. What are telecommunications and computer telecommunications?

Telecommunications(Greek tele - into the distance, far away and lat. communicatio - communication) is the transmission and reception of any information (sound, image, data, text) over a distance via various electromagnetic systems (cable and fiber optic channels, radio channels and other wired and wireless channels communications).

Telecommunications network
is a system of technical means through which telecommunications are carried out.

Telecommunication networks include:
1. Computer networks (for data transmission)
2. Telephone networks (transmission of voice information)
3. Radio networks (transmission of voice information - broadcast services)
4. Television networks (voice and image transmission - broadcast services)

Computer telecommunications are telecommunications whose terminal devices are computers.

The transfer of information from computer to computer is called synchronous communication, and through an intermediate computer, which allows messages to be accumulated and transmitted to personal computers as requested by the user - asynchronous.

Computer telecommunications are beginning to be introduced into education. In higher education they are used to coordinate scientific research, prompt exchange of information between project participants, distance learning, and consultations. In the school education system - to increase efficiency independent activity students associated with a variety of types creative works, including educational activities, based on the widespread use of research methods, free access to databases, exchange of information with partners both within the country and abroad.

5. What is the bandwidth of an information transmission channel?
Bandwidth- metric characteristic showing the ratio of the maximum number of passing units (information, objects, volume) per unit of time through a channel, system, node.
In computer science, the definition of bandwidth is usually applied to a communication channel and is determined by the maximum amount of information transmitted/received per unit of time.
Bandwidth is one of the most important factors from a user's point of view. It is estimated by the amount of data that the network can, in the limit, transfer per unit of time from one device connected to it to another.

The speed of information transfer depends largely on the speed of its creation (source performance), encoding and decoding methods. The highest possible information transmission speed in a given channel is called its throughput. Channel capacity, by definition, is the speed of information transmission when using the “best” (optimal) source, encoder and decoder for a given channel, so it characterizes only the channel.

Today, information spreads so quickly that there is not always enough time to comprehend it. Most people rarely think about how and by what means it is transmitted, much less imagine a scheme for transmitting information.

Basic Concepts

The transfer of information is considered to be the physical process of moving data (signs and symbols) in space. From the point of view of data transfer, this is a pre-planned, technically equipped event for moving information units over set time from the so-called source to the receiver via an information channel, or data transmission channel.

Data transmission channel is a set of means or medium for data distribution. In other words, this is that part of the information transmission circuit that ensures the movement of information from the source to the recipient, and under certain conditions, and back.

There are many classifications of data transmission channels. If we highlight the main ones, we can list the following: radio channels, optical, acoustic or wireless, wired.

Technical channels for transmitting information

Technical data transmission channels directly include radio channels, fiber optic channels and cable. The cable can be coaxial or twisted pair. The former are an electrical cable with a copper wire inside, and the latter are twisted pairs copper wires, insulated in pairs, located in a dielectric sheath. These cables are quite flexible and easy to use. Optical fiber consists of optical fiber strands that transmit light signals through reflection.

The main characteristics are throughput and noise immunity. Bandwidth is usually understood as the amount of information that can be transmitted over a channel in certain time. And noise immunity is the parameter of the channel’s resistance to the effects of external interference (noise).

Understanding Data Transfer

If you do not specify the scope of application, general scheme information transmission looks simple, it includes three components: “source”, “receiver” and “transmission channel”.

Shannon scheme

Claude Shannon, an American mathematician and engineer, was at the origins of information theory. They proposed a scheme for transmitting information through technical communication channels.

This diagram is not difficult to understand. Especially if you imagine its elements in the form of familiar objects and phenomena. For example, the source of information is a person talking on the phone. The handset will be an encoder that converts speech or sound waves into electrical signals. The data transmission channel in this case is the communication nodes, in general, the entire telephone network leading from one telephone set to another. The decoding device is the subscriber's handset. It converts the electrical signal back into sound, that is, into speech.

In this diagram of the information transfer process, data is represented as a continuous electrical signal. This type of communication is called analog.

Coding concept

Coding is considered to be the transformation of information sent by a source into a form suitable for transmission over the communication channel being used. The most understandable example of coding is Morse code. In it, information is converted into a sequence of dots and dashes, that is, short and long signals. The receiving side must decode this sequence.

Modern technologies use digital communication. In it, information is converted (encoded) into binary data, that is, 0 and 1. There is even a binary alphabet. Such a connection is called discrete.

Interference in information channels

There is also noise in the data transmission circuitry. The concept of “noise” in this case means interference, due to which the signal is distorted and, as a result, its loss. The reasons for the interference may be various. For example, information channels may be poorly protected from each other. To prevent interference, various technical protection methods, filters, shielding, etc. are used.

K. Shannon developed and proposed for use a coding theory to combat noise. The idea is that since information loss occurs under the influence of noise, it means that the transmitted data should be redundant, but at the same time not so much that it reduces the transmission speed.

In digital communication channels, information is divided into parts - packets, for each of which a checksum is calculated. This amount is transferred along with each package. The information receiver recalculates this sum and accepts the packet only if it matches the original one. Otherwise, the packet is sent again. And so on until the sent and received checksums match.

General scheme for transmitting information in a communication line

Previously, the source of information was defined as an object or subject that generates information and has the ability to present it in the form of a message, i.e. sequences of signals in a material medium. In other words, the source connects information with its material carrier. The transmission of a message from source to receiver is always associated with some non-stationary process occurring in the material environment. This condition is mandatory, since information itself is not a material object or a form of existence of matter. There are many ways to transmit information: mail, telephone, radio, television, computer networks, etc. However, with all the variety of specific implementations of communication methods, it is possible to identify common elements in them, presented in the diagram (Fig. 9).

A situation is possible when the encoding device is external in relation to the source of information, for example, a telegraph machine or computer in relation to the operator working on it. Next, the codes must be translated into a sequence of material signals, that is, placed on a material medium - this operation is performed by a converter. The converter can be combined with an encoding device (for example, a telegraph machine),

Rice. 9.

communication can also be an independent element of a communication line (for example, a modem that converts discrete electrical signals with a computer frequency into analog signals with a frequency at which their attenuation in telephone lines will be minimal). Transducers also include devices that transfer a message from one medium to another, for example, a megaphone or telephone that converts voice signals into electrical signals; a radio transmitter that converts voice signals into radio waves; a television camera that converts images into a sequence of electrical impulses. In the general case, during conversion, output signals do not completely reproduce all the features of the input message, but only its essential aspects, i.e., during conversion, part of the information is lost. For example, the frequency bandwidth for telephone communications is from 300 to 3400 Hz, while the frequencies perceived by the human ear lie in the range - 16-20,000 Hz (i.e. telephone lines are “cut” high frequencies, which leads to sound distortion); In black and white television, the color of the image is lost during conversion. It is in this connection that the task arises of developing a message encoding method that would provide the most complete representation of the original information during conversion and at the same time would be consistent with the speed of information transmission over a given communication line.

After the converter, the signals arrive and propagate through communication channel. The concept of “communication channel” includes material environment, and also physical or other process, through which a message is transmitted, i.e., the propagation of signals in space over time. Table 10 shows examples of some communication channels.

Any real communication channel is subject to external influences; internal processes can also occur in it, as a result of which the transmitted signals and, consequently, the message associated with them are distorted. Such influences are called noise (interference). Sources of interference may be external,

Communication channels

Table 10

Communication channel

Wednesday

Message carrier

Process used to pass messages

Mail, couriers

Human habitat

Mechanical media movement

Telephone, computer networks

Conductor

Electric current

Movement of electric charges

Radio, television

Electromagnetic

Electromagnetic

Propagation of electromagnetic waves

Light waves

Propagation of Light Waves

Sound waves

Propagation of sound waves

Smell, taste

Air, food

Chemicals

Chemical reactions

Touch

Skin surface

Object affecting the senses of touch

Heat transfer, pressure

After the message passes through the communication channel, the signals are converted using a receiving converter into a sequence of codes, which are presented by the decoding device in the form required by the information receiver. At the reception stage, as well as during transmission, the converter can be combined with a decoding device (for example, a radio or TV) or exist independently (for example, a modem).

The concept of “communication line” unites all the elements presented in the diagram - from the source to the receiver of information. The characteristics of any communication line are the speed with which a message can be transmitted over it, as well as the degree of message distortion during transmission. From these parameters, we isolate those that relate directly to the communication channel, i.e., characterize the environment and the transmission process.

Communication channel characteristics

Next, we will consider communication channels through which messages are transmitted using electrical impulses. From a practical point of view, as well as for computer communication lines, these channels are of the greatest interest.

Bandwidth

Any converter whose operation is based on the use of oscillations (electrical or mechanical) can generate and transmit signals from a limited frequency range. (An example with telephone communications was given above.) The same should be applied to radio and television communications: the entire frequency spectrum is divided into ranges (LW, MW, KBI, KVP, VHF, DM V), within which each station occupies its own subband, so that do not interfere with the broadcast of others.

The frequency range used by a given communication channel to transmit signals is called bandwidth.

To build a theory, it is not the bandwidth itself that is important, but the maximum frequency value from a given band (v m), since it is this that determines the possible speed of information transmission over the channel.

The duration of an elementary pulse can be determined from the following considerations. If the signal parameter changes sinusoidally, then, as can be seen from the figure, in one oscillation period T the signal will have one maximum value and one minimum value.

Rice. 10.

If we approximate a sinusoid rectangular pulses and shift the reference point to the level of the minimum value, it turns out that the signal takes only two values: the maximum (let’s denote it "1")- pulse, minimum (can be designated "ABOUT")- pause. An impulse and a pause can be considered elementary signals; with the chosen approximation, their durations are obviously the same and equal:

If the pulses are generated by a clock generator having a frequency vm, That

Thus, every 0 seconds you can transmit a pulse or a pause, associating certain codes with their sequence. It is, in principle, possible to use signals of longer duration than t 0 (for example, 2 t 0) - this will not lead to loss of information, although it will reduce the speed of its transmission over the channel. The use of signals shorter than t 0 can lead to information losses, since the signals will then take some intermediate values ​​between the minimum and maximum, which will complicate their interpretation.

Thus, v m determines the duration of an elementary signal t 0, used to convey a message.

Communication channel capacity

If the transmission of one pulse is associated with the amount of information 1. tr, and it is transmitted in time t 0, ratio I to t 0, obviously, will reflect the average amount of information transmitted over the channel per unit of time - this value is a characteristic of the communication channel and is called channel capacity C:

If G tr expressed in bits, and t 0 - in seconds, then the unit of measure C will be bps Previously, such a unit was called baud, but the name did not stick, and for this reason the throughput of a communication channel is measured in bits/s. Derived units are:

  • 1 Kbit/s = 10 3 bit/s,
  • 1 Mbit/s = 10 6 bit/s,
  • 1 Gbit/s = 10 9 bit/s.

Information transfer rate

Let through the communication channel in time t amount of information transmitted I. You can introduce a value characterizing the speed of information transfer - information transfer speed J:

Dimension J, like C, is bit/s. What is the relationship between these characteristics? Since m 0 is the minimum duration of an elementary signal, it is obvious that C corresponds to the maximum speed of information transmission over a given communication line, i.e. J J max Thus, maximum speed transmission of information over a communication channel is equal to its throughput.

Entropy and information

Random events can be described using the concept of "probability". The relations of probability theory make it possible to find (calculate) the probabilities of both single random events and complex experiments that combine several independent or interconnected events. However, random events can be described not only in terms of probabilities.

The fact that an event is random means that there is no complete certainty of its occurrence, which, in turn, creates uncertainty in the outcomes of experiments associated with this event. Of course, the degree of uncertainty varies for different situations. For example, if the experiment consists of determining the age of a randomly selected 1st year full-time student at a university, then with a high degree of confidence we can say that he will be less than 30 years old; Although, according to the situation, persons under the age of 35 can study full-time, most often graduates of schools of the next few classes study full-time. A similar experience has much less certainty if it is checked whether the age of a randomly selected student will be less than 18 years old. For practice, it is important to be able to make a numerical assessment of the uncertainty of different experiments. Let's try to introduce such a quantitative measure of uncertainty.

Let's start with a simple situation where experience has n equally probable outcomes. Obviously, the uncertainty of each of them depends on p, i.e. the measure of uncertainty is a function of the number of outcomes f(n).

You can specify some properties this function:

  • 1. f(l)= 0, because at n = 1 the outcome of the experiment is not random and, therefore, there is no uncertainty;
  • 2. f(n) increases with growth p, because the greater the number of possible outcomes, the more difficult it becomes to predict the outcome of an experiment.

Unit of measurement of uncertainty for two possible equally probable

The outcome of the experiment is called a bit.

An explicit form of the function has been established that describes the measure of uncertainty of the experiment, which has n equally probable outcomes:

This quantity is called entropy. IN in what follows we will denote it N.Statement.Entropy is equal to the information about experience that is contained within it.

You can clarify:

The entropy of an experience is equal to the information that we receive as a result of its implementation.

Information properties:

  • 1. /(a,P) > 0, and /(a,|3) = 0 the experiments are independent.
  • 2. /(a,p) = /(P,a), i.e., the information is symmetrical with respect to the sequence of experiments.

3. 5 i.e. the information of an experience is equal to the average value of the amount of information contained in any one of its outcomes.

Easy to get corollary of the formula for the case when

All n outcomes are equally likely. In this case, everything and therefore

This formula was derived in 1928 by an American engineer R. Hartley and bears his name. It relates the number of equally probable states (p) and the amount of information in the message (/) that any of these states has occurred. Its meaning is that if some set contains n elements and x belongs to this set, then to isolate it (unambiguous identification) among others, an amount of information equal to log 2 “ is required.

A special case of application of Hartley’s formula is the situation when n= 2 k. Substituting this value into Hartley’s formula, we obviously get:

Shannon's formula

Known probabilities , with which the system assumes one of its states

- Shannon's formula - entropy of the system - formula for measuring the amount of information.

Properties of entropy

2. . (Hartley formula)

This is a case of maximum entropy.

Shannon's first theorem.

In the absence of interference, it is always possible to encode a message in such a way that the code redundancy will be arbitrarily close to zero.

Shannon's second theorem.

When transmitting information over a noisy channel, there is always a coding method in which the message will be transmitted with arbitrarily high reliability if the transmission speed does not exceed the channel capacity.

If you find an error, please select a piece of text and press Ctrl+Enter.