Information is information about something

Concept and types of information, transmission and processing, search and storage of information

Information is, definition

Information is any intelligence, received and transmitted, stored by various sources. - this is the entire collection of information about the world around us, about all kinds of processes occurring in it that can be perceived by living organisms, electronic machines and other information systems.

- This significant information about something, when the form of its presentation is also information, that is, it has a formatting function in accordance with its own nature.

Information is everything that can be supplemented with our knowledge and assumptions.

Information is information about something, regardless of the form of its presentation.

Information is mental of any psychophysical organism, produced by it when using any means called a medium of information.

Information is information perceived by humans and (or) specialists. devices as a reflection of the facts of the material or spiritual world in process communications.

Information is data organized in such a way that it makes sense to the person handling it.

Information is the meaning a person attaches to data based on the known conventions used to represent it.

Information is information, explanation, presentation.

Information is any data or information that interests anyone.

Information is information about objects and phenomena of the environment, their parameters, properties and state, which are perceived by information systems (living organisms, control machines, etc.) in process life and work.

The same information message (newspaper article, advertisement, letter, telegram, certificate, story, drawing, radio broadcast, etc.) may contain different amounts of information for different people - depending on their previous knowledge, on the level of understanding of this message and interest in it.

In cases where they talk about automated work with information through any technical devices, they are interested not in the content of the message, but in how many characters this message contains.

Information is

In relation to computer data processing, information is understood as a certain sequence of symbolic designations (letters, numbers, encoded graphic images and sounds, etc.), carrying a semantic load and presented in a form understandable to the computer. Each new character in such a sequence of characters increases the information volume of the message.

Currently, there is no single definition of information as a scientific term. From the point of view of various fields of knowledge, this concept is described by its specific set of characteristics. For example, the concept of “information” is basic in a computer science course, and it is impossible to define it through other, more “simple” concepts (just as in geometry, for example, it is impossible to express the content of the basic concepts “point”, “line”, “plane” through simpler concepts).

The content of basic, basic concepts in any science should be explained with examples or identified by comparing them with the content of other concepts. In the case of the concept “information”, the problem of its definition is even more complex, since it is a general scientific concept. This concept is used in various sciences (computer science, cybernetics, biology, physics, etc.), and in each science the concept of “information” is associated with different systems of concepts.

Information concept

In modern science, two types of information are considered:

Objective (primary) information is the property of material objects and phenomena (processes) to generate a variety of states, which through interactions (fundamental interactions) are transmitted to other objects and imprinted in their structure.

Subjective (semantic, semantic, secondary) information is the semantic content of objective information about objects and processes of the material world, formed by the human consciousness with the help of semantic images (words, images and sensations) and recorded on some material medium.

In the everyday sense, information is information about the surrounding world and the processes occurring in it, perceived by a person or a special device.

Currently, there is no single definition of information as a scientific term. From the point of view of various fields of knowledge, this concept is described by its specific set of characteristics. According to the concept of K. Shannon, information is the removal of uncertainty, i.e. Information that should remove, to one degree or another, the uncertainty existing in the acquirer before receiving it, and expand his understanding of the object with useful information.

From Gregory Beton's point of view, the elementary unit of information is a "non-indifferent difference" or effective difference for some larger perceiving system. He calls those differences that are not perceived “potential”, and those that are perceived “effective”. “Information consists of differences that are not indifferent” (c) “Any perception of information is necessarily the receipt of information about the difference.” From the point of view of computer science, information has a number of fundamental properties: novelty, relevance, reliability, objectivity, completeness, value, etc. The science of logic deals primarily with the analysis of information. The word “information” comes from the Latin word informatio, which means information, explanation, introduction. The concept of information was considered by ancient philosophers.

Information is

Before the start of the Industrial Revolution, determining the essence of information remained the prerogative of mainly philosophers. Next, the new science of cybernetics began to consider issues of information theory.

Sometimes, in order to comprehend the essence of a concept, it is useful to analyze the meaning of the word by which this concept is denoted. Clarifying the inner form of a word and studying the history of its use can shed unexpected light on its meaning, obscured by the usual "technological" use of the word and modern connotations.

The word information entered the Russian language in the Petrine era. It was first recorded in the “Spiritual Regulations” of 1721 in the meaning of “idea, concept of something.” (In European languages ​​it was established earlier - around the 14th century.)

Information is

Based on this etymology, information can be considered any significant change in shape or, in other words, any materially recorded traces formed by the interaction of objects or forces and amenable to understanding. Information, therefore, is a converted form of energy. The carrier of information is a sign, and the method of its existence is interpretation: identifying the meaning of a sign or a sequence of signs.

The meaning can be an event reconstructed from a sign that caused its occurrence (in the case of “natural” and involuntary signs, such as traces, evidence, etc.), or a message (in the case of conventional signs inherent in the sphere of language). It is the second type of signs that makes up the body of human culture, which, according to one definition, is “a set of non-hereditarily transmitted information.”

Information is

Messages may contain information about facts or interpretation of facts (from the Latin interpretatio, interpretation, translation).

A living being receives information through the senses, as well as through reflection or intuition. The exchange of information between subjects is communication or communication (from the Latin communicatio, message, transfer, derived in turn from the Latin communico, to make common, to communicate, to talk, to connect).

From a practical point of view, information is always presented in the form of a message. The information message is associated with the source of the message, the recipient of the message and the communication channel.

Returning to the Latin etymology of the word information, let's try to answer the question of what exactly is given form here.

It is obvious that, firstly, to a certain meaning, which, being initially formless and unexpressed, exists only potentially and must be “built” in order to become perceived and transmitted.

Secondly, to the human mind, which is trained to think structurally and clearly. Thirdly, to a society that, precisely because its members share these meanings and use them together, gains unity and functionality.

Information is

information as expressed intelligent meaning is knowledge that can be stored, transmitted and be the basis for the generation of other knowledge. The forms of knowledge conservation (historical memory) are diverse: from myths, chronicles and pyramids to libraries, museums and computer databases.

Information - information about the world around us, about the processes occurring in it that living organisms perceive, managers machines and other information systems.

The word "information" is Latin. Over its long life, its meaning has undergone evolution, either expanding or extremely narrowing its boundaries. At first, the word “information” meant: “representation”, “concept”, then “information”, “transmission of messages”.

In recent years, scientists have decided that the usual (universally accepted) meaning of the word “information” is too elastic and vague, and have given it the following meaning: “a measure of certainty in a message.”

Information is

Information theory was brought to life by the needs of practice. Its occurrence is associated with work Claude Shannon's "Mathematical Theory of Communication", published in 1946. The fundamentals of information theory are based on results obtained by many scientists. By the second half of the 20th century, the globe was buzzing with transmitted information running along telephone and telegraph cables and radio channels. Later, electronic computers appeared - information processors. And for that time, the main task of information theory was, first of all, to increase the efficiency of communication systems. The difficulty in designing and operating means, systems and communication channels is that it is not enough for the designer and engineer to solve the problem from a physical and energy perspective. From these points of view, the system can be the most advanced and economical. But when creating transmission systems, it is important to pay attention to how much information will pass through this transmission system. After all, information can be measured quantitatively, counted. And in such calculations they act in the most usual way: they abstract from the meaning of the message, just as they abandon concreteness in arithmetic operations that are familiar to all of us (as they move from adding two apples and three apples to adding numbers in general: 2 + 3).

The scientists said they "completely ignored human evaluation of information." To a sequential series of 100 letters, for example, they assign a certain meaning of information, without paying attention to whether this information makes sense and whether, in turn, it makes sense in practical application. The quantitative approach is the most developed branch of information theory. By this definition, a collection of 100 letters—a 100-letter phrase from a newspaper, a Shakespeare play, or Einstein's theorem—has exactly the same amount of information.

This definition of information quantity is extremely useful and practical. It exactly corresponds to the task of the communications engineer, who must convey all the information contained in the submitted telegram, regardless of the value of this information for the addressee. The communication channel is soulless. One thing is important for the transmission system: to transmit the required amount of information in a certain time. How to calculate the amount of information in a particular message?

Information is

Estimation of the amount of information is based on the laws of probability theory, more precisely, it is determined through probabilities events. This is understandable. A message has value and carries information only when we learn from it about the outcome of an event that is random in nature, when it is to some extent unexpected. After all, the message about what is already known does not contain any information. Those. If, for example, someone calls you on the telephone and says: “It is light during the day and dark at night,” then such a message will surprise you only with the absurdity of stating something obvious and known to everyone, and not with the news that it contains. Another thing, for example, is the result of a race. Who will come first? The outcome here is difficult to predict. The more random outcomes an event of interest to us has, the more valuable the message about its result, the more information. A message about an event that has only two equally possible outcomes contains a single unit of information called a bit. The choice of information unit is not accidental. It is associated with the most common binary way of encoding it during transmission and processing. Let us try, at least in the most simplified form, to imagine the general principle of quantitative assessment of information, which is the cornerstone of all information theory.

We already know that the amount of information depends on probabilities certain outcomes of the event. If an event, as scientists say, has two equally probable outcomes, this means that each outcome is equal to 1/2. This is the probability of getting heads or tails when tossing a coin. If an event has three equally probable outcomes, then the probability of each is 1/3. Note that the sum of the probabilities of all outcomes is always equal to one: after all, one of all possible outcomes will definitely occur. An event, as you yourself understand, can have unequally probable outcomes. So, in a football match between a strong and weak team, the probability of the strong team winning is high - for example, 4/5. there are much fewer draws, for example 3/20. The probability of defeat is very small.

It turns out that the amount of information is a measure of reducing the uncertainty of a certain situation. Various amounts of information are transmitted over communication channels, and the amount of information passing through the channel cannot be greater than its capacity. And it is determined by how much information passes here per unit of time. One of the heroes of Jules Verne’s novel “The Mysterious Island,” journalist Gideon Spillett, reported on telephone set chapter from the Bible so that his competitors could not use the telephone service. In this case, the channel was fully loaded, and the amount of information was equal to zero, because information known to him was transmitted to the subscriber. This means that the channel was running idle, passing a strictly defined number of pulses without loading them with anything. Meanwhile, the more information each of a certain number of pulses carries, the more fully the channel capacity is used. Therefore, you need to wisely encode information, find an economical, spare language to convey messages.

Information is “sifted” in the most thorough manner. In the telegraph, frequently occurring letters, combinations of letters, even entire phrases are represented by a shorter set of zeros and ones, and those that occur less frequently are represented by a longer set. In the case when the length of the code word is reduced for frequently occurring symbols and increased for rarely occurring ones, they speak of effective encoding of information. But in practice, it often happens that the code that has arisen as a result of the most careful “sifting”, the code is convenient and economical, can distort the message due to interference, which, unfortunately, always happens in communication channels: sound distortion in the telephone, atmospheric interference in, distortion or darkening of the image in television, errors in transmission in telegraph. This interference, or as experts call it, noise, attacks the information. And this results in the most incredible and, naturally, unpleasant surprises.

Therefore, to increase reliability in the transmission and processing of information, it is necessary to introduce extra characters - a kind of protection against distortion. They - these extra symbols - do not carry the actual content of the message, they are redundant. From the point of view of information theory, everything that makes a language colorful, flexible, rich in shades, multifaceted, multi-valued is redundancy. How redundant from such a standpoint is Tatyana’s letter to Onegin! How much information excess there is in it for a brief and understandable message “I love you”! And how informationally accurate the hand-drawn signs are, understandable to everyone who enters the subway today, where instead of words and phrases of announcements there are laconic symbolic signs indicating: “Entrance”, “Exit”.

In this regard, it is useful to recall the anecdote once told by the famous American scientist Benjamin Franklin about a hat maker who invited his friends to discuss a sign project. It was supposed to draw a hat on the sign and write: “John Thompson, a hat maker, makes and sells hats for cash.” One of my friends noticed that the words "for cash" money" are unnecessary - such a reminder would be offensive to buyer. Another also found the word “sells” superfluous, since it goes without saying that the hatmaker sells hats and does not give them away for free. The third thought that the words “hattermaker” and “makes hats” were an unnecessary tautology, and the latter words were thrown out. The fourth suggested that the word “hatmaker” should also be thrown out - the painted hat clearly says who John Thompson is. Finally, the fifth assured that for buyer it made absolutely no difference whether the hatmaker was called John Thompson or otherwise, and proposed to dispense with this indication. Thus, in the end, nothing remained on the sign except the hat. Of course, if people used only this kind of codes, without redundancy in messages, then all “information forms” - books, reports, articles - would be extremely brief. But they would lose in clarity and beauty.

Information can be divided into types according to different criteria: in truth: true and false;

by way of perception:

Visual - perceived by the organs of vision;

Auditory - perceived by the organs of hearing;

Tactile - perceived by tactile receptors;

Olfactory - perceived by olfactory receptors;

Gustatory - perceived by taste buds.

according to presentation form:

Text - transmitted in the form of symbols intended to denote lexemes of the language;

Numerical - in the form of numbers and signs indicating mathematical operations;

Graphic - in the form of images, objects, graphs;

Sound - oral or recorded transmission of language lexemes by auditory means.

by purpose:

Mass - contains trivial information and operates with a set of concepts understandable to most of society;

Special - contains a specific set of concepts; when used, information is transmitted that may not be understandable to the bulk of society, but is necessary and understandable within the narrow social group where this information is used;

Secret - transmitted to a narrow circle of people and through closed (protected) channels;

Personal (private) - a set of information about a person that determines the social status and types of social interactions within the population.

by value:

Relevant - information that is valuable at a given moment in time;

Reliable - information obtained without distortion;

Understandable - information expressed in a language understandable to those to whom it is intended;

Complete - information sufficient to make the right decision or understanding;

Useful - the usefulness of information is determined by the subject who received the information depending on the scope of possibilities for its use.

The value of information in various fields of knowledge

In information theory, many systems, methods, approaches, and ideas are being developed nowadays. However, scientists believe that new directions in information theory will be added to modern ones and new ideas will appear. As proof of the correctness of their assumptions, they cite the “living”, developing nature of science, pointing out that information theory is surprisingly quickly and firmly being introduced into the most diverse areas of human knowledge. Information theory has penetrated into physics, chemistry, biology, medicine, philosophy, linguistics, pedagogy, economics, logic, technical sciences, and aesthetics. According to the experts themselves, the doctrine of information, which arose due to the needs of the theory of communications and cybernetics, has crossed their boundaries. And now, perhaps, we have the right to talk about information as a scientific concept that puts into the hands of researchers a theoretical and information method with which one can penetrate into many sciences about living and inanimate nature, about society, which will not only allow one to look at all problems from a new perspective sides, but also to see what has not yet been seen. That is why the term “information” has become widespread in our time, becoming part of such concepts as information system, information culture, even information ethics.

Many scientific disciplines use information theory to highlight new directions in old sciences. This is how, for example, information geography, information economics, and information law arose. But the term “information” has acquired extremely great importance in connection with the development of the latest computer technology, the automation of mental work, the development of new means of communication and information processing, and especially with the emergence of computer science. One of the most important tasks of information theory is the study of the nature and properties of information, the creation of methods for processing it, in particular the transformation of a wide variety of modern information into computer programs, with the help of which the automation of mental work occurs - a kind of strengthening of intelligence, and therefore the development of the intellectual resources of society.

The word “information” comes from the Latin word informatio, which means information, explanation, introduction. The concept of “information” is basic in a computer science course, but it is impossible to define it through other, more “simple” concepts. The concept of “information” is used in various sciences, and in each science the concept of “information” is associated with different systems of concepts. Information in biology: Biology studies living nature and the concept of “information” is associated with the appropriate behavior of living organisms. In living organisms, information is transmitted and stored using objects of different physical nature (DNA state), which are considered as signs of biological alphabets. Genetic information is inherited and stored in all cells of living organisms. Philosophical approach: information is interaction, reflection, cognition. Cybernetic approach: information is characteristics manager signal transmitted over a communication line.

The role of information in philosophy

The traditionalism of the subjective constantly dominated in the early definitions of information as a category, concept, property of the material world. Information exists outside of our consciousness, and can be reflected in our perception only as a result of interaction: reflection, reading, receiving in the form of a signal, stimulus. Information is not material, like all properties of matter. Information stands in the following order: matter, space, time, systematicity, function, etc., which are the fundamental concepts of a formalized reflection of objective reality in its distribution and variability, diversity and manifestations. Information is a property of matter and reflects its properties (state or ability to interact) and quantity (measure) through interaction.

From a material point of view, information is the order of objects in the material world. For example, the order of letters on a sheet of paper according to certain rules is written information. The order of multi-colored dots on a sheet of paper according to certain rules is graphic information. The order of musical notes is musical information. The order of genes in DNA is hereditary information. The order of bits in a computer is computer information, etc. and so on. To carry out information exchange, the presence of necessary and sufficient conditions is required.

Information is

The necessary conditions:

The presence of at least two different objects of the material or intangible world;

The presence of a common property among objects that allows them to be identified as a carrier of information;

The presence of a specific property in objects that allows them to distinguish objects from each other;

The presence of a space property that allows you to determine the order of objects. For example, the layout of written information on paper is a specific property of paper that allows letters to be arranged from left to right and from top to bottom.

There is only one sufficient condition: the presence of a subject capable of recognizing information. This is man and human society, societies of animals, robots, etc. An information message is constructed by selecting copies of objects from a basis and arranging these objects in space in a certain order. The length of the information message is defined as the number of copies of the basis objects and is always expressed as an integer. It is necessary to distinguish between the length of an information message, which is always measured in an integer, and the amount of knowledge contained in an information message, which is measured in an unknown unit of measurement. From a mathematical point of view, information is a sequence of integers that are written into a vector. Numbers are the object number in the information basis. The vector is called an information invariant, since it does not depend on the physical nature of the basis objects. The same information message can be expressed in letters, words, sentences, files, pictures, notes, songs, video clips, any combination of all of the above.

Information is

The role of information in physics

information is information about the surrounding world (object, process, phenomenon, event), which is the object of transformation (including storage, transmission, etc.) and is used to develop behavior, for decision-making, for management or for learning.

The characteristic features of the information are as follows:

This is the most important resource of modern production: it reduces the need for land, labor, capital, and reduces the consumption of raw materials and energy. So, for example, if you have the ability to archive your files (i.e., having such information), you don’t have to spend money on buying new floppy disks;

Information brings new productions to life. For example, the invention of the laser beam was the reason for the emergence and development of the production of laser (optical) discs;

Information is a commodity, and information is not lost after sale. So, if a student tells his friend information about the class schedule during the semester, he will not lose this data for himself;

Information adds value to other resources, in particular labor. Indeed, a worker with a higher education is valued more than one with a secondary education.

As follows from the definition, three concepts are always associated with information:

The source of information is that element of the surrounding world (object, phenomenon, event), information about which is the object of transformation. Thus, the source of information that the reader of this textbook currently receives is computer science as a sphere of human activity;

The acquirer of information is that element of the surrounding world that uses information (to develop behavior, to make decisions, to manage or to learn). The purchaser of this information is the reader himself;

A signal is a material medium that records information to transfer it from the source to the recipient. In this case, the signal is electronic in nature. If a student takes this manual from the library, then the same information will be on paper. Having been read and remembered by the student, the information will acquire another carrier - biological, when it is “recorded” in the student’s memory.

The signal is the most important element in this circuit. The forms of its presentation, as well as the quantitative and qualitative characteristics of the information it contains, which are important for the acquirer of information, are discussed further in this section of the textbook. The main characteristics of a computer as the main tool that maps the source of information into a signal (link 1 in the figure) and “brings” the signal to the recipient of information (link 2 in the figure) are given in the Computer section. The structure of procedures that implement connections 1 and 2 and make up the information process is the subject of consideration in the Information Process part.

Objects of the material world are in a state of continuous change, which is characterized by the exchange of energy between the object and the environment. A change in the state of one object always leads to a change in the state of some other environmental object. This phenomenon, regardless of how, what states and what objects have changed, can be considered as the transmission of a signal from one object to another. Changing the state of an object when a signal is transmitted to it is called signal registration.

A signal or a sequence of signals forms a message that can be perceived by the recipient in one form or another, as well as in one or another volume. Information in physics is a term that qualitatively generalizes the concepts of “signal” and “message”. If signals and messages can be quantified, then we can say that signals and messages are units of measurement of the volume of information. The message (signal) is interpreted differently by different systems. For example, a successively long and two short beeps in Morse code terminology is the letter de (or D), in BIOS terminology from the award company it is a video card malfunction.

Information is

The role of information in mathematics

In mathematics, information theory (mathematical communication theory) is a section of applied mathematics that defines the concept of information, its properties and establishes limiting relationships for data transmission systems. The main branches of information theory are source coding (compression coding) and channel (noise-resistant) coding. Mathematics is more than a scientific discipline. It creates a unified language for all Science.

The subject of mathematics research is abstract objects: number, function, vector, set, and others. Moreover, most of them are introduced axiomatically (axiom), i.e. without any connection with other concepts and without any definition.

Information is

information is not included in the scope of mathematics research. However, the word "information" is used in mathematical terms - self-information and mutual information, related to the abstract (mathematical) part of information theory. However, in mathematical theory, the concept of “information” is associated with exclusively abstract objects - random variables, while in modern information theory this concept is considered much more broadly - as a property of material objects. The connection between these two identical terms is undeniable. It was the mathematical apparatus of random numbers that was used by the author of information theory, Claude Shannon. He himself means by the term “information” something fundamental (irreducible). Shannon's theory intuitively assumes that information has content. Information reduces overall uncertainty and information entropy. The amount of information is measurable. However, he warns researchers against mechanically transferring concepts from his theory to other areas of science.

“The search for ways to apply information theory in other fields of science does not come down to a trivial transfer of terms from one field of science to another. This search is carried out in a long process of putting forward new hypotheses and their experimental testing.” K. Shannon.

Information is

The role of information in cybernetics

The founder of cybernetics, Norbert Wiener, spoke about information like this:

information is not matter or energy, information is information." But the basic definition of information, which he gave in several of his books, is the following: information is a designation of content received by us from the external world, in the process of adapting us and our feelings.

Information is the basic concept of cybernetics, just as economic information is the basic concept of economic cybernetics.

There are many definitions of this term, they are complex and contradictory. The reason, obviously, is that cybernetics as a phenomenon is studied by different sciences, and cybernetics is only the youngest of them. Information is the subject of study of such sciences as management science, mathematics, genetics, and the theory of mass media (print, radio, television), computer science, which deals with the problems of scientific and technical information, etc. Finally, recently philosophers have shown great interest in the problems of information: they tend to consider information as one of the main universal properties of matter, associated with the concept of reflection. With all interpretations of the concept of information, it presupposes the existence of two objects: the source of information and the acquirer (recipient) of information. The transfer of information from one to another occurs with the help of signals, which, generally speaking, may not have any physical connection with its meaning: this communication is determined by agreement. For example, ringing the veche bell meant that one had to gather for the square, but to those who did not know about this order, he did not communicate any information.

In a situation with a veche bell, the person participating in the agreement on the meaning of the signal knows that at the moment there can be two alternatives: the veche meeting will take place or not. Or, in the language of the theory of information, an uncertain event (veche) has two outcomes. The received signal leads to a decrease in uncertainty: the person now knows that the event (evening) has only one outcome - it will take place. However, if it was known in advance that the meeting would take place at such and such an hour, the bell did not announce anything new. It follows that the less probable (i.e., more unexpected) the message, the more information it contains, and vice versa, the greater the probability of the outcome before the event occurs, the less information it contains. Approximately the same reasoning was made in the 40s. XX century to the emergence of a statistical, or “classical” theory of information, which defines the concept of information through the measure of reducing the uncertainty of knowledge about the occurrence of an event (this measure was called entropy). The origins of this science were N. Wiener, K. Shannon and Soviet scientists A. N. Kolmogorov, V. A. Kotelnikov and others. They were able to derive mathematical laws for measuring the amount of information, and hence such concepts as channel capacity and ., storage capacity of I. devices, etc., which served as a powerful incentive for the development of cybernetics as a science and electronic computing technology as a practical application of the achievements of cybernetics.

As for determining the value and usefulness of information for the recipient, there is still a lot that is unresolved and unclear. If we proceed from the needs of economic management and, therefore, economic cybernetics, then information can be defined as all that information, knowledge, and messages that help solve a particular management problem (that is, reduce the uncertainty of its outcomes). Then some opportunities open up for evaluating information: it is more useful, more valuable, the sooner or with less costs leads to a solution to the problem. The concept of information is close to the concept of data. However, there is a difference between them: data are signals from which information still needs to be extracted. Data processing is the process of bringing them into a form suitable for this.

The process of their transfer from source to recipient and perception as information can be considered as passing through three filters:

Physical, or statistical (purely quantitative limitation on channel capacity, regardless of the data content, i.e. from the point of view of syntactics);

Semantic (selection of those data that can be understood by the recipient, i.e. correspond to the thesaurus of his knowledge);

Pragmatic (selection among understood information of those that are useful for solving a given problem).

This is clearly shown in the diagram taken from E. G. Yasin’s book on economic information. Accordingly, three aspects of the study of linguistic problems are distinguished—syntactic, semantic, and pragmatic.

According to the content, information is divided into socio-political, socio-economic (including economic information), scientific and technical, etc. In general, there are many classifications of information; they are based on various bases. As a rule, due to the proximity of concepts, data classifications are constructed in the same way. For example, information is divided into static (constant) and dynamic (variable), and data is divided into constant and variable. Another division is primary, derivative, output information (data are also classified in the same way). The third division is I. controlling and informing. Fourth - redundant, useful and false. Fifth - complete (continuous) and selective. This idea by Wiener gives a direct indication of the objectivity of the information, i.e. its existence in nature is independent of human consciousness (perception).

Information is

Modern cybernetics defines objective information as the objective property of material objects and phenomena to generate a variety of states that, through the fundamental interactions of matter, are transmitted from one object (process) to another and are imprinted in its structure. A material system in cybernetics is considered as a set of objects that themselves can be in different states, but the state of each of them is determined by the states of other objects of the system.

Information is

In nature, many states of a system represent information; the states themselves represent the primary code, or source code. Thus, every material system is a source of information. Cybernetics defines subjective (semantic) information as the meaning or content of a message.

The role of information in computer science

The subject of science is data: methods of its creation, storage, processing and transmission. Content (also: “content” (in context), “site content”) is a term meaning all types of information (both text and multimedia - images, audio, video) that make up the content (visualized, for the visitor, content) of the web -site. It is used to separate the concept of information that makes up the internal structure of a page/site (code) from what will ultimately be displayed on the screen.

The word “information” comes from the Latin word informatio, which means information, explanation, introduction. The concept of “information” is basic in a computer science course, but it is impossible to define it through other, more “simple” concepts.

The following approaches to determining information can be distinguished:

Traditional (ordinary) - used in computer science: information is information, knowledge, messages about the state of affairs that a person perceives from the outside world using the senses (vision, hearing, taste, smell, touch).

Probabilistic - used in the theory of information: information is information about objects and phenomena of the environment, their parameters, properties and state, which reduce the degree of uncertainty and incompleteness of knowledge about them.

Information is stored, transmitted and processed in symbolic (sign) form. The same information can be presented in different forms:

Sign writing, consisting of various signs, among which symbolic ones are distinguished in the form of text, numbers, specials. characters; graphic; tabular, etc.;

In the form of gestures or signals;

Oral verbal form (conversation).

Information is presented using languages ​​as sign systems, which are built on the basis of a specific alphabet and have rules for performing operations on signs. Language is a specific sign system for presenting information. Exist:

Natural languages ​​are spoken languages ​​in spoken and written form. In some cases, spoken language can be replaced by the language of facial expressions and gestures, the language of special signs (for example, road signs);

Formal languages ​​are special languages ​​for various areas of human activity, which are characterized by a strictly fixed alphabet and more strict rules of grammar and syntax. This is the language of music (notes), the language of mathematics (numbers, mathematical symbols), number systems, programming languages, etc. The basis of any language is the alphabet - a set of symbols/signs. The total number of symbols of the alphabet is usually called the power of the alphabet.

Information media is a medium or physical body for transmitting, storing and reproducing information. (These are electrical, light, thermal, sound, radio signals, magnetic and laser disks, printed publications, photographs, etc.)

Information processes are processes associated with receiving, storing, processing and transmitting information (i.e., actions performed with information). Those. These are processes during which the content of information or the form of its presentation changes.

To ensure the information process, a source of information, a communication channel and a buyer of information are required. The source transmits (sends) information, and the receiver receives (perceives) it. The transmitted information travels from the source to the receiver using a signal (code). Changing the signal allows you to obtain information.

Being an object of transformation and use, information is characterized by the following properties:

Syntax is a property that determines the way information is presented on a medium (in a signal). Thus, this information is presented on electronic media using a specific font. Here you can also consider such information presentation parameters as font style and color, its size, line spacing, etc. The selection of the necessary parameters as syntactic properties is obviously determined by the intended method of transformation. For example, for a person with poor vision, the size and color of the font is important. If you plan to enter this text into a computer via a scanner, the paper size is important;

Semantics is a property that determines the meaning of information as the correspondence of the signal to the real world. Thus, the semantics of the “computer science” signal lies in the definition given earlier. Semantics can be considered as some agreement, known to the acquirer of information, about what each signal means (the so-called interpretation rule). For example, it is the semantics of signals that a novice motorist studies, studying the rules of the road, learning road signs (in this case, the signs themselves are the signals). The semantics of words (signals) is learned by a student of a foreign language. We can say that the point of teaching computer science is to study the semantics of various signals - the essence of the key concepts of this discipline;

Pragmatics is a property that determines the influence of information on the behavior of the acquirer. Thus, the pragmatics of the information received by the reader of this textbook is, at the very least, successful passing of the computer science exam. I would like to believe that the pragmatics of this work will not be limited to this, and it will serve for the further education and professional activities of the reader.

Information is

It should be noted that signals that differ in syntax can have the same semantics. For example, the signals “computer” and “computer” mean an electronic device for converting information. In this case, we usually talk about signal synonymy. On the other hand, one signal (i.e., information with one syntactic property) may have different pragmatics for consumers and different semantics. Thus, a road sign known as a “brick” and having a very specific semantics (“entry prohibited”) means for a motorist a ban on entry, but has no effect on a pedestrian. At the same time, the “key” signal can have different semantics: a treble clef, a spring clef, a key for opening a lock, a key used in computer science to encode a signal in order to protect it from unauthorized access (in this case they talk about signal homonymy). There are signals - antonyms that have opposite semantics. For example, “cold” and “hot”, “fast” and “slow”, etc.

The subject of study of the science of computer science is data: methods of their creation, storage, processing and transmission. And the information itself recorded in the data, its meaningful meaning, is of interest to users of information systems who are specialists in various sciences and fields of activity: a physician is interested in medical information, a geologist is interested in geological information, a businessman is interested in commercial information, etc. (In particular, a computer scientist is interested in information on working with data).

Semiotics - science of information

Information cannot be imagined without its receipt, processing, transmission, etc., that is, outside the framework of information exchange. All acts of information exchange are carried out through symbols or signs, with the help of which one system influences another. Therefore, the main science that studies information is semiotics - the science of signs and sign systems in nature and society (theory of signs). In each act of information exchange one can find three “participants”, three elements: a sign, an object that it designates, and a recipient (user) of the sign.

Depending on the relationships between which elements are considered, semiotics is divided into three sections: syntactics, semantics and pragmatics. Syntactics studies signs and the relationships between them. At the same time, it abstracts from the content of the sign and its practical meaning for the recipient. Semantics studies the relationship between signs and the objects they denote, while abstracting from the recipient of the signs and the value of the latter: for him. It is clear that studying the patterns of semantic representation of objects in signs is impossible without taking into account and using the general patterns of construction of any sign systems studied by syntactics. Pragmatics studies the relationship between signs and their users. Within the framework of pragmatics, all the factors that distinguish one act of information exchange from another, all questions of the practical results of using information and its value for the recipient are studied.

In this case, many aspects of the relationships of signs with each other and with the objects they denote are inevitably affected. Thus, the three sections of semiotics correspond to three levels of abstraction (distraction) from the characteristics of specific acts of information exchange. The study of information in all its diversity corresponds to the pragmatic level. Distracting from the recipient of information, excluding him from consideration, we move on to studying it at the semantic level. With the abstraction from the content of signs, the analysis of information is transferred to the level of syntactics. This interpenetration of the main sections of semiotics, associated with different levels of abstraction, can be represented using the diagram “Three sections of semiotics and their interrelation.” The measurement of information is carried out accordingly in three aspects: syntactic, semantic and pragmatic. The need for such different dimensions of information, as will be shown below, is dictated by design practice and companies operation of information systems. Let's consider a typical production situation.

At the end of the shift, the site planner prepares the production schedule data. This data enters the information and computing center (ICC) of the enterprise, where it is processed, and in the form of reports on the current state of production is issued to managers. Based on the data received, the workshop manager makes a decision to change the production plan to the next planned one or take any other organizational measures. Obviously, for the shop manager, the amount of information contained in the summary depends on the magnitude of the economic impact received from its use in decision-making, on how useful the information received was. For the site planner, the amount of information in the same message is determined by the accuracy of its correspondence with the actual state of affairs on the site and the degree of surprise of the reported facts. The more unexpected they are, the sooner you need to report them to management, the more information there is in this message. For ICC workers, the number of characters and the length of the message carrying information will be of paramount importance, since it is this that determines the loading time of computer equipment and communication channels. At the same time, they are practically not interested in either the usefulness of information or the quantitative measure of the semantic value of information.

Naturally, when organizing a production management system and building decision selection models, we will use the usefulness of information as a measure of the informativeness of messages. When building a system accounting and reporting that provides guidance on the progress of the production process, the measure of the amount of information should be taken as the novelty of the information received. Company The same procedures for mechanical processing of information require measuring the volume of messages in the form of the number of processed characters. These three fundamentally different approaches to measuring information are not contradictory or mutually exclusive. On the contrary, by measuring information on different scales, they allow a more complete and comprehensive assessment of the information content of each message and more effectively organize a production management system. According to the apt expression of Prof. NOT. Kobrinsky, when it comes to a rational company of information flows, the quantity, novelty, and usefulness of information are as interconnected as the quantity, quality and cost of products in production.

Information in the material world

information is one of the general concepts associated with matter. Information exists in any material object in the form of a variety of its states and is transferred from object to object in the process of their interaction. The existence of information as an objective property of matter logically follows from the known fundamental properties of matter - structure, continuous change (movement) and interaction of material objects.

The structure of matter manifests itself as the internal dismemberment of integrity, the natural order of connection of elements within the whole. In other words, any material object, from the subatomic particle of the Meta Universe (Big Bang) as a whole, is a system of interconnected subsystems. Due to continuous movement, understood in a broad sense as movement in space and development in time, material objects change their states. The states of objects also change during interactions with other objects. The set of states of a material system and all its subsystems represents information about the system.

Strictly speaking, due to uncertainty, infinity, and the properties of structure, the amount of objective information in any material object is infinite. This information is called complete. However, it is possible to distinguish structural levels with finite sets of states. Information that exists at a structural level with a finite number of states is called private. For private information, the concept of quantity of information makes sense.

From the above presentation, it is logical and simple to select a unit of measurement for the amount of information. Let's imagine a system that can be in only two equally probable states. Let's assign the code “1” to one of them, and “0” to the other. This is the minimum amount of information that the system can contain. It is a unit of measurement of information and is called a bit. There are other, more difficult to define, methods and units for measuring the amount of information.

Depending on the material form of the medium, information is of two main types - analog and discrete. Analog information changes continuously over time and takes values ​​from a continuum of values. Discrete information changes at some points in time and takes values ​​from a certain set of values. Any material object or process is the primary source of information. All its possible states make up the information source code. The instantaneous value of states is represented as a symbol (“letter”) of this code. In order for information to be transmitted from one object to another as a receiver, it is necessary that there be some kind of intermediate material medium that interacts with the source. Such carriers in nature, as a rule, are rapidly propagating processes of wave structure - cosmic, gamma and x-ray radiation, electromagnetic and sound waves, potentials (and perhaps not yet discovered waves) of the gravitational field. When electromagnetic radiation interacts with an object as a result of absorption or reflection, its spectrum changes, i.e. the intensities of some wavelengths change. The harmonics of sound vibrations also change during interactions with objects. Information is also transmitted through mechanical interaction, but mechanical interaction, as a rule, leads to large changes in the structure of objects (up to their destruction), and the information is greatly distorted. Distortion of information during its transmission is called disinformation.

The transfer of source information to the structure of the medium is called encoding. In this case, the source code is converted into the carrier code. The medium with the source code transferred to it in the form of a carrier code is called a signal. The signal receiver has its own set of possible states, which is called the receiver code. A signal, interacting with a receiving object, changes its state. The process of converting a signal code into a receiver code is called decoding. The transfer of information from a source to a receiver can be considered as information interaction. Information interaction is fundamentally different from other interactions. In all other interactions of material objects, an exchange of matter and (or) energy occurs. In this case, one of the objects loses matter or energy, and the other gains it. This property of interactions is called symmetry. During information interaction, the receiver receives information, but the source does not lose it. Information interaction is asymmetrical. Objective information itself is not material, it is a property of matter, such as structure, movement, and exists on material media in the form of its own codes.

Information in wildlife

Wildlife is complex and diverse. The sources and receivers of information in it are living organisms and their cells. An organism has a number of properties that distinguish it from inanimate material objects.

Basic:

Continuous exchange of matter, energy and information with the environment;

Irritability, the body’s ability to perceive and process information about changes in the environment and internal environment of the body;

Excitability, the ability to respond to stimuli;

Self-organization, manifested as changes in the body to adapt to environmental conditions.

An organism, considered as a system, has a hierarchical structure. This structure relative to the organism itself is divided into internal levels: molecular, cellular, organ level and, finally, the organism itself. However, the organism also interacts above organismal living systems, the levels of which are population, ecosystem and all living nature as a whole (biosphere). Flows of not only matter and energy, but also information circulate between all these levels. Information interactions in living nature occur in the same way as in inanimate nature. At the same time, living nature in the process of evolution has created a wide variety of sources, carriers and receivers of information.

The reaction to the influences of the external world is manifested in all organisms, since it is caused by irritability. In higher organisms, adaptation to the external environment is a complex activity, which is effective only with sufficiently complete and timely information about the environment. The receivers of information from the external environment are their sense organs, which include vision, hearing, smell, taste, touch and the vestibular apparatus. In the internal structure of organisms there are numerous internal receptors associated with the nervous system. The nervous system consists of neurons, the processes of which (axons and dendrites) are analogous to information transmission channels. The main organs that store and process information in vertebrates are the spinal cord and the brain. In accordance with the characteristics of the senses, information perceived by the body can be classified as visual, auditory, gustatory, olfactory and tactile.

When the signal reaches the retina of the human eye, it excites its constituent cells in a special way. Nerve impulses from cells are transmitted through axons to the brain. The brain remembers this sensation in the form of a certain combination of states of its constituent neurons. (The example is continued in the section “information in human society”). By accumulating information, the brain creates a connected information model of the surrounding world on its structure. In living nature, an important characteristic for an organism that receives information is its availability. The amount of information that the human nervous system is capable of sending to the brain when reading texts is approximately 1 bit per 1/16 s.

Information is

The study of organisms is complicated by their complexity. The abstraction of structure as a mathematical set, which is acceptable for inanimate objects, is hardly acceptable for a living organism, because in order to create a more or less adequate abstract model of an organism, it is necessary to take into account all the hierarchical levels of its structure. Therefore, it is difficult to introduce a measure of the amount of information. It is very difficult to determine the connections between the components of the structure. If it is known which organ is the source of information, then what is the signal and what is the receiver?

Before the advent of computers, biology, which deals with the study of living organisms, used only qualitative, i.e. descriptive models. In a qualitative model, it is almost impossible to take into account information connections between the components of the structure. Electronic computing technology has made it possible to apply new methods in biological research, in particular, the machine modeling method, which involves a mathematical description of known phenomena and processes occurring in the body, adding to them hypotheses about some unknown processes and calculating possible behavior patterns of the organism. The resulting options are compared with the actual behavior of the organism, which makes it possible to determine the truth or falsity of the hypotheses put forward. Such models can also take into account information interaction. The information processes that ensure the existence of life itself are extremely complex. And although it is intuitively clear that this property is directly related to the formation, storage and transmission of complete information about the structure of the organism, an abstract description of this phenomenon seemed impossible for some time. However, the information processes that ensure the existence of this property have been partially revealed through deciphering the genetic code and reading the genomes of various organisms.

Information in human society

The development of matter in the process of movement is directed towards complicating the structure of material objects. One of the most complex structures is the human brain. So far, this is the only structure known to us that has a property that man himself calls consciousness. Speaking about information, we, as thinking beings, a priori mean that information, in addition to its presence in the form of signals we receive, also has some meaning. By forming a model of the surrounding world in his mind as an interconnected set of models of its objects and processes, a person uses semantic concepts rather than information. Meaning is the essence of any phenomenon that does not coincide with itself and connects it with the broader context of reality. The word itself directly indicates that the semantic content of information can only be formed by thinking receivers of information. In human society, it is not the information itself that is of decisive importance, but its semantic content.

Example (continued). Having experienced such a sensation, a person assigns the concept “tomato” to the object, and the concept “red color” to its state. In addition, his consciousness fixes the connection: “tomato” - “red”. This is the meaning of the received signal. (Example continued below in this section). The brain's ability to create meaningful concepts and connections between them is the basis of consciousness. Consciousness can be considered as a self-developing semantic model of the surrounding world. Meaning is not information. Information exists only on a tangible medium. Human consciousness is considered immaterial. Meaning exists in the human mind in the form of words, images and sensations. A person can pronounce words not only out loud, but also “to himself.” He can also create (or remember) images and sensations “in his own mind.” However, he can retrieve information corresponding to this meaning by speaking or writing words.

Information is

Example (continued). If the words “tomato” and “red” are the meaning of the concepts, then where is the information? information is contained in the brain in the form of certain states of its neurons. It is also contained in printed text consisting of these words, and when encoding letters with a three-bit binary code, its quantity is 120 bits. If you say the words out loud, there will be much more information, but the meaning will remain the same. The visual image carries the greatest amount of information. This is reflected even in folklore - “it’s better to see once than to hear a hundred times.” The information restored in this way is called semantic information, since it encodes the meaning of some primary information (semantics). Having heard (or seen) a phrase spoken (or written) in a language that a person does not know, he receives information, but cannot determine its meaning. Therefore, to transmit the semantic content of information, some agreements between the source and the receiver on the semantic content of the signals are necessary, i.e. words Such agreements can be achieved through communication. Communication is one of the most important conditions for the existence of human society.

In the modern world, information is one of the most important resources and, at the same time, one of the driving forces in the development of human society. Information processes occurring in the material world, living nature and human society are studied (or at least taken into account) by all scientific disciplines from philosophy to marketing. The increasing complexity of scientific research problems has led to the need to attract large teams of scientists from different specialties to solve them. Therefore, almost all of the theories discussed below are interdisciplinary. Historically, two complex branches of science—cybernetics and computer science—are engaged in the study of information itself.

Modern cybernetics is a multidisciplinary industry science that studies highly complex systems, such as:

Human society (social cybernetics);

Economics (economic cybernetics);

Living organism (biological cybernetics);

The human brain and its function is consciousness (artificial intelligence).

Computer science, formed as a science in the middle of the last century, separated from cybernetics and is engaged in research in the field of methods for obtaining, storing, transmitting and processing semantic information. Both of these industry use several underlying scientific theories. These include information theory, and its sections - coding theory, algorithm theory and automata theory. Research into the semantic content of information is based on a set of scientific theories under the general name semiotics. Information theory is a complex, mainly mathematical theory that includes a description and assessment of methods for retrieving, transmitting, storing and classifying information. Considers information media as elements of an abstract (mathematical) set, and interactions between media as a way of arranging elements in this set. This approach makes it possible to formally describe the information code, that is, to define an abstract code and study it using mathematical methods. For these studies he uses methods of probability theory, mathematical statistics, linear algebra, game theory and other mathematical theories.

The foundations of this theory were laid by the American scientist E. Hartley in 1928, who determined the measure of the amount of information for certain communication problems. Later, the theory was significantly developed by the American scientist K. Shannon, Russian scientists A.N. Kolmogorov, V.M. Glushkov and others. Modern information theory includes sections such as coding theory, algorithm theory, digital automata theory (see below) and some others. There are also alternative information theories, for example “Qualitative Information Theory”, proposed by the Polish scientist M. Mazur. Every person is familiar with the concept of an algorithm, without even knowing it. Here is an example of an informal algorithm: “Cut the tomatoes into circles or slices. Place chopped onion in them, pour in vegetable oil, then sprinkle with finely chopped capsicum and stir. Before eating, sprinkle with salt, place in a salad bowl and garnish with parsley.” (Tomato salad).

The first rules for solving arithmetic problems in the history of mankind were developed by one of the famous scientists of antiquity, Al-Khorezmi, in the 9th century AD. In his honor, formalized rules for achieving any goal are called algorithms. The subject of the theory of algorithms is to find methods for constructing and evaluating effective (including universal) computational and control algorithms for information processing. To substantiate such methods, the theory of algorithms uses the mathematical apparatus of information theory. The modern scientific concept of algorithms as methods of information processing was introduced in the works of E. Post and A. Turing in the 20s of the twentieth century (Turing Machine). Russian scientists A. Markov (Markov's Normal Algorithm) and A. Kolmogorov made a great contribution to the development of the theory of algorithms. Automata theory is a branch of theoretical cybernetics that studies mathematical models of actually existing or fundamentally possible devices that process discrete information at discrete moments in time.

The concept of an automaton arose in the theory of algorithms. If there are some universal algorithms for solving computational problems, then there must also be devices (albeit abstract) for implementing such algorithms. Actually, an abstract Turing machine, considered in the theory of algorithms, is at the same time an informally defined automaton. The theoretical justification for the construction of such devices is the subject of automata theory. Automata theory uses the apparatus of mathematical theories - algebra, mathematical logic, combinatorial analysis, graph theory, probability theory, etc. Automata theory, together with the theory of algorithms, is the main theoretical basis for the creation of electronic computers and automated control systems. Semiotics is a complex of scientific theories that study the properties of sign systems. The most significant results have been achieved in the branch of semiotics—semantics. The subject of semantics research is the semantic content of information.

A sign system is considered to be a system of concrete or abstract objects (signs, words), with each of which a certain meaning is associated in a certain way. In theory, it has been proven that there can be two such comparisons. The first type of correspondence directly determines the material object that this word denotes and is called a denotation (or, in some works, a nominee). The second type of correspondence determines the meaning of a sign (word) and is called a concept. At the same time, such properties of comparisons as “meaning”, “truth”, “definability”, “following”, “interpretation”, etc. are studied. For research, the apparatus of mathematical logic and mathematical linguistics is used. Ideas of semantics, outlined by G. V. Leibniz and F de Saussure in the 19th century, formulated and developed by C. Pierce (1839-1914), C. Morris (b. 1901), R. Carnap (1891-1970), etc. The main achievement of the theory is the creation of a semantic analysis apparatus that allows one to represent the meaning of a text in a natural language in the form of a record in some formalized semantic (semantic) language. Semantic analysis is the basis for creating devices (programs) for machine translation from one natural language to another.

Information is stored by transferring it to some physical media. Semantic information recorded on a tangible storage medium is called a document. Humanity learned to store information a very long time ago. The most ancient forms of storing information used the arrangement of objects - shells and stones on the sand, knots on a rope. A significant development of these methods was writing - the graphic representation of symbols on stone, clay, papyrus, and paper. Of great importance in the development of this direction was invention book printing. Over its history, humanity has accumulated a huge amount of information in libraries, archives, periodicals and other written documents.

Currently, storing information in the form of sequences of binary characters has gained particular importance. To implement these methods, a variety of storage devices are used. They are the central link of information storage systems. In addition to them, such systems use means of searching for information (search engine), means of obtaining information (information and reference systems) and means of displaying information (output device). Formed according to the purpose of the information, such information systems form databases, data banks and a knowledge base.

The transfer of semantic information is the process of its spatial transfer from the source to the recipient (addressee). Man learned to transmit and receive information even earlier than to store it. Speech is a method of transmission that our distant ancestors used in direct contact (conversation) - we still use it now. To transmit information over long distances, it is necessary to use much more complex information processes. To carry out such a process, information must be formatted (presented) in some way. To present information, various sign systems are used—sets of predetermined semantic symbols: objects, pictures, written or printed words of natural language. Semantic information about any object, phenomenon or process presented with their help is called a message.

Obviously, in order to transmit a message over a distance, information must be transferred to some kind of mobile medium. Carriers can move through space using vehicles, as happens with letters sent by mail. This method ensures complete reliability of the transmission of information, since the addressee receives the original message, but requires significant time for transmission. Since the middle of the 19th century, methods of transmitting information have become widespread using a naturally propagating information carrier - electromagnetic vibrations (electrical vibrations, radio waves, light). Implementation of these methods requires:

Preliminary transfer of information contained in a message to a medium - encoding;

Ensuring the transmission of the signal thus received to the recipient via a special communication channel;

Reverse conversion of the signal code into a message code - decoding.

Information is

The use of electromagnetic media makes the delivery of a message to the addressee almost instantaneous, but requires additional measures to ensure the quality (reliability and accuracy) of the transmitted information, since real communication channels are subject to natural and artificial interference. Devices that implement the data transfer process form communication systems. Depending on the method of presenting information, communication systems can be divided into sign (, telefax), sound (), video and combined systems (television). The most developed communication system in our time is the Internet.

Data processing

Since information is not material, its processing involves various transformations. Processing processes include any transfer of information from a medium to another medium. Information intended for processing is called data. The main type of processing of primary information received by various devices is transformation into a form that ensures its perception by the human senses. Thus, photographs of space obtained in X-rays are converted into ordinary color photographs using special spectrum converters and photographic materials. Night vision devices convert the image obtained in infrared (thermal) rays into an image in the visible range. For some communication and control tasks, conversion of analog information is necessary. For this purpose, analog-to-digital and digital-to-analog signal converters are used.

The most important type of processing of semantic information is determining the meaning (content) contained in a certain message. Unlike primary semantic information, it does not have statistical characteristics, that is, a quantitative measure - either there is meaning or there is not. And how much it is, if any, is impossible to establish. The meaning contained in the message is described in an artificial language that reflects the semantic connections between the words of the source text. A dictionary of such a language, called a thesaurus, is located in the message receiver. The meaning of words and phrases in a message is determined by assigning them to certain groups of words or phrases, the meaning of which has already been established. The thesaurus, thus, allows you to establish the meaning of the message and, at the same time, is replenished with new semantic concepts. The described type of information processing is used in information retrieval systems and machine translation systems.

One of the widespread types of information processing is the solution of computational problems and automatic control problems using computers. Information processing is always carried out for some purpose. To achieve it, the order of actions on information leading to a given goal must be known. This procedure is called an algorithm. In addition to the algorithm itself, you also need some device that implements this algorithm. In scientific theories, such a device is called an automaton. It should be noted that the most important feature of information is the fact that due to the asymmetry of information interaction, new information appears when processing information, but the original information is not lost.

Analog and digital information

Sound is wave vibrations in any medium, for example in air. When a person speaks, the vibrations of the throat ligaments are converted into wave vibrations of the air. If we consider sound not as a wave, but as vibrations at one point, then these vibrations can be represented as air pressure changing over time. Using a microphone, pressure changes can be detected and converted into electrical voltage. Air pressure is converted into electrical voltage fluctuations.

Such a transformation can occur according to various laws, most often the transformation occurs according to a linear law. For example, like this:

U(t)=K(P(t)-P_0),

where U(t) is the electrical voltage, P(t) is the air pressure, P_0 is the average air pressure, and K is the conversion factor.

Both electrical voltage and air pressure are continuous functions over time. The functions U(t) and P(t) are information about the vibrations of the throat ligaments. These functions are continuous and such information is called analog. Music is a special case of sound and it can also be represented as some kind of function of time. It will be an analog representation of music. But music is also written down in the form of notes. Each note has a duration that is a multiple of a predetermined duration, and a pitch (do, re, mi, fa, salt, etc.). If this data is converted into numbers, we get a digital representation of the music.

Human speech is also a special case of sound. It can also be represented in analog form. But just as music can be broken down into notes, speech can be broken down into letters. If each letter is given its own set of numbers, then we will get a digital representation of speech. The difference between analog and digital information is that analog information is continuous, while digital information is discrete. The transformation of information from one type to another, depending on the type of transformation, is called differently: simply "conversion", such as digital-to-analog conversion, or analog-to-digital conversion; complex transformations are called "coding", for example, delta coding, entropy coding; The conversion between characteristics such as amplitude, frequency or phase is called "modulation", for example amplitude-frequency modulation, pulse-width modulation.

Information is

Typically, analog conversions are quite simple and can be easily handled by various devices invented by man. A tape recorder converts magnetization on film into sound, a voice recorder converts sound into magnetization on film, a video camera converts light into magnetization on film, an oscilloscope converts electrical voltage or current into an image, etc. Converting analogue information to digital is much more difficult. The machine cannot make some transformations or succeeds with great difficulty. For example, converting speech into text, or converting a recording of a concert into sheet music, and even an inherently digital representation: text on paper is very difficult for a machine to convert into the same text in computer memory.

Information is

Why then use digital representation of information if it is so complex? The main advantage of digital information over analog information is noise immunity. That is, in the process of copying information, digital information is copied as it is, it can be copied almost an infinite number of times, while analog information becomes noisy during the copying process, and its quality deteriorates. Typically, analogue information can be copied no more than three times. If you have a two-cassette audio recorder, you can perform the following experiment: try rewriting the same song several times from cassette to cassette; after just a few such re-recordings you will notice how much The recording quality has deteriorated. The information on the cassette is stored in analog form. You can rewrite music in mp3 format as many times as you like, and the quality of the music does not deteriorate. The information in an mp3 file is stored digitally.

Amount of information

A person or some other receiver of information, having received a piece of information, resolves some uncertainty. Let's take the same tree as an example. When we saw the tree, we resolved a number of uncertainties. We learned the height of the tree, the type of tree, the density of the foliage, the color of the leaves and, if it was a fruit tree, then we saw the fruits on it, how ripe they were, etc. Before we looked at the tree, we did not know all this, after we looked at the tree, we resolved the uncertainty - we received information.

If we go out into a meadow and look at it, we will get a different kind of information, how big the meadow is, how tall the grass is and what color the grass is. If a biologist goes to this same meadow, then, among other things, he will be able to find out: what varieties of grass grow in the meadow, what type of meadow it is, he will see what flowers have bloomed, which ones are about to bloom, whether the meadow is suitable for grazing cows, etc. That is, he will receive more information than we do, since he had more questions before he looked at the meadow, the biologist will resolve more uncertainties.

Information is

The more uncertainty was resolved in the process of obtaining information, the more information we received. But this is a subjective measure of the amount of information, and we would like to have an objective measure. There is a formula for calculating the amount of information. We have some uncertainty, and we have N number of cases of resolution of the uncertainty, and each case has a certain probability of resolution, then the amount of information received can be calculated using the following formula that Shannon suggested to us:

I = -(p_1 log_(2)p_1 + p_2 log_(2)p_2 +... +p_N log_(2)p_N), where

I - amount of information;

N - number of outcomes;

p_1, p_2,..., p_N are the probabilities of the outcome.

Information is

The amount of information is measured in bits - an abbreviation for the English words BInary digiT, which means binary digit.

For equally probable events, the formula can be simplified:

I = log_(2)N, where

I - amount of information;

N is the number of outcomes.

Let's take, for example, a coin and throw it on the table. It will land either heads or tails. We have 2 equally probable events. After we tossed the coin, we received log_(2)2=1 bit of information.

Let's try to find out how much information we get after we roll the dice. The cube has six sides - six equally probable events. We get: log_(2)6 approx 2.6. After we threw the die on the table, we received approximately 2.6 bits of information.

The odds of us seeing a Martian dinosaur when we leave the house are one in ten billion. How much information will we get about the Martian dinosaur once we leave home?

Left(((1 over (10^(10))) log_2(1 over (10^(10))) + left(( 1 - (1 over (10^(10)))) ight) log_2 left(( 1 - (1 over (10^(10))) ight)) ight) approx 3.4 cdot 10^(-9) bits.

Let's say we tossed 8 coins. We have 2^8 coin drop options. This means that after tossing coins we will get log_2(2^8)=8 bits of information.

When we ask a question and are equally likely to receive a “yes” or “no” answer, then after answering the question we receive one bit of information.

It's amazing that if we apply Shannon's formula to analog information, we get an infinite amount of information. For example, the voltage at a point in an electrical circuit can take an equally probable value from zero to one volt. The number of outcomes we have is equal to infinity, and by substituting this value into the formula for equally probable events, we get infinity - an infinite amount of information.

Now I will show you how to encode “war and peace” using just one mark on any metal rod. Let's encode all the letters and characters found in " war and peace”, using two-digit numbers - they should be enough for us. For example, we will give the letter “A” the code “00”, the letter “B” the code “01” and so on, we will encode punctuation marks, Latin letters and numbers. Let's recode " war and the world" using this code and get a long number, for example, 70123856383901874..., add a comma and a zero in front of this number (0.70123856383901874...). The result is a number from zero to one. Let's put risk on a metal rod so that the ratio of the left side of the rod to the length of this rod is equal to exactly our number. Thus, if suddenly we want to read “war and peace”, we will simply measure the left side of the rod to risks and the length of the entire rod, divide one number by another, get a number and recode it back into letters (“00” into “A”, “01” into “B”, etc.).

Information is

In reality, we will not be able to do this, since we will not be able to determine the lengths with infinite accuracy. Some engineering problems prevent us from increasing the accuracy of measurements, and quantum physics shows us that after a certain limit, quantum laws will already interfere with us. Intuitively, we understand that the lower the measurement accuracy, the less information we receive, and the greater the measurement accuracy, the more information we receive. Shannon's formula is not suitable for measuring the amount of analog information, but there are other methods for this, which are discussed in Information Theory. In computer technology, a bit corresponds to the physical state of the information carrier: magnetized - not magnetized, there is a hole - no hole, charged - not charged, reflects light - does not reflect light, high electrical potential - low electrical potential. In this case, one state is usually denoted by the number 0, and the other by the number 1. Any information can be encoded with a sequence of bits: text, image, sound, etc.

Along with a bit, a value called a byte is often used; it is usually equal to 8 bits. And if a bit allows you to choose one equally probable option from two possible ones, then a byte is 1 out of 256 (2^8). To measure the amount of information, it is also common to use larger units:

1 KB (one kilobyte) 210 bytes = 1024 bytes

1 MB (one megabyte) 210 KB = 1024 KB

1 GB (one gigabyte) 210 MB = 1024 MB

In reality, the SI prefixes kilo-, mega-, giga- should be used for the factors 10^3, 10^6 and 10^9, respectively, but historically there has been a practice of using factors with powers of two.

A Shannon bit and a bit used in computer technology are the same if the probabilities of a zero or a one appearing in a computer bit are equal. If the probabilities are not equal, then the amount of information according to Shannon becomes less, we saw this in the example of the Martian dinosaur. The computer quantity of information provides an upper estimate of the quantity of information. Volatile memory, after power is applied to it, is usually initialized with some value, for example, all ones or all zeros. It is clear that after power is applied to the memory, there is no information there, since the values ​​​​in the memory cells are strictly defined, there is no uncertainty. Memory can store a certain amount of information, but after power is applied to it, there is no information in it.

Disinformation is deliberately false information provided to an enemy or business partner for more effective conduct of military operations, cooperation, checking for information leakage and the direction of its leakage, identifying potential clients of the black market. Also disinformation (also misinformed) is the process of manipulating information itself, such as: misleading someone by providing incomplete information or complete but no longer necessary information, distorting the context, distorting part of the information.

The goal of such influence is always the same - the opponent must act as the manipulator needs. The action of the target against whom disinformation is directed may consist in making a decision that the manipulator needs or in refusing to make a decision that is unfavorable for the manipulator. But in any case, the final goal is the action that will be taken by the opponent.

Disinformation, then, is product human activity, an attempt to create a false impression and, accordingly, push to the desired actions and/or inaction.

Information is

Types of disinformation:

Misleading a specific person or group of people (including an entire nation);

Manipulation (the actions of one person or group of people);

Creating public opinion regarding a problem or object.

Information is

Misrepresentation is nothing more than outright deception, the provision of false information. Manipulation is a method of influence aimed directly at changing the direction of people’s activity. The following levels of manipulation are distinguished:

Strengthening the values ​​(ideas, attitudes) that exist in people’s minds and are beneficial to the manipulator;

Partial change in views on a particular event or circumstance;

A radical change in life attitudes.

Creating public opinion is the formation in society of a certain attitude towards a chosen problem.

Sources and links

ru.wikipedia.org - free encyclopedia Wikipedia

youtube.com - YouTube video hosting

images.yandex.ua - Yandex pictures

google.com.ua - Google images

ru.wikibooks.org - Wikibooks

inf1.info - Planet Informatics

old.russ.ru - Russian Magazine

shkolo.ru - Information directory

5byte.ru - Computer science website

ssti.ru - Information technologies

klgtu.ru - Computer Science

informatika.sch880.ru - website of computer science teacher O.V. Podvintseva

Encyclopedia of Cultural Studies

The basic concept of cybernetics, in the same way, economic I. the basic concept of economic cybernetics. There are many definitions of this term, they are complex and contradictory. The reason for this, obviously, is that I. deals with the phenomenon... ... Economic-mathematical dictionary

information- Meaningful data. [GOST R ISO 9000 2008] information Any type of knowledge about objects, facts, concepts, etc. of a problem area that is exchanged by users of an information system [GOST 34.320 96] information Information (messages, data) ... ... Technical Translator's Guide

information- and, f. information f., floor informacyia, lat. informatio explanation, presentation. Message, information about something. BAS 1. Everywhere and in everything, protect the interest of the monarchs with all fidelity, .. for everything I give him direct information to Shvymers... ... Historical Dictionary of Gallicisms of the Russian Language

information- data, source data, information; notification, message, notification, notice; ranking, catamnesis, news, reference, material, report, press release Dictionary of Russian synonyms. information see information Dictionary of synonyms of the Russian language... ... Synonym dictionary

INFORMATION- (information) Data available to individuals, firms or governments when making economic decisions. In principle, there is an infinitely large amount of information; in practice, even such large and sophisticated organizations as central... ... Economic dictionary

- (data) Information that is processed, accumulated or issued by a computer. Business. Dictionary. M.: INFRA M, Ves Mir Publishing House. Graham Betts, Barry Brindley, S. Williams and others. General editor: Ph.D. Osadchaya I.M.. 1998. Information ... Dictionary of business terms

INFORMATION- INFORMATION, information, women. (book, official). 1. units only Action under Ch. inform. The information is presented at the proper level. 2. A message informing about the state of affairs or someone’s activities, information about something. Give... ... Ushakov's Explanatory Dictionary

INFORMATION- (from the Latin informatio familiarization, explanation) a concept used in philosophy since ancient times and recently received a new, broader meaning thanks to the development of cybernetics, where it acts as one of the central categories... ... Philosophical Encyclopedia

INFORMATION- (from lat. informatio explanation, awareness) any information and data reflecting the properties of objects in natural (biological, physical, etc.), social and technical. systems and transmitted by sound, graphic (including written) or other means without... ... Physical encyclopedia

from lat. ?nf?rm?tio - explanation, presentation). One of the meanings of the term is information communicated in various ways: oral, written, technical, visual, etc., as well as the process of transmitting this information. By the middle of the twentieth century. “information” becomes one of the central concepts of cybernetics and acquires general scientific significance; A special branch of knowledge appears - computer science.

Excellent definition

Incomplete definition ↓

Information

lat. informatio - familiarization, presentation. 1) any information, data, messages transmitted via signals; 2) reduction of uncertainty as a result of the transfer of information, data, messages - in this capacity, information is opposed to entropy. Until sep. 20th century the concept of information referred only to information and messages transmitted by a person using sign means; the ability to transmit information was considered as a distinguishing feature. characteristic of a person - an intelligent species, however, with the development of science and technology, the concept of intelligence began to be used to characterize the processes of signal exchange in living nature (signaling behavior in animals and plants, genetic transmission of data in cells, etc.), as well as in environment of automated means. Interest in studying information. processes, quantity and quality assessment of I., which arose in the beginning. 20th century, was also due to the development of logical-mathematical, logical-semantic. and semiotic. studies that have drawn attention to the problems of representation of signs and meanings, and therefore. an increase in the volume of transmitted information and the development of technology. means of its transmission (telegraph, telephone, radio communications, television) that took place within the framework of modernization processes. In the lane third of the 20th century I.'s research pursued primarily the goals of clarifying the processes of its formalization (meaning - see Meaning) and optimizing the conditions for its transmission. However, by the middle 20th century the first theories appeared. research that later formed a number of theories of information - probabilistic, combinatorial, algorithmic. etc. These theories, developed by means of mathematics, made it possible to implement mathematical modeling of the I. transfer process, identifying the main elements of this process (in the classical scheme proposed by K. Shannon, information exchange includes six components: source - transmitter - transmission channel - receiver - recipient - source of interference), identify the principles of quantitative. assessment of information (throughput) and the degree of its distortion (noise immunity). The development of these theories led to the emergence of computer science as a science, the subject of which is information and methods of its transmission. However, the emergence of automation had a decisive impact on the development of research in the field of information technology. means of information processing (computers) and cybernetics - the science of communications, control and information. processing. The development of computer processing of information stimulated research in the field of formalization and algorithmization (reduction to operations with elementary statements) of information and the emergence of detailed theories of algorithmic syntax, as well as many languages ​​of algorithmization and programming. Attempts at semantic algorithmization. processes - meaning and understanding - although they were far from being so successful, they nevertheless had a significant impact. influence on the development of English linguistic philosophy and linguistics semantics, as well as transformational grammar in line with the search for a universal language for recording semantics. characteristics. In cybernetics, information is viewed in a narrower sense - not as any information, but only as information leading to a reduction in uncertainty (a reduction in the number of possible alternatives) in a communication situation, information aimed at control and coordination. In accordance with this approach to information, within the framework of general theories of management, research into pragmatics has developed. aspects of I. - assessment of I. from perspective. its relevance (sufficient, redundant, unnecessary I.), value, usefulness, adequacy, etc. Within the framework of cybernetics, the synthesis of mathematical sciences became possible. models and theories of I. with theories of social interaction and communication, which significantly enriched scientific ideas about communication and translation processes in society. At the intersection of computer science, cybernetics and anthropology, neuroinform was also developed. and neurolinguistic. studies that examined the processes of I. transmission at the level of higher nervous activity. In relation to sociocultural material, mathematics. I.'s models have undergone creatures. transformation. It was found that on the information. processes in human communication in addition to the six main ones. elements are also significantly influenced by barriers and filters: internal (individual mental characteristics of participants in information exchange, their experience and competence), and external (social and cultural norms, values, collective ideas), significantly transforming, distorting information, and at the same time they they don't always wear rac. character. Quantity assessing these distorting influences (interference) is completely insufficient, since the individual nature and complexity of the nature of these influences make it fundamentally necessary to contain them. quality analysis and determination of the mechanisms of their impact on I. Inform. processes in human communications cannot always be interpreted as leading to a reduction in the uncertainty of the situation, and irrelevant information (noise) is no less important here than relevant information. In accordance with this, in addition to relevance for socially and culturally significant information, adequacy, reliability, completeness, novelty, persuasiveness, expressiveness, perceptibility, etc. are important. Understanding what information processes are an important component of any cultural community (historical and modern), and the application of mathematical methods to their study. modeling (partially carried out by structuralism and European. social anthropology) significantly enriched the theoretical and methodological approach. baggage of sociocultural sciences. The functional approach to communication was further developed in communication theory. Within the framework of semiotics (see Semiotics), the study of information is carried out mainly in its semantics. aspects (I. as a space of meanings and meanings). Currently, the study of information in the sociocultural sciences is carried out in two directions: 1) the study of information. processes (information culture) dec. cultural communities (states, ethnic groups, civilizations, etc.); 2) research of local information. processes in various types of activities (management, marketing, advertising, social participation, political activities, etc.). Research of this kind, as a rule, has an applied focus and most widely involves the achievements of computer science and cybernetics. Such studies are devoted primarily to modern times. information problems exchange. Progressive activation and globalization of information. processes (mass media, popular culture, global information networks, etc.) in culture throughout the 20th century. determined recognition will exclude. importance of information processes for the development of modern societies and made I. the subject of not only scientific, but also philosophical consideration. I. was interpreted in cultural philosophy, as a rule, within the framework of the general ideas of one direction or another (neo-Thomism is characterized by the idea of ​​I. as a transcendental phenomenon; for existentialism and phenomenology - an orientation towards its subjectivist interpretation; for philosophical hermeneutics (see Hermeneutics) - the desire determine information processes by cultural experience, for postpositivism - emphasis on non-cognitive aspects of information). Modern the situation is often characterized by cultural philosophy as “inform. explosion”, “inform. boom”, I. processing is considered as the main. type of activity in the emerging “post-industrial society”, attempts are being made to philosophize it. interpretations and predictions of possible ways of development of “inform. civilization." Lit.: Shannon K.E. Works on information theory and cybernetics. M., 1963; Pierce J. Symbols, signals, noise. Patterns and processes of information transfer. M., 1967; Wiener N. Cybernetics or control and communication in animals and machines. M., 1968; Grishkin I.I. Concept of information. M., 1973; Afanasyev V.G. Social information and company management. M., 1975; Stratonovich R.L. Information theory. M., 1975; Dubrovsky D.I. Information, consciousness, brain. M., 1980; Strassman P.A. Information in the age of electronics: Probl. management. M., 1987; Keen J. Mass media and democracy. M., 1994; Broy M. Computer Science. Basic introduction. Parts 1-3. M., 1996; Fedotova L.N. Mass information: Production strategy and consumption tactics. M., 1996; Ivanov A.M., Kozlov V.I. Information. Computer science. Computer. Samara, 1996. A. G. Sheikin. Cultural studies of the twentieth century. Encyclopedia. M.1996

In an effort to define the concept of information, scientists have gone through an evolution over the past 50 years from formal (mainly theoretical and mathematical) definitions of what constitutes and how the amount of information can be measured, to the latest attempts to build universal concepts of the information society, a universal metalanguage, and a universal metatheory etc. The paradox of many of these concepts lies in the fact that the very concept of intelligence is not defined in them, but is accepted at an intuitive level. This explains the professional interest in understanding the phenomenon of philosophy among philosophers.

Developments in the field of information theory contributed to shifts in the methodology of scientific knowledge, which were expressed in a shift in emphasis from things to relationships, from the search for a universal fundamental principle of being to the recognition of diversity as the basic principle of scientific research. It is these categories of philosophy - attitude and diversity - that occupy a central place today in attempts to determine the nature of information phenomena.

At the same time, numerous studies of the phenomenon of information have discovered its connection with organization, systematicity, orderliness, structure, as well as with functional states and processes in complex control systems. And then information appears as a functional property of control processes, inseparable from the latter, and the theory of information appears as a branch of cybernetics.

From scientific and technical developments based on the theory of information, specialized scientific disciplines were born. This is computer science (a combination of the words “information” and “automation”) - a field of study of scientific and technical information, focusing on the automated processing of data, bodies of knowledge for production, technical and social purposes using computer technology, communications, and mathematical software. Another scientific discipline is information science (the science of information) - the field of study of information as a fundamental factor of existence, the laws of production, transmission, receipt, storage and use of information.

Information theory in the narrow sense (mathematical communication theory) is the field of study of information processes from the perspective of the quantity of information passing through communication channels, memorized, etc.; it considers the issues of optimal encoding of messages into a signal form, the maximum throughput of communication channels, etc. (the issue of the content of a message (signal) is usually outside the scope of this theory).

The main historical stages of the information evolution of society are determined by the emergence of various information carriers: writing, printing, modern information and cybernetic (in particular, computing) technology. Nowadays, the concept of information is associated with computers, advertising, publishing, television, radio and telegraph communications, and other media. This concept was introduced into science in 1928 by R. Hartley (USA) to designate a measure of quantitative measurement of information disseminated through technical communication channels (note, regardless of the content of this information). The latter, due to the limited capabilities of recording and transmitting oral speech, are converted by the source of information, first into the form of a language (sign) message, and then by the transmitter into a secondary signal form convenient for broadcasting over technical communication channels, which involves an encoding operation followed by decoding on the receiver side . Thus, the recipient has at the output of the receiver a message, which, while minimizing interference ("noise"), is, with a certain degree of correspondence, a copy of the message on the source side. Note that bringing information to the addressee (recipient), if this information is not false (misinformation), always leads to a decrease in uncertainty in the knowledge and actions of the latter. Hartley proposed a logarithm to base two for calculating the amount of information as a measure of the uncertainty eliminated as a result of receiving information from the one who receives this information. This is how the unit of information arose - a bit, or “one of two”: either “yes” or “no” in relation to a question that captures the uncertainty of the recipient’s knowledge or information about something of interest to him. In the 40s another American scientist K. Shannon, who specialized in the capacity of communication channels and message coding, gave this measure of the amount of information a more universal form: the amount of information began to be understood as the amount of entropy by which the total entropy of the system decreases as a result of the system receiving information. This formula expresses entropy through the sum of a number of probabilities multiplied by their logarithms, and relates only to the entropy (uncertainty) of the message.

In other words, the information content of a message is inversely proportional to its obviousness, predictability, and likelihood: the less predictable, unobvious and unlikely the message, the more information it carries for the recipient. A completely obvious (with probability equal to 1) message is as empty as the complete absence of such (i.e., a message whose probability is obviously equal to 0). Both of them, according to Shannon’s assumption, are uninformative and do not convey any information to the recipient. For a number of reasons related to mathematics and related to the convenience of formalization, the entropy of a message is described by Shannon as a function of the distribution of random variables.

The problem of I. is multifaceted not only in a general scientific, but also in a philosophical sense. In the ontological and ideological aspects, attempts are being made to reveal the relationship of energy with matter and energy, its nature and status in the structure of being; in the epistemological aspect - to correlate information with the content and form of knowledge, with images, signs, models, etc.; in the logical and methodological aspect - to identify the quantitative and mathematical, measurable aspects of information processes in the mathematical theory of communication, models of mass communications, cybernetics.

In the 60s - 80s. many results obtained in research over the previous twenty years were explicated in connection with studies of cybernetic models of machine translation from one language to another, game theory and decision making, and pattern recognition. Along with the further development of the statistical (syntactic) concept of information, semantic and pragmatic concepts appeared. It became clear that the works of Fisher, Nyquist, Hartley and Shannon, being an attempt to quantitatively explicate the qualitative concept of information as information, messages, do not answer the question of the quantity of what quality we are talking about. I.'s interpretation in these works is of a formal, abstract and mathematical nature. The initial principle of creating a message is the principle of sequential selection, sign by sign, letter by letter, from an infinite reservoir of ready-made messages (ensemble), and the creation of an individual message is its statistical selection from the ensemble. Messages are statistically homogeneous among themselves (the property of ergodicity), therefore the mathematical theory of communication is interested in individual differences in messages, as well as the amount of information contained in an individual message. We can only determine the average number of messages per message if it is selected. But the I. of the choice of the message is not the I.. of the message itself (E.K. Voishvillo). The individuality of an event should not disappear in the homogeneity of the statistical ensemble. Moreover, A. N. Kolmogorov and his students showed that the statistical concept of information does not express its absolute quantity, but additional information, an addition to the information content available to the recipient.

This gave impetus, firstly, to the development of the so-called. thesaurus model and, as a consequence, the semantic and pragmatic concepts of information; secondly, clarifying the relationship between information and diversity.

In the 80s and especially in the 90s. There has been a tendency for specialists in the field of information theory to become noticeably divided into pessimists and optimists, critics and apologists. From the field of semantics and mathematical problems of communication theory, discussions have moved to the socio-ethical and political spheres of problems of the information society. The objective basis for these changes was the enormous advantages that the development of information infrastructure gives to the states and regions, organizations and individuals that possess it: the possibility of time-compressed processing of large amounts of information, almost instantaneous communication within the globe, the design and management of complex systems and etc. In a number of works, the term “information society” symbolizes essentially a new social paradigm (O. Toffler), a historically new and special type of civilization, replacing agricultural and industrial. The real advantages, increasingly received by states and regions (USA, Europe, Japan) with developed information technologies and computer networks, lead to changes in the nature of economic, political and social relations, family, everyday life, leisure, lifestyle, and overturn traditional ideas about values ​​of agricultural and industrial production. And at the same time, informatization of all spheres of life of a modern person, in other words. its pessimistic critics, is accompanied by dehumanization, gives rise to a new virtual reality of existence in an illusory world, unknown to previous eras. From a socio-psychological point of view, informatization destroys the usual natural rhythms and life cycles of people; from the moral and ethical level - it displaces the value and attractiveness of live communication, empathy, and understanding; from the political point of view, it sharply increases the possibilities of manipulating mass and individual consciousness, the influence of the “fourth estate” - the media, changes the potential of the power elites, including by moving the powers and capabilities of the latter from the domestic sphere to the area of ​​​​the hierarchy of interstate relations.

From a socio-historical perspective, the negative manifestations of informatization can be designated as the apotheosis of rationality, bringing the European classical type of rationality to a logically completed form of information domination on a planetary scale. In the modern typology of information research, there is a noticeable tendency among “optimists” to develop, based on the theory of information, a general metatheory and a universal information metalanguage for scientific and non-scientific fields of knowledge. The well-known Russian researcher I. I. Yuzvishin, developing a new generalized science - information science (1993), proposes the concepts of the I. code of man and the Universe, information approaches to maintaining health and increasing longevity, building a new world community, etc. The goal of the future is seen in the creation of a single the world distributed information-cellular community of a new information civilization, and in the epistemological aspect - in a revolutionary breakthrough through information into transcendental worlds.

TO primary sources m. information rel. data obtained during primary marketing research, for example, the results of surveys, experiments, questionnaires.

As sources of secondary information m. act externally. and internal sources, which can be classified as follows:

    to internal sources of information m. include accounting, statistical, warehouse accounting data of the enterprise, cost calculation, short-term profit calculations, etc.;

    external sources of information incl: personal, non-formalized information obtained in the process of communication with journalists, distributors; reports, prospectuses, business catalogs; data from periodicals, INTERNET, etc. Among secondary sources of information, special attention today is paid to automated databases.

Scheme marketing information systems as follows:

10. Characteristics of information. Types, sources, requirements for it.

The following types of marketing information are distinguished:

1. Secondary,T. That is, this is data previously collected for other purposes.

The benefits of such information:relatively inexpensive; quick collection of information; availability of multiple sources of information; information from independent sources; usually reliable, etc.

Flaws: may not meet the requirements due to its incompleteness; may be outdated; the methodology for collecting data is unknown; partial character; presence of contradictions, etc.

Secondary information is divided into internal and external.

Inside information– this is information available within the company: budgets, sales data, profits, losses, customer accounts, inventory data and much more.

External information– this is data from external sources: governmental and non-governmental.

Government data includes statistical data and descriptive material on many issues (pricing, credit, etc.)

Non-governmental are periodicals, books, monographs, non-periodical publications.

2. Primary,T. e. newly acquired information collected to solve a specific problem.

Firms resort to this type of information when secondary analysis cannot provide the necessary data.

Advantages: is collected in accordance with certain goals and objectives; the methodology for collecting data and the availability of control are known; all results are available and known; the information is not outdated; absence of contradictions; reliability of the information received; obtaining information on all questions.

Flaws:quite expensive; large expenditures of time and labor; inability to obtain certain types of information (census data); the firm's inability to collect primary data.

If it is necessary to obtain primary data, the company is forced to develop a plan and methods for obtaining it.

Product positioning

Product positioning is a set of measures and techniques with the help of which, in the minds of target consumers, a given product takes its own place, different from others, in relation to competing products, including the formation of a competitive position and a complex of detailed marketing.

M.Video is the largest retail chain in terms of sales of electronics and household appliances in Russia. M.Video has been operating since 1993 and today operates more than 400 stores in 169 cities of the Russian Federation throughout Russia from Kaliningrad to Vladivostok.

M.Video is the first network on the Russian market to implement a full-fledged omnichannel approach to sales - a single assortment, price and service both when purchasing in stores and online. The retailer offers its customers more than 20,000 items of audio/video and digital equipment, small and large household appliances, media products, and accessories. M.Video stores have a uniform format and a special design concept. M.Video is also developing the m_mobile project - special areas in stores focused on sales of smartphones and related devices and accessories. The zones have dedicated cash registers, a large assortment of new products on open display, and consultants will help you choose a comprehensive solution based on the buyer’s needs at the best possible price.

In addition to its efficient retail format and customer-oriented store concept, the company offers customers high-quality service support under the M.Service brand. M.Credit experts will help you quickly fill out the form and get approval from several partner banks at once.

In 2016, the M.Video brand entered the top 50 most valuable Russian brands according to the British consulting company Brand Finance and was recognized as the most successful among Russian non-food retail chains. The M.Video network is also among the top 10 best employers in Russia at the end of 2017. M.Video is the only Russian retail company included in the rating of the best employers in Russia according to Aon Hewiit and AXES. The M.Bonus loyalty program was recognized as the best among non-food retail and the best online loyalty program in 2016 and 2017. In 2016, the company became the winner of the professional rating of Russian E-commerce and the winner of the Grand Prix for the best online store.

The M.Video retail chain is part of the M.Video-Eldorado Group (M.Video PJSC), which unites the retail brands M.Video and Eldorado in the household appliances and electronics market, as well as the Goods marketplace. The total annual revenue of the companies exceeds 360 billion rubles including VAT. The group operates more than 800 stores in 200 largest cities in Russia. The M.Video Group is the only Russian company in the electronics retail sector whose shares are traded on the stock market. Currently, the company's shares are traded on the largest Russian exchange platform - the Moscow Exchange (ticker: MVID).