Mathematical Machines and Systems. 2019 #4

ABSTRACTS


COMPUTER SYSTEMS

UDC 621.3.019.3

Cespedes Garcia N.V., Cespedes Garcia P.D. Vulnerabilities of computer systems – a threat to the information security of society. Mathematical machines and systems. 2019. N 4. P. 3–8. 

This paper provides a description for a certain computer equipment components which allow remotely gain unauthorized access to computers. These components are Intel Management Engine (Intel ME) and Intel AMT. Intel ME is an autonomous subsystem that is integrated into almost every Intel processor chipsets since 2008. The chipset is always connected to a current source (battery or alternative power source), this subsystem continues to work even when the computer is turned off. Vulnerabilities were discovered in Intel AMT, thereafter many computers using Intel processors became available for remote and local intruders. The paper also describes Chinese microchips that have been implemented into Supermicro equipment. This equipment was supplied not only to US commercial organizations, but to governmental as well. Supermicro Chinese microchips have the ability to edit the code stream that heads to the processor by inserting their own code, or else they can change the instructions order for the processor. The chip can also “steal” secure communication encryption keys, and block security updates that were intended to neutralize the attack. The paper also provides an overview for recent sensational vulnerabilities of Meltdown, Spectre and ZombieLoad in Intel and ARM processors that allows manipulating a computer to one degree or another. These vulnerabilities are similar to each other, they allow a malicious application to read any type of computer memory, including kernel. It became feasible thanks to a speculative code execution system. Personal user data can be stolen, such as browser history, website content, passwords, or system data, such as disk encryption keys. Security experts should take into account the points above, as in certain cases this could possibly turn out into national scale problems, both financial and political. Refs.: 6 titles.


UDC 004.83

Pogoriliy S.D., Kramov A.A., Yatsenko F.M. A method for analyzing the coherence of Ukrainian-language texts using a recurrent neural network. Mathematical machines and systems. 2019. N 4. P. 9–16. 

The urgency of solving the problem of text coherence estimation has been justified in the paper. The comparative analysis of the corresponding methods of computer linguistic has been performed. The automated estimation of text coherence falls into a category of natural language processing; therefore, it should be considered as an AI‑complete task. In order to perform the estimation of text coherence machine learning methods and computer linguistics means are used. It is advisable to use neural networks because such an approach does not require expert knowledge. It should be noted that the semantic component of a text should be taken into account. The semantic formalized representation of text units (words or sentences) is performed using a previously trained model according to a subject area of information. The corresponding neural networks can be designed using convolutional and recurrent layers that allow the processing of input data with unfixed size (words and sentences). The principle of the distributed sentence representation method using a recurrent neural network has been considered in detail. The key advantage of recurrent layers is the availability of feedback connections within neurons: the initial value from the previous step goes to the input of the neuron. Such an approach shows the process of reading a text by a reader because the analysis of current information should be based on previously retrieved knowledge. A recurrent neural network has been created. The training process of the network on the set of Ukrainian scientific articles has been performed. In order to increase the accuracy of the method, the previous processing of articles has been made. The articles contained incorrect symbol sequences due to the automated extraction of its content from PDF files. The analysis of method accuracy has been implemented basing on experimental examination of the method on document discrimination task and insertion task. The results obtained can indicate the method based on recurrent neural networks can be used for the coherence estimation of Ukrainian texts. Тabl.: 1. Figs.: 4. Refs.: 11 titles.



     INFORMATION AND TELECOMMUNICATION TECHNOLOGY


UDC 004.91

Dodonov O.G., Nikiforov A.V., Putiatin V.G., Kniaz I.V. Situational control of forces on the example of air defense units. Mathematical machines and systems. 2019. N 4. P. 17–37.

The constantly growing scope and transience of warfare modern processes lead to a situation of reaching the threshold of human capabilities for processing diverse information during the course of managing forces. Decisions made during the force application process (especially air defense forces), as a rule, are taken in conditions of acute time pressure and, at the same time, require the coordination (refinement) of many parameters before issuing control commands. To solve this problem, a situational management approach can be used – determining managerial parameters on the base of automatisms, which are formed in advance if there is a sufficient margin of time. In order for the used automatisms to take into account the specifics of specific managerial situations, it is necessary to create situational management databases that are adapted to specific processes of the force application process. In practice, such data is created in the course of the daily activities of troops during the operational training of headquarters. The paper proposes a form for presenting these data, as well as the procedure for filling and further using the database. The general model of the implementation of the automated process of situational control of forces is presented on the example of combining air defense using problem-oriented knowledge bases. The structure of special software for a promising automated system, the basic mathematical methods and models necessary to develop complexes of functional tasks of special software are described. Some mechanisms are proposed for the functioning of the headquarters of units and subunits of air defense units, equipped with special automation equipment, under the distributed filling of an adapted database and the implementation of situational management on this base. Таbl.: 1. Figs.: 7. Refs.: 27 titles.


UDC 004.056.5

Lytvynov V.V., Stoianov N., Skiter I.S., Trunova O.V., Grebennyk A.G. Using decision support methods under the searching of sources of attacks on computer networks in the conditions of uncertainty. Mathematical machines and systems. 2019. N 4. P. 38–51.

It was proposed theoretical and, in part, methodological approach to the creation of special decision support systems (DSS) to search the most likely sources of attacks. Systematic, generalized and developed ideas about methods decision support searching the sources of attacks on computer networks. It was proposed structure scheme of the process of finding sources of attacks based on decision-making methodology, the main goal of which is to minimize the number of options put forward so that at the next stages a limited number of alternatives should be evaluated in detail. Each stage was briefly characterized separately, namely: identifying a problem situation, formation and coordination of the target hierarchy, formation of alternatives (solutions), analysis of alternatives, and formation of selection criteria, optimal choice, and solution of a problem situation. In particular, the problem of multi-criteria choice and its system model, generalized methods of solution are considered. The proposed solution to the problem situation involving the method of determining fuzzy judgments DM, which combining classical approaches with fuzzy automatic control based on expert judgments and the idea of splitting the criterion space into domains, the combined method of decision support. It was proposed structure of DSP, with the help of which the choice of alternatives (solutions), which is the definition of the most likely source of attacks, allows to use general information about the threat of cyberspace for a given network, which is defined as the number of unknown attacks per unit of time, set up adaptive SDA. It also allows developing a new generation of firewalls that are based not only on protocols, but on address information from network packets as well. Figs.: 4. Refs.: 15 titles.


UDC 004.02

Kalmykov V.G., Sharypanov A.V. Determining the boundaries of grayscale image. Mathematical machines and systems. 2019. N 4. P. 52–64. 

A grayscale image contains information about an object of interest. Part of the image, free from the object of interest, belongs to background. Known methods of recognition, in particular, statistical methods involve processing the entire area of image. If the background changes unpredictably, processing and recognition quality of such image becomes problematic because the boundary, that separates the object of interest from background, is unknown. The task of object boundaries detection in grayscale images, with heterogeneous background and in the presence of noise with the purpose of further use of these results in recognition problems is urgent. Detection of object boundary in human visual system happens imperceptible, at a subconscious level, even with a high level of noise, which is caused by variable resolution mechanism in human visual system. The paper considers the method and the algorithm for detection of objects boundaries in grayscale images that are presented as discrete implementations of an unknown piece-smooth functions. For adequate presentation the image is considered as a set of cellular complexes, which makes it possible to reproduce the boundaries of objects as one-dimensional lines without thickness, which, in turn, allows to accurately display the smallest details of the object shape at a given resolution. Variable resolution is used in image processing, which enables automatic image processing in the presence of noise without first specifying its parameters. The results of object boundaries detection experiments in a grayscale image are presented in comparison to the results obtained using known methods. Figs.: 5. Refs.: 15 titles.


UDC 004.891.2

Kovalenko O.Ye. Principles of situational systems engineering. Mathematical machines and systems. 2019. N 4. P. 65–78. 

The unpredictability, randomness and variability of environments in different fields of activity requires the use of adequate approaches and principles in the process of carrying out such activities. This approach is a situational approach for organization of activities within relevant situational systems. The analysis of situational management models (situation-oriented behavior) showed that the processes of situational management are realized within the perceptual cycle (having a cyclical nature) and include stages of empirical awareness of the state of the environment (target domain), constructing of its model and application of this model in the formation of rational behavior in environment based on periodic updates of awareness of the current state of the environment (target domain). The awareness of the state of the environment and the formation of rational behavior on its base are carried out using the mechanisms of logical reasoning, corresponding to the stages of the perceptual cycle. Such mechanisms of logical reasoning in the cycle of situational interaction with the environment are induction, deduction and abduction. Since situational management is generally concerned with unforeseen accidental events, the target situational system should be adapted to the specific situation. Such adaptation might be implemented based on the ad-hoc system architecture. The principles of systems engineering might be applied with appropriate refinements to situational systems, since the creation of a system of systems is usually determined by situational factors. The key principles for the engineering of situational systems, as well as systems of systems, are interoperability and coordination between its constituent systems. The specific principles of situational systems engineering are the use of means of describing situational semantics and situational behavior (situational calculus). Tabl.: 2. Figs.: 4. Refs.: 29 titles.


UDC 519.876.2:336

Neskorodieva T.V. Formalization method of the first level variables in the audit systems IT. Mathematical machines and systems. 2019. N 4. P. 79–86.

The article is devoted to the problem of creating information technologies for decision support systems in auditing. The problem is relevant because the created IT must provide analysis of large amounts of data. This analysis should be invariant with respect to the features of the economic and production activities of the enterprise and its accounting system. It is noted that the data characterizing an enterprise as an audit object is characterized by a global multi-level hierarchical structure of heterogeneous, multi-factorial, multifunctional connections, interdependencies and interactions of its subsystems, from IT control, accounting, management, business, to other IT
and systems in structure of the information system of the national economy. It is noted that to create IT conforming to these requirements in previous works, the author proposed a method of generalized-multiple display of audit data. This paper addresses the problem of formalizing first-level variables for the tasks of audit accounting standards. The proposed method of variables formalization, invariant with respect to the characteristics of the enterprise, types of objects, accounting operations and correspondence relations between them. The method is based on the formalization of relations of correspondences of the characteristics of objects and operations of two tasks in the form of correspondence graphs. The variables are also structured in relation to their levels at which the calculation of the corresponding generalized indicators is carried out. This will allow for the analysis of interrelated levels. The technique is illustrated by the example of the first local task of calculations with suppliers, which is formalized in the form test of display a “paid-received”.  Tabl.: 1. Refs.: 12 titles.


UDC 004.942:930.24

Khymytsia N.O., Holub S.V. Intellectual analysis of the results of the cliometric monitoring. Mathematical machines and systems. 2019. N 4. P. 87–92.

In order to increase the adequacy of the results of clinical studies, information technology of multilevel intelligent monitoring is used in conjunction with expert estimates of simulation results. The results were interpreted in the framework of the classical concepts of historical science. This contributes to attracting to the analysis of sources of systems of artificial intelligence (knowledge bases, expert systems, cognitive computer models of understanding of the text, frame systems), in which the knowledge of historians is modeled. When forming an array of numerical characteristics of certain historical periods, there is a tendency towards increasing the proportion of historical sources created by the collective method. The application of intelligent monitoring technology allows automating the processing of historical data and improving the efficiency of research and the adequacy of the findings. The results of the application of the technology of multilevel intelligent monitoring for solving one of the problems of climmetry are presented. The problem of determining the similarity of historical periods was solved. The list of features describing historical periods is determined expertly. The length of the time interval is selected based on the results of processing the statistics. Numerical characteristics of the selected signs were formed during the same time intervals and formed a vector of signs for each of the historical periods. Vectors of signs of historical periods were subjected to clustering by the results of modeling. The model synthesis method was selected separately for the formation of each cluster based on the results of testing each of the algorithms for the synthesis of models of the monitoring intellectual system. In most cases, model clusterizers were built on a multi-row GMDH algorithm. The processes of the formation of the input data array, the synthesis of models and the determination of the effect of the signs are described. The feasibility of using a new method for determining the similarity of historical periods has been experimentally confirmed. Таbl.: 1. Refs.: 16 titles.


UDC 004.9:004.75

Lysetsky Yu.M. An integrated approach to data management. Mathematical machines and systems. 2019. N 4. P. 93–99.

Every day enterprises and organizations from different branches of economics have to solve numerous tasks connected with procession and storage of growing data volumes, access control, support and business continuity. At the same time data storage infrastructure has to provide rational resource utilization, be flexible and minimize expenses on information management. In view of the fact, that today information is one of the most valuable assets of any enterprise and organization, it is necessary to pay more attention to complex approach to data management. Thus, such tasks appear to be urgent and require qualitatively new solutions for data management and provisioning of effective functioning of enterprises and organizations. The paper describes essential concepts of data storage and management as well as considers complex data management with CommVault Simpana platform, its structure and basic components. Functionality of CommVault Simpana is analysed in comparison with similar products of other manufacturers. The features of different versions are described; a special attention is paid to their capabilities in the field of virtual platforms and cloud integration, as well as protection of workplaces taking into account mobile devices. Future development of Simpana platform is described. The development is oriented to its productivity growth together with expansion of number of corporate users, active support of intellectual analytics systems working in real time mode. All these improvements will significantly increase architecture scaling capabilities while new indexation methods will increase the productivity. It is demonstrated that complex approach to data management is a priority for organizations and enterprises of corporate level as it significantly improves efficiency of data management and as such the efficiency of functioning of enterprises and organizations in general. Figs.: 2. Refs.: 4 titles.


UDC 004.42

 

Golub B.L., Vetrova D.V., Pronishina K.O. The program system of formation of the educational schedule in institution of higher education. Mathematical machines and systems. 2019. N 4. P. 100–109.

This paper is about solving problems related to the lack of a unified electronic records system and outdated methods of scheduling in higher education institutions. One of the most important tasks of ensuring the quality organization of the educational process in the institution of higher education is the automation of the process of formation of the educational schedule. Properly and accurately compiled schedule ensures uniform loading of the auditorium, student groups, and academic teaching staff. In higher education institutions, as in any enterprise, inevitably undergoing automation and, despite the fact that the concept of educational activity is unique to all educational institutions, in every higher education institution, this process is different. The availability of cash and the willingness to use existing software solutions have a significant impact on automation processes. The latter is related to the specifics of the processes in each higher education institution and the human factor. It is suggested to use a software product that will greatly simplify the process of workflow and scheduling, will allow to move away from manual and paper work. The latter is due to the automated execution of data storage and transmission functions, validation, information dissemination and etc. Intended distribution of schedules for students and teachers through the online service. This will allow to access the schedule conveniently. The proposed software system can be used by any institution. Figs
.: 9. Refs.: 3 titles.

UDC 37.091.2+004.5

Kraskevych V.Ye., Yurchenko Yu.Yu. Using educational situation centers for resource management. Mathematical machines and systems. 2019. N 4. P. 110–116.

The article considers the expediency of using educational situation centers (ESC) for resource management. The application of information technologies in the process of learning in situation centers and the principles they implement are considered: the principle; the systemicity and consistency principle; the principle of education individualization; the activity principle. The application of training situational centers for the formation and development of practical management design skills in students is considered. The aspects of protecting important information from leakages using information technology in educational situation centers are considered and analyzed. The basic principles of the classification of data, which should be used in the classification of information that needs protection are given. The main types of network security tools that are used in constructing a system for the protection of critical information are presented. The process, development and introduction into the educational process of ESC for the training of future users of SC, as well as further developers of such centers in different sectors of the national economy of Ukraine are considered. The using of solutions to protect information in situation centers from sources based on information technologies is given. The question what needs to be protected is considered, which contains three aspects: confidential information, sensitive data, commercial secrets. Specifies that in each particular case, the answer to this question depends on what kind of information the organization considers for itself as the most valuable. Figs.: 2. Refs.: 3 titles.


                         SIMULATION AND MANAGEMENT

UDC 004.54; 004.77

Davies J.N., Verovko M.V., Posadska A.S., Solomakha I.V. Usage of simulation for QA in real-time network environments. Mathematical machines and systems. 2019. N 4. P. 117–125. 

ISO 9000 design and development quality standards are widely used in software. Most companies seek to gain accreditation to ISO 9001 to improve their overall performance. At the same time, compliance with these standards is necessary for some industries. Many companies are now relying on information systems for doing business, that is why huge efforts are being made to accredit IT companies in these areas. Usage of software incorporating networks in almost all areas of modern life significantly increases the importance of the stability of their operation and hence the quality of the performance. Quality assurance process (QA) is a powerful instrument which helps to identify and eliminate potential failures of an information system. However providing a high quality level requires considering the impact of all system components. This issue is devoted to software testing in real time based on various performance computer networks, allowing emulate conditions as close as possible to the end-users. The cost-effectiveness analysis of quality assurance processes was also analyzed. Algorithms of users’ interaction with existing real-time systems with different network performance indicators are investigated in order to determine the most frequent user actions that cause software failures and possible prevention of the consequences of these actions during the testing process. The authors propose general approaches to testing the performance and usability of the system in the event of a permanent deterioration of the network. Таbl.: 3. Figs.: 4. Refs.: 14 titles.


 
UDC 519.237.5

Lapach S.M. Determination of emissions in the correlation and one-dimensional regression analysis. Mathematical machines and systems. 2019. N 4. P. 126–138. 

In this paper, it is considered the method of determination of emissions in case of preliminary processing of data for correlation and one-dimensional regression analysis. The classification of emissions according to their nature is proposed. The algorithm developed on their determination is based on the consistent application of the jack-knife technique and the elements of distance calculations using the LSD-criterion, using instead of the mean values of the correlation coefficients and confidence intervals instead of the critical distances. The application of the algorithm is shown in the examples for different conditions of its use. The whole spectrum of possible conditions of application is considered: from those in which the method determines the emissions accurately, to those in which it does not work or the results of its work are ambiguous. It is established that described in the paper method allows to automatically determine the emissions, regardless of their nature, in the linear relationship between the variables. In the nonlinear relationship between variable emissions, they are determined in the case of linearization of variables (linearization in such situations is a prerequisite for data analysis). In questionable and uncertain cases, and with complex dependence, the algorithm defines outliers as those experiments, the removal of which leads to an improvement of the linear regression model (a priori, prior to its construction). The method does not work in the case of outliers that compensate each other, but in such situations, the presence of outliers does not lead to a significant shift of the model coefficients. The regression models have been constructed, which show changes in the characteristics and values of regression coefficients of the model under the influence of outliers. Using the method allows to identify questionable experiment points for further decision making. The method can be used in automated information processing systems, since it automatically guarantees the presence of outliers or improves the linear regression model. Таbl.: 14. Figs.: 9. Refs.: 12 titles.


UDC 004.82, 004.89, 004.853, 004.855, 005

Zhyriakova I.A., Holub S.V. A new approach to the formal explicit specification of knowledge acquisition. Mathematical machines and systems. 2019. N 4. P. 139–145. 

The decision-making process in many subject areas is a rather nontrivial problem. The use of “smart systems” to support decision-making processes requires new approaches to knowledge acquisition and storage. A new concept of building models of model knowledge is described. At the heart of the concept is the assertion that knowledge is displayed in the form of a hierarchical combination of multiparametric models. A comprehensive analysis of existing approaches to knowledge work and the choice of their presentation model is envisaged. In accordance with the methodology proposed by the authors for the creation of models of model knowledge, a new method of ontological modeling is described. This method is based on fundamentally new approaches to the formation of a model of presentation of knowledge, to the construction of the architecture of the bases of model knowledge and the intellectual system as a whole. This allows to expand significantly the functionality of intelligent systems, including monitoring intelligent systems, automate the processes of synthesis of models and global functional dependencies, reduce the influence of the subjective judgments of the researcher on the results of data conversion. The results of investigations of ontological modeling methods used for formalization and systematization of elements of global functional dependences, which combine models of objects of observation, are described. These models are solutions to local problems of information transformation in selected subject areas. Depending on the purpose of the intellectual system, a combination of global functional dependencies is realized at the highest level, and thus the formation of the structure of the base model knowledge. Using the strategy of coordination of elements, which is used by the monitoring intellectual system, allows to form synergies between individual local models in the structure of global functional dependence and between the global functional dependencies at the highest level of the model knowledge base. Figs.: 3. Refs.: 12 titles. 


   QUALITY, RELIABILITY AND CERTIFICATION OF COMPUTER TECHNIQUE AND SOFTWARE

UDC 621.3.019.3

Mukha Ar.A. Quantitative assessment of the level of computer systems dependability. Mathematical machines and systems. 2019. N 4. P. 146–153. 

This article discusses the issues of present interest of quantitative assessment of the level of computer systems dependability. A proposed method allows having obtained the numerical values of the dependability of system levels of the system under study, to analyze the system under study, as well as decide on the preference of a particular implementation variant of the system dependability. For this purpose, an attribute model of the computer systems dependability (AMD) is presented, which is based on the basic properties (attributes) of dependable computer systems (DCS) and on the metrics of these properties. The definitions of basic attributes and metrics, their calculated and qualitative characteristics are given. The problem of formalization of a generalized criterion of the level of achievement of the dependability system performance is solved. A formalized criterion for assessing the level of dependability, which is a linear functional, is proposed. Each attribute of the dependable system is considered as a set of metrics that can be measured by computational, experimental, or expert methods. On the basis of quantitative estimates of metrics, quantitative estimates of attributes are calculated which makes it possible to calculate quantitative estimates of the achieved level of dependability of the system under consideration for various variants of its implementation. The above criterion also takes into account the weighting factors of the influence of attributes and metrics of the system under study. Quantitative estimates of the level of performance of metrics are relative values, normalized to the required in the specification or limit values. On the basis of a simplified example, the calculation of the quantitative assessment of the level of computer system dependability is demonstrated. On the basis of the given example, it is shown how to obtain the numerical value of the level of system dependability under study. The described method of calculation allows analyzing detailed the values of the levels of dependability of various options for the implementation of the investigated CS, and to make reasoned conclusions about the advantages of one system being created over another. Тabl.: 9. Refs.: 4 titles.


UDC 681.3 (075)

Kostanovskyi V.V. Determining the area of existence of reliability indicators depending on the acceptable values of the performance indicators of the active phased array antenna. Mathematical machines and systems. 2019. N 4. P. 154–164.

The paper considers the determination of the range of possible values of the reliability indicators of active phased array antennas (APAA) of multifunctional radars. The modernized probabilistic-physical model of the APAA reliability is presented in the form of a system of restrictions on the values of the antenna array efficiency indicators: the relative radiation power and the minimum level of the near side lobes of the amplitude-phase distribution of the APAA beam pattern. It is shown that during operation, the values of the APAA efficiency indicators can be in the field of operable or in the field of inoperative states, which in general can be represented as an mGmmatrix. A structural diagram of the APAA reliability in the form of a three-level hierarchical structure is proposed. The author refined the criteria for failures and presented two new definitions of failures of APAA. The formulas and equations for calculating the mean time between failures and the probability of failure-free operation of the phased sublattice antenna and the APAA as a whole have been clarified. The application of the upgraded probabilistic-physical model of the APAA reliability is illustrated by an example of the calculation of a hypothetical APAA having 6400 SCDs arranged in 100 antenna sublattices. It is assumed that failures of the control panel, receiver modules and secondary power supplies are distributed exponentially. The results of calculations of reliability indicators are presented in the form of figures and a matrix of APAA states with a size of 10 × 15. Based on the results of consideration of the calculation of the matrix elements, the fields of operating conditions and the fields of nonoperable conditions of the APAA are determined. The modernized probabilistic-physical model of the APAA reliability proposed in the paper can be useful in the design and operation of multifunctional radars. Таbl.: 1. Figs.: 3. Refs.: 11 titles. 


                                          

 

 

       Last modified: Jan 13, 2020