Currently, there is a growing number of publications devoted to the analysis, evaluation, and optimal selection of AI models for specific applications, as well as market offers for services comparing and selecting AI tools for specific tasks. Along with this mainstream information, it is interesting to review the capabilities and characteristics of tools based on AI models’ «opinions» about themselves and their «colleagues». The purpose of this article is to conduct an experimental comparative review of model characteristic assessments and identify possible trends in them. The current generation of popular, accessible intelligent assistants, aimed at a wide range of users and their tasks, have been chosen for review. In particular, mutual assessments of the characteristics of ChatGPT versions 3.5+DALL-E, 4o, and 5, DeepSeek V3, Gemini 1.5 and 2.5, Claude Sonnet 3 and 4 are obtained and discussed. Cross-referenced 5-point self-assessments of current model generations according to specified criteria and their assessments of the expected capabilities of ChatGPT-5 are provided and analyzed. The advantages of models are discussed in the context of balanced assessments and recommendations for their application. Ambiguity, inconsistency, and possible deviations from objectivity have been identified. Models often demonstrate bias by overestimating their own capabilities. Among the possible reasons for this (obsolescence of some knowledge bases, some kind of hallucinogenicity, etc.), the most likely is considered to be the competitive and marketing orientation of models from different manufacturers. Therefore, when using the models themselves for relevant assessments and selection, it is essential to weigh their opinions collectively and exclude self-assessments. The emergence of GPT-5 is noted for its significance in changes to the «competitive landscape» and approaches to model evaluation criteria. Таbl.: 7. Refs.: 10 titles.
In recent years, the transfer of critical data from institutions and enterprises to the cloud has been rapidly gaining momentum, so researching the capabilities of cloud technologies, particularly those offered by Oracle, is quite relevant. Oracle Cloud offers three types of cloud services: SaaS (Software as a Service), PaaS (Platform as a Service), and IaaS (Infrastructure as a Service), each of which solves its own tasks for different needs. Oracle Cloud Infrastructure offers a wide range of cloud services focused on high-performance computing and data storage and provides flexible and powerful solutions for computing tasks, including virtual machines and bare-metal servers that support various architectures, such as x86 and ARM. Special attention is paid to options that include high-performance graphics processing units (GPUs) suitable for intensive AI computing. These GPU instances include NVIDIA graphics processors (H100, A100, L40S), which are ideal for training large language models and other resources for AI processes. In addition to GPU instances, Oracle supports container environments and serverless infrastructure, simplifying the deployment of cloud applications and microservices. Storage services include a wide range of features designed for different data storage needs. This enables efficient management of large amounts of data in real time, which is especially important for critically important business processes. Together, these capabilities create a powerful and flexible infrastructure stack that is suitable for a wide range of tasks, from database management to high-performance computing and long-term data storage. Support for different architectures and high-speed networks enables Oracle to compete with leading cloud providers and meet the requirements of companies that need efficient and reliable cloud solutions for their business. Figs.: 4. Refs.: 5 titles.
The problem of visualization and visual analytics in modern information systems of organizational management is considered. The difference between visualization and visual analytics is clarified. Various techniques, such as military maps, bar charts, organizational charts, and their connection with specific analytics goals, are analyzed. The role of integrating individual tools into a unified environment is analyzed, which contributes to a common understanding, in particular through the use of military maps as a foundation. The paper shows that the use of dashboards for the aggregation of multiple metrics enables a more rapid comprehension of operational status. The transformation of complex geospatial and tactical information into intelligible formats for command and control is considered. To check the effectiveness of visualization tools in information systems of organizational management, it is proposed to use the method of visual form evaluation, which makes it possible to calculate the values of their characteristics based on the concept of entropy. Information content and complexity are quantified through entropy, enabling the assessment of detail preservation and noise levels. Thus, visualization tools are optimized to become a key component of visual analytics, which will improve decision-making. Special attention is given to the visualization of ontologies, which enables semantic structuring of knowledge domains and facilitates interoperability among decision-support tools. Visualization tools are adapted to the needs of visual analytics, using such indicators as usability, interactivity, adaptability, and cognitive load reduction. A list of methods tested within the framework of complex systems modeling and demonstrated effectiveness in visualizing information that is critical for managerial decision-making, having been validated within the modeling complex, has been obtained.Figs.: 8. Refs.: 15 titles.
One of the urgent tasks of environmental management is the efficient recycling of waste based on its sorting. In an intelligent waste sorting system, the image processing algorithm is a key component that determines its efficiency and accuracy. In open datasets, objects in images are usually large, different from those taken in real sorting line conditions. This leads to the lack of samples with small objects. As a result, the model may have difficulty with detection, manifested in missed recognitions or inaccurate positioning. The article combines materials from open sources and photographs obtained in laboratory conditions to create a specialized dataset for visual waste sorting. It included images with a single object, multiple objects, blurred objects, a laboratory background, and a complex background. All images are divided into four main categories: food waste, recyclable waste, hazardous waste, and other waste. The efficient, lightweight YOLO model has been chosen as the base model due to its fewer parameters and high processing speed to achieve low energy consumption and cost efficiency in the intelligent sorting system. The paper analyzes popular image processing and recognition algorithms using convolutional neural networks and, based on them, proposes an improved composite algorithm for garbage detection and classification. A computational experiment has also been conducted to confirm the effectiveness of the proposed algorithm on real training samples. It is shown that the overall performance of the proposed algorithm exceeds current popular object recognition algorithms, ensuring efficient and accurate garbage recognition. Tabl.: 6. Figs.: 18. Refs.: 18 titles.
This article explores a modern approach to environmental monitoring of ambient air quality using OLAP (On-Line Analytical Processing) technologies. The relevance of this research stems from the high levels of air pollution in urbanized areas, which lead to serious environmental and social consequences. Existing monitoring systems often lack real-time access to multidimensional analytics and are unable to effectively support decision-making processes. The proposed information-analytical system is based on OLAP cubes, enabling flexible data analysis across various dimensions — such as time, location, pollutant type, and quality indices. Two separate analytical blocks have been developed: one for analyzing measured concentrations of harmful substances (Measurements), and the other for detecting exceedances of regulatory thresholds (ThresholdExceed). Data are integrated from multiple sources, including automated stationary stations and community-based sensor networks. The system is implemented in Power BI, providing interactive visualizations, KPI indicators, and analytical dashboards. It enables anomaly detection, seasonal pattern analysis, and a comprehensive overview of pollution dynamics. In particular, abnormal PM10 values were recorded in 2024, along with radiation spikes observed in 2022. The proposed approach demonstrates the effectiveness of OLAP technologies as tools for developing flexible, scalable, and visually-oriented environmental analytical systems that can be integrated into national environmental monitoring frameworks.Figs.: 9. Refs.: 11 titles.
Modern agricultural technologies involve the widespread use of information and technical systems to obtain timely information about the agrobiological state of the soil environment. Such information is the basis for operational management of the agrobiological potential of fields. Therefore, there is a need to develop new methods for operational monitoring of the agrobiological conditions of agricultural lands and for operational decision-making regarding the use of variable rates of technological material application. It is obvious that under such conditions there is a need to develop and use fundamentally new approaches to conducting agro-industrial production, aimed at ensuring the proper quality of technological operations. One of the modern elements of the information and technical system is a system for measuring the conductive properties of the soil environment. Therefore, analyzing the hardware of the information and technical system for measuring the electrical conductivity of the soil environment is important and relevant in modern scientific research. The article reviews the requirements and architecture of the hardware of the information and technical system designed to measure the electrical conductivity and related parameters of the soil environment. The measurement methods (two-electrode, four-electrode, and electrical tomography) are analyzed, the components of the hardware are described (current source/excitation, electrodes, amplifiers/filters, ADC, controllers, communications, power supply), and recommendations are given for the selection of components, calibration, improving the accuracy and stability of measurements, as well as testing and validation approaches. Additionally, the issues of protection, grounding, interface unification, and integration with software for data processing and transmission are discussed. Tabl.: 1. Refs.: 15 titles.
The work presents the adjustment and evaluation of the WRF meteorological model for forecasting meteorological conditions accompanying storms over the Black Sea (BS). For the first time, modeling of meteorological conditions over the surfaces of the Black and Azov Seas has been carried out for the most powerful storm that occurred between February 4 and February 11, 2023. The synoptic conditions of the storm occurrence have been analyzed. The modeling has been carried out using computational grids with resolutions of 0.15, 0.05, and 0.025degrees. The results of calculations regarding the impact of resolution on the modeling results show that when the grid step is reduced threefold (from 0.15 to 0.05 degrees), the absolute errors of meteorological parameters are reduced by 8.4% for pressure, 5.3% for wind direction, 6% for wind speed, and 16% for temperature. Further reduction of resolution from 0.05 to 0.025 degrees only led to a slight reduction in errors. The best values of the average absolute errors of meteorological parameters for the considered scenario have been obtained using a grid with a spatial resolution of 0.025degrees: 1.17°C (temperature), 2.37m/s (wind speed), 31.79degrees (wind direction), and 79.19Pa (surface pressure). The best agreement between the simulation results and measurements has been obtained for weather stations located in the northwestern, western, and southern parts of the BS. The degree of influence of the spatial resolution on the results of the simulated wind fields varied depending on time and location within the computational domain. The impact of the grid spatial resolution is more significant near the northern and eastern coasts of the BS, which is most likely associated with the more complex topography of the terrain in the eastern part of the BS coast. In conclusion, the adjusted WRF model can be further used for operational forecasting of meteorological conditions accompanying severe storms in the Black Sea, including its interaction with the models of oceanographic conditions.Таbl.: 3. Figs.: 5. Refs.: 15 titles.
This paper presents an approach to the embedded system software implementation based on a Model-View-Controller (MVC) architecture, specially modified for the multi-agent domain. It aims to solve the poor scalability and single-point-of-failure risks associated with centralized, cyclic-polling architectures in unmanned aerial vehicle (UAV) swarms. In this novel interpretation, the UAV swarm itself is re-envisioned as the User, a continuously updated state of the Control E-Network (CEN) (token markings and attributes) serves as the View, and the network’s transitions and listeners function as the Controller, manipulating the model. The model itself is expressed as a CEN that is an extension of Petri nets for control purposes. The places and data of CEN represent the agent’s state, but in general, CEN provides a control algorithm view as a set of transitions and their related places to implement event-driven logic. At the same time, listeners integrate external input signals and generate output commands in a fully reactive manner. The resulting chain-driven, event-oriented execution eliminates cyclic polling, reduces CPU overhead, simultaneously supporting synchronous, asynchronous, and parallel event processing in real time. The paper details the Python-based data structures — places, transitions, queues/stacks, thread pool, and reactive listeners, — together with a dynamic verification method that automatically collects statistics on transition activity and timing, enabling on-the-fly profiling and detection of performance bottlenecks. A comprehensive example demonstrates an agent program with a three-layered (reactive, planning, and cooperative) control model that reacts in parallel to sensor events, performs delayed actions, and synchronizes results through a joining transition, thereby confirming the effectiveness of the proposed approach for multi-agent applications. Figs.: 4. Refs.: 14 titles.
Quality characteristics of rolled products represent a vector of parameters whose components include mechanical properties, surface condition, and geometric dimensions. The required values of the quality parameters for rolled products are specified by national standards (DSTU) and technical specifications (TU) as limited (one-sided or two-sided) tolerance intervals. Quality parameters are formed to a greater or lesser extent during the rolling process and can be described by corresponding mathematical models. Such models are primarily mathematical dependencies of the respective quality indicator on the vector of rolling mode parameters. Automated rolling control aims at optimizing the balance of quality indicators to achieve maximum profit. Under these conditions, an optimization problem with a vector criterion may be posed. However, when reducing the vector criterion to a scalar one, necessary for solving the problem, the issue of determining weighting coefficients for different indicators arises, which in general is usually resolved at the level of expert evaluations. As a result of such criterion construction, the efficiency of optimization for automatic control according to this criterion largely depends on the experts’ expertise. A more rigorous approach to formulating the automation problem, whose solution would ensure maximum automation efficiency, is enabled by the decomposition of the object from the general automation problem into a set of single-criterion problems. The paper considers the formulation and decomposition of the problem of comprehensive automated quality control of rolled products on a heavy-plate mill. It presents a description of the autonomous control tasks obtained as a result of decomposition, which ensure the specified quality indicators of the rolled products. Scientific and technical solutions provided above can be used in the development of automated quality control systems for rolling on heavy-plate mills. Refs.: 18 titles.
Following the previous paper where an evacuation time calculation algorithm was implemented based on manually prepared data, this one formalizes a final end-to-end module for importing a PDF/image evacuation plan and automatically converting it into an object and graph model for further calculations according to DSTU standards. A hybrid Computer Vision pipeline is proposed: vector analysis of PDF objects (lines/polylines/texts/layers); raster branch using OpenCV (HSV segmentation of green «EXIT» signs and arrows, wall detection using LSD/Hough, door detection as local «gaps», morphological skeletonization, and distance transform for estimating the local passage width). The module output consists of a set of objects (exits, doors, stairs, sections, passages) and a directed graph (V, E) with attributes of length and width, directly consumed by the software from the first paper [1]. To evaluate the performance of automatic evacuation plan import, several key factors are considered. First, we check whether the system correctly finds the exits. Next, the accuracy of recognized directional arrows, which indicate movement orientation, is assessed. For corridors, it is important that the generated «passage mask» matches the actual layout. Additionally, the connectivity of the graph is analyzed — whether it is possible to reach an exit from any point — and whether the routes can be fully reconstructed. Moreover, the final outcome, the calculated evacuation time, is evaluated and compared with manual calculations according to the DSTU methodology. Thus, both the correctness of object recognition and the accuracy of the final calculations are verified. The result is the elimination of manual input to the program, reproducibility, and suitability for reporting. The module is integrated with the previously developed object-oriented software (classes Area, EvacuationRoute, Metrics; BFS/DFS), ensuring a full cycle.Figs.: 8. Refs.: 17 titles.
The article is devoted to the study of current trends in the development of the dairy industry with an emphasis on the integration of mathematical modeling and artificial intelligence tools into the processes of production planning and product supply. The paper proposes a mathematical model for planning the production of dairy products, which takes into account the balance between available raw material stocks and production needs, the shelf life of finished products, the minimum profitable and maximum permissible production volumes, as well as storage and transportation costs.The model belongs to the class of multi-criteria NP-hard problems, which requires the use of modern optimization algorithms and approximate solution methods. It is focused on minimizing the costs of the entire production and logistics chain and forming an optimal product range for different consumer groups. Considerable attention is paid to the need for flexible order management, where dispatchers and technologists coordinate the use of raw materials, ensuring the fulfillment of both domestic and export contracts in large batches. Research has also been conducted on the use of artificial intelligence and libraries for data analysis, which will be included in the decision support system being developed. The authors emphasize that the combination of mathematical methods and artificial intelligence technologies forms a new management paradigm in the dairy industry that meets the modern requirements of sustainable development and digital transformation of the food industry. The proposed mathematical model and well-founded tools for its implementation can become the basis for the development of innovative decision support systems that will contribute to the growth of competitiveness of milk processing enterprises in domestic and global markets.Refs.: 13 titles.
QUALITY, RELIABILITY, AND CERTIFICATION OFCOMPUTER TECHNIQUE AND SOFTWARE
This article focuses on the development of a comprehensive method for assessing the residual life of a corroded pipeline, based on the analysis of its segments in the defective section. Approaches defined by the international standards ASME B31G and API 570, as well as the probabilistic-physical failure model — the diffusion-monotonic DM-distribution — have been used as components of the method to analyze the pipeline’s suitability for continued operation. The parameters of this failure model include the coefficient of variation of the degradation process, which reflects the dispersion of random variable values, and the mathematical expectation of the operating time to the limiting state. The key parameter of this model is the residual thickness of the defective pipe wall, particularly its threshold value. The initial parameters for the residual life estimation method are derived from measured corrosion depths in defective pipeline segments, to which classical statistical methods are subsequently applied. Two methods have been applied to predict the residual life of the defective pipeline, both based on a probabilistic-physical failure model in accordance with the DSTU 8646 standard: the cumulative DM-distribution function for operating time to the limiting state and the gamma-percentile method for residual life estimation. In order to ensure a more optimal and accurate comparison of these methods, an additional formula has been derived, since the applied approaches utilize different forms of probabilistic representation of results, which requires consistent interpretation. The developed comprehensive approach enables prediction of the residual life of the defective pipeline for specified probabilities of failure-free operation or failure, as well as for any values of operating time until the limit state is reached.Таbl.: 5. Figs.: 3. Refs.: 13 titles.
The word «spatial» fundamentally relates to human existence and activity in any terrestrial and now even celestial spaces. The aim of this book is to describe a universal high-level model and technology for dealing with spatial systems of different natures. After reviewing the spatial features of many areas, the book describes the basics of high-level Spatial Grasp Technology for dealing with large distributed systems, together with its Spatial Grasp Language (SGL), which can provide spatial vision, awareness, management, and even consciousness. It shows how the SGL scenarios self-evolve in physical, virtual, or imaginable spaces, and demonstrates the main details of the technology implementation, where a distributed SGL interpreter integrated with communication systems can convert the whole into a powerful spatial engine. Universal techniques are described in SGL for vision, understanding, and modifying large distributed spaces. Provided numerous SGL scenarios reflect application in numerous areas, including security, autonomous robots, automatic control, and withstanding cyberattacks in networked systems. The book also compares the Spatial Grasp model with traditional algorithms, confirming the universality of the former for any spatial systems, with the latter just being tools for concrete applications. The technology can be implemented on any platform, which was already prototyped for its previous versions in different countries.