2022 year, value 26, issue 1 (PDF)
We present an expansion of the paraconsistent logic N4 by operators for meaningfulness and nonsensicality. This logic contains three congruentiality-breaking unary connectives, which gives rise to a tetra-lateral sequent calculus with four different sequent arrows.
Keywords: inconsistency-tolerant logic, sequent calculus, proof-theoretic multilateralism.
Free electron lasers are becoming increasingly powerful and affordable installations for determining the structure of proteins. Conformational rearrangements are characteristic of the vast majority of proteins, but the transience of transformations often does not allow obtaining crystals of sufficient size. The solution could be to obtain X-ray diffraction patterns from single molecules, but in this case it is not possible to determine their orientation, which makes it impossible to restore the structure by modern methods. The paper considers the applicability of a number of neural network architectures to the recognition of the conformational state of a protein based on diffraction data on single bacteriorhodopsin molecules.
Keywords: X-ray diffraction, neural networks, conformational states of protein, bacteriorhodopsin
Machine learning intelligent control systems by symbolic regression methods is considered. Symbolic regression allows to find mathematical expressions for various problems where it is necessary to find structure and parameters of unknown multidimensional function. The search for an unknown function is carried out by a genetic algorithm in code space of symbolic regression method. Functions containing condition operators that are mandatory component of intelligent control systems programs.
Keywords: symbolic regression, control synthesis, machine learning, optimal control.
In the contemporary economy, we see evidence of the “productivity paradox” — investments in new information technology do not increase labor productivity on the national economy level. We suppose that centralized online platforms dominating the modern digital economy are the key drivers of this paradox. Artificial Intelligence (AI) development strengthens their dominance due to the concentration of data necessary for AI algorithms learning. At the same time, the business models of online platforms were developed at the turn of the 1990s – 2000s and may not be optimal in the presence of other metaphors, primarily peer-to-peer networks. To limit the dominance of centralized online platforms, the government needs to change its policy towards them, including leading domestic operators. To reduce platform operators’ monopoly power, we propose dividing operators into companies that support collecting and processing data on platforms and business operators that use this data to provide end-user services.
Keywords: productivity paradox, centralized online platform, business model, peer-to-peer platform
This paper provides a brief overview of the problems of sentiment analysis, as well as the problems and approaches used. In recent years, machine learning methods have been actively used in sentiment analysis problems. The article discusses the main approaches of machine learning and their features. On the example of one of the datasets for the Russian language, the progress of methods in the problem of sentiment analysis is considered.
Keywords: sentiment analysis, targeted sentiment analysis, sentiment lexicon, machine learning, neural network, BERT.
It’s one thing to solve a problem mathematically and develop an appropriate method, but it’s quite another to port it to hardware components. Even seemingly simple tasks require careful thought software architecture hardware implementation. It is necessary to understand the device of RAM and internal memory, including hardware mechanisms for direct copying of data from various subsystems, and the functioning of individual subsystems (controllers) of a large system (system on a chip), as well as their interaction. It is necessary to understand the structure of RAM and internal memory, including hardware mechanisms of direct data copying from various subsystems, and functioning of individual subsystems (controllers) of a large system (system on a chip), as well as their interaction. In particular, take into account that calculations can be performed not only on the processor, but, for example, on some controller. So, there are tensor calculations, and there are even more specialized digital image processing based. Even the effective output of graphic information requires attention from the developer, otherwise artifacts (data distortion) may occur during their visualization. For the performance of methods, not only is the mathematical factor important, but also the hardware implementation support of seperate algorithm elements. The latter is reflected in modern computing systems and embedded processors architectures.
Keywords: hardware implementation, processor, random access memory, controller.
We consider how in the course of biological evolution the brain gradually formed a hierarchical architecture of deep reinforcement learning. Based on this architecture, a working AGI model, ADAM, is proposed, capable of learning more and more complex behavioral skills as the depth of the hierarchy of control levels increases.
Keywords: artificial general intelligence, deep reinforcement learning, hierarchical control systems.
In an infinitely generated functional system of automata with a superposition operation, there are both pre-complete classes and classes that do not fit into any pre-complete one. The article provides examples of both the first and second classes.
Keywords: finite automaton, superposition, closed class.
Cryptocurrencies, the price of which is tied to physical assets, are called stablecoins. This paper shows that for stablecoins, the use of a Proof-Of-Work consensus-building algorithm is impractical, because someone will have to pay for the hard work done to reach a consensus, and the amount of money in the system will decrease. This paper proposes a new consensus-building algorithm based on the lottery principle that can be used for stablecoins and for digital currencies of central banks.
Keywords: Cryptocurrencies, digital currencies of central banks, consensus building algorithms.
The report presents a new mathematical model of parallel programs and gives an example of its application for verification of a parallel program for matrix multiplication.
Keywords: parallel programs, distributed processes, verification.
In this paper, a present-day problem is considered — the design of hypergraphs. For this purpose, the simulated annealing method was used. The subject area was reviewed along with the main parts of the algorithm and the program was designed. The developed algorithm can be used to solve other similar problems.
Keywords: hypergraphs, simulated Annealing, intelligent optimization method
The report considers the problem of classifying the behavior of complex systems based on a comparison of a characteristic function presented by an expert and a function derived from data. Such situations are typical for economics, sociology, biology, and other fields. To solve the problem, a characteristic function is described in the form of a system of fuzzy conditions describing its behavior, and an algorithm is proposed for calculating the degree of compliance of an empirical function with such a system of fuzzy conditions.
The paper proves that the ring of integers is topologically simple with respect to a certain ring topology. Then we construct a cryptosystem based on commutative topologically simple rings.
Keywords: topologically simple ring, topologically irreducible module, cryptosystem
We discuss the problems of computer science and focus on computational complexity.
Keywords: computer science, computational complexity, fast multiplication
The implementation of the modified Rabiner’s method (MMP) is proposed for calculating the maximum probability of generating a Markov sequence of a given length based on one of the set of stochastic matrices belonging to the class of ergodic matrices (ESM). The implementation is carried out using the same type of elements — neural adders (NA). Estimates of the complexity of the implementation of the MMR by the number of NAs and storage elements as a function of the power of the set of ESMs and their dimensions are obtained.
Keywords: Rabiner’s method, stochastic matrices, neural adders, complexity estimates.
Fast multiplication algorithms for both large numbers and large square matrices are considered here. When multiplying numbers, a group algebra is introduced, and the Fourier transform is expressed as a representation of the elements of the group algebra in another basis associated with characters. Next, bigroup algebra is introduced as an extension of group algebra operators using characters acting as diagonal matrices in the standard basis of group algebra. The analogue of multiplication of large numbers by the Fourier transform extends to group algebra, i.e. to the algebra of matrices.
Keywords: Group algebra, symbols, b and group algebra, sign automorphisms, symmetries, values.
Cluster analysis has a very wide range of applications; its methods are used in medicine, chemistry, archeology, marketing, geology and other disciplines. Clustering consists of grouping similar objects together, and this task is one of the fundamental tasks in the field of data mining. Usually, clustering is understood as a partition of a given set of points of a certain metric space into subsets in such a way that close points fall into one group, and distant points fall into different ones. In this paper, we offer a local averaging method for calculating the distribution density of data as points in a metric space. Choosing further sections of the set of points at a certain level of density, we get a partition into clusters. The proposed method offers a stable partitioning into clusters and is free from a number of disadvantages inherent in known clustering methods.
Keywords: cluster, algorithm, density, averaging method.
The modern situation of the development of society is considered, when it comes close to the perception of methods and means of artificial intelligence (AI) in everyday life as the implementation of the principles of new scientific knowledge (doctrine) based on the foundation of «Data Science». The author’s concept of constructing educational trajectories in the training of specialists in the field of AI is offered.
Keywords: Artificial Intelligence, Higher Education, Data Scientist, Data Engineer, Data Analyst.
The principles of dynamic calculation of indicators and some algorithms of data mining used in the calculation of maps of sequestration and stock of organic carbon in the soils of Russia in the framework of FAO UN projects to create global maps are considered. The description of multi-modal and multi-temporal source data is given: raster grids of various resolutions, vector data in the geographical coordinate system, attribute information. The calculation of final maps and error maps in a distributed network of soil data centers is described as a Big Data task.
Keywords: soil databases, statistical methods, distributed systems, organic carbon.
The report presents the result of joint research carried out by researchers of the National Research Center for Therapy and Preventive Medicine of the Ministry of Health of the Russian Federation and by researchers of Faculty of Mechanics and Mathematics of the Lomonosov Moscow State University, showing the data analysis methods applicability in the problem of an unfavorable clinical outcome risk predicting based on clinic patient information (medical information system «Medialog»).
Keywords: preventive medicine, adverse clinical outcome, in-depth data analysis.
To reduce the risks associated with alcohol abuse, it is extremely important to assess alcohol consumption levels. At the same time, it is not enough to know indicators that assess only the volume of retail sales, which does not take into account unrecorded alcohol consumption. The development of an integral indicator of the index, based on available statistical data, will make it possible to level the risks of expert assessment and increase the efficiency of spending budget funds in the field of healthcare. The report provides a description of the available expert and factual information, the structure of the index, an approach to its assessment by means of the theory of fuzzy sets is proposed. The features of the implementation of the index calculation in the Matlab environment are considered, examples are given. The related direct and inverse problems are formulated and discussed, the solution of which will optimize the efficiency of the healthcare system in this parameter at the regional and federal levels. The alcohol well-being index is one of the important parameters of the health care system, on the one hand, and a typical index of socio-economic processes, on the other, therefore, the approaches described in the report can be applied to construct a wide set of such indices.
Keywords: preventive medicine, alcohol well-being, process evaluation and monitoring.
An approach to building a search engine that takes into account both the context and the occurrence of common words between the query and documents is presented.
Keywords: search engines.
The idea of the presentation addresses to possibility of logical- semantic tools applying in the natural language analysis. Epistemic contexts have been chosen as the most productive for such analysis, since they impose high requirements on systems trying to represent their dynamics. In particular, the truth relation between the sub- operative expression and the complex sentence containing it is not obvious: though the sub-operative expression is false, the complex expression may be true, and vice versa. Another non-obvious point: it is impossible to perform substitution in such contexts based on the values’ identity; a coincidence in meaning (synonymy) is required. In this regard, a kind of intuitively transparent scheme of belief contexts semantics’ understanding based on the concept of ideal worlds is proposed.
Keywords: epistemic contexts, semantics, natural language, denotation, meaning.
The cluster approach to natural language analysis solves the problem of labor and time consuming in this data form analysis. It gives opportunity to see the overall objective picture of scientific research in both dynamic form and in static to analyze the current situation. The method combines the capabilities of machine learning and human cognitive abilities which is an efficient way to analyze large amounts of data. The method is suitable for a variety of purposes and in this article there are some examples of its applications.
Keywords: Cluster approach, natural language analysis, patent analysis, tokenization, network
Nowadays, supervised word sense disambiguation (WSD) algorithms attain the best results on the main benchmarks. However, large sense- tagged training sets are required for their training. This requirement hinders the development of the word sense disambiguation systems for many low-resource languages, including Russian. To address the issue of the knowledge acquisition bottleneck in Russian, in this work we investigate the method for automatic text labelling that is based on the ensemble of weakly supervised WSD models. Our experiments demonstrated that the models retrained on the new pseudo-annotated data outperform the initial models.
Keywords: Word sense disambiguation; Russian dataset; ELMo; BERT.
The paper examines opinions of VKontakte network users about having births. A dataset annotated with opinions on three classes of positions towards six childbirth-related topics. Experiments on automatic classification of opinions have been carried out. The best results were obtained using neural network model BERT in the formulation of NLI (Natural language Inference). It was revealed that the phenomenon of conscious childlessness is actively represented in the network, while having many children remains a poorly widespread model of behavior. Within the framework of a pro-natalist policy, it is important to support a positive public opinion about parenting, to develop family-life balance for parents.
Keywords: stance detection, classification, BERT, VKontakte, reproductive values, pro-natalist policy.
Acceptability judgments have become the key method to obtain empirical base for theoretical language modelling and rule-based NLP. A signature property of this method is heterogeneity of speakers’ judgments. In this talk we will consider its possible sources. We will present metrics elaborated to measure this heterogeneity and ways to interpret it.
Keywords: grammar, acceptability, consistency, grammaticality judgments.
The present paper is all about legal document processing and deals with semantic analysis of Road Traffic Law. We describe the basic procedures for semantic analysis of «Give way» and «Prohibitive» types of sentences. We circumscribe search and implement parsing and semantic analysis of these sentences using syntactic patterns.
Keywords: semantic analysis, syntactic analysis, legal document, rules of the road.
Classical models of artificial neural networks have several disadvantages. To eliminate these shortcomings, a fundamentally new model of an artificial neural network, called the interferential model, is proposed. This model is based on the structure of biological neurons of the human brain. This work describes principles of work of interferential model. The results of the work show that the interferential model does not contain the disadvantages of classical neural networks. It is well suited for running classification task, as well as for pattern recognition.
Keywords: neural network, interferential model, interference, data classification, neurotransmitter, synapse, receptor, image recognition.
We give an overview of the main technical problems in the modern neurobiology studies, which can be successfully solved using machine learning and artificial intelligence technologies. The main difficulties encountered by researchers in the analysis of biosignals time series and image segmentation are considered. An overview of modern technical solutions using convolutional neural networks for solving practical problems in neurobiological research is given.
Keywords: neural networks, machine learning, data processing, image processing, time-series analysis
The current success in the field of neural networks is largely due to the availability of a sufficient amount of hardware resources. The paper analyzes the main modern hardware solutions for AI (CPU, GPU, TPU), considers their advantages and disadvantages.
Keywords: GPU, TPU, hardware, neural networks
The magnetic flux leakage (MFL) method is the most common approach for non-destructive testing of oil and gas pipelines. As a result of MFL detection, magnetograms are obtained, often analyzed by semi-automated methods, which leads to a decrease in accuracy and an increase in analysis time. The paper proposes a new CNN architecture for automatic image classification based on magnetograms for oil pipeline diagnostics. As a result of testing the developed algorithms on a deferred sample, the high accuracy and efficiency of the developed solution were proved.
Keywords: Deep learning, Computer vision, Convolutional neural networks, Anomaly detection, Oil pipeline diagnostics, Magnetic Flux Leakage data processing
The purpose of this study is to examine the utility of using a software system and data exchange protocols that support interaction of data collection systems (in particular biosensors), data processing systems based on AI and user algorithms. The system is designed to reduce the “entry barrier” into the field of data processing based on AI. The prototype of the system has been successfully tested by means of an example of a problem of gesture recognition using a small number of EMG sensors.
Keywords: protocols, electromyography, machine learning, data processing.
Modern technical systems such as power plants are equipped with diagnostic systems. But these systems are imperfect, and accidents still happen. Accidents can lead not only to economic losses but also to socially frightening consequences, such as human-made catastrophes. Another one, that seems to be a typical, problem for Nuclear Power Plant is the excessive duplication of safety systems, which increases the cost of the NPP itself. The solution could be an advanced diagnostic system. Diagnostics as science can be divided into the main three parts: the first one is the monitoring of technical conditions; the second one is finding the root cause of anomalies; the third one is the forecasting of the future state of a technical system. The developed framework can be used to solve all these diagnostics tasks.
Keywords: technical systems diagnostics, machine learning, deep learning, time series analysis, anomaly detection, time series forecasting, data preprocessing, framework
This paper is devoted to the application of statistical models to increase the prediction accuracy of oceanological data. The initial time series are modelled with mixtures of finite normal distributions. Statistical characteristics of constructed mixtures are used to pre-initialize the layers of a recurrent neural network. Forecasts made with the application of statistical models are compared with forecasts made using only the original data. It is demonstrated that a significant improvement in accuracy is observed for all analyzed series.
Keywords: EM, MSM, LSTM, neural networks, machine learning, feature enrichment, finite normal mixtures.
The report considers an urgent problem of machine learning – the human pose estimation based on his image. A new way of setting the problem as the problem of classifying each pixel of the image is proposed. In the proposed formulation, the solution is implemented, a comparison with the existing regression approach is made.
Keywords: human pose estimation, convolutional neural networks, keypoint detection.
Currently, video surveillance systems are becoming more widespread. One of the main goals of such systems is to control and track a person’s movement. The solution of this problem allows us to solve such applied problems as tracking the occupancy of various premises (whether shopping facilities or educational and cultural institutions), creating a motion heatmap or organizing control of access to a particular object. The present paper proposes a method based on a combination of various neural networks, which allows solving these problems with high accuracy.
Keywords: Deep Learning, convolutional neural network, ReID, Mask R-CNN, OsNet, ResNet.
Automatic identification of minerals in images of polished section is highly demanded in exploratory geology since it can significantly reduce the time spent by a human expert in the study of ores, automatically provide high quality statistics of mineral distribution of different deposits. In this work we propose a deep-learning based algorithm for automatic identification of minerals in images of polished sections and present LumenStone dataset which unites high-quality geological images of different mineral associations and provides pixel-level semantic segmentation masks.
Keywords: Image Segmentation, Deep Learning, Geology, Mineral Identification, Polished Sections, Ore.
An algorithm for an identifier design based on a recurrent neural network is proposed and its applicability on a centrifuge-type stand.
Keywords: recurrent neural network, non-parametric identifier, centrifuge, motion cueing.
We solve the problem of autonomous robot path planning by applying and comparing classical approach and approach using machine learning algorithms. The task was set to "move from point A to point B"on various experimental maps representing a maze with obstacles. We studied the behavior of a two-wheeled robot with different methods of building routes and different sets of sensors. It is assumed that the machine learning approach is easier to develop and requires fewer sensors, so this significantly reduces the cost of such a robot.
Keywords: path planning; robotics; motion primitives; reinforcement learning; rl.
Software for the pupil detection on the video recording of eye movement has been developed. Two methods are implemented: circle detection and ellipse detection. The method of calibration by the size of the iris is implemented.
Keywords: video oculograph, recognition of pupil position, vertical and horizontal pupil displacement, OpenCV computer vision libraries.
The existing methods for evaluation the quality of the acceleration effects imitated by the motion cueing system of flight simulation training devices are considered. The technology based on the model of the vestibulo-ocular reflex is proposed. It may allow to apply the modern technical equipment to reach the objective comparison of the vestibular response to acceleration effects in a real flight and the simulation. This technology could enhance the simulation systems and it’s validation for emprove the upset prevention and recovery training.
Keywords: vestibulo-ocular reflex, flight simulators objective quality evaluation, flight safety.
This study is devoted to the problem of forming the movements of an avatar in a virtual environment and developing proposals for modifying the tracking algorithms to create an interactive virtual environment. The presented proposals will allow filtering the tracking data more successfully and predicting the motion of the attachment points of the tracking sensors and, consequently, of the avatar in the virtual environment.
Keywords: inertial body tracking, optical body tracking, virtual reality, VR, tracking algorithms.
The study of declarative memory is a topical area of neurophysiology. In the present work, we tested the possibility of short-term and long-term memory formation in laboratory mice in the novel object location task. The results indicate the possibility of forming both types of declarative memory using this method. We also showed that glutamatergic system plays crucial role in long-term declarative memory formation.
Keywords: declarative memory, novel object location, short-term memory, long-term memory.
The Aesop’s test assesses the ability to catch bait floating on the surface of the water in a narrow cylinder out of reach by placing drowning objects in it. We investigated the ability of hooded crows with a modified version of this test (with five types of cylinders and two types of objects) spontaneously and after training with one cylinder and one object.
Keywords: Aesop’s fable paradigm, understanding cause–and–effect relationships, tool–use, hooded crows.
This study is aimed at elucidating the mechanisms of memory formation in cognitive systems using a mouse model of post-traumatic stress disorder (PTSD). The pattern of activation of brain structures during the development and prevention of PTSD was investigated. Differences were shown in the level of activation of the structures of the fear network of the brain during the development of PTSD and the formation of aversive memory.
Keywords: cognitive systems; memory; post-traumatic stress disorder; aversive learning; protein synthesis inhibition; c-fos.
The study evaluated the induction of immediate early genes c-fos and Arc/Arg3.1 in mouse brain after contextual fear conditioning and after memory retrieval. It was shown that the induction of both genes in a number of brain structures occurs after this exposure, however, only 30% of neurons simultaneously express both genes.
Keywords: immediate early genes, c-Fos, Arc, neuroplastisity, engram.
A principally new type of video cameras, the so-called DVS (dynamic vision sensors), became commercially available recently. They are capable of enhancing dramatically speed and energy consumption parameters of video signal processing procedures because they send asynchronous stream of spikes indicating brightness change of individual pixels instead of scanned raster data of the whole camera view field. However, this new kind of output signal requires novel algorithms for its processing. Such an algorithm called SCoBUL (spike correlation based unsupervised learning) is described in the present work. SCoBUL uses a one-layer spiking neural network with lateral inhibition for extracting informative features from spike trains in unsupervised learning regime. SCoBUL‘s most important feature is a generalization of STDP (spike timing dependent plasticity) rule optimized especially for this task.
Keywords: DVS, spiking neural network, network with lateral inhibition, unsupervised learning, synaptic plasticity, STDP.
String-pulling tasks are used to study animal cognition. We developed a new type of string-pullling task and studied the ability of hooded crows to understand their logical structure spontaneously and after training.
Keywords: tool–use, string-pulling task, understanding
To analyze the data generated with whole-brain staining and imaging approaches, several assays have been proposed registering brain samples to atlases and detecting stained cells in individual brain regions. However, the algorithms for quantifying labeled cells in the brain and grounding these counts to the brain’s functional anatomy are still in the early stages of development. To bridge this gap, we introduce DOGHOUSE, an end-to-end assay for probing spatiotemporal dynamics of brain-wide activity. The assay consists of two components. CORGI, a software package for registering whole- brain sample data in space and time, overcomes the differences between dissimilar perinatal brains. DALMATIAN, a software package for cell detection in whole-brain sample data, separates densely packed dividing cells. The staining protocol has been validated for the staining specificity whilst the software packages have shown competitive performance meeting or exceeding the state-of-the-art on diverse datasets. Our assay automates a variety of tasks including group comparison and monitoring brain development dynamics. All methods are available for download via open access.
Keywords: whole-brain, 3D-analysis, developmental dynamics, microscopy, registration, morphing, cell counting.
Modern artificial intelligence (AI) systems based on the von Neumann architecture have a number of fundamental limitations compared to the brain. In the paper, we partially revealed these limitations, proposed the principle of classification of AI neuromorphic systems and presented a comparative analysis of popular neuromorphic projects in the context of the proposed classification.
Keywords: Artificial intelligence, neuromorphic systems, local learning, spiking neural networks, sparse computing, in-memory computing, analog computing.
Сellular mechanisms of neural code for information about the environment are one of the topical areas of neurophysiology. The work is aimed at studying the influence of the value and novelty of objects to the cognitive specialization of neurons in the hippocampus of mice. In this work, a comparative study of the relationship between cellular representations of individually and socially acquired memories was investigated using the methods of optical calcium imaging.
Keywords: neuronal encoding, calcium imaging, object recognition, social memory, CA1, hippocampus.
We investigated the collective activity of the CA1 place field neurons of the mouse hippocampus during free exploration task. Using methods of nonlinear dimensionality reduction, we showed that the activity of a population of several hundred cells can be reduced to two collective variables that determine the position of the animal on the explored environment. Our results support the distributed coding hypothesis and can be used to analyze the collective activity of neurons in other brain regions.
Keywords: place neurons, dimension reduction, distributed coding.
The retrosplenial cortex (RSC) plays a key role in the processes of spatial navigation and coding of spatial information. However, the participation of the RSC in encoding information about objects has been studied little. In this work, we used fiber-optic photometry to record RSC calcium activity in novel object and novel place recognition tasks. We have shown that at the moment of animal contact with the object, there is a decrease in the calcium activity of the RSC, regardless of the behavioral task. At the same time, the performance of another form of exploration behavior — rearing, was not accompanied by a similar decrease in the activity of the RSC. Thus, we have shown a specific change in the activity of RSC during the object exploration in mice.
Keywords: retrosplenial cortex, object recognition, calcium imaging, exploratory behavior.
Neuromorphic computer vision systems, also called event cameras, are sensors whose basic principle of operation was borrowed from the physiological principle of how human vision works. They differ from traditional cameras in that instead of taking pictures at fixed intervals, they asynchronously detects changes in the brightness of each point in the observed space. At the output, these sensors generate a stream of data packets, where each packet corresponds to a brightness change event, and the packet itself contains information about time the event occurs, pixel coordinates and the nature of the brightness change (increase or decrease). The article describes principles of event cameras operations. Provides an overview of main advantages of event cameras, as well as the most promising areas for using this technology in various fields of human activity.
Keywords: Event cameras, machine vision, neuromorphic systems, computer vision, asynchronous sensors, high dynamic range, robots.
In this study we investigated the influence of post-traumatic stress disorder on spontaneous behavior and resting state brain activity of animals. Using automated behavior analysis, Fos-neuroimaging and rest network mapping, we showed stressful experiences can alter spontaneous behavior, induced and spontaneous brain activity and patterns of functional connections in resting state neuronal networks long after trauma.
Keywords: PTSD, resting state networks, с-fos, spontaneous behavior, protein synthesis inhibition.
In the proposed model, agents’ reflections are proposed taking into account the effect of forgetting unused constraints and on complexity. The formal setting, modifications, motivation and justification of its adequacy are discussed.
Keywords: calculus, completeness, multi-agent system, non-classical logics, complexity, social network.
One of the biggest hypes in current research is Artificial Intelligence. Similar to the AI 1.0, 2.0, and 3.0 hypes, the claim is that AI will help to solve all problems everywhere and anytime, will replace almost all human activities by greater machinery, will be far more intelligent than humans, will be far more reliable than humans, and will be the basis for greater wealth. We briefly investigate whether it is possible and figure out that these and other promises are not realistic. A silver bullet is, however, modelling since it is more concerned with human intelligence.
Keywords: artificial intelligence, human intelligence, modelling, AI models.
The report demonstrates that the development of AI systems is associated with personalization and individualization of educational process through the formation of individual educational programs, an individual schedule of the educational process, taking into account individual experience, style of thinking, level of knowledge, for maximum psychological comfort of the child. Proponents of the use of AI in education highlight a number of possible advantages of AI technologies for a child: novelty, possible increase in involvement and motivation; the ability to set individual tasks, depending on the characteristics, which helps child to feel special. Education should address the dilemma of individual autonomy and public good, as AI raises the tension between privacy, respect for human dignity and autonomy, and understanding education as a public good. A very acute problem is the confidentiality of information, the definition of modes and levels of accessibility to the information with which the AI works, and the recommendations that it gives. Dattification in education runs the risk of becoming a factor of stigma and discrimination. AI systems can serve to enhance social normalization, that is, the result of using AI can be exactly the opposite than personification. The problem of responsibility in the application of AI and the problem of creating "trustworthy AI"have not been solved for education. The ethics of the child’s interaction with artificial intelligence needs to be developed, the psychological effects of the child’s interaction with the AI system are not clear.
Keywords: philosophy of education, artificial intelligence, essence of education, ethics of artificial intelligence.
In recent years a new field of study has appeared in social psychology. It concerns voice assistants that contribute to major changes in our society during its relatively brief existence. They are being increasingly used in situations that require multi-user interaction: in families, classrooms and, more importantly, in task- oriented workgroups. This study analyzes a process of group interaction moderation by a voice assistant. The algorithmic repetition of the voice assistant’s lines, noticeable to the group members, reduces involvement in the interaction process. Due to technical limitations the members of the group organize collective activities to build interaction with the voice assistant. This allows to increase the perceived cohesion in the early stages of group development. This study analyzes a process of group interaction moderation by a voice assistant. The algorithmic repetition of the voice assistant skill, noticeable to the group members, reduces involvement in the interaction process. However, due to technical limitations members of the group organize their joint activities to better interact with the voice assistant. This allows to increase the perceived cohesion in the early stages of group development.
Keywords: Human–computer interaction, Voice assistant, Group interaction.
The message will focus on ethical and applied problems of creating and using artificial intelligence technologies. Existing ethical codes regulating the field of artificial intelligence and their key principles are considered.
Keywords: artificial intelligence, philosophy of technology, applied ethics.
Artificial and natural neural network models are a new toolkit which could be potentially have been used for clarifying of complex brain functions. To attend this goal, such models need to be neurobiologically realistic. In this work we discuss different types of neural models and also identify aspects under which their biological credibility can be improved.
Keywords: neural networks, complex systems modeling, computational neuroscience.
The report describes the concept of human-computer intelligent systems (hybrid intelligence systems), the main problems of their development, provides examples of such systems for organizations at the international, federal, and corporate levels. The focus is on the analytical capabilities of such systems. Formulations of direct and inverse problems are given. It is shown that, in contrast to standard OLAP analytics (analysis of the past) and from data mining analytics (forecasting the future), such systems allow constructing the future. In particular, understand what and how should be changed in the analyzed system to achieve the maximum effect within a given budget and / or how to achieve a given effect with a minimum budget. Such features of hybrid intelligence systems make it possible to use them in a wide class of problems of managing socio-economic processes and socio-technical systems.
This paper discusses the results of the exploratory stage (the first of three) of the study of social social perceptions of young people about Artificial Intelligence (AI). Our work is based on Moskovisi’s theory of social perceptions[11]. Interviews (N = 14) and observations (N = 14) were conducted, and free associations were collected on the topic of AI (N = 127). The purpose of this step is to map the common meanings associated with AI. The most prominent were the topics described below. 1) Ideas about the future with AI: the scenarios of a catastrophe with unemployment and free, creative and carefree future for humans coexist at the same time. 2) Reflections on the “non-humanity” of AI 3) The division of the areas of application of the use of AI into acceptable and unacceptable. The next steps of this study will be the analysis of the media to identify new topics, confirmation and analysis of the positioning of these topics.
Keywords: Artificial Intelligence, Human-AI interaction, Social representations.
The report explores the possibilities of using neural networks for creativity assessment. The use of both psychometric and machine learning methods allowed us to assess students‘ creativity skills (the sample is 1831 4-graders) through graphical solutions created in a computer environment. Also, based on a trained neural network model, the influence of instruction in creativity tasks on the results of creativity diagnostics is evaluated.
Keywords: creativity assessment, neural networks, image analysis, psychometrics.
Creativity assessment is an urgent task for researchers and practitioners. Nowadays in Russian psychology, there is no instrument for measuring creativity at school age, implemented in a modern computer format with automatic (non-expert-based) scoring. This study demonstrates what perspectives in the field of psychological testing are brought by modern computer technologies and the trend towards automation.
Keywords: computer testing, automatic scoring, creativity assessment.
The paper introduces the concept of de Morgan multimonoid and considers the possibility of constructing a relevant multilattice logic on its basis. The problem of constructing a sequent calculus for it is discussed.
Keywords: relevant logic, multilattice logic, multilattice, multimonoid, sequent calculus.
In our talk we will consider three epistemic logics for ignorance rep- resentation. The overall objective of the talk is to provide a comparative analysis of these three settings and evaluate the extent to which each logic succeeds in formalizing ignorance.
Keywords: Epistemic logic, ignorance that, ignorance whether, factive ignorance, frame definability.
We propose an extension of DeGroot’s learning model with an epistemic-doxstic modal logic. It is argued that this framework can be applied for the analysis of informational cascades, pluralistic ignorance and other phenomenon.
Keywords: DeGroot, social influence, epistemic logic.
Progress in the creation of artificial intelligence (hereinafter - AI) is reflected both in the evolution of its definitions and in the increasing interest in its use by financial market participants around the world and regulators. From the perspective of users of financial services, one can identify certain opportunities and challenges accompanying the introduction of AI in this sphere. Despite warnings about such risks, including from representatives of the business community, large companies promote as "best practices"in the use of data recommendations that are very controversial from the point of view of the consumer. The "human"factor is of particular importance, as is a more thorough study of human behavior and decision-making that cannot be directly rationalized, including in financial consumer services, for which AI is becoming an increasingly widespread technology.
Keywords: artificial intelligence, consumer financial services, decision making.
We set out a formal system for logical analyses of existence judgements. Its language contains the constant of existence, atomic formulas are formed by the concatenation of this constant with any finite sequence of general terms. We formulate an analytic tableaux version of this logic and a decision procedure for it. This research has been supported by the Interdisciplinary Scientific and Educational School of Moscow University “Brain, Cognitive Systems, Artificial Intelligence”.
Keywords: existence judgements, logical calculus, semantics, analytic tableaux, decision procedure.
The use of artificial intelligence methods and machine learning methods to perform routing tasks, followed by predictive modeling of the optimal action strategy, becomes an effective tool in applied application to the development of an innovative economy. Decision support systems offering possible ways of business development and offering solutions based on marked-up Data Sets are becoming drivers for both startups and large corporations working in the development of new technologies.
Keywords: artificial intelligence, personal investment routes for technology projects, decision-making.
Modern digital technologies, in particular artificial intelligence, are the basis for the existence of digital ecosystems that operate in a variety of areas and, consequently, markets. The presented study concerns the analysis of the behavior of ecosystem participants, which is supported by mathematical modeling. The findings from the analysis for various market structures indicate multiple effects, not only direct and indirect network effects, but also “feedback loops” that need to be taken into account when making business and management decisions.
Keywords: digital ecosystems, competition policy, monopolies, platforms, artificial intelligence.
Technologies are algorithms, the result of which is the result of a sequence of external physical processes initiated by an active agent. We define the language and semantics of technologies. Also the minimal logic of technologies is axiomatized. One of the possible areas of application of the proposed language is evolutionary programming for adaptive optimization of existing technologies and the search for new ones.
Keywords: technologies, logic of technologies, natural algorithms, evolutionary programming.