An effective economy requires prompt prevention of misconduct of legal entities. With the ever-increasing transaction rate, an important part of this work is finding market collusions based on statistics of electronic traces. We report a solution to this problem based on a quantum-theoretical approach to behavioral modeling. In particular, cognitive states of economic subjects are represented by complex-valued vectors in space formed by the basis of decision alternatives, while decision probabilities are defined by projections of these states to the corresponding directions. Coordination of multilateral behavior then corresponds to entanglement of the joint cognitive state, measured by standard metrics of quantum theory. A high score of these metrics indicates the likelihood of collusion between the considered subjects. The resulting method for collusion discovery was tested with open data on the participation of legal entities in public procurement between 2015 and 2020 available at the federal portal https://zakupki.gov.ru. Quantum models are built for about 80 thousand unique pairs and 10 million unique triples of agents in the obtained dataset. The reliability of collusion discovery was defined by comparison with open data of Federal antimonopoly service available at https://br.fas.gov.ru. The achieved performance allows the discovery of about one-half of known pairwise collusions with a reliability of more than 50%, which is comparable with detection based on classical correlation and mutual information. For three-sided behavior, in contrast, the quantum model is practically the only available option since classical measures are typically limited to the bilateral case. Half of such collusions are detected with a reliability of 40%. The obtained results indicate the efficiency of the quantum-probabilistic approach to modeling economic behavior. The developed metrics can be used as informative features in analytic systems and algorithms of machine learning for this field.
Spectral analysis of signals is used as one of the main methods for studying systems and objects of various physical natures. Under conditions of a priori statistical uncertainty, the signals are subject to random changes and noise. Spectral analysis of such signals involves the estimation of the power spectral density (PSD). One of the classical methods for estimating PSD is the periodogram method. The algorithms that implement this method in digital form are based on the discrete Fourier transform. Digital multiplication operations are mass operations in these algorithms. The use of window functions leads to an increase in the number of these operations. Multiplication operations are among the most time consuming operations. They are the dominant factor in determining the computational capabilities of an algorithm and determine its multiplicative complexity. The paper deals with the problem of reducing the multiplicative complexity of calculating the periodogram estimate of the PSD using window functions. The problem is solved based on the use of binary-sign stochastic quantization for converting a signal into digital form. This two-level signal quantization is carried out without systematic error. Based on the theory of discrete-event modeling, the result of a binary-sign stochastic quantization in time is considered as a chronological sequence of significant events determined by the change in its values. The use of a discrete-event model for the result of binary-sign stochastic quantization provided an analytical calculation of integration operations during the transition from the analog form of the periodogram estimation of the SPM to the mathematical procedures for calculating it in discrete form. These procedures became the basis for the development of a digital algorithm. The main computational operations of the algorithm are addition and subtraction arithmetic operations. Reducing the number of multiplication operations decreases the overall computational complexity of the PSD estimation. Numerical experiments were carried out to study the algorithm operation. They were carried out on the basis of simulation modeling of the discrete-event procedure of binary-sign stochastic quantization. The results of calculating the PSD estimates are presented using a number of the most famous window functions as an example. The results obtained indicate that the use of the developed algorithm allows calculating periodogram estimates of PSD with high accuracy and frequency resolution in the presence of additive white noise at a low signal-to-noise ratio. The practical implementation of the algorithm is carried out in the form of a functionally independent software module. This module can be used as a part of complex metrologically significant software for operational analysis of the frequency composition of complex signals.
The paper introduces the features of quantum infocommunication and the basic concept definitions, discusses the historical preconditions, possible applications and development perspectives
The aim of this paper is to describe the main conceptions of neuron nets and quantum calculations in order to show their common properties. Attention is focused on the problem of parallel calculations as the most important property providing a considerable growth of the calculating ability. In conclusions, the task of simulation of quantum calculations on neuron nets using the algorithm of quantum Fourier transformation is described.
The problem of the determination of signal quantization mistakes is considering from spline theory position. The method of the mistakes estimation based on formulas for power spectrum with signal approximation by graded B-splines of zero degree and by polynomial splines of high degrees is offering.
Difficulties in algorithmic simulation of natural thinking point to the inadequacy of information encodings used to this end. The promising approach to this problem represents information by the qubit states of quantum theory, structurally aligned with major theories of cognitive semantics. The paper develops this idea by linking qubit states with color as fundamental carrier of affective meaning. The approach builds on geometric affinity of Hilbert space of qubit states and color solids, used to establish precise one-to-one mapping between them. This is enabled by original decomposition of qubit in three non-orthogonal basis vectors corresponding to red, green, and blue colors. Real-valued coefficients of such decomposition are identical to the tomograms of the qubit state in the corresponding directions, related to ordinary Stokes parameters by rotational transform. Classical compositions of black, white and six main colors (red, green, blue, yellow, magenta and cyan) are then mapped to analogous superposition of the qubit states. Pure and mixed colors intuitively map to pure and mixed qubit states on the surface and in the volume of the Bloch ball, while grayscale is mapped to the diameter of the Bloch sphere. Herewith, the lightness of color corresponds to the probability of the qubit’s basis state «1», while saturation and hue encode coherence and phase of the qubit, respectively. The developed code identifies color as a bridge between quantum-theoretic formalism and qualitative regularities of the natural mind. This opens prospects for deeper integration of quantum informatics in semantic analysis of data, image processing, and the development of nature-like computational architectures.
In this paper we study one of the possible variants of smooth approximation of probability criteria in stochastic programming problems. The research is applied to the optimization problems of the probability function and the quantile function for the loss functional depending on the control vector and one-dimensional absolutely continuous random variable. In this paper we study one of the possible variants of smooth approximation of probability criteria in stochastic programming problems. The research is applied to the optimization problems of the probability function and the quantile function for the loss functional depending on the control vector and one-dimensional absolutely continuous random variable. The main idea of the approximation is to replace the discontinuous Heaviside function in the integral representation of the probability function with a smooth function having such properties as continuity, smoothness, and easily computable derivatives. An example of such a function is the distribution function of a random variable distributed according to the logistic law with zero mean and finite dispersion, which is a sigmoid. The value inversely proportional to the root of the variance is a parameter that provides the proximity of the original function and its approximation. This replacement allows us to obtain a smooth approximation of the probability function, and for this approximation derivatives by the control vector and by other parameters of the problem can be easily found. The article proves the convergence of the probability function approximation obtained by replacing the Heaviside function with the sigmoidal function to the original probability function, and the error estimate of such approximation is obtained. Next, approximate expressions for the derivatives of the probability function by the control vector and the parameter of the function are obtained, their convergence to the true derivatives is proved under a number of conditions for the loss functional. Using known relations between derivatives of probability functions and quantile functions, approximate expressions for derivatives of quantile function by control vector and by the level of probability are obtained. Examples are considered to demonstrate the possibility of applying the proposed estimates to the solution of stochastic programming problems with criteria in the form of a probability function and a quantile function, including in the case of a multidimensional random variable.
The article discusses the paradigm shift from traditional mathematical models of control theory to the A.N. Kolmogorov's algorithmic theory of computer science. A comparison between the identifiable object information and Shannon's ensemble (entropy) information. The proposed algorithmic models based on the respective approximations of space filling curves also regarded as self-similar recursive structure (fractal approach).
1 - 9 of 9 items