Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
421 result(s) for "algorithmic complexity"
Sort by:
The black hole interior from non-isometric codes and complexity
A bstract Quantum error correction has given us a natural language for the emergence of spacetime, but the black hole interior poses a challenge for this framework: at late times the apparent number of interior degrees of freedom in effective field theory can vastly exceed the true number of fundamental degrees of freedom, so there can be no isometric (i.e. inner-product preserving) encoding of the former into the latter. In this paper we explain how quantum error correction nonetheless can be used to explain the emergence of the black hole interior, via the idea of “non-isometric codes protected by computational complexity”. We show that many previous ideas, such as the existence of a large number of “null states”, a breakdown of effective field theory for operations of exponential complexity, the quantum extremal surface calculation of the Page curve, post-selection, “state-dependent/state-specific” operator reconstruction, and the “simple entropy” approach to complexity coarse-graining, all fit naturally into this framework, and we illustrate all of these phenomena simultaneously in a soluble model.
A Review of Methods for Estimating Algorithmic Complexity: Options, Challenges, and New Directions
Some established and also novel techniques in the field of applications of algorithmic (Kolmogorov) complexity currently co-exist for the first time and are here reviewed, ranging from dominant ones such as statistical lossless compression to newer approaches that advance, complement and also pose new challenges and may exhibit their own limitations. Evidence suggesting that these different methods complement each other for different regimes is presented and despite their many challenges, some of these methods can be better motivated by and better grounded in the principles of algorithmic information theory. It will be explained how different approaches to algorithmic complexity can explore the relaxation of different necessary and sufficient conditions in their pursuit of numerical applicability, with some of these approaches entailing greater risks than others in exchange for greater relevance. We conclude with a discussion of possible directions that may or should be taken into consideration to advance the field and encourage methodological innovation, but more importantly, to contribute to scientific discovery. This paper also serves as a rebuttal of claims made in a previously published minireview by another author, and offers an alternative account.
Motion Sensors for Knee Angle Recognition in Muscle Rehabilitation Solutions
The progressive loss of functional capacity due to aging is a serious problem that can compromise human locomotion capacity, requiring the help of an assistant and reducing independence. The NanoStim project aims to develop a system capable of performing treatment with electrostimulation at the patient’s home, reducing the number of consultations. The knee angle is one of the essential attributes in this context, helping understand the patient’s movement during the treatment session. This article presents a wearable system that recognizes the knee angle through IMU sensors. The hardware chosen for the wearables are low cost, including an ESP32 microcontroller and an MPU-6050 sensor. However, this hardware impairs signal accuracy in the multitasking environment expected in rehabilitation treatment. Three optimization filters with algorithmic complexity O(1) were tested to improve the signal’s noise. The complementary filter obtained the best result, presenting an average error of 0.6 degrees and an improvement of 77% in MSE. Furthermore, an interface in the mobile app was developed to respond immediately to the recognized movement. The systems were tested with volunteers in a real environment and could successfully measure the movement performed. In the future, it is planned to use the recognized angle with the electromyography sensor.
Information Theory in Perception of Form: From Gestalt to Algorithmic Complexity
In 1948, Claude Shannon published a revolutionary paper on communication and information in engineering, one that made its way into the psychology of perception and changed it for good. However, the path to truly successful applications to psychology has been slow and bumpy. In this article, we present a readable account of that path, explaining the early difficulties as well as the creative solutions offered. The latter include Garner’s theory of sets and redundancy as well as mathematical group theory. These solutions, in turn, enabled rigorous objective definitions to the hitherto subjective Gestalt concepts of figural goodness, order, randomness, and predictability. More recent developments enabled the definition of, in an exact mathematical sense, the key notion of complexity. In this article, we demonstrate, for the first time, the presence of the association between people’s subjective impression of figural goodness and the pattern’s objective complexity. The more attractive the pattern appears to perception, the less complex it is and the smaller the set of subjectively similar patterns.
A comparative evaluation of measures to assess randomness in human-generated sequences
Whether and how well people can behave randomly is of interest in many areas of psychological research. The ability to generate randomness is often investigated using random number generation (RNG) tasks, in which participants are asked to generate a sequence of numbers that is as random as possible. However, there is no consensus on how best to quantify the randomness of responses in human-generated sequences. Traditionally, psychologists have used measures of randomness that directly assess specific features of human behavior in RNG tasks, such as the tendency to avoid repetition or to systematically generate numbers that have not been generated in the recent choice history, a behavior known as cycling. Other disciplines have proposed measures of randomness that are based on a more rigorous mathematical foundation and are less restricted to specific features of randomness, such as algorithmic complexity. More recently, variants of these measures have been proposed to assess systematic patterns in short sequences. We report the first large-scale integrative study to compare measures of specific aspects of randomness with entropy-derived measures based on information theory and measures based on algorithmic complexity. We compare the ability of the different measures to discriminate between human-generated sequences and truly random sequences based on atmospheric noise, and provide a systematic analysis of how the usefulness of randomness measures is affected by sequence length. We conclude with recommendations that can guide the selection of appropriate measures of randomness in psychological research.
Filter Bubbles? Also Protector Bubbles! Folk Theories of Zhihu Algorithms Among Chinese Gay Men
In light of the awareness that we know little about how algorithms are perceived by groups other than those in the mainstream, this study investigates how Chinese gay men on Zhihu generate folk theories of the operation and impact of the platform algorithms. After recruiting 16 long-term users on Zhihu as informants and conducting thematic analysis, two overarching themes are identified: (1) the algorithm as evictor, supported by the users’ folk theories of sidelining, disorganizing, and defaming; and (2) the algorithm as protector, supported by the users’ folk theories of shielding, recognizing, and exclusive networks. Based on the empirical data collected, this study provides inspiration for understanding algorithmic complexity, and challenges the mainstream appeal to break through filter bubbles (information cocoons) by indicating its (hetero)normativity.
Bounds on the dimension of lineal extensions
Let E R^n be a union of line segments and F R^n the set obtained from E by extending each line segment in E to a full line. Keleti’s line segment extension conjecture posits that the Hausdorff dimension of F should equal that of E . Working in R^2 , we use effective methods to prove a strong packing dimension variant of this conjecture. Furthermore, a key inequality in this proof readily entails the planar case of the generalized Kakeya conjecture for packing dimension. This is followed by several doubling estimates in higher dimensions and connections to related problems.
An Additively Optimal Interpreter for Approximating Kolmogorov Prefix Complexity
We study practical approximations of Kolmogorov prefix complexity (K) using IMP2, a high-level programming language. Our focus is on investigating the optimality of the interpreter for this language as the reference machine for the Coding Theorem Method (CTM). This method is designed to address applications of algorithmic complexity that differ from the popular traditional lossless compression approach based on the principles of algorithmic probability. The chosen model of computation is proven to be suitable for this task, and a comparison to other models and methods is conducted. Our findings show that CTM approximations using our model do not always correlate with the results from lower-level models of computation. This suggests that some models may require a larger program space to converge to Levin’s universal distribution. Furthermore, we compare the CTM with an upper bound on Kolmogorov complexity and find a strong correlation, supporting the CTM’s validity as an approximation method with finer-grade resolution of K.
Structural Change in Romanian Land Use and Land Cover (1990–2018): A Multi-Index Analysis Integrating Kolmogorov Complexity, Fractal Analysis, and GLCM Texture Measures
Monitoring land use and land cover (LULC) transformations is essential for understanding socio-ecological dynamics. This study assesses structural shifts in Romania’s landscapes between 1990 and 2018 by integrating algorithmic complexity, fractal analysis, and Grey-Level Co-occurrence Matrix (GLCM) texture analysis. Multi-year maps were used to compute Kolmogorov complexity, fractal measures, and 15 GLCM metrics. The measures were compiled into a unified matrix, and temporal trajectories were explored with principal component analysis and k-means clustering to identify inflection points. Informational complexity and Higuchi 2D decline over time, while homogeneity and angular second moment rise, indicating greater local uniformity. A structural transition around 2006 separates an early heterogeneous regime from a more ordered state; 2012 appears as a turning point when several indices reach extreme values. Strong correlations between fractal and texture measures imply that geometric and radiometric complexity co-evolve, whereas large-scale fractal dimensions remain nearly stable. The multi-index approach provides a replicable framework for identifying critical transitions in LULC. It can support landscape monitoring, and future work should integrate finer temporal data and socio-economic drivers.
Algorithmic Information Distortions in Node-Aligned and Node-Unaligned Multidimensional Networks
In this article, we investigate limitations of importing methods based on algorithmic information theory from monoplex networks into multidimensional networks (such as multilayer networks) that have a large number of extra dimensions (i.e., aspects). In the worst-case scenario, it has been previously shown that node-aligned multidimensional networks with non-uniform multidimensional spaces can display exponentially larger algorithmic information (or lossless compressibility) distortions with respect to their isomorphic monoplex networks, so that these distortions grow at least linearly with the number of extra dimensions. In the present article, we demonstrate that node-unaligned multidimensional networks, either with uniform or non-uniform multidimensional spaces, can also display exponentially larger algorithmic information distortions with respect to their isomorphic monoplex networks. However, unlike the node-aligned non-uniform case studied in previous work, these distortions in the node-unaligned case grow at least exponentially with the number of extra dimensions. On the other hand, for node-aligned multidimensional networks with uniform multidimensional spaces, we demonstrate that any distortion can only grow up to a logarithmic order of the number of extra dimensions. Thus, these results establish that isomorphisms between finite multidimensional networks and finite monoplex networks do not preserve algorithmic information in general and highlight that the algorithmic information of the multidimensional space itself needs to be taken into account in multidimensional network complexity analysis.