Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
506 result(s) for "Marx, Daniel"
Sort by:
Signatures of hierarchical temporal processing in the mouse visual system
A core challenge for the brain is to process information across various timescales. This could be achieved by a hierarchical organization of temporal processing through intrinsic mechanisms (e.g., recurrent coupling or adaptation), but recent evidence from spike recordings of the rodent visual system seems to conflict with this hypothesis. Here, we used an optimized information-theoretic and classical autocorrelation analysis to show that information- and correlation timescales of spiking activity increase along the anatomical hierarchy of the mouse visual system under visual stimulation, while information-theoretic predictability decreases. Moreover, intrinsic timescales for spontaneous activity displayed a similar hierarchy, whereas the hierarchy of predictability was stimulus-dependent. We could reproduce these observations in a basic recurrent network model with correlated sensory input. Our findings suggest that the rodent visual system employs intrinsic mechanisms to achieve longer integration for higher cortical areas, while simultaneously reducing predictability for an efficient neural code.
Size Bounds and Query Plans for Relational Joins
Relational joins are at the core of relational algebra, which in turn is the core of the standard database query language SQL. As their evaluation is expensive and very often dominated by the output size, it is an important task for database query optimizers to compute estimates on the size of joins and to find good execution plans for sequences of joins. We study these problems from a theoretical perspective, both in the worst-case model and in an average-case model where the database is chosen according to a known probability distribution. In the former case, our first key observation is that the worst-case size of a query is characterized by the fractional edge cover number of its underlying hypergraph, a combinatorial parameter previously known to provide an upper bound. We complete the picture by proving a matching lower bound and by showing that there exist queries for which the join-project plan suggested by the fractional edge cover approach may be substantially better than any join plan that does not use intermediate projections. On the other hand, we show that in the average-case model, every join-project plan can be turned into a plan containing no projections in such a way that the expected time to evaluate the plan increases only by a constant factor independent of the size of the database. Not surprisingly, the key combinatorial parameter in this context is the maximum density of the underlying hypergraph. We show how to make effective use of this parameter to eliminate the projections. [PUBLICATION ABSTRACT]
Embedding optimization reveals long-lasting history dependence in neural spiking activity
Information processing can leave distinct footprints on the statistics of neural spiking. For example, efficient coding minimizes the statistical dependencies on the spiking history, while temporal integration of information may require the maintenance of information over different timescales. To investigate these footprints, we developed a novel approach to quantify history dependence within the spiking of a single neuron, using the mutual information between the entire past and current spiking. This measure captures how much past information is necessary to predict current spiking. In contrast, classical time-lagged measures of temporal dependence like the autocorrelation capture how long—potentially redundant—past information can still be read out. Strikingly, we find for model neurons that our method disentangles the strength and timescale of history dependence, whereas the two are mixed in classical approaches. When applying the method to experimental data, which are necessarily of limited size, a reliable estimation of mutual information is only possible for a coarse temporal binning of past spiking, a so-called past embedding. To still account for the vastly different spiking statistics and potentially long history dependence of living neurons, we developed an embedding-optimization approach that does not only vary the number and size, but also an exponential stretching of past bins. For extra-cellular spike recordings, we found that the strength and timescale of history dependence indeed can vary independently across experimental preparations. While hippocampus indicated strong and long history dependence, in visual cortex it was weak and short, while in vitro the history dependence was strong but short. This work enables an information-theoretic characterization of history dependence in recorded spike trains, which captures a footprint of information processing that is beyond time-lagged measures of temporal dependence. To facilitate the application of the method, we provide practical guidelines and a toolbox.
Social Decision Making in Autistic Adolescents: The Role of Theory of Mind, Executive Functioning and Emotion Regulation
Social decision making is often challenging for autistic individuals. Twenty autistic adolescents made decisions in the socially interactive context of a one-shot ultimatum game, and performance was compared to a large matched typical reference sample. Theory of mind, executive functioning and emotion regulation were measured via direct assessments, self- and parent report. Relative to the reference sample, autistic adolescents proposed fewer fair offers, and this was associated with poorer theory of mind. Autistic adolescents responded similarly to the reference sample when making decisions about offers proposed to them, however they did not appear to down regulate their negative emotion in response to unfair treatment in the same way. Atypical processes may underpin even apparently typical decisions made by autistic adolescents.
Fixed-Parameter Tractability of Multicut Parameterized by the Size of the Cutset
Given an undirected graph $G$, a collection $\\{(s_1,t_1), \\dots, (s_{k},t_{k})\\}$ of pairs of vertices, and an integer ${{p}}$, the Edge Multicut problem asks if there is a set $S$ of at most ${{p}}$ edges such that the removal of $S$ disconnects every $s_i$ from the corresponding $t_i$. Vertex Multicut is the analogous problem where $S$ is a set of at most ${{p}}$ vertices. Our main result is that both problems can be solved in time $2^{O({{p}}^3)}\\cdot n^{O(1)}$, i.e., fixed-parameter tractable parameterized by the size ${{p}}$ of the cutset in the solution. By contrast, it is unlikely that an algorithm with running time of the form $f({{p}})\\cdot n^{O(1)}$ exists for the directed version of the problem, as we show it to be W[1]-hard parameterized by the size of the cutset. [PUBLICATION ABSTRACT]
Fixed-Parameter Tractability of Directed Multiway Cut Parameterized by the Size of the Cutset
Given a directed graph $G$, a set of $k$ terminals, and an integer $p$, the Directed Vertex Multiway Cut problem asks whether there is a set $S$ of at most $p$ (nonterminal) vertices whose removal disconnects each terminal from all other terminals. Directed Edge Multiway Cut is the analogous problem where $S$ is a set of at most $p$ edges. These two problems are indeed known to be equivalent. A natural generalization of the multiway cut is the Multicut problem, in which we want to disconnect only a set of $k$ given pairs instead of all pairs. Marx [Theoret. Comput. Sci., 351 (2006), pp. 394--406] showed that in undirected graphs Vertex/Edge Multiway cut is fixed-parameter tractable (FPT) parameterized by $p$. Marx and Razgon [Proceedings of the 43rd ACM Symposium on Theory of Computing, 2011, pp. 469--478] showed that undirected Multicut is FPT and Directed Multicut is W[1]-hard parameterized by $p$. We complete the picture here by our main result, which is that both Directed Vertex Multiway Cut and Directed Edge Multiway Cut can be solved in time $2^{2^{O(p)}}n^{O(1)}$, i.e., FPT parameterized by size $p$ of the cutset of the solution. This answers an open question raised by the aforementioned papers. It follows from our result that Directed Edge/Vertex Multicut is FPT for the case of $k=2$ terminal pairs, which answers another open problem raised by Marx and Razgon. [PUBLICATION ABSTRACT]
Immersions in Highly Edge Connected Graphs
We consider the problem of how much edge connectivity is necessary to force a graph$G$to contain a fixed graph$H$as an immersion. We show that if the maximum degree in$H$is$\\Delta$ , then all the examples of$\\Delta$ -edge connected graphs which do not contain$H$as a weak immersion must have a treelike decomposition called a tree-cut decomposition of bounded width. If we consider strong immersions, then it is easy to see that there are arbitrarily highly edge connected graphs which do not contain a fixed clique$K_t$as a strong immersion. We give a structure theorem which roughly characterizes those highly edge connected graphs which do not contain$K_t$as a strong immersion. [PUBLICATION ABSTRACT]
Closest Substring Problems with Small Distances
We study two pattern matching problems that are motivated by applications in computational biology. In the Closest Substring problem $k$ strings $s_1,\\dots, s_k$ are given, and the task is to find a string $s$ of length $L$ such that each string $s_i$ has a consecutive substring of length $L$ whose distance is at most $d$ from $s$. We present two algorithms that aim to be efficient for small fixed values of $d$ and $k$: for some functions $f$ and $g$, the algorithms have running time $f(d)\\cdot n^{O(\\log d)}$ and $g(d,k)\\cdot n^{O(\\log\\log k)}$, respectively. The second algorithm is based on connections with the extremal combinatorics of hypergraphs. The Closest Substring problem is also investigated from the parameterized complexity point of view. Answering an open question from [P. A. Evans, A. D. Smith, and H. T. Wareham, Theoret. Comput. Sci., 306 (2003), pp. 407-430, M. R. Fellows, J. Gramm, and R. Niedermeier, Combinatorica, 26 (2006), pp. 141-167, J. Gramm, J. Guo, and R. Niedermeier, Lecture Notes in Comput. Sci. 2751, Springer, Berlin, 2003, pp. 195-209, J. Gramm, R. Niedermeier, and P. Rossmanith, Algorithmica, 37 (2003), pp. 25-42], we show that the problem is W[1]-hard even if both $d$ and $k$ are parameters. It follows as a consequence of this hardness result that our algorithms are optimal in the sense that the exponent of $n$ in the running time cannot be improved to $o(\\log d)$ or to $o(\\log \\log k)$ (modulo some complexity-theoretic assumptions). Consensus Patterns is the variant of the problem where, instead of the requirement that each $s_i$ has a substring that is of distance at most $d$ from $s$, we have to select the substrings in such a way that the average of these $k$ distances is at most $\\delta$. By giving an $f(\\delta)\\cdot n^9$ time algorithm, we show that the problem is fixed-parameter tractable. This answers an open question from [M. R. Fellows, J. Gramm, and R. Niedermeier, Combinatorica, 26 (2006), pp. 141-167].
Tractable Structures for Constraint Satisfaction with Truth Tables
The way the graph structure of the constraints influences the complexity of constraint satisfaction problems (CSP) is well understood for bounded-arity constraints. The situation is less clear if there is no bound on the arities. In this case the answer depends also on how the constraints are represented in the input. We study this question for the truth table representation of constraints. We introduce a new hypergraph measure adaptive width and show that CSP with truth tables is polynomial-time solvable if restricted to a class of hypergraphs with bounded adaptive width. Conversely, assuming a conjecture on the complexity of binary CSP, there is no other polynomial-time solvable case. Finally, we present a class of hypergraphs with bounded adaptive width and unbounded fractional hypertree width.
Constraint Satisfaction Parameterized by Solution Size
In the constraint satisfaction problem (CSP) corresponding to a constraint language (i.e., a set of relations) $\\Gamma$, the goal is to find an assignment of values to variables so that a given set of constraints specified by relations from $\\Gamma$ is satisfied. The complexity of this problem has received a substantial amount of attention in the past decade. In this paper, we study the fixed-parameter tractability of CSPs parameterized by the size of the solution in the following sense: one of the possible values, say 0, is \"free,\" and the number of variables allowed to take other, \"expensive,\" values is restricted. A size constraint requires that exactly $k$ variables take nonzero values. We also study a more refined version of this restriction: a global cardinality constraint prescribes how many variables have to be assigned each particular value. We study the parameterized complexity of these types of CSPs where the parameter is the required number $k$ of nonzero variables. As special cases, we can obtain natural and well-studied parameterized problems such as Independent set, Vertex Cover, $d$-Hitting Set, Biclique, etc. In the case of constraint languages closed under substitution of constants, we give a complete characterization of the fixed-parameter tractable cases of CSPs with size constraints, and we show that all the remaining problems are W[1]-hard. For CSPs with cardinality constraints, we obtain a similar classification, but for some of the problems we are only able to show that they are Biclique-hard. The exact parameterized complexity of the Biclique problem is a notorious open problem, although it is believed to be W[1]-hard. [PUBLICATION ABSTRACT]