Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
46 result(s) for "block-based analysis"
Sort by:
Computationally Efficient Wildfire Detection Method Using a Deep Convolutional Network Pruned via Fourier Analysis
In this paper, we propose a deep convolutional neural network for camera based wildfire detection. We train the neural network via transfer learning and use window based analysis strategy to increase the fire detection rate. To achieve computational efficiency, we calculate frequency response of the kernels in convolutional and dense layers and eliminate those filters with low energy impulse response. Moreover, to reduce the storage for edge devices, we compare the convolutional kernels in Fourier domain and discard similar filters using the cosine similarity measure in the frequency domain. We test the performance of the neural network with a variety of wildfire video clips and the pruned system performs as good as the regular network in daytime wild fire detection, and it also works well on some night wild fire video clips.
An Early Flame Detection System Based on Image Block Threshold Selection Using Knowledge of Local and Global Feature Analysis
Fire is one of the mutable hazards that damage properties and destroy forests. Many researchers are involved in early warning systems, which considerably minimize the consequences of fire damage. However, many existing image-based fire detection systems can perform well in a particular field. A general framework is proposed in this paper which works on realistic conditions. This approach filters out image blocks based on thresholds of different temporal and spatial features, starting with dividing the image into blocks and extraction of flames blocks from image foreground and background, and candidates blocks are analyzed to identify local features of color, source immobility, and flame flickering. Each local feature filter resolves different false-positive fire cases. Filtered blocks are further analyzed by global analysis to extract flame texture and flame reflection in surrounding blocks. Sequences of successful detections are buffered by a decision alarm system to reduce errors due to external camera influences. Research algorithms have low computation time. Through a sequence of experiments, the result is consistent with the empirical evidence and shows that the detection rate of the proposed system exceeds previous studies and reduces false alarm rates under various environments.
Measuring frequency domain granger causality for multiple blocks of interacting time series
In the past years, several frequency-domain causality measures based on vector autoregressive time series modeling have been suggested to assess directional connectivity in neural systems. The most followed approaches are based on representing the considered set of multiple time series as a realization of two or three vector-valued processes, yielding the so-called Geweke linear feedback measures, or as a realization of multiple scalar-valued processes, yielding popular measures like the directed coherence (DC) and the partial DC (PDC). In the present study, these two approaches are unified and generalized by proposing novel frequency-domain causality measures which extend the existing measures to the analysis of multiple blocks of time series. Specifically, the block DC (bDC) and block PDC (bPDC) extend DC and PDC to vector-valued processes, while their logarithmic counterparts, denoted as multivariate total feedback and direct feedback , represent into a full multivariate framework the Geweke’s measures. Theoretical analysis of the proposed measures shows that they: (i) possess desirable properties of causality measures; (ii) are able to reflect either direct causality (bPDC, or total (direct + indirect) causality (bDC, between time series blocks; (iii) reduce to the DC and PDC measures for scalar-valued processes, and to the Geweke’s measures for pairs of processes; (iv) are able to capture internal dependencies between the scalar constituents of the analyzed vector processes. Numerical analysis showed that the proposed measures can be efficiently estimated from short time series, allow to represent in an objective, compact way the information derived from the causal analysis of several pairs of time series, and may detect frequency domain causality more accurately than existing measures. The proposed measures find their natural application in the evaluation of directional interactions in neurophysiological settings where several brain activity signals are simultaneously recorded from multiple regions of interest.
Latent linkage semantic kernels for collective classification of link data
Generally, links among objects demonstrate certain patterns and contain rich semantic clues. These important clues can be used to improve classification accuracy. However, many real-world link data may exhibit more complex regularity. For example, there may be some noisy links that carry no human editorial endorsement about semantic relationships. To effectively capture such regularity, this paper proposes latent linkage semantic kernels (LLSKs) by first introducing the linkage kernels to model the local and global dependency structure of a link graph and then applying the singular value decomposition (SVD) in the kernel-induced space. For the computational efficiency on large datasets, we also develop a block-based algorithm for LLSKs. A kernel-based contextual dependency network (KCDN) model is then presented to exploit the dependencies in a network of objects for collective classification. We provide experimental results demonstrating that the KCDN model, together with LLSKs, demonstrates relatively high robustness on the datasets with the complex link regularity, and the block-based computation method can scale well with varying sizes of the problem. [PUBLICATION ABSTRACT]
A Comparison of Plugged and Unplugged Tools in Teaching Algorithms at the K-12 Level for Computational Thinking Skills
This study examines the effects of plugged and unplugged programming tools used in algorithm teaching at the K-12 level on student computational thinking skills and to determine whether gender is a factor in this process. The study group was designed with a control group pre-test–post-test; quasi-experimental model, that consisted of 109 students in 6th grade at a secondary school. 3 of 4 branches in the school were randomly selected and the experiment and control groups were determined by random assignment. During the study, experiment group 1 was taught Code.org while experiment group 2 was taught unplugged tools, and the control group was taught Scratch. The study lasted for 6 weeks and had 2 lessons per week. The data collection tool used was the \"Computational Thinking Levels Scale\" as the pre- and post-tests. Our findings showed that while the group that was taught unplugged activities showed positive development for computational thinking skills, there were no significant improvements observed in the other groups. Also, when comparing computational thinking skills, again, there was no significant difference found among the groups. It was observed that group and gender cofactors did not create significant variation among the groups; and when examined on a group basis, differences were found to favor male students when they were performing unplugged activities.
Characterizing Students’ 4C Skills Development During Problem-based Digital Making
Amid the maker movement, educators are proposing various making activities with programmable artifacts to prepare students for coping with the challenges in the twenty-first century. Today, the “4C” skills—critical thinking, creativity, communication, and collaboration—are regarded as significant learning outcomes in Science, Technology, Engineering, and Mathematics education; however, few researchers have investigated the adoption of problem-based learning in K-12 programming education for developing students’ 4C skills. A case study was conducted in a “digital making” camp in which 54 upper elementary and lower secondary school students (10–14 years old) were engaged in harnessing a block-based programming tool, Scratch, to conduct various problem-solving tasks. Through triangulating multiple sources of qualitative data (including lesson plans, classroom field notes, videotaped lesson records, student solutions/artifacts, and post-intervention interviews), together with the microgenetic learning analysis, this study characterizes students’ 4C skills development in the process of problem-based digital making. We found that the problem-based digital making environment supported the students’ development of (a) critical thinking in the form of critical modeling and critical data handling; (b) creativity in the form of creative explorations, creative solutions, and creative expressions; and (c) communication and collaboration in the form of communicative scaffolding and collaborative debugging. Complementary evidence-based suggestions for scaffolding problem-based digital making activities are suggested.
Debugging during block-based programming
In this study, we investigated the debugging process that early childhood preservice teachers used during block-based programing. Its purpose was to provide insights into how to prepare early childhood teachers to integrate computer science into instruction. This study reports the types of errors that early childhood preservice teachers commonly made and how they debugged the errors. Findings are discussed in relation to research and practice that could benefit from debugging instruction. This study provides directions for future computer science education research that aims to prepare teachers for programming, computational thinking, and STEM education. Though this study used robotics as a programming context, findings on early childhood preservice teachers' debugging processes could be applicable to other contexts involving block-based programming.
Exploring Force and Motion Concepts in Middle Grades Using Computational Modeling: a Classroom Intervention Study
Computational thinking (CT) and modeling are authentic practices that scientists and engineers use frequently in their daily work. Advances in computing technologies have further emphasized the centrality of modeling in science by making computationally enabled model use and construction more accessible to scientists. As such, it is important for all students to get exposed to these practices in K-12 science classrooms. This study investigated how a week-long intervention in a regular middle school science classroom that introduced CT and simulation-based model building through block-based programming influenced students’ learning of CT and force and motion concepts. Eighty-two seventh-grade students from a public middle school participated in the study. Quantitative data sources included pre- and post-assessments of students’ understanding of force and motion concepts and CT abilities. Qualitative data sources included classroom observation notes, student interviews, and students’ reflection statements. During the intervention, students were introduced to CT using block-based programming and engaged in constructing simulation-based computational models of physical phenomena. The findings of the study indicated that engaging in building computational models resulted in significant conceptual learning gains for the sample of this study. The affordances of the dynamic nature of computational models let students both observe and interact with the target phenomenon in real time while the generative dimension of model construction promoted a rich classroom discourse facilitating conceptual learning. This study contributes to the nascent literature on integrating CT into K-12 science curricula by emphasizing the affordances and generative dimension of model construction through block-based programming.
Self-efficacy and behavior patterns of learners using a real-time collaboration system developed for group programming
In order to promote the practice of co-creation, a real-time collaboration (RTC) version of the popular block-based programming (BBP) learning environment, MIT App Inventor (MAI), was proposed and implemented. RTC overcomes challenges related to non-collocated group work, thus lowering barriers to cross-region and multi-user collaborative software development. An empirical study probed into the differential impact on self-efficacy and collaborative behavior of learners in the environment depending upon their disciplinary background. The study serves as an example of the use of learning analytics to explore the frequent behavior patterns of adult learners, in this case specifically while performing BBP in MAI integrated with RTC. This study compares behavior patterns that are collaborative or individual that occurred on the platform, and investigates the effects of collaboration on learners working within the RTC depending on whether they were CS-majors or not. We highlight advantages of the new MAI design during multi-user programming in the online RTC based on the connections between the interface design and BBP as illustrated by two significant behavior patterns found in this instructional experiment. First, the multi-user programming in the RTC allowed multiple tasks to happen at the same time, which promoted engagement in joint behavior. For example, one user arranged components in the interface design while another dragged blocks to complete the program. Second, this study confirmed that the Computer Programming Self-Efficacy (CPSE) was similar for individual and multi-user programming overall. The CPSE of the homogeneous CS-major groups engaged in programming within the RTC was higher than that of the homogeneous non-CS-major groups and heterogeneous groups. There was no significant difference between the CPSE of the homogenous non-CS group and the CPSE of the heterogeneous groups, regardless of whether they were engaged in individual programming or collaborative programming within their groups. The results of the study support the value of engaging with MAI collaboratively, especially for CS-majors, and suggest directions for future work in RTC design.
A Block-Based Adaptive Decoupling Framework for Graph Neural Networks
Graph neural networks (GNNs) with feature propagation have demonstrated their power in handling unstructured data. However, feature propagation is also a smooth process that tends to make all node representations similar as the number of propagation increases. To address this problem, we propose a novel Block-Based Adaptive Decoupling (BBAD) Framework to produce effective deep GNNs by utilizing backbone networks. In this framework, each block contains a shallow GNN with feature propagation and transformation decoupled. We also introduce layer regularizations and flexible receptive fields to automatically adjust the propagation depth and to provide different aggregation hops for each node, respectively. We prove that the traditional coupled GNNs are more likely to suffer from over-smoothing when they become deep. We also demonstrate the diversity of outputs from different blocks of our framework. In the experiments, we conduct semi-supervised and fully supervised node classifications on benchmark datasets, and the results verify that our method can not only improve the performance of various backbone networks, but also is superior to existing deep graph neural networks with less parameters.