Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
429 result(s) for "programming paradigm"
Sort by:
A Processing-in-Memory Architecture Programming Paradigm for Wireless Internet-of-Things Applications
The widespread applications of the wireless Internet of Things (IoT) is one of the leading factors in the emerging of Big Data. Huge amounts of data need to be transferred and processed. The bandwidth and latency of data transfers have posed a new challenge for traditional computing systems. Under Big Data application scenarios, the movement of large scales of data would influence performance, power efficiency, and reliability, which are the three fundamental attributes of a computing system. Thus, changes in the computing paradigm are demanding. Processing-in- Memory (PIM), aiming at placing computation as close as possible to memory, has become of great interest to academia as well as industries. In this work, we propose a programming paradigm for PIM architecture that is suitable for wireless IoT applications. A data-transferring mechanism and middleware architecture are presented. We present our methods and experiences on simulation-platform design, as well as FPGA demo design, for PIM architecture. Typical applications in IoT, such as multimedia and MapReduce programs, are used as demonstration of our method’s validity and efficiency. The programs could successfully run on the simulation platform built based on Gem5 and on the FPGA demo. Results show that our method could largely reduce power consumption and execution time for those programs, which is very beneficial in IoT applications.
QCCP: a taskflow programming model for emerging computing scenario
As the demand for computing power continues to rise, it is difficult for a single type of computing device or architecture to satisfy the current situation. Diversity and heterogeneity are becoming more and more popular. Seamlessly integrating the realms of high performance computing and quantum computing, and harnessing their collective potential, has emerged as a consensus approach to effectively address the pressing need for increased computing power. In the emerging computing scenario, various different types of computing devices have super-heterogeneous characteristics, and there are significant differences in computational principles, programming models, parallelism, etc. Effectively harnessing these disparate resources and achieving a unified programming paradigm have become urgent imperatives. To address the above problems, this paper introduces QCCP, a taskflow programming model that enables efficient collaborative computing between classical computers and quantum computers. QCCP establishes a unified programming abstraction, shields the super-heterogeneous characteristics of the underlying network and hardware, and supports flexible scheduling for different computational backends. The experimental results indicate that QCCP can support the processing of hybrid classical-quantum applications with diverse program structures. In particular, QCCP reveals its immense potential and superiority in tackling real-world challenges, specifically in the realm of quantum circuit cutting and reconstruction.
A Proposal of Naturalistic Software Development Method
Naturalistic programming purports to include natural language elements in programming languages to increase software expressiveness. Even though natural language is inherently ambiguous, it is richer and thus more expressive than any artificial language. Currently, the Naturalistic Programming Paradigm (NPP) is supported by its conceptual model and three general-purpose naturalistic programming languages that can generate executable binary code. Nevertheless, to date, no research efforts have been concentrated on applying the NPP within a software development process. To address this gap, in this article, we propose a naturalistic software development method to test the advantages of the NPP. The method focuses on the analysis and design stages of the software development process and seeks to contribute to closing the gap between the problem and the solution domains. We also present an example of an implementation using Cal-4700, a naturalistic programming language, showing the differences in expressiveness of programming with a traditional programming language, like Python.
On the effectiveness of the test-first approach to programming
Test-driven development (TDD) is based on formalizing a piece of functionality as a test, implementing the functionality such that the test passes, and iterating the process. This paper describes a controlled experiment for evaluating an important aspect of TDD: in TDD, programmers write functional tests before the corresponding implementation code. The experiment was conducted with undergraduate students. While the experiment group applied a test-first strategy, the control group applied a more conventional development technique, writing tests after the implementation. Both groups followed an incremental process, adding new features one at a time and regression testing them. We found that test-first students on average wrote more tests and, in turn, students who wrote more tests tended to be more productive. We also observed that the minimum quality increased linearly with the number of programmer tests, independent of the development strategy employed.
A qualitative assessment of αRby in the perspective of the supervisory control theory
It becomes more and more evident today that SAT-solving approaches have the potential to verify properties and synthesize supervisors of controlled systems described with a high level of abstraction. Such approaches can be particularly appropriate when engineers give more importance to decentralized, hierarchical, and parameterized control paradigms than to centralized ones in the design of systems composed of multiple small agents. One advantage of declarative programming languages, such as relational logic, in specifying control problems, including their underlying properties and reasoning methods, is their proximity to the mathematical objects used in the formulation of the theory itself, which allows for implementing new fragments of it faster. The disadvantage is, however, that SAT-solving approaches do not lend themselves to efficient calculations of auxiliary objects involved in some control problems, even if they can be described with the logic at hand. In some cases, the latter is not sufficiently powerful to express the entire solution logically. Such difficulties can be circumvented with α R B Y , a fusion of Alloy and Ruby . Based on earlier experiments conducted with Alloy , this paper provides a qualitative assessment of α R B Y and reports on the results of new experiments with two fragments of the supervisory control theory: state-based control and decentralized control.
The Influence of the Developed Specific Multi-Paradigm Programming in Digital Logic Education
This article introduces the possible usage of the developed programming discourse that can be used to support training in the digital logic area. The discourse merges several programming paradigms into one solution. The intended learners are secondary school students focused on digital system programming. The main intent is to find out whether digital logic curriculum based on Digital Circuits Based Logical Programming (DCBLP) inheritance has positive impact on the students and the way they explore the digital logic itself. Students’ cognitive and affective areas are in the scope of this preliminary research and questionnaires and cognitive tests will help to support the research. Experimental and control groups were used to gather relevant records. To analyse and support the interpretation of the data gathered by questionnaires, the chi-square test (two-tailed) has been used. ANOVA has been used to evaluate data for the achievement test results. The preliminary research revealed there is a possibility of using developed programming discourse DCBLP in digital logic training. Students claim overall usefulness of the discourse in the training; the strong motivation power of the programming discourse itself has not been discovered. From the test we conclude that the performance of the students trained using new programming discourse is significantly better. It is possible to use more different programming paradigms, such as imperative and declarative, in one solution to support training in the area of digital logic. Such solutions can enhance the way the students deal with the programming languages and also supports interdisciplinary relationships.
The Evolution of Imperative Programming Paradigms as a Search for New Ways to Reduce Code Duplication
The cause for imperative programming paradigm shift is the impossibility of developing software systems of a new level of complexity. We consider the evolution of programming paradigms: structured, procedural, and object-oriented. We demonstrate new ways of code duplication reducing have appeared with the shift of paradigm. We conclude the factor of code duplication reducing determines the direction of programming paradigm evolution. We discover the constraints, which were introduced in the paradigms, simplify the development of software systems. We conclude the new constraints allow the development of more complex software systems. The main reason for the code duplication is the low qualification of programmers. Therefore, in the process of learning programming, one should pay attention to code duplication and ways to reduce it.
Developing Web-Based Process Management with Automatic Code Generation
Automated code generation and process flow management are central to web-based application development today. This database-centric approach targets the form and process management challenges faced by corporate companies. It minimizes the time losses caused by managing hundreds of forms and processes, especially in large companies. Shortening development times, optimizing user interaction, and simplifying the code are critical advantages offered by this methodology. These low-code systems accelerate development, allowing organizations to adapt to the market quickly. This approach simplifies the development process with drag-and-drop features and enables developers to produce more effective solutions with less code. Automatic code generation with flow diagrams allows one to manage inter-page interactions and processes more intuitively. The interactive Process Design Editor developed in this study makes code generation more user-friendly and accessible. The case study results show that a 98.68% improvement in development processes, a 95.84% improvement in test conditions, and a 36.01% improvement in code size were achieved with this system. In conclusion, automated code generation and process flow management represent a significant evolution in web application development processes. This methodology both shortens development times and improves code quality. In the future, the demand for these technologies is expected to increase even more.
Designing Modular Software: A Case Study in Introductory Statistics
Modular programming is a development paradigm that emphasizes self-contained, flexible, and independent pieces of functionality. This practice allows new features to be seamlessly added when desired, and unwanted features to be removed, thus simplifying the software's user interface. The recent rise of web-based software applications has presented new challenges for designing an extensible, modular software system. In this article, we outline a framework for designing such a system, with a focus on reproducibility of the results. We present as a case study a Shiny-based web application called intRo , that allows the user to perform basic data analyses and statistical routines. Finally, we highlight some challenges we encountered, and how to address them, when combining modular programming concepts with reactive programming as used by Shiny. Supplementary material for this article is available online.
Comparative Study of the Inference Mechanisms in PROLOG and SPIDER
Control Network Programming (CNP) is a graphical nonprocedural programming style whose built-in inference engine (interpreter) is based on search in a recursive network. This paper is the third in a series of reports that share a common objective – comparison between the CNP language SPIDER and the logic programming language PROLOG. The focus here is on the comparative investigation of their interpreters, presented in a generic formal frame – reduction of goals. As a result of juxtaposing their pseudo-codes the advantages of SPIDER are outlined.