Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
2,301 result(s) for "Analytical Engine"
Sort by:
Gender scripts in the old age subjectivation of men: A theoretical pledge through empirics
The article is based on a research study and uses less well-known conceptual and analytical tools. It focuses on the divergent gendering of older and old men from the perspective of the public/private divide. Preliminary research shows the greater presence of men at home and more care work, while women are becoming more active in the public. For the research on the hypothetical alternative of gendered old age, a target group of the highly educated and/or well professionally skilled older and old men was constituted. All of them possess above-average social and cultural capital. The results show familial involvement already during the employment period, and a perceptible presence in public life after retirement. Nevertheless, the research pointed to the hidden mechanisms of hegemonic masculinity, which could add to the theoretical deconstruction of gender hierarchies.
The Computer from Pascal to von Neumann
In 1942, Lt. Herman H. Goldstine, a former mathematics professor, was stationed at the Moore School of Electrical Engineering at the University of Pennsylvania. It was there that he assisted in the creation of the ENIAC, the first electronic digital computer. The ENIAC was operational in 1945, but plans for a new computer were already underway. The principal source of ideas for the new computer was John von Neumann, who became Goldstine's chief collaborator. Together they developed EDVAC, successor to ENIAC. After World War II, at the Institute for Advanced Study, they built what was to become the prototype of the present-day computer. Herman Goldstine writes as both historian and scientist in this first examination of the development of computing machinery, from the seventeenth century through the early 1950s. His personal involvement lends a special authenticity to his narrative, as he sprinkles anecdotes and stories liberally through his text.
When Computers Were Human
Before Palm Pilots and iPods, PCs and laptops, the term \"computer\" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology. Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider world of women and men who did the hard computational labor of science. His grandmother's casual remark, \"I wish I'd used my calculus,\" hinted at a career deferred and an education forgotten, a secret life unappreciated; like many highly educated women of her generation, she studied to become a human computer because nothing else would offer her a place in the scientific world. The book begins with the return of Halley's comet in 1758 and the effort of three French astronomers to compute its orbit. It ends four cycles later, with a UNIVAC electronic computer projecting the 1986 orbit. In between, Grier tells us about the surveyors of the French Revolution, describes the calculating machines of Charles Babbage, and guides the reader through the Great Depression to marvel at the giant computing room of the Works Progress Administration. When Computers Were Human is the sad but lyrical story of workers who gladly did the hard labor of research calculation in the hope that they might be part of the scientific community. In the end, they were rewarded by a new electronic machine that took the place and the name of those who were, once, the computers.
Sacramental Engines: The Trinitarian Ontology of Computers in Charles Babbage’s Analytical Engine
Charles Babbage’s Analytical Engine can be recollected as a fossilized image of the first digital computer. It is essentially distinguished from all prior and analog computers by the transcription of the ‘mechanical notation’, the separation of the mnemonic ‘store’ from the cybernetic ‘mill’, and the infinite miniaturization of its component parts. This substitution of finite space for an accelerating singularity of time creates the essential rupture of the digital, in which a singular calculation of mechanical force stands opposed to the universal totality of space. Babbage’s criticism of Christian doctrine to preserve the mathematical consistency of mechanics and computing would result in the collapse of the Christian Trinity into a digital theology. This Arian subordinate difference of the Son to the Father would then be infinitely transcribed in a technical contradiction that would threaten to annul the metaphysical ground of any machine. Against digital and postdigital theologies alike, this rupture can only be repaired by a dialectical analysis of the digital into a hyperdigital grammar, which is created by Christ the Logos in a trinitarian ontology of computers. Digital computers can thus be vindicated from theological suspicion as incarnationally accelerated calculators of the sacraments, or ‘sacramental engines’ of the digital age.
MAES_GR: A Web-Based, Spatially Enabled Field Survey Platform for the MAES Implementation in Greece
This study presents a standardized approach to collecting, registering, and reporting field-survey data for baseline MAES (Mapping and Assessment of Ecosystems and their Services) information in Greece. This is accomplished through a web-based platform (MAES_GR) exclusively developed under the relevant, nation-wide LIFE-IP 4 NATURA project. Based on the European Commission’s guidance for ecosystem condition (EC) and ecosystem services (ES) MAES studies, we conceptualized and structured an online platform to support EC and ES assessments, integrating all relevant fields of information needed for registering EC and ES parameters. A novel algorithm calculating EC was also developed and it is available as an integral part of the platform. The use of the MAES_GR platform was evaluated during nationwide field surveys efforts, increasing time efficiency and reducing costs. Field recording of EC and ES pinpoint spatial priorities for ecosystem restoration, conservation and sustainable development. This work highlights that MAES implementation can be favored by the use of technology tools such as mobile survey platforms, developed according to scientific needs and policy guidelines. Such tools, apart from the data inventory phase, can be used for data analysis, synthesis and extraction, providing timely, standardized information suitable for reporting at the local, regional, national and European Union scale.
Whiplash Syndrome Reloaded: Digital Echoes of Whiplash Syndrome in the European Internet Search Engine Context
In many Western countries, after a motor vehicle collision, those involved seek health care for the assessment of injuries and for insurance documentation purposes. In contrast, in many less wealthy countries, there may be limited access to care and no insurance or compensation system. The purpose of this infodemiology study was to investigate the global pattern of evolving Internet usage in countries with and without insurance and the corresponding compensation systems for whiplash injury. We used the Internet search engine analytics via Google Trends to study the health information-seeking behavior concerning whiplash injury at national population levels in Europe. We found that the search for \"whiplash\" is strikingly and consistently often associated with the search for \"compensation\" in countries or cultures with a tort system. Frequent or traumatic painful injuries; diseases or disorders such as arthritis, headache, radius, and hip fracture; depressive disorders; and fibromyalgia were not associated similarly with searches on \"compensation.\" In this study, we present evidence from the evolving viewpoint of naturalistic Internet search engine analytics that the expectations for receiving compensation may influence Internet search behavior in relation to whiplash injury.
Cyber risk at the edge: current and future trends on cyber risk analytics and artificial intelligence in the industrial internet of things and industry 4.0 supply chains
Digital technologies have changed the way supply chain operations are structured. In this article, we conduct systematic syntheses of literature on the impact of new technologies on supply chains and the related cyber risks. A taxonomic/cladistic approach is used for the evaluations of progress in the area of supply chain integration in the Industrial Internet of Things and Industry 4.0, with a specific focus on the mitigation of cyber risks. An analytical framework is presented, based on a critical assessment with respect to issues related to new types of cyber risk and the integration of supply chains with new technologies. This paper identifies a dynamic and self-adapting supply chain system supported with Artificial Intelligence and Machine Learning (AI/ML) and real-time intelligence for predictive cyber risk analytics. The system is integrated into a cognition engine that enables predictive cyber risk analytics with real-time intelligence from IoT networks at the edge. This enhances capacities and assist in the creation of a comprehensive understanding of the opportunities and threats that arise when edge computing nodes are deployed, and when AI/ML technologies are migrated to the periphery of IoT networks.
Charles Babbage and his world
Most people have heard something about Charles Babbage, F.R.S., whose work on computers in the nineteenth century was much ahead of its time. Babbage worked on two computing devices, neither of which he completed: the Difference Engine and the Analytical Engine. It is the Analytical Engine that gives him a claim to major fame. Unfortunately, most of the details remained buried in his manuscript notebooks and were not unearthed until the modern computer age had begun. For that reason, it is rather overstating the case to describe him, as is often done, as the father of the computer. I shall not here be concerned with Babbage's technical achievements, but rather with the age in which he lived. It was an age in which rapid scientific and industrial progress was being made and in that respect it resembles our own. Babbage was born in 1791 and went up to Cambridge University in 1810. Steam engines were then rare, and railways and electric telegraphs lay in the future. The word 'electricity' denoted static electricity generated by friction, and there was no hint of the marvels that electric currents would bring.
Liquidity Risk Management Information Systems
As a result of the global financial crisis that manifested as both systemic and firm‐specific liquidity shortfalls and the consequent heightened regulatory oversight, banks and financial institutions are keenly focused on the need to identify, measure, manage, monitor, assess, and mitigate their liquidity risk positions in a timely manner. An effective management information system (MIS) is an important prerequisite that enables these foundational capabilities for achieving sound liquidity risk management (LRM) practice. The liquidity risk MIS should enable timely and relevant information to the board of directors, senior management, and liquidity managers to address both operational liquidity management and regulatory requirements effectively. To achieve this, a robust MIS infrastructure should be automated, auditable, adaptive, and sustainable. The core components of the liquidity risk MIS infrastructure should include a centralized data foundation, advanced analytical and stress test engines, management information portal, and reporting platform, all supported by a well‐defined data governance and quality control framework.
Building the Right Technology Landscape
This chapter discusses the key aspects of the technology ecosystem and how to build the right technology foundation in a scalable and cost affordable manner. The topics covered are data platform, analytics engine, multi‐agent systems, adaptive user experiences, universal software gateways, and technology partner ecosystems. Database management architecture and systems is the most critical component of the technical infrastructure for AI projects. Analytics engines help reduce the time to develop and deploy the AI algorithms. These engines most often come with a domain‐specific common modeling language to uniformly represent analytical functions. Many industries and companies are adopting microservices architecture as a potential intermediate step before migration into a more mature multi‐agent system. Paying attention to user experience and making it adaptive is therefore critical to the success of any AI initiative. Technology partners become critical as they may bring speed and scale needed to win with AI.