Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Content Type
      Content Type
      Clear All
      Content Type
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
73 result(s) for "Building Great Britain Data processing."
Sort by:
Ground-Penetrating Radar Mapping Using Multiple Processing and Interpretation Methods
Ground-penetrating radar processing and interpretation methods have been developed over time that usually follow a certain standard pathway, which leads from obtaining the raw reflection data to the production of amplitude slice-maps for three-dimensional visualization. In this standard series of analysis steps a great deal of important information contained in the raw data can potentially be lost or ignored, and without careful consideration, data filtering and re-analysis, information about important buried features can sometimes be unobserved. A typical ground-penetrating radar (GPR) dataset should, instead, be processed, re-evaluated, re-processed and then new images made from new sets of data as a way to enhance the visualization of radar reflections of interest. This should only be done in an intuitive way, once a preliminary series of images are produced using standard processing steps. An example from data collected in an agricultural field in France illustrate how obvious buried features are readily discovered and interpreted using standard processing steps, but additional frequency filtering, migration and then re-processing of certain portions of the data produced images of a subtle Roman villa foundation that might have otherwise gone undiscovered. In sand dunes in coastal Brazil, geological complexity obscured the reflections from otherwise hidden anthropogenic strata, and only an analysis of multiple profiles using different scales and processing allowed this small buried feature to be visible. Foundations of buildings in a Roman city in England could be easily discovered using standard processing methods, but a more detailed analysis of reflection profiles after re-processing and a comparison of GPR images with magnetic gradiometry maps provided information that allowed for the functions of come buried buildings and also an analysis of the city’s destruction by fire.
How the impacts of burst water mains are influenced by soil sand content
Society relies on infrastructure, but as infrastructure systems are often collocated and interdependent, they are vulnerable to cascading failures. This study investigated cross-infrastructure and societal impacts of burst water mains, with the hypothesis that multi-infrastructure failures triggered by burst water mains are more common in sandy soils. When water mains in sandy soils burst, pressurised water can create subsurface voids and abrasive slurries, contributing to further infrastructure failures. Three spatial data investigations, at nested scales, were used to assess the influence that soil sand content has on the frequency and damage caused by burst water mains (1) to roads in the county of Lincolnshire, (2) to other proximal water mains in East Anglia and (3) to other proximal infrastructure and wider society across England and Wales. These investigations used infrastructure network and failure data, media reports and soil maps, and were supported by workshop discussions and structured interviews with infrastructure industry experts. The workshop, interviews and media reports produced a greater depth of information on the infrastructure and societal impacts of cascading failures than the analysis of infrastructure data. Cross-infrastructure impacts were most common on roads, built structures and gas pipes, and they occurred at a higher rate in soils with very high sand contents.
Detecting Associations between Archaeological Site Distributions and Landscape Features: A Monte Carlo Simulation Approach for the R Environment
Detecting association between archaeological sites and physical landscape elements like geological deposits, vegetation, drainage networks, or areas of modern disturbance like mines or quarries is a key goal of archaeological projects. This goal is complicated by the incomplete nature of the archaeological record, the high degree of uncertainty of typical point distribution patterns, and, in the case of deeply buried archaeological sites, the absence of reliable information about the ancient landscape itself. Standard statistical approaches may not be applicable (e.g., X2 test) or are difficult to apply correctly (regression analysis). Monte Carlo simulation, devised in the late 1940s by mathematical physicists, offers a way to approach this problem. In this paper, we apply a Monte Carlo approach to test for association between Lower and Middle Palaeolithic sites in Hampshire and Sussex, UK, and quarries recorded on historical maps. We code our approach in the popular ‘R’ software environment, describing our methods step-by-step and providing complete scripts so others can apply our method to their own cases. Association between sites and quarries is clearly shown. We suggest ways to develop the approach further, e.g., for detecting associations between sites or artefacts and remotely-sensed deposits or features, e.g., from aerial photographs or geophysical survey.
Inside Smartgeometry
Smartgeometry (SG) is a key influence on the architectural community who explore creative computational methods for the design of buildings. An informal international network of practitioners and researchers, the group meets annually to experiment with new technologies and collaborate to develop digital design techniques. When SG was founded in 2001 by London-based architects and friends Hugh Whitehead (Foster + Partners), J Parrish (AECOM) and Lars Hesselgren (PLP), there was little in the way of parametric tools for architecture. SG was founded to encourage the development, discussion and experimentation of digital design techniques driven by design intent rather than on construction specifications. SG calls for a re-consideration of the design process, where the creation of computational mechanisms become an integral part of designing - not a task done prior to or separate from the process. In the early years of the workshops this need for new ways of design thinking led to the development of Bentley´s GenerativeComponents software. In recent years, the ecology of these design environments has diversified to include multiple software platforms, as well as innovative fabrication techniques and interactive environments. SG has grown accordingly from a handful of experts to an international network of designers who are defining the future of design. Founded by digital pioneers, it creates the algorithmic designers of the future. Inside Smartgeometry can be seen as a retroactive manifesto for SG, examining and contextualising the work of the SG community: the digital spaces, prototypes and buildings designed using bespoke tools created in response to architectural ideas. From interactive crowd-sourcing tools to responsive agent-based systems to complex digitally fabricated structures, it explores more than a decade of advances that have been influential for architecture.  Through 23 original texts including reflections by the founders, and key contributors such as Robert Aish, Martin Bechthold, Mark Burry, Chris Williams and Robert Woodbury, the book offers a critical state of the art of computational design for architecture. Many international design and engineering firms have participated in SG and the book includes chapters by practitioners from offices such as CASE, Design2Production, Foster + Partners, Grimshaw, Populous and SOM.
Assessing stone degradation using an integrated database and geographical information system (GIS)
Issue Title: Special Issue: Monument Future: Climate Change, Air Pollution, Decay and Conservation - The Wolf-Dieter-Grimm Volume (pp. 451 - 802) An integrated database and geographical information system (GIS) for the recording and monitoring of stone degradation are outlined. GIS requirements for stone degradation are summarized and a simple classification system for weathering forms identified and applied. The potential use of the system for identifying change in weathering forms using historic imagery is illustrated using an example from Oxford. Combining information from imagery of different buildings at different time periods, it is possible to put forward a possible scenario of weathering form evolution. [PUBLICATION ABSTRACT]
The GIS approach to evaporite-karst geohazards in Great Britain
Evaporite karst in Great Britain has formed in Permian and Triassic gypsum, and in Triassic salt. Active dissolution of these deposits can occur on a human rather than a geological timescale causing subsidence and building damage. The British Geological Survey has taken two approaches towards understanding and advising on hazards caused by dissolution of these soluble rocks. At a detailed level, a national database and GIS of karstic features is being populated. Information gathered includes dolines, springs, stream sinks, caves and building damage. At a national level, the soluble rocks in Great Britain have been identified and digital-map polygon information relating to them was extracted from the British 1:50,000-scale digital geological map. These areas have been assessed, and in places their margins extended to include some overlying rocks where subsidence features are known to penetrate upwards through the overlying sequence. The national areas have then been assessed using detailed local information to assign a susceptibility rating from A (extremely low) to E (high), depending on the nature and regularity of the subsidence events that occur. This national zonation of the soluble rocks can be used for planning, construction and in the insurance businesses. This has proved useful for assessing the potential stability of linear routes, such as roads and pipelines or for other important structures such as bridges and buildings. The information can also be used to delineate zones of karstic groundwater flow.[PUBLICATION ABSTRACT]
Numerical modelling of flood flows over irregular topography
In solving the hyperbolic equations of environmental shallow water flows, computational difficulty arises when the topography is irregular. In particular, under a conventional operator-splitting framework, a very small time step is often necessary to ensure stability, and this limitation on the time step is dictated by the magnitude of the source terms in relation to the irregular topography and inevitably increases the computational cost. Here a practically efficient finite volume algorithm is presented for solving the hyperbolic equations of shallow water flows. The numerical solution is achieved under an operator-splitting framework, a second-order weighted average flux (WAF) total variation diminishing (TVD) method along with the Harten Lax van Leer (HLL) approximate Riemann solver for the homogeneous equations, and a Runge–Kutta scheme for the ordinary differential equations of source terms. For numerical stability, a self-adaptive time step method is proposed under the Runge–Kutta scheme. Numerical tests for a case of glacier-lake outburst flooding demonstrate that the present model is essentially free from the restriction on time step arising from irregular topography, and therefore computational efficiency is substantially enhanced. The model is benchmarked with an urban flooding event in Glasgow, UK, and the modelling results are in reasonable agreement with those attained by others from the Flood Risk Management Research Consortium, UK.