Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Content Type
      Content Type
      Clear All
      Content Type
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
170 result(s) for "Transport theory Data processing."
Sort by:
Atomistic simulation of quantum transport in nanoelectronic devices
\"Computational nanoelectronics is an emerging multi-disciplinary field covering condensed matter physics, applied mathematics, computer science, and electronic engineering. In recent decades, a few state-of-the-art software packages have been developed to carry out first-principle atomistic device simulations. Nevertheless those packages are either black boxes (commercial codes) or accessible only to very limited users (private research codes). The purpose of this book is to open one of the commercial black boxes, and to demonstrate the complete procedure from theoretical derivation, to numerical implementation, all the way to device simulation. Meanwhile the affiliated source code constitutes an open platform for new researchers. This is the first book of its kind. We hope the book will make a modest contribution to the field of computational nanoelectronics\"-- Provided by publisher.
Mastering Wireshark 2
Wireshark, a combination of Kali and Metasploit, deals with the second to the seventh layer of network protocols. The book will introduce to various protocol analysis methods and will teach you how to analyze them. You will discover and work with some advanced features which will enhance the capabilities of your application. By the end, you.
Emulating spin transport with nonlinear optics, from high-order skyrmions to the topological Hall effect
Exploring material magnetization led to countless fundamental discoveries and applications, culminating in the field of spintronics. Recently, research effort in this field focused on magnetic skyrmions – topologically robust chiral magnetization textures, capable of storing information and routing spin currents via the topological Hall effect. In this article, we propose an optical system emulating any 2D spin transport phenomena with unprecedented controllability, by employing three-wave mixing in 3D nonlinear photonic crystals. Precise photonic crystal engineering, as well as active all-optical control, enable the realization of effective magnetization textures beyond the limits of thermodynamic stability in current materials. As a proof-of-concept, we theoretically design skyrmionic nonlinear photonic crystals with arbitrary topologies and propose an optical system exhibiting the topological Hall effect. Our work paves the way towards quantum spintronics simulations and novel optoelectronic applications inspired by spintronics, for both classical and quantum optical information processing. Control of effective magnetization textures like skyrmions is limited by the thermodynamic stability in current materials. Here, the authors propose a 3D nonlinear photonic crystal to emulate 2D spin transport phenomena with excellent controllability.
New Era of Air Quality Monitoring from Space
The Geostationary Environment Monitoring Spectrometer (GEMS) is scheduled for launch in February 2020 to monitor air quality (AQ) at an unprecedented spatial and temporal resolution from a geostationary Earth orbit (GEO) for the first time. With the development of UV–visible spectrometers at sub-nm spectral resolution and sophisticated retrieval algorithms, estimates of the column amounts of atmospheric pollutants (O₃, NO₂, SO₂, HCHO, CHOCHO, and aerosols) can be obtained. To date, all the UV–visible satellite missions monitoring air quality have been in low Earth orbit (LEO), allowing one to two observations per day. With UV–visible instruments on GEO platforms, the diurnal variations of these pollutants can now be determined. Details of the GEMS mission are presented, including instrumentation, scientific algorithms, predicted performance, and applications for air quality forecasts through data assimilation. GEMS will be on board the Geostationary Korea Multi-Purpose Satellite 2 (GEO-KOMPSAT-2) satellite series, which also hosts the Advanced Meteorological Imager (AMI) and Geostationary Ocean Color Imager 2 (GOCI-2). These three instruments will provide synergistic science products to better understand air quality, meteorology, the long-range transport of air pollutants, emission source distributions, and chemical processes. Faster sampling rates at higher spatial resolution will increase the probability of finding cloud-free pixels, leading to more observations of aerosols and trace gases than is possible from LEO. GEMS will be joined by NASA’s Tropospheric Emissions: Monitoring of Pollution (TEMPO) and ESA’s Sentinel-4 to form a GEO AQ satellite constellation in early 2020s, coordinated by the Committee on Earth Observation Satellites (CEOS).
The first 1-year-long estimate of the Paris region fossil fuel CO2 emissions based on atmospheric inversion
The ability of a Bayesian atmospheric inversion to quantify the Paris region's fossil fuel CO2 emissions on a monthly basis, based on a network of three surface stations operated for 1 year as part of the CO2-MEGAPARIS experiment (August 2010-July 2011), is analysed. Differences in hourly CO2 atmospheric mole fractions between the near-ground monitoring sites (CO2 gradients), located at the north-eastern and south-western edges of the urban area, are used to estimate the 6h mean fossil fuel CO2 emission. The inversion relies on the CHIMERE transport model run at 2km × 2km horizontal resolution, on the spatial distribution of fossil fuel CO2 emissions in 2008 from a local inventory established at 1km × 1km horizontal resolution by the AIRPARIF air quality agency, and on the spatial distribution of the biogenic CO2 fluxes from the C-TESSEL land surface model. It corrects a prior estimate of the 6h mean budgets of the fossil fuel CO2 emissions given by the AIRPARIF 2008 inventory. We found that a stringent selection of CO2 gradients is necessary for reliable inversion results, due to large modelling uncertainties. In particular, the most robust data selection analysed in this study uses only mid-afternoon gradients if wind speeds are larger than 3ms-1 and if the modelled wind at the upwind site is within ±15° of the transect between downwind and upwind sites. This stringent data selection removes 92% of the hourly observations. Even though this leaves few remaining data to constrain the emissions, the inversion system diagnoses that their assimilation significantly reduces the uncertainty in monthly emissions: by 9% in November 2010 to 50% in October 2010. The inverted monthly mean emissions correlate well with independent monthly mean air temperature. Furthermore, the inverted annual mean emission is consistent with the independent revision of the AIRPARIF inventory for the year 2010, which better corresponds to the measurement period than the 2008 inventory. Several tests of the inversion's sensitivity to prior emission estimates, to the assumed spatial distribution of the emissions, and to the atmospheric transport modelling demonstrate the robustness of the measurement constraint on inverted fossil fuel CO2 emissions. The results, however, show significant sensitivity to the description of the emissions' spatial distribution in the inversion system, demonstrating the need to rely on high-resolution local inventories such as that from AIRPARIF. Although the inversion constrains emissions through the assimilation of CO2 gradients, the results are hampered by the improperly modelled influence of remote CO2 fluxes when air masses originate from urbanised and industrialised areas north-east of Paris. The drastic data selection used in this study limits the ability to continuously monitor Paris fossil fuel CO2 emissions: the inversion results for specific months such as September or November 2010 are poorly constrained by too few CO2 measurements. The high sensitivity of the inverted emissions to the prior emissions' diurnal variations highlights the limitations induced by assimilating data only during the afternoon. Furthermore, even though the inversion improves the seasonal variation and the annual budget of the city's emissions, the assimilation of data during a limited number of suitable days does not necessarily yield robust estimates for individual months. These limitations could be overcome through a refinement of the data processing for a wider data selection, and through the expansion of the observation network.
Novel applications of intelligent computing paradigms for the analysis of nonlinear reactive transport model of the fluid in soft tissues and microvessels
This article presents a methodology to solve a one-dimensional steady-state nonlinear reactive transport model (RTM) that is meant for fluid and solute transport model of soft tissues and microvessels. The methodology integrates the artificial neural network (ANN), genetic algorithms (GAs), and pattern search (PS) aided by active-set technique (AST) and interior-point technique (IPT). The RTM is represented with nonlinear second-order system based on the boundary value problem of ordinary differential equation. The ANN modeling is used for governing expression of RTM to form a fitness function in mean square sense, and optimization solvers based on the GA, PS, GA-AST, GA-IPT, PS-AST, PS-IPT are used for viable learning of weights. Proposed techniques are applied to different nonlinear RTMs based on variation in the characteristic reaction rate and half-saturation concentration. The proposed stochastic numerical solutions are compared with state-of-the-art solvers in order to check the accuracy and convergence based on sufficient large multiple runs of the algorithms.
Matching code and law: achieving algorithmic fairness with optimal transport
Increasingly, discrimination by algorithms is perceived as a societal and legal problem. As a response, a number of criteria for implementing algorithmic fairness in machine learning have been developed in the literature. This paper proposes the continuous fairness algorithm (CFAθ) which enables a continuous interpolation between different fairness definitions. More specifically, we make three main contributions to the existing literature. First, our approach allows the decision maker to continuously vary between specific concepts of individual and group fairness. As a consequence, the algorithm enables the decision maker to adopt intermediate “worldviews” on the degree of discrimination encoded in algorithmic processes, adding nuance to the extreme cases of “we’re all equal” and “what you see is what you get” proposed so far in the literature. Second, we use optimal transport theory, and specifically the concept of the barycenter, to maximize decision maker utility under the chosen fairness constraints. Third, the algorithm is able to handle cases of intersectionality, i.e., of multi-dimensional discrimination of certain groups on grounds of several criteria. We discuss three main examples (credit applications; college admissions; insurance contracts) and map out the legal and policy implications of our approach. The explicit formalization of the trade-off between individual and group fairness allows this post-processing approach to be tailored to different situational contexts in which one or the other fairness criterion may take precedence. Finally, we evaluate our model experimentally.
Experimentally achieving minimal dissipation via thermodynamically optimal transport
Optimal transport theory, originally developed in the 18th century for civil engineering, has since become a powerful optimization framework across disciplines, from generative AI to cell biology. In physics, it has recently been shown to set fundamental bounds on thermodynamic dissipation in finite-time processes. This extends beyond the conventional second law, which guarantees zero dissipation only in the quasi-static limit and cannot characterize the inevitable dissipation in finite-time processes. Here, we experimentally realize thermodynamically optimal transport using optically trapped microparticles, achieving minimal dissipation within a finite time. As an application to information processing, we implement the optimal finite-time protocol for information erasure, confirming that the excess dissipation beyond the Landauer bound is exactly determined by the Wasserstein distance — a fundamental geometric quantity in optimal transport theory. Furthermore, our experiment achieves the bound governing the trade-off between speed, dissipation, and accuracy in information erasure. To enable precise control of microparticles, we develop scanning optical tweezers capable of generating arbitrary potential profiles. These results provide guiding principles for information processing with saturating the trade-off. Optimal transport theory has been applied across diverse fields including nonequilibrium thermodynamics. The authors present experimental realization for optimal transport of microparticles, minimizing energy cost within finite time, and optimal finite-time information erasure.
Are Trends of Gulf Stream Transport Uniform Along the Florida Shelf?
The Gulf Stream and Yucatán Current are key components of the Atlantic Ocean's circulation system. Although the Florida Current at 27°N has been continuously monitored for over 40 years, spatial variability in transport along the broader Florida Shelf remains less explored. Three decades (1993–2023) of satellite altimetry allow assessment of surface transport trends across four primary transects from the Yucatán Channel to 30°N, and along 32 supplementary sections tracking the Gulf Stream path. Bayesian regression reveals negative trends upstream, while quasi‐stable behavior downstream (off North Florida). Wavelet analysis identifies dominant seasonal to decadal variability, with the strongest coherence within the straits. Model (HYbrid Coordinate Ocean Model) output confirms spatially heterogeneous trends and downstream stabilization of volume transport. These findings suggest that trends of the Gulf Stream are spatially non‐uniform, involving apparent transport reductions balanced by downstream compensation. Thus, it is necessary to resolve along‐stream variability when assessing long‐term changes in western boundary current systems.
Machine Learning for Protein Subcellular Localization Prediction
Comprehensively covers protein subcellular localization from single-label prediction to multi-label prediction, and includes prediction strategies for virus, plant, and eukaryote species. Three machine learning tools are introduced to improve classification refinement, feature extraction, and dimensionality reduction.