Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Series Title
      Series Title
      Clear All
      Series Title
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Content Type
    • Item Type
    • Is Full-Text Available
    • Subject
    • Country Of Publication
    • Publisher
    • Source
    • Target Audience
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
1,353,730 result(s) for "Joe"
Sort by:
The Onion book of known knowledge : a definitive encyclopaedia of existing information in 27 excruciating volumes
An encyclopedia of all worldly facts in existence, and the last book one need ever purchase. Well, no... really, it's an encyclopedic collection of satirical commentary on world events, human behavior, and journalistic convention, full of the Onion's typical surreal wit.
DEA under big data: data enabled analytics and network data envelopment analysis
This paper proposes that data envelopment analysis (DEA) should be viewed as a method (or tool) for data-oriented analytics in performance evaluation and benchmarking. While computational algorithms have been developed to deal with large volumes of data (decision making units, inputs, and outputs) under the conventional DEA, valuable information hidden in big data that are represented by network structures should be extracted by DEA. These network structures, e.g., transportation and logistics systems, encompass a broader range of inter-linked metrics that cannot be modelled by conventional DEA. It is proposed that network DEA is related to the value dimension of big data. It is shown that network DEA is different from standard DEA, although it bears the name of DEA and some similarity with conventional DEA. Network DEA is big data enabled analytics (big DEA) when multiple (performance) metrics or attributes are linked through network structures. These network structures are too large or complex to be dealt with by conventional DEA. Unlike conventional DEA that are solved via linear programming, general network DEA corresponds to nonconvex optimization problems. This represents opportunities for developing techniques for solving non-linear network DEA models. Areas such as transportation and logistics system as well as supply chains have a great potential to use network DEA in big data modeling.
Strange weather : four short novels
\"Snapshot\" is the disturbing story of a Silicon Valley adolescent who finds himself threatened by \"The Phoenician,\" a tattooed thug who possesses a Polaroid Instant Camera that erases memories, snap by snap.
A Quantitative Survey of Local Adaptation and Fitness Trade‐Offs
The long history of reciprocal transplant studies testing the hypothesis of local adaptation has shown that populations are often adapted to their local environments. Yet many studies have not demonstrated local adaptation, suggesting that sometimes native populations are no better adapted than are genotypes from foreign environments. Local adaptation may also lead to trade‐offs, in which adaptation to one environment comes at a cost of adaptation to another environment. I conducted a survey of published studies of local adaptation to quantify its frequency and magnitude and the costs associated with local adaptation. I also quantified the relationship between local adaptation and environmental differences and the relationship between local adaptation and phenotypic divergence. The overall frequency of local adaptation was 0.71, and the magnitude of the native population advantage in relative fitness was 45%. Divergence between home site environments was positively associated with the magnitude of local adaptation, but phenotypic divergence was not. I found a small negative correlation between a population’s relative fitness in its native environment and its fitness in a foreign environment, indicating weak trade‐offs associated with local adaptation. These results suggest that populations are often locally adapted but stochastic processes such as genetic drift may limit the efficacy of divergent selection.
تفاحة في اليوم : الخرافات وسوء المعرفة والحقائق حول ما نأكل من طعام
لا يزعم هذا الكتاب أن يكون دائرة معارف التغذية ولا دليلا شاملا للأكل الصحي ولكنه يقدم إطارا للتفكير الغذائي السليم ومنظورا لما يتسحق منا القلق حين نقدم لأنفسنا ذلك الخليط من الجزيئات نسميها طعاما فالتغذية مهمة جدا والتحدي يكمن في تبين الغث من السمين وفي الخروج ببعض نتائج عملية حول ما يؤكل لا استنادا إلى القليل والقال بل إلى العلم الصحيح ويختلف الناس عن أذواقهم حين يتطلب الأمر تخير الطعام وبعضم يهتم بالمزايا الغذائية لأطعمة معينة وبعضم الآخر يفتنه ما تفعله مضادات الأكسدة بينما تقلق آخرين سلامة المضافات الغذائية.
Next-generation prediction metrics for composite-based PLS-SEM
PurposeThe purpose of this study is to provide an overview of emerging prediction assessment tools for composite-based PLS-SEM, particularly proposed out-of-sample prediction methodologies.Design/methodology/approachA review of recently developed out-of-sample prediction assessment tools for composite-based PLS-SEM that will expand the skills of researchers and inform them on new methodologies for improving evaluation of theoretical models. Recently developed and proposed cross-validation approaches for model comparisons and benchmarking are reviewed and evaluated.FindingsThe results summarize next-generation prediction metrics that will substantially improve researchers' ability to assess and report the extent to which their theoretical models provide meaningful predictions. Improved prediction assessment metrics are essential to justify (practical) implications and recommendations developed on the basis of theoretical model estimation results.Originality/valueThe paper provides an overview of recently developed and proposed out-of-sample prediction metrics for composite-based PLS-SEM that will enhance the ability of researchers to demonstrate generalization of their findings from sample data to the population.
Anomalous Z′ bosons for anomalous B decays
A bstract Motivated by the intriguing discrepancies in b → sℓℓ transitions, the fermion mass problem, and a desire to preserve the accidental symmetries of the Standard Model (SM), we extend the SM by an anomalous U(1) X gauge symmetry where X = Y 3 + a ( L μ − Lτ ) / 6. The heavy Z ′ boson associated with spontaneously breaking U(1) X at the TeV scale mediates the b → sℓℓ anomalies via O 9 μ ~ 1 Λ 2 s ¯ γ ρ P L b μ ¯ γ ρ μ . We show that this model, which features mixed gauge anomalies involving U(1) X and hypercharge, can be made anomaly-free for any a ∈ ℤ by integrating in a pair of charged fermions whose masses naturally reside somewhere between 1 and 30 TeV. The gauge symmetry permits only the third family Yukawas at the renormalisable level, and so the light quark masses and mixings are controlled by accidental U(2) 3 flavour symmetries which we assume are minimally broken alongside U(1) X . The lepton sector is not governed by U(2) symmetries, but rather one expects a nearly diagonal charged lepton Yukawa with m e,μ « m τ . The model does not explain the hierarchy m e « m μ , but it does possess high quality lepton flavour symmetries that are robust to the heavy physics responsible for generating m e,μ . We establish the viability of these models by checking agreement with the most important experimental constraints. We comment on how the model could also explain neutrino masses and the muon g − 2.
A detailed introduction to density-based topology optimisation of fluid flow problems with implementation in MATLAB
This article presents a detailed introduction to density-based topology optimisation of fluid flow problems. The goal is to allow new students and researchers to quickly get started in the research area and to skip many of the initial steps, often consuming unnecessarily long time from the scientific advancement of the field. This is achieved by providing a step-by-step guide to the components necessary to understand and implement the theory, as well as extending the supplied MATLAB code. The continuous design representation used and how it is connected to the Brinkman penalty approach, for simulating an immersed solid in a fluid domain, are illustrated. The different interpretations of the Brinkman penalty term and how to chose the penalty parameters are explained. The accuracy of the Brinkman penalty approach is analysed through parametric simulations of a reference geometry. The chosen finite element formulation and the solution method are explained. The minimum dissipated energy optimisation problem is defined and how to solve it using an optimality criteria solver and a continuation scheme is discussed. The included MATLAB implementation is documented, with details on the mesh, pre-processing, optimisation and post-processing. The code has two benchmark examples implemented and the application of the code to these is reviewed. Subsequently, several modifications to the code for more complicated examples are presented through provided code modifications and explanations. Lastly, the computational performance of the code is examined through studies of the computational time and memory usage, along with recommendations to decrease computational time through approximations.