Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Series Title
      Series Title
      Clear All
      Series Title
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Content Type
    • Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
1,003,068 result(s) for "model analysis"
Sort by:
A quantitative analysis on policies of China’s fuel cell electric vehicle industry
Fuel cell electric vehicles (FCEVs) are still in the early phases of development, largely influenced by policy measures. However, the current support policies have not effectively fostered the growth of the FCEV industry. Moreover, most existing research is based on qualitative analysis, which falls short in tackling this issue. This study developed a policy evaluation model that combines the analytic hierarchy process and content analysis (AHP-CA) to assess the differences between ideal and actual policy distributions, grounded in AHP and CA theories. Furthermore, CA-industry chain analysis model and CA-value chain analysis model were constructed. These models utilize a two-dimensional matrix to evaluate how policies are distributed across the industry chain and value chain. The results show that: In terms of policy types, preferential fiscal and tax policies exhibited a large negative deviation (-0.29), while comprehensive macro-policies policies exhibited a large positive deviation (0.17). In terms of policy instruments, demand-oriented policy instruments demonstrated a large negative deviation (-0.09), while environment-oriented ones demonstrated a large positive deviation (0.06). In terms of specific policy measures, a large negative deviation was observed in government procurement (-0.15), and a large positive deviation in policy strategies (0.13). From the perspective of the industrial chain and value chain, the supply side still needs to be further strengthened. It is proposed to enhance fiscal and tax preferential policies to further address the economic barriers in the early stages of FCEV industry development. Specifically, a demand-oriented approach should be adopted, with government procurement as the main measure. Additionally, efforts should be made to guide the improvement of supply-side capabilities and to enhance the demonstration and guidance for private consumption. Future research will focus on ways to classify policies and policy-making mechanisms that can adapt to changes over time.
The Phenomenon and Development of K-Pop: The Relationship between Success Factors of K-Pop and the National Image, Social Network Service Citizenship Behavior, and Tourist Behavioral Intention
This study aims to understand the impact of six success factors of K-pop on the national image of Korea perceived by global viewers and SNS citizenship behavior. In addition, this study seeks to validate the impact of the national image of Korea/SNS citizenship behavior, as defined in the foregoing, on tourist behavioral intention. Our analysis was conducted within the theoretical frameworks of the SERVQUAL, Image Theory, and the Theory of Planned Behavior. To that end, 1247 global viewers (eight countries) who have listened, watched and searched for information on K-pop are surveyed. Four statistical programs (SPSS/SmartPLS/GSCA Pro/JASP) are used for regression analysis and structural equation modeling. The results indicate the following. (1) Four K-pop success factors (producers, casting, producing/promotion, and contents) demonstrate a statistically significant positive influence on national image. (2) Two K-pop success factors (casting and producing/promotion) demonstrate a statistically significant positive influence on SNS citizenship behavior. In addition, K-pop contents has a statistically significant positive influence on SNS citizenship behavior (only in SPSS). (3) National image has a statistically significant positive influence on SNS citizenship behavior; and national image and SNS citizenship behavior have a statistically significant positive influence on tourist behavioral intention. As an exemplary cultural product, K-pop is creating economic added value. It is necessary to establish the measures to integrate K-pop in product planning and PR for sustainable marketing for Hallyu tourism.
A Method to Construct an Environmental Vulnerability Model Based on Multi-Source Data to Evaluate the Hazard of Short-Term Precipitation-Induced Flooding
Flood hazards resulting from short-term severe precipitation have caused serious social and economic losses and have posed extraordinary threats to the safety of lives and property. Vulnerability, which reflects the degree of the adverse impact of flooding on a city, the sensitivity of the environment, and the extent to which rescues are possible during flooding, is one of the significant factors of the disaster risk assessment. Because of this, this paper proposes an Environmental Vulnerability Analysis Model (EVAM), based on comprehensively evaluating multi-source remote sensing data. The EVAM includes a two-stage, short-term flood vulnerability assessment. In the first stage, the flood’s areal extension and land-use classification are extracted, based on the U-NET++ network, using multi-source satellite remote sensing images. The results from the first stage are used in the second stage of vulnerability assessment. In the second stage, combining multi-source data with associated feature extraction results establishes the Exposure–Sensitivity–Adaptive capacity framework. The short-term flood vulnerability index is leveraged through the analytic hierarchy process (AHP) and the entropy method is calculated for an environmental vulnerability evaluation. This novel proposed framework for short-term flood vulnerability evaluation is demonstrated for the Henan Province. The experimental results show that the proportion of vulnerable cities in the Henan Province ranging from high to low is 22.22%, 22.22%, 38.89%, and 16.67%, respectively. The relevant conclusions can provide a scientific basis for regional flood control and risk management as well as corresponding data support for post-disaster reconstruction in disaster regions.
What Will the European Climate Look Like in the Future? A Climate Analog Analysis Accounting for Dependencies Between Variables
Increasing the awareness of society about climate change by using a simplified way for the explanation of its impacts might be one of the key elements to adaptation and mitigation of its possible effects. This study investigates climate analogs, which allow the possibility to find, today, a place on land where climatic conditions are similar to those that a specific area will face in the future. The grid‐based calculation of analogs over the selected European domain was carried out using a newly proposed distance between multivariate distributions, the Wasserstein distance, that has never been used so far for climate analog calculations. By working on the whole multivariate distributions, the Wasserstein distance allows us to account for dependencies between the variables of interest and for the shape of their distribution. Its features are compared with the Euclidean and the Mahalanobis distances, which are the most used methods up to now. Multi‐model climate analogs analysis is achieved between the reference period 1981–2010 and three future periods 2011–2040, 2041–2070, and 2071–2100, for seasonal temperatures (mean, min, and max) and precipitation, from five different climate models and three different socio‐economic scenarios. The agreement between climate models in the location and degree of similarity of the best analogs decreases as warming intensifies and/or as time approaches the end of the century. As the climate warms, the similarity between future and current climatic conditions gradually decreases and the spatial (geographical) distance between a location and its best analog increases. Plain Language Summary This study explores the concept of climate analogs, which can help us understand and prepare for future climate conditions. Climate analogs are places on Earth today that have similar climate conditions to what a specific area will experience in the future. The study focuses on Europe and uses a new method called the Wasserstein distance to calculate these analogs. This method takes into account the relationships between different climate variables. We analyze multiple climate models and emission scenarios for different time periods. The findings indicate that as we approach the end of the century and as scenarios become more severe, the agreement between climate models on best analogs decreases, although they point to similar geographical areas. Toward the end of the century, the similarity between future and current climate conditions will decline, and the distance between a location and its best analog will increase. This means that finding suitable climate analogs becomes more challenging. Overall, this study highlights the importance of understanding climate change impacts and finding ways to adapt and mitigate its effects through simplified explanations and climate analogs. Key Points Wasserstein distance is used for the first time in climate analog calculations and compared with Euclidean and Mahalanobis distances Europe's future analogs are mostly located today south of Europe, except for the Balkans which need to look east to find their analogs As the climate warms, it will become more difficult to find a proper analog, leading to more challenges in thinking about how to adapt
STUDY AND APPLICATION OF FLOOD CONTROL RISK TREND ANALYSIS MODEL
In order to analyze the comprehensive risks of natural disasters quantitatively and improve the accuracy of natural disaster management and control, this paper expands the F indicator, Forecast, which is about real-time monitoring and early warning data of natural disasters, and forms the flood control risk trend analysis model framework based on PSR. The framework is named FPSR, i.e. Forecast-Pressure-State-Response, composed of static data and dynamic data. By establishing the four-level index system of flood control risk trend analysis in Fangshan District of Beijing, screening factors, and using analytic hierarchy process, AHP, and experts scoring to determine the weights of each factor, it constructs the flood control risk trend analysis model, FCRTAM. At last, using the real-time monitoring and early warning data of natural disasters in Beijing and the information such as disaster-causing factors, historical natural disasters, major hidden dangers, disaster-bearing bodies, disaster reduction resources (capacities), etc., from National Natural Disaster Comprehensive Risk Census in Fangshan, it analyzes the flood control situation of each town in Fangshan. The results show that the results flood control risk index calculated according to FCRTAM is basically consistent with the actual flood control situation of the towns in Fangshan, and can provide theoretical basis for flood control comprehensive risk trend analysis and the decision-making of disaster prevention and reduction in Fangshan District, which has high use value.
Limiting Ship Accidents by Identifying Their Causes and Determining Barriers to Application of Preventive Measures
When analyzing ship accidents, there may be doubts whether appropriate countermeasures had been taken to prevent known types of accidents. This study aimed to suggest possible solutions by investigating the status and issues associated with the implementation of countermeasures using importance–performance analysis (IPA), Borich’s needs assessment, and locus for focus models based on previously identified causes of the ship accidents. As a result, firstly, we confirmed that there is a need to enhance education and training on specific knowledge, understanding, and proficiency (KUP) regarding ship stability, emergency response, and type specific training. Secondly, we confirmed that there is a need for a system of monitoring a seafarer’s KUP even while onboard a vessel—that is, after completion of identified training. Additionally, it is necessary to improve the seafarers’ working environment, which is subject to regulations. Thirdly, difficulties in solving wrong practice parts of safety and efficiency, such as the costs associated with implementation of safety regulations, were identified as the main reasons for the causes of the “not amended yet” sector after accidents. Lastly, the tools that were employed in this analysis can be used to confirm the implementation status of the actions to be taken after a ship accident.
Changepoint detection in the presence of outliers
Many traditional methods for identifying changepoints can struggle in the presence of outliers, or when the noise is heavy-tailed. Often they will infer additional changepoints in order to fit the outliers. To overcome this problem, data often needs to be pre-processed to remove outliers, though this is difficult for applications where the data needs to be analysed online. We present an approach to changepoint detection that is robust to the presence of outliers. The idea is to adapt existing penalised cost approaches for detecting changes so that they use loss functions that are less sensitive to outliers. We argue that loss functions that are bounded, such as the classical biweight loss, are particularly suitable - as we show that only bounded loss functions are robust to arbitrarily extreme outliers. We present an efficient dynamic programming algorithm that can find the optimal segmentation under our penalised cost criteria. Importantly, this algorithm can be used in settings where the data needs to be analysed online. We show that we can consistently estimate the number of changepoints, and accurately estimate their locations, using the biweight loss function. We demonstrate the usefulness of our approach for applications such as analysing well-log data, detecting copy number variation, and detecting tampering of wireless devices.
Uncertainty Quantification for Computer Models With Spatial Output Using Calibration-Optimal Bases
The calibration of complex computer codes using uncertainty quantification (UQ) methods is a rich area of statistical methodological development. When applying these techniques to simulators with spatial output, it is now standard to use principal component decomposition to reduce the dimensions of the outputs in order to allow Gaussian process emulators to predict the output for calibration. We introduce the \"terminal case,\" in which the model cannot reproduce observations to within model discrepancy, and for which standard calibration methods in UQ fail to give sensible results. We show that even when there is no such issue with the model, the standard decomposition on the outputs can and usually does lead to a terminal case analysis. We present a simple test to allow a practitioner to establish whether their experiment will result in a terminal case analysis, and a methodology for defining calibration-optimal bases that avoid this whenever it is not inevitable. We present the optimal rotation algorithm for doing this, and demonstrate its efficacy for an idealized example for which the usual principal component methods fail. We apply these ideas to the CanAM4 model to demonstrate the terminal case issue arising for climate models. We discuss climate model tuning and the estimation of model discrepancy within this context, and show how the optimal rotation algorithm can be used in developing practical climate model tuning tools. Supplementary materials for this article are available online.
Crack detection and characterization techniques-An overview
SUMMARY Crack occurrence and propagation are among critical factors that affect the performance and lifespan of civil infrastructures such as bridges, pipelines, and so on. As a consequence, numerous crack detection and characterization techniques have been researched and developed in the past decades in the areas of SHM and non‐destructive evaluation (NDE). The significant amount of performed studies and the large number of publications give rise to the need to systematize, condensate, and summarize this enormous effort. The aims of this paper are to summarize the knowledge about cracking and its sources, review both existing and emerging methods for crack detection and characterization, and identify the advantages and challenges for these methods. In general, this paper identifies two sensing approaches (direct and indirect) and two data analysis approaches (model‐based and model‐free or data‐driven) along with a range of associated technologies. The advantages and challenges of each approach and technology are discussed and summarized, and the future research needs are identified. This paper is intended to serve as a reference for researchers who are interested in crack detection and characterization as well as for those who are generally interested in SHM and NDE. Copyright © 2014 John Wiley & Sons, Ltd.