Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
143,367 result(s) for "Neuronal network analysis"
Sort by:
The social media sentiment analysis framework: deep learning for sentiment analysis on social media
Researching public opinion can help us learn important facts. People may quickly and easily express their thoughts and feelings on any subject using social media, which creates a deluge of unorganized data. Sentiment analysis on social media platforms like Twitter and Facebook has developed into a potent tool for gathering insights into users' perspectives. However, difficulties in interpreting natural language limit the effectiveness and precision of sentiment analysis. This research focuses on developing a social media sentiment analysis (SMSA) framework, incorporating a custom-built emotion thesaurus to enhance the precision of sentiment analysis. It delves into the efficacy of various deep learning algorithms, under different parameter calibrations, for sentiment extraction from social media. The study distinguishes itself by its unique approach towards sentiment dictionary creation and its application to deep learning models. It contributes new insights into sentiment analysis, particularly in social media contexts, showcasing notable advancements over previous methodologies. The results demonstrate improved accuracy and deeper understanding of social media sentiment, opening avenues for future research and applications in diverse fields.
Nonnegative matrix factorization for analyzing state dependent neuronal network dynamics in calcium recordings
Calcium imaging allows recording from hundreds of neurons in vivo with the ability to resolve single cell activity. Evaluating and analyzing neuronal responses, while also considering all dimensions of the data set to make specific conclusions, is extremely difficult. Often, descriptive statistics are used to analyze these forms of data. These analyses, however, remove variance by averaging the responses of single neurons across recording sessions, or across combinations of neurons, to create single quantitative metrics, losing the temporal dynamics of neuronal activity, and their responses relative to each other. Dimensionally Reduction (DR) methods serve as a good foundation for these analyses because they reduce the dimensions of the data into components, while still maintaining the variance. Nonnegative Matrix Factorization (NMF) is an especially promising DR analysis method for analyzing activity recorded in calcium imaging because of its mathematical constraints, which include positivity and linearity. We adapt NMF for our analyses and compare its performance to alternative dimensionality reduction methods on both artificial and in vivo data. We find that NMF is well-suited for analyzing calcium imaging recordings, accurately capturing the underlying dynamics of the data, and outperforming alternative methods in common use.
Bank efficiency estimation in China: DEA-RENNA approach
The current study proposes a new DEA model to evaluate the efficiency of 39 Chinese commercial banks over the period 2010–2018. The paper also, in the second stage, investigates the inter-relationships between efficiency and some bank-specific variables (i.e. bank profitability, bank size, expenses management, traditional business and non-traditional business) under the Robust Endogenous Neural Network Analysis. The findings suggest that the sample of Chinese banks experiences a consistent increase in the level of bank efficiency up to 2015; the efficiency score is 0.915, after which the efficiency level declines and then experiences a slight volatility, while finally ending up with an efficiency score of 0.746 by the end of 2018. We also find that among different bank ownership types, the state-owned banks have the highest efficiency, the rural commercial banks are found to be least efficient and the foreign banks experience the strongest volatility over the examined period. The second-stage analysis shows that bank size exerts a positive influence on the development of non-traditional banking business and a proactive expense management, bank size and non-traditional businesses have a positive impact on efficiency levels, while bank profitability, traditional businesses and expenses management have negative influences on bank efficiency.
Surgical timing and clinical factor predicting in-hospital mortality in older adults with hip fractures: a neuronal network analysis
Introduction Hip fractures in older adults are associated with a significant mortality rate, which has been reported to be around 35% within a year. Today, the incidence of these fractures is on the rise, and this trend is expected to increase even more owing to the aging of the population. Treatment timing and perioperative management of these patients are typically challenging owing to the presence of multiple comorbidities that are important risk factors for mortality after surgery. This study aims to evaluate the relationship between surgical timing and in-hospital mortality, analyzing the role of both acute events and chronic preexisting comorbidities in patient outcomes. Materials and methods This is a single-center, retrospective observational study (from January 2018 until June 2023). All consecutive patients ≥ 65 years with a diagnosis of proximal femur fracture were enrolled. The primary study endpoint was to evaluate risk factors associated with in-hospital mortality. The secondary endpoint was the assessment of the relationship between surgical timing and in-hospital mortality, including factors such as preexisting comorbidities, the Charlson Comorbidity Index, and the Nottingham Hip Fracture Score. The relative weight of each factor for predicting the mortality rate was also evaluated using neural network analysis, comparing patients treated within 24 h to those treated after a longer surgical delay. Results Among the 2320 patients enrolled, 1391 (60%) underwent surgery within 24 h, while 929 patients (40%) were treated after 24 h. For patients who underwent surgery within 24 h, the in-hospital mortality was 2.8%, and for those who underwent surgery after 24 h, it was 5.2% ( p  = 0.046; odds ratio (OR) 1.58). Age ( p  = 0.001; OR 1.06) and Nottingham score ( p  = 0.04; OR 1.32) are factors predicting mortality. Acute infections were related to a high risk of mortality ( p  = 0.001; OR 5.99), both in patients treated within and after 24 h. Acute events, such as atrial fibrillation and electrolyte imbalance, were related to mortality risk only in patients treated within 24 h ( p  = 0.001 versus p  = 0.51). Neural network analysis revealed that atrial fibrillation (AF), flutter, and electrolyte imbalance had the highest relative weight for mortality in patients treated in the first 24 h; by contrast, renal failure and pneumonia were most present in patients who died that were treated after 24 h. Conclusions Hip fracture is known to be a significant cause of morbidity and mortality in older adults. The impact of the timing of surgical treatment in those patients is crucial for postoperative outcomes. Early surgery is essential to reduce the risk of mortality. Our study has shown that, while in the case of acute and reversible conditions, waiting about 24 h to stabilize the patient with preoperative stabilization protocols, such as managing anticoagulation, optimizing hemodynamics, or addressing acute medical conditions including infection prevention, guarantees better results, in the case of sepsis or acute infection presence, the prolonged waiting to optimize patients before and after surgery does not help improve outcomes.
Mapping theme trends and knowledge structures for human neural stem cells: a quantitative and co-word biclustering analysis for the 2013-2018 period
Neural stem cells, which are capable of multi-potential differentiation and self-renewal, have recently been shown to have clinical potential for repairing central nervous system tissue damage. However, the theme trends and knowledge structures for human neural stem cells have not yet been studied bibliometrically. In this study, we retrieved 2742 articles from the PubMed database from 2013 to 2018 using \"Neural Stem Cells\" as the retrieval word. Co-word analysis was conducted to statistically quantify the characteristics and popular themes of human neural stem cell-related studies. Bibliographic data matrices were generated with the Bibliographic Item Co-Occurrence Matrix Builder. We identified 78 high-frequency Medical Subject Heading (MeSH) terms. A visual matrix was built with the repeated bisection method in gCLUTO software. A social network analysis network was generated with Ucinet 6.0 software and GraphPad Prism 5 software. The analyses demonstrated that in the 6-year period, hot topics were clustered into five categories. As suggested by the constructed strategic diagram, studies related to cytology and physiology were well-developed, whereas those related to neural stem cell applications, tissue engineering, metabolism and cell signaling, and neural stem cell pathology and virology remained immature. Neural stem cell therapy for stroke and Parkinson's disease, the genetics of microRNAs and brain neoplasms, as well as neuroprotective agents, Zika virus, Notch receptor, neural crest and embryonic stem cells were identified as emerging hot spots. These undeveloped themes and popular topics are potential points of focus for new studies on human neural stem cells.
Big data analytics adoption success: value chain process-level perspective
PurposeDespite the considerable hype about how Big Data Analytics (BDA) can transform businesses and advance their capabilities, recognising its strategic value through successful adoption is yet to be appreciated. The purpose of this paper is to focus on the process-level value-chain realisation of BDA adoption between SMEs and large organisations.Design/methodology/approachResource-based theory offered the lens for developing a conceptual BDA process-level value chain adoption model. A combined two-staged regression-artificial neural network approach has been utilised for 369 small, medium (SMEs) and large organisations to verify their critical value chain process-level drivers for successful organisational adoption of BDA.FindingsThe findings revealed that organisational BDA adoption success is driven predominantly by product—and service-process-level value, with distinctive discrepancies dependent on the organisation’s size. Large organisations primarily embrace BDA for their external value chain dimensions, while SMEs encompass its internal value chain cues. As such, businesses will be advised to acknowledge their organisational dynamics and precise size to develop the right strategies to adopt BDA successfully.Research limitations/implicationsThe study advances the understanding of the role of internal and external value chain drivers in influencing how BDA can be successfully adopted in SMEs and large organisations. Thus, appreciating the organisation’s unique attributes, including its size, will need to be carefully examined. By investigating these elements, this research has shed new light on how developing such innovative capabilities and competencies must be carefully crafted to help create a sustainable competitive advantage.Practical implicationsFor an organisational positioning, acknowledging the role of internal and external value chain drivers is critical for implementing the right strategies for adopting BDA. For larger businesses, resources for innovation often can be widely available compared to SMEs. As such, they can manage their costs and associated risks resourcefully. By considering the identified value-chain-related adoption success factors, businesses should be better positioned to assess their competencies while being prepared to adopt BDA.Originality/valueThe study offers the research and business community empirical-based insights into the strategies needed to successfully adopt big data in an organisation from a process-level value chain perspective.
Comparison of machine learning algorithms and multiple linear regression for live weight estimation of Akkaraman lambs
This study was designed to predict the post-weaning weights of Akkaraman lambs reared on different farms using multiple linear regression and machine learning algorithms. The effect of factors the age of the dam, gender, type of lambing, enterprise, type of flock, birth weight, and weaning weight was analyzed. The data was collected from a total of 25,316 Akkaraman lambs raised at multiple farms in the Çiftlik District of Niğde province. Comparative analysis was conducted by using multiple linear regression, Random Forest, Support Vector Machines (and Support Vector Regression), Extreme Gradient Boosting (XGBoost) (and Gradient Boosting), Bayesian Regularized Neural Network, Radial Basis Function Neural Network, Classification and Regression Trees, Exhaustive Chi-squared Automatic Interaction Detection (and Chi-squared Automatic Interaction Detection), and Multivariate Adaptive Regression Splines algorithms. In this study, the test dataset was divided into five layers using the K-fold cross-validation method. The performance of models was compared using performance criteria such as Adjusted R-squared (Adj- ), Root Mean Square Error (RMSE), Mean Absolute Deviation (MAD), and Mean Absolute Percentage Error (MAPE) by utilizing test populations in the predicted models. Additionally, the presence of low standard deviations for these criteria indicates the absence of an overfitting problem. The comparison results showed the Random Forest algorithm had the best predictive performance compared to other algorithms with Adj- , RMSE, MAD, and MAPE values of 0.75, 3.683, 2.876, and 10.112, respectively. In conclusion, the results obtained through Multiple Linear Regression for the live weights of Akkaraman lambs were less accurate than the results obtained through artificial neural network analysis.
Neural network analysis for predicting metrics of fragmented laminar artifacts: a case study from MPPNB sites in the Southern Levant
This study was aimed at introducing a new method for predicting the original metrics of fragmented standardized artifacts, specifically of flint blades from the Middle Pre-Pottery Neolithic B (10,200/100–9,500/400 cal B.P.) in the Southern Levant. The excessive re-use of these artifacts or poor preservation conditions often prevent a complete set of metric data from being obtained. Our suggested approach is based on readily accessible machine learning (artificial intelligence) and neural network analysis. These are performed in a multi-paradigm programming language and numeric computing environment, with parameters represented by a rapid measurement system based on the technological features shared by all lithic artifacts of the studied assemblages. This method can be applied to various chronologies and/or contexts. A full set of metric data, including potential typological and functional elements of the assemblages studied, may provide a better understanding of the lithic technology involved; highlight cultural aspects related to the chaîne opératoire of the studied lithic production; and address issues related to cultural sub-divisions in larger-scale applications. Herein, neural network analysis was performed on blade samples from Middle Pre-Pottery Neolithic B sites from the Southern Levant specifically Nahal Yarmuth 38, Motza, Yiftahel, and Nahal Reuel.
The interplay of soft TQM practices and knowledge sharing: moderating role of market turbulence
PurposeEmerging competitive dynamics demand small and medium-sized enterprises (SMEs) to continuously comprehend and respond to changing market conditions by implementing effective soft total quality management (STQM) practices. Firstly, the study intends to identify the key STQM practices perceived to foster knowledge sharing (KS). Secondly, this study aims to investigate the impact of market turbulence (MT) on the interaction between STQM practices and KS among SMEs.Design/methodology/approachA total of 215 valid samples were analysed. Incorporating a two-hidden-layer deep artificial neural network (ANN) into SEM approaches allows for more in-depth testing and high prediction power. This study employs a two-stage PLS-SEM-ANN predictive-analytical technique to provide a more comprehensive analysis and significant statistical contribution.FindingsThe PLS-SEM-ANN analysis reveals that STQM practices including employee involvement (EI), employee training (ET), top management commitment (TMC) and employee teamwork (EM) are critical to boosting KS. MT, interestingly, moderates the relationship between EM and KS while negatively moderating the relationship between TMC and KS.Originality/valueThe study contributes to the knowledge-based view theory by demonstrating the importance of integrating STQM and KS among SMEs to thrive in today's dynamic market environment.
Sexual homicide and the forensic process: The decision-making process of collecting and analyzing traces and its implication for crime solving
The focus of the current study is to examine the collection and analysis of traces that are related to crime scene behaviors in sexual homicide cases as well as the factors influencing the solving of these crimes. Using 230 sexual homicide cases from the SHielD database, we computed two neural network models based on the multi-layer perceptron algorithm. First, we determined whether certain crime scene characteristics predicted the collection and analysis of traces (dependent variable for Model 1). Not surprisingly, the results indicate that trace collection and analysis were more likely to occur in sexual homicide cases with crime scene behaviors exhibiting the highest risk for trace transfer (e.g. close interactions with the victim) as well as the best conditions for trace persistence (e.g. body is found indoors). Situational and physical aspects of the crime scene are thus taken into account when deciding on the collection and analysis of traces. Second, we examined the situations in which the collection and analysis of traces contributes to crime solving (dependent variable for Model 2). The results suggest that the collection and analysis of traces does not necessarily predict the resolution of the case. Specifically, the analyses show that the collection and analysis of traces is useful for crime solving when: (1) the offenders' behaviors increase the opportunities for leaving traces at the crime scene, and (2) when the environmental and temporal aspects are favorable to the collection of traces. The impact of trace collection and analysis on case resolution is thus depending on the context of the case. Furthermore, the subsequent steps, such as the result of the trace analysis, the introduction into a database, the obtention of a result from this comparison, etc. might also affect case resolution, and thus interfere in the link between trace collection and analysis and case resolution.