Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
841
result(s) for
"Object oriented modeling"
Sort by:
Object analysis patterns for embedded systems
by
Campbell, L.A.
,
Konrad, S.
,
Cheng, B.H.C.
in
Application software
,
Automatic logic units
,
Automobile industry
2004
Some of the most challenging tasks in building a software system are capturing, refining, and analyzing requirements. How well these tasks are performed significantly impacts the quality of the developed software system. The difficulty of these tasks is greatly exacerbated for the software of embedded systems as these systems are commonly used for critical applications, have to operate reliably for long periods of time, and usually have a high degree of complexity. Current embedded systems software development practice, however, often deals with the (requirements) analysis phase in a superficial manner, instead emphasizing design and implementation. This research investigates how an approach similar to the well-known design patterns, termed object analysis patterns, can be applied in the analysis phase of embedded systems development, prior to design and coding. Specifically, our research explores how object-oriented modeling notations, such as the Unified Modeling Language (UML), can be used to represent structural and behavioral information as part of commonly occurring object analysis patterns. This work also investigates how UML-based conceptual models of embedded systems, based on the diagram templates in the object analysis patterns, can be automatically analyzed using the Spin model checker for adherence to properties specified in linear-time temporal logic (LTL) using a previously developed UML formalization framework. We have applied these patterns to several embedded systems applications obtained from the automotive industry. This paper describes one of our case studies and illustrates how our approach facilitates the construction of UML-based conceptual models of embedded systems and the analysis of these models for adherence to functional requirements.
Journal Article
Formalization of the Whole-Part relationship in the Unified Modeling Language
2003
A formal definition for the semantics of the Whole-Part relationship in the Unified Modeling Language or UML is introduced. This provides a fully directly usable specification which can be incorporated into version 2.0 of UML. An improvement to the current metamodel fragment relating to relationships is proposed, supplemented by the introduction of axioms expressed in the Object Constraint Language or OCL. The overall formalization relates to a clear and concise emphasis on carefully enunciated (primary) characteristics that apply to all instances of a new Whole-Part metatype. Specific kinds of the Whole-Part relationship are defined in terms of secondary characteristics, which must be possessed by subtypes: In UML 1.4, these are Aggregation (a.k.a. white diamond) and Composition (a.k.a. black diamond). Primary and secondary characteristics may then be consistently combined with each other. Consequently, this allows the possible introduction of supplementary forms of Whole-Part. Such a revision is necessary since Aggregation and Composition in UML 1.4 do not cover the full spectrum of Whole-Part theory.
Journal Article
On the Semantics of Associations and Association Ends in UML
2007
Association is one of the key concepts in UML that is intensively used in conceptual modeling. Unfortunately, in spite of the fact that this concept is very old and is inherited from other successful modeling techniques, a fully unambiguous understanding of it, especially in correlation with other newer concepts connected with association ends, such as uniqueness, still does not exist. This paper describes a problem with one widely assumed interpretation of the uniqueness of association ends, the restrictive interpretation, and proposes an alternative, the intentional interpretation. Instead of restricting the association from having duplicate links, uniqueness of an association end in the intentional interpretation modifies the way in which the association end maps an object of the opposite class to a collection of objects of the class at that association end. If the association end is unique, the collection is a set obtained by projecting the collection of all linked objects. In that sense, the uniqueness of an association end modifies the view to the objects at that end, but does not constrain the underlying object structure. This paper demonstrates how the intentional interpretation improves expressiveness of the modeling language and has some other interesting advantages. Finally, this paper gives a completely formal definition of the concepts of association and association ends, along with the related notions of uniqueness, ordering, and multiplicity. The semantics of the UML actions on associations are also defined formally
Journal Article
Thermal Performance Visualization Using Object−Oriented Physical and Building Information Modeling
by
Jeong, WoonSeong
,
Lee, Chang Joon
,
Yan, Wei
in
Building information modeling
,
Design
,
Energy management
2020
This study demonstrates the research and development of a visualization method called thermal performance simulation. The objective of this study is providing the results of thermal performance simulation results into building information modeling (BIM) models, displaying a series of thermal performance results, and enabling stakeholders to use the BIM tool as a common user interface in the early design stage. This method utilizes a combination of object-oriented physical modeling (OOPM) and BIM. To implement the suggested method, a specific BIM authoring tool called the application programming interface (API) was adopted, as well as an external database to maintain the thermal energy performance results from the OOPM tool. Based on this method, this study created a prototype called the thermal energy performance visualization (TEPV). The TEPV translates the information from the external database to the thermal energy performance indicator (TEPI) parameter in the BIM tool. In the TEPI, whenever BIM models are generated for building design, the thermal energy performance results are visualized by color-coding the building components in the BIM models. Visualization of thermal energy performance results enables non-engineers such as architects to explicitly inspect the simulation results. Moreover, the TEPV facilitates architects using BIM as an interface in building design to visualize building thermal energy performance, enhancing their design production at the early design stages.
Journal Article
Power-Laws in a Large Object-Oriented Software System
2007
We present a comprehensive study of an implementation of the Smalltalk object oriented system, one of the first and purest object-oriented programming environment, searching for scaling laws in its properties. We study ten system properties, including the distributions of variable and method names, inheritance hierarchies, class and method sizes, system architecture graph. We systematically found Pareto - or sometimes log-normal - distributions in these properties. This denotes that the programming activity, even when modeled from a statistical perspective, can in no way be simply modeled as a random addition of independent increments with finite variance, but exhibits strong organic dependencies on what has been already developed. We compare our results with similar ones obtained for large Java systems, reported in the literature or computed by ourselves for those properties never studied before, showing that the behavior found is similar in all studied object oriented systems. We show how the Yule process is able to stochastically model the generation of several of the power-laws found, identifying the process parameters and comparing theoretical and empirical tail indexes. Lastly, we discuss how the distributions found are related to existing object-oriented metrics, like Chidamber and Kemerer's, and how they could provide a starting point for measuring the quality of a whole system, versus that of single classes. In fact, the usual evaluation of systems based on mean and standard deviation of metrics can be misleading. It is more interesting to measure differences in the shape and coefficients of the data?s statistical distributions.
Journal Article
Empirical Validation of Three Software Metrics Suites to Predict Fault-Proneness of Object-Oriented Classes Developed Using Highly Iterative or Agile Software Development Processes
by
Quattlebaum, S.
,
Olague, H.M.
,
Etzkorn, L.H.
in
Case studies
,
Computer industry
,
Computer programs
2007
Empirical validation of software metrics suites to predict fault proneness in object-oriented (OO) components is essential to ensure their practical use in industrial settings. In this paper, we empirically validate three OO metrics suites for their ability to predict software quality in terms of fault-proneness: the Chidamber and Kemerer (CK) metrics, Abreu's Metrics for Object-Oriented Design (MOOD), and Bansiya and Davis' Quality Metrics for Object-Oriented Design (QMOOD). Some CK class metrics have previously been shown to be good predictors of initial OO software quality. However, the other two suites have not been heavily validated except by their original proposers. Here, we explore the ability of these three metrics suites to predict fault-prone classes using defect data for six versions of Rhino, an open-source implementation of JavaScript written in Java. We conclude that the CK and QMOOD suites contain similar components and produce statistical models that are effective in detecting error-prone classes. We also conclude that the class components in the MOOD metrics suite are not good class fault-proneness predictors. Analyzing multivariate binary logistic regression models across six Rhino versions indicates these models may be useful in assessing quality in OO classes produced using modern highly iterative or agile software development processes.
Journal Article
Solving the Class Responsibility Assignment Problem in Object-Oriented Analysis with Multi-Objective Genetic Algorithms
by
Labiche, Yvan
,
Bowman, Michael
,
Briand, Lionel C
in
Acceptability
,
Algorithm design and analysis
,
Assignment problem
2010
In the context of object-oriented analysis and design (OOAD), class responsibility assignment is not an easy skill to acquire. Though there are many methodologies for assigning responsibilities to classes, they all rely on human judgment and decision making. Our objective is to provide decision-making support to reassign methods and attributes to classes in a class diagram. Our solution is based on a multi-objective genetic algorithm (MOGA) and uses class coupling and cohesion measurement for defining fitness functions. Our MOGA takes as input a class diagram to be optimized and suggests possible improvements to it. The choice of a MOGA stems from the fact that there are typically many evaluation criteria that cannot be easily combined into one objective, and several alternative solutions are acceptable for a given OO domain model. Using a carefully selected case study, this paper investigates the application of our proposed MOGA to the class responsibility assignment problem, in the context of object-oriented analysis and domain class models. Our results suggest that the MOGA can help correct suboptimal class responsibility assignment decisions and perform far better than simpler alternative heuristics such as hill climbing and a single-objective GA.
Journal Article
Empirical Analysis of Object-Oriented Design Metrics for Predicting High and Low Severity Faults
2006
In the last decade, empirical studies on object-oriented design metrics have shown some of them to be useful for predicting the fault-proneness of classes in object-oriented software systems. This research did not, however, distinguish among faults according to the severity of impact. It would be valuable to know how object-oriented design metrics and class fault-proneness are related when fault severity is taken into account. In this paper, we use logistic regression and machine learning methods to empirically investigate the usefulness of object-oriented design metrics, specifically, a subset of the Chidamber and Kemerer suite, in predicting fault-proneness when taking fault severity into account. Our results, based on a public domain NASA data set, indicate that 1) most of these design metrics are statistically related to fault-proneness of classes across fault severity, and 2) the prediction capabilities of the investigated metrics greatly depend on the severity of faults. More specifically, these design metrics are able to predict low severity faults in fault-prone classes better than high severity faults in fault-prone classes
Journal Article
A hierarchical model for object-oriented design quality assessment
2002
The paper describes an improved hierarchical model for the assessment of high-level design quality attributes in object-oriented designs. In this model, structural and behavioral design properties of classes, objects, and their relationships are evaluated using a suite of object-oriented design metrics. This model relates design properties such as encapsulation, modularity, coupling, and cohesion to high-level quality attributes such as reusability, flexibility, and complexity using empirical and anecdotal information. The relationship or links from design properties to quality attributes are weighted in accordance with their influence and importance. The model is validated by using empirical and expert opinion to compare with the model results on several large commercial object-oriented systems. A key attribute of the model is that it can be easily modified to include different relationships and weights, thus providing a practical quality assessment tool adaptable to a variety of demands.
Journal Article
Exception Handling Patterns for Process Modeling
2010
Process modeling allows for analysis and improvement of processes that coordinate multiple people and tools working together to carry out a task. Process modeling typically focuses on the normative process, that is, how the collaboration transpires when everything goes as desired. Unfortunately, real-world processes rarely proceed that smoothly. A more complete analysis of a process requires that the process model also include details about what to do when exceptional situations arise. We have found that, in many cases, there are abstract patterns that capture the relationship between exception handling tasks and the normative process. Just as object-oriented design patterns facilitate the development, documentation, and maintenance of object-oriented programs, we believe that process patterns can facilitate the development, documentation, and maintenance of process models. In this paper, we focus on the exception handling patterns that we have observed over many years of process modeling. We describe these patterns using three process modeling notations: UML 2.0 Activity Diagrams, BPMN, and Little-JIL. We present both the abstract structure of the pattern as well as examples of the pattern in use. We also provide some preliminary statistical survey data to support the claim that these patterns are found commonly in actual use and discuss the relative merits of the three notations with respect to their ability to represent these patterns.
Journal Article