Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
10,893
result(s) for
"Extensible Markup Language"
Sort by:
Centralized cloud information accountability with bat key generation algorithm (CCIA-BKGA) framework in cloud computing environment
2019
In cloud storage, without any complexity, the nearby information stockpiling and support, the clients can indirectly keep up their information and appreciate the on-request superb applications and administrations from a shared pool of configurable computing resources. Nevertheless, the data integrity protection in cloud computing becomes a tedious task, because the client doesn’t have physical possession of the outsourced data, especially for the users with the restricted computing resources. The essential component of the cloud services is that, the clients’ information will be prepared remotely in obscure machines, where the clients have no rights on it. In order to rectify this issue, a novel highly centralized cloud information accountability with bat key generation algorithm (CCIA-BKGA) framework is suggested to maintain the concrete usage of the users’ data in the cloud. It will encircle the logging mechanism together with users’ data and policies. Before initiating the imploding processing the cloud server, first it required to convert the general information & sensitive information into an extensible markup language file format and get saved inside file. After that, Java ARchive (JAR) label setting has to be done which will be completed at the side of the data owner, then the label attributes will work according to the following functionality such as signature key value, and unique identifier. Use the JAR programmable has the capacity for both to produce a dynamic and traveling object, and to guarantee that any entrance to clients’ information will trigger verification and robotized logging facilities for local in the JARs. CCIA-BKGA structure is platform-independent and profoundly decentralized, it doesn’t request any obsessive validation or capacity framework set up and furthermore it gives the broad exploratory examinations which clarify the proficiency and adequacy of the proposed CCIA-BKGA system and existing methodologies.
Journal Article
An exponent based error detection mechanism against DXDOS attack for improving the security in cloud
2019
Providing security to Cloud against the harmful attacks is an important and essential thing in recent days. Because, there are lot of attacks intend to affect the performance of data transmission in Cloud. Specifically, the extensible markup language-denial of service (XML-DoS) cause the severe damage to Cloud, which misuses the protocols for injecting the attack packets and disturbing the protocol handlers. So, this type of attack must be detected for enabling a reliable and secure service delivery in Cloud. For this purpose, the traditional works developed various attack detection mechanisms for identifying and blocking the XML-DoS attacks, but it lacks with the limitations of increased computation overhead, reduced detection accuracy, and inefficient classification. To solve these issues, this paper aims to develop a new attack detection framework based on the XML schema. The stages that involved in this work are, pattern validation, traffic extraction, error classification, and IP traceback. At first, the individual users transmit the packets to the routers, then the packet marking is performed based on the router IP. After that, the patterns of the packets are validated, if it is valid, the distinct IP is counted for IP matching. Consequently, the time sequence Tsallis entropy, source IP Tsallis entropy, and Lyapunov exponent are estimated. Based on the estimated exponent value, the error is classified as the chaotic or non-chaotic. Finally, the bee colony algorithm is implemented to perform the IP traceback, which takes the appropriate decision for blocking the attacker packets from the particular server. In experiments, the performance of the proposed method is evaluated by using various performance measures. Also, the superiority of the proposed system is proved by comparing it with the existing techniques.
Journal Article
Interoperability between Building Information Modelling (BIM) and Building Energy Model (BEM)
by
Del Valle de Lersundi, Kattalin
,
Fernández Bandera, Carlos
,
Bastos Porsani, Gabriela
in
Architecture
,
Automation
,
Building construction
2021
Building information modelling (BIM) is the first step towards the implementation of the industrial revolution 4.0, in which virtual reality and digital twins are key elements. At present, buildings are responsible for 40% of the energy consumption in Europe and, so, there is a growing interest in reducing their energy use. In this context, proper interoperability between BIM and building energy model (BEM) is paramount for integrating the digital world into the construction sector and, therefore, increasing competitiveness by saving costs. This paper evaluates whether there is an automated or semi-automated BIM to BEM workflow that could improve the building design process. For this purpose, a residential building and a warehouse are constructed using the same BIM authoring tool (Revit), where two open schemas were used: green building extensible markup language (gbXML) and industry foundation classes (IFC). These transfer files were imported into software compatible with the EnergyPlus engine—Design Builder, Open Studio, and CYPETHERM HE—in which simulations were performed. Our results showed that the energy models were built up to 7.50% smaller than in the BIM and with missing elements in their thermal envelope. Nevertheless, the materials were properly transferred to gbXML and IFC formats. Moreover, the simulation results revealed a huge difference in values between the models generated by the open schemas, in the range of 6 to 900 times. Overall, we conclude that there exists a semi-automated workflow from BIM to BEM which does not work well for big and complex buildings, as they present major problems when creating the energy model. Furthermore, most of the issues encountered in BEM were errors in the transfer of BIM data to gbXML and IFC files. Therefore, we emphasise the need to improve compatibility between BIM and model exchange formats by their developers, in order to promote BIM–BEM interoperability.
Journal Article
BD5: An open HDF5-based data format to represent quantitative biological dynamics data
by
Ho, Kenneth H. L.
,
Tohsato, Yukako
,
Onami, Shuichi
in
Binary data
,
Biological research
,
Biology
2020
BD5 is a new binary data format based on HDF5 (hierarchical data format version 5). It can be used for representing quantitative biological dynamics data obtained from bioimage informatics techniques and mechanobiological simulations. Biological Dynamics Markup Language (BDML) is an XML (Extensible Markup Language)-based open format that is also used to represent such data; however, it becomes difficult to access quantitative data in BDML files when the file size is large because parsing XML-based files requires large computational resources to first read the whole file sequentially into computer memory. BD5 enables fast random (i.e., direct) access to quantitative data on disk without parsing the entire file. Therefore, it allows practical reuse of data for understanding biological mechanisms underlying the dynamics.
Journal Article
Mapping of extensible markup language-to-ontology representation for effective data integration
by
Kusumo, Dana Sulistyo
,
Naveen, Palanichamy
,
Haw, Su-Cheng
in
Complexity
,
Data exchange
,
Data integration
2023
Extensible markup language (XML) is well-known as the standard for data exchange over the Internet. It is flexible and has high expressibility to express the relationship between the data stored. Yet, the structural complexity and the semantic relationships are not well expressed. On the other hand, ontology models the structural, semantic and domain knowledge effectively. By combining ontology with visualization effect, one will be able to have a closer view based on respective user requirements. In this paper, we propose several mapping rules for the transformation of XML into ontology representation. Subsequently, we show how the ontology is constructed based on the proposed rules using the sample domain ontology in University of Wisconsin-Milwaukee (UWM) and mondial datasets.
We also look at the schemas, query workload, and evaluation, to derive the extended knowledge from the existing ontology. The correctness of the ontology representation has been proven effective through supporting various types of complex queries in simple protocol and resource description framework query language (SPARQL) language.
Journal Article
Analysis and Design of Intelligent Management and Control System for Relay Protection Based on Information Fusion Technology
by
Liu, Yang
,
Li, Lisheng
,
Zhu, Wenqiang
in
Control systems design
,
Data integration
,
Document markup languages
2020
Combined with the current problems of the large number of professional systems for relay protection and the inability to collaboratively share data. This paper analyzes the current application of relay protection information systems. Through analysis of four key relay protection systems including OCS (Operation Control System), Protection and Information System, Wave Recording System, and Traveling Wave System, the core technical parameters are sorted out, and the XML-based (Extensible Markup Language) data specification and model fusion method, introduced the overall architecture of the information fusion relay protection intelligent management and control system, and designed the application function of the intelligent management and control system to realize the information fusion and intercommunication of multi-source heterogeneous systems to improve Power grid coordinated control and emergency response capabilities.
Journal Article
Big-data: transformation from heterogeneous data to semantically-enriched simplified data
by
Ahmad, Tauqir
,
Jabbar, Sohail
,
Khalid, Shehzad
in
Big Data
,
Complexity
,
Computer Communication Networks
2016
In big data, data originates from many distributed and different sources in the shape of audio, video, text and sound on the bases of real time; which makes it massive and complex for traditional systems to handle. For this, data representation is required in the form of semantically-enriched for better utilization but keeping it simplified is essential. Such a representation is possible by using Resource Description Framework (RDF) introduced by World Wide Web Consortium (W3C). Bringing and transforming data from different sources in different formats into the RDF form having rapid ratio of increase is still an issue. This requires improvements to cover transition of information among all applications with induction of simplicity to reduce complexities of prominently storing data. With the improvements induced in the shape of big data representation for transformation of data to form into Extensible Markup Language (XML) and then into RDF triple as linked in real time. It is highly needed to make transformation more data friendly. We have worked on this study on developing a process which translates data in a way without any type of information loss. This requires to manage data and metadata in such a way so they may not improve complexity and keep the strong linkage among them. Metadata is being kept generalized to keep it more useful than being dedicated to specific types of data source. Which includes a model explaining its functionality and corresponding algorithms focusing how it gets implemented. A case study is used to show transformation of relational database textual data into RDF, and at end results are being discussed.
Journal Article
A review of attacks, objects, and mitigations on web services
by
Masrur Rauf, Rafif
,
Amiruddin, Amiruddin
,
Eiffel Rivaldo, Tubagus
in
Computer networks
,
Cybersecurity
,
Denial of service attacks
2020
Web service is a technology that continues to develop until now. Web services are needed for the exchange or dissemination of information between applications through a computer network. Various methods have been developed to improve the performance and security of web services. However, the more the increasing performance of web services, the higher the level of vulnerabilities and threats. Some attacks often target on web services are Denial of Service (DoS), Structure Query Language (SQL) injection, and eXtensible Markup Language (XML) injection. There are also many methods or techniques that researchers proposed as mitigations for these attacks. In this research, a literature review has been carried out regarding attacks as well as objects and mitigation of such attacks on web services.
Journal Article
Transforming XML schemas into OWL ontologies using formal concept analysis
by
Cruz, Christophe
,
Nait-Bahloul, Safia
,
Hacherouf, Mokhtaria
in
Compilers
,
Computer Science
,
Extensible Markup Language
2019
Ontology Web Language (OWL) is considered as a data representation format exploited by the Extensible Markup Language (XML) format. OWL extends XML by providing properties to further express the semantics of data. To this effect, transforming XML data into OWL proves important and constitutes an added value for indexing XML documents and re-engineering ontologies. In this paper, we propose a formal method to transform XSD schemas into OWL schemas using transformation patterns. To achieve this end, we extend at the beginning, a set of existing transformation patterns to allow the maximum transformation of XSD schema constructions. In addition, a formal method is presented to transform an XSD schema using the extended patterns. This method named PIXCO comprises several processes. The first process models both the transformation patterns and all the constructions of XSD schema to be transformed. The patterns are modeled using the context of Formal Concept Analysis. The XSD constructions are modeled using a proposed mathematical model. This modeling will be used in the design of the following process. The second process identifies the most appropriate patterns to transform each construction set of XSD schema. The third process generates for each XSD construction set an OWL model according to the pattern that is identified. Finally, it creates the OWL file encompassing the generated OWL models.
Journal Article
A novel ISO 6983 interpreter for open architecture CNC systems
2015
Computer numerical control (CNC) technology is a key technology in machine tools and is also the base of industrial unit computerization. CNC machines are operated by controllers, which have a software module inside known as interpreter. The function of an interpreter is to extract data from computer-aided manufacturing (CAM) system-generated code and convert it to controller motion commands. However, with the development of numerical control technology, existing CNC systems are limited with interpreter lacking in expandibility, modularity, and openness. In order to overcome these problems, open architecture technology (OAC) was employed in CNC controller. In this paper, a new technique is presented for the interpretation of the International Standards Organization (ISO) 6983 data interface model. The proposed technique is able to interpret ISO 6983 data and translate it to the internal structure required by the CNC machine. It takes an input file in text format and extracts position, feed rate, tool, spindle, and other data. It is also able to generate output in text and EXtensible Markup Language (XML) files as per user defined file structure. The use of .txt and .xml files enables shop floor data modification and internet accessibility facilities in CNC system. The paper includes an introduction, brief description of the architecture, algorithm design, operational pattern, and validation of the system through experimental studies. The output of these experiments illustrated satisfactory performance of the interpreter with an OAC CNC system.
Journal Article