The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
The paper presents a number of research challenges for medical data storing into Health Information Systems (HIS), such as complex-data modeling features, advanced classification structures, integration of very complex data, and demonstrates how this area may benefit from the functionality offered by data warehousing. In addition it is presented a case study that configures a data warehouse developed...
Memcached is a widely used in-memory caching solution in large-scale searching scenarios. The most pivotal performance metric in Memcached is latency, which is affected by various factors including the workload pattern, the service rate, the unbalanced load distribution and the cache miss ratio. To quantitate the impact of each factor on latency, we establish a theoretical model for the Memcached...
Femtocells are deployed to provide good indoor coverage and to offload data traffic from macrocell networks. Unnecessary handoffs (HOs), ping-pong effects, and cell utilization are important performance metrics for evaluating the quality of connections and data offloading in femtocell networks. Though significant research has been conducted on HO decision algorithms to reduce unnecessary HOs and ping-pong...
A significant milestone is reached when the field of software vulnerability research matures to a point warranting related security patterns represented by intelligent data. A substantial research material of empirical findings, distinctive taxonomy, theoretical models, and a set of novel or adapted detection methods justify a unifying research map. The growth interest in software vulnerability is...
Quality prediction performance of recently standardized parametric P.1203 models for real-streaming services: YouTube, Vimeo, Amazon Instant Video and a proprietary DASH-based streaming framework, is analyzed. In particular, a validation database comprising of bitstream traces from aforementioned services is used to evaluate the performance of P.1203 (mode 0 and mode 1) models. It is understood from...
Architecture of database servers is one of the important parameters in the performance of web applications. In this paper, a model is proposed for guiding Postgre SQL database server sizing for concurrent users. Coloured Petri-nets are chosen to represent the model that brings out need to change the deployment architecture when the current architecture may not suffice. The focus of the proposed model...
Technology management data refers to all the data which is produced in the process of research and has witnessed rapid development in the literature. However, there lacks mapping and visualization of it in global scope. To identify the state and trend of technology management data, the paper uses Citespace to conduct a series of analysis, including the distribution of core authors, journal, countries...
Open source projects and the globalization of the software industry have been a driving force in reuse of system components across traditional system boundaries. As a result, vulnerabilities and security concerns are no longer only impact individual but now also global software ecosystems. Known vulnerabilities and security concerns are reported in specialized vulnerability databases, which often...
Component Analysis (CA) comprises of statistical techniques that decompose signals into appropriate latent components, relevant to a task-at-hand (e.g., clustering, segmentation, classification). Recently, an explosion of research in CA has been witnessed, with several novel probabilistic models proposed (e.g., Probabilistic Principal CA, Probabilistic Linear Discriminant Analysis (PLDA), Probabilistic...
To create the content available on social media platforms such as twitter, is easy and creative. Such information have become strategically important for companies interested in obtaining feedback for their products, for brand endorsements, merchandising etc. By analyzing the content from social media like twitter, companies can decide their target groups and select influential users from these platforms...
Mobile phones are widely used in our day-to-day life. It's not only used by a common man but also used by antisocial elements and that's why it's not a surprise that today in almost every case the first step towards solving a Crime is to analyze the Call Records of the Suspects. Today in almost all the criminal cases, analysis of Mobile Phone calls of suspect's plays an important role in investigation...
In this paper, we propose a novel technique which integrates morphological analysis of ECG signal with Markov Model to derive four major subclasses of ventricular arrhythmia in real time. The subclasses are: ventricular tachycardia, ventricular fibrillation, ventricular flutter and premature ventricular complex. Markov Models have been trained using MIT-BIH ventricular arrhythmia database. Parameters...
Entity-Relationship (ER) models are usually used to design a database application in the industry. Therefore, ER models are taught in database courses at many schools offering tertiary education. However, teaching ER modeling effectively is a challenge for the instructors. The research presented in this paper is based on our previous work that had identified frequent modeling errors. This paper identifies...
Program execution traces (simply traces in the rest of this paper) which include data/control dependency information are indispensable for new kind of debugging, such as back-in-time debugging. We aim to support debugging of Java programs. Traces of practical programs are prone to have vast amount of complex data, which makes it difficult to develop practical debuggers to use for them. In our previous...
Frequent sequence mining methods often make use of constraints to control which subsequences should be mined, e.g., length, gap, span, regular-expression, and hierarchy constraints. We show that many subsequence constraints—including and beyond those considered in the literature—can be unified in a single framework. In more detail, we propose a set of simple and intuitive "pattern expressions"...
SQL injection is the most common web application vulnerability. The vulnerability can be generated unintentionally by software developer during the development phase. To ensure that all secure coding practices are adopted to prevent the vulnerability. The framework of SQL injection prevention using compiler platform and machine learning is proposed. The machine learning part will be described primarily...
Prioritizing a database of items in response to a given query object is a fundamental task in information retrieval and machine learning. We examine a specific realization of this problem in the context of a collection of biomedical articles. Given a query PubMed article, we investigate the problem of identifying and ranking recommended papers that are topically related to the query article. The two...
The analysis of clinical pathways from event logs provides new insights about care processes. In this paper, we propose a new methodology to automatically perform simulation analysis of patients' clinical pathways based on a national hospital database. Process mining is used to build highly representative causal nets, which are then converted to state charts in order to be executed. A joint multi-agent...
Modular construction has been a widely used method for industrial construction in Alberta. Heavy piperack modules are prefabricated and assembled offsite and transported to site for installation, which minimizes the impact of Alberta's harsh weather and improves efficiency. Such projects are large in scale, ranging from hundreds of modules to thousands; because of this, project planning often requires...
Simulation-based acquisition (SBA) is a robust, collaborative use of modeling and simulation (M&S) technologies that are integrated across acquisition phases and programs. Our research goal is to quantitatively show the benefits from M&S in SBA. To that end, we should consider costs arisen from the use of M&S in SBA, e.g., development costs related with M&S. This paper presents a simulation...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.