The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
State-of-the-art scientific instruments and simulations routinely produce massive datasets requiring intensive processing to disclose key features of the artifact or model under study. Scientists commonly call these data-processing pipelines, which are structured according to the pipe and-filter architecture pattern.1 Different stages typically communicate using files; each stage is an executable...
Systems biology is characterized by a large community of scientists who use a wide variety of fragmented and competing data sets and computational tools of all scales to support their research. In order to provide a more coherent computational environment for systems biology, we are working as part of the Department of Energy Systems Biology Knowledgebase (Kbase) project to define a federated cloud-based...
Modern scientific enterprises are inherently knowledge-intensive. In general, scientific studies in domains such as geoscience, chemistry, physics and biology require the acquisition and manipulation of large amounts of experimental and field data in order to create inputs for large-scale computational simulations. The results of these simulations must then be analyzed, leading to refinements of inputs...
Independent, greedy collection of data events using simple heuristics results in massive over-sampling of the prominent data features in large-scale studies over what should be achievable through ??intelligent", online acquisition of such data. As a result, data generated are more aptly described as a collection of a large number of small experiments rather than a true large-scale experiment...
Operating the electrical power grid to prevent power black-outs is a complex task. An important aspect of this is contingency analysis, which involves understanding and mitigating potential failures in power grid elements such as transmission lines. When taking into account the potential for multiple simultaneous failures (known as the N-x contingency problem), contingency analysis becomes a massively...
Scientific applications are often structured as workflows that execute a series of interdependent, distributed software modules to analyze large data sets. The order of execution of the tasks in a workflow is commonly controlled by complex scripts, which over time become difficult to maintain and evolve. In this paper, we describe how we have integrated the Kepler scientific workflow platform with...
Enterprise level cyber security requires the deployment, operation, and monitoring of many sensors across geographically dispersed sites. Communicating with the sensors to gather data and control behavior is a challenging task when the number of sensors is rapidly growing. This paper describes the system requirements, design, and implementation of T3, the third generation of our transport software...
Architecture reviews are an effective way of ensuring design quality and addressing architectural concerns. However, the software engineering community rarely adopts the methods and techniques available to support disciplined architecture review processes.
Through the development of new classes of software, algorithms, and hardware, data-intensive applications provide timely and meaningful analytical results in response to exponentially growing data complexity and associated analysis requirements.
Systems biology research demands the availability of tools and technologies that span a comprehensive range of computational capabilities, including data management, transfer, processing, integration, and interpretation. To address these needs, we have created the bioinformatics resource manager (BRM), a scalable, flexible, and easy to use tool for biologists to undertake complex analyses. This paper...
Management of software architecture knowledge is vital for improving an organisation's architectural capabilities. Despite the recognition of the importance of capturing and reusing software architecture knowledge, there is currently no suitable support mechanism available. To address this issue, we have developed a conceptual framework for managing architecture design knowledge. A Web-based knowledge...
Summary form only given. Video games have now existed in various forms for over 30 years, and have evolved from humble beginnings into remarkably complex software projects. The ever present emphasis on an immersive audio/visual experience has put game developers in the position of being on the bleeding edge of exploring the performance of modern consumer hardware. This talk will discuss the elements...
Data intensive computing is concerned with creating scalable solutions for capturing, analyzing, managing and understanding multi-terabyte and petabyte data volumes. Such data volumes exist in a diverse range of application domains, including scientific research, bio-informatics, cyber security, social computing and commerce. Innovative hardware and software technologies to address these problems...
Building high performance analytical applications for data streams generated from sensors is a challenging software engineering problem. Such applications typically comprise a complex pipeline of processing components that capture, transform and analyze the incoming data stream. In addition, applications must provide high throughput, be scalable and easily modifiable so that new analytical components...
In this paper, having morphed their external personae to become almost mainstream, the future looks bright for real programmers. The author discusses the evolution of the programming profession.
An enterprise service bus (ESB) is a standards-based integration platform that combines messaging, web services, data transformation, and intelligent routing in a highly distributed environment. The ESB has been adopted as a key component of SOA infrastructures. For SOA implementations with large number of users, services, or traffic, maintaining the necessary performance levels of applications integrated...
Industry best practices are widely held but not necessarily empirically verified software engineering beliefs. Best practices can be documented in distributed web-based public repositories as pattern catalogues or practice libraries. There is a need to systematically index and organize these practices to enable their better practical use and scientific evaluation. In this paper, we propose a semi-automatic...
Performance and scalability are critical quality attributes for server applications in Internet-facing business systems. These applications operate in dynamic environments with rapidly fluctuating user loads and resource levels, and unpredictable system faults- Adaptive (autonomic) systems research aims to augment such server applications with intelligent control logic that can detect and react to...
Capturing the technical knowledge, contextual information, and rationale surrounding the design decisions underpinning system architectures can greatly improve the software development process. If not managed, this critical knowledge is implicitly embedded in the architecture, becoming tacit knowledge which erodes as personnel on the project change. Moreover, the unavailability of architecture knowledge...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.