The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Summary Incorporating various specialty libraries in different programming languages (FORTRAN and C/C++) with the main body of the source code remains a major challenge for developing scientific and engineering application software packages. The main difficulty originates from the fact that Fortran 90/95 pointers and C/C++ pointers are structurally different. In this paper, we present a technique...
Understanding the temporal evolution of features of interest requires the ability to: (i) extract features from each snapshot; (ii) correlate them over time; and (iii) understand the resulting tracking graph. This paper provides new solutions for the last two challenges in the context of large-scale turbulent combustion simulations. In particular, we present a simple and general algorithm to correlate...
Values of integrated metrology (IM) in high-volume manufacturing (HVM) have been a topic of discussion for over a decade. This publication finally brings the HVM data to take a fresh-look at the values of IM. A detail analysis was done using a very large amount of HVM data (2x-production node) in order to quantify both cycle time advantage and on-product overlay improvement benefits of IM. A significant...
Super resolution localization microscopy (SRLM) techniques such as STORM and PALM overcome the ∼200nm diffraction limit of conventional light microscopy by randomly activating separate fluorophores over time and computationally aggregating their nanometer resolution detected locations for image reconstruction. However, a basic limitation of current SRLM approaches for live cell imaging is their low...
Super-resolution localization microscopy (SRLM) is a new imaging modality that is capable of resolving cellular structures at nanometer resolution, providing unprecedented insight into biological processes. Each SRLM image is reconstructed from a time series of images of randomly activated fluorophores that are localized at nanometer resolution and represented by clusters of particles of varying spatial...
Localization-based super-resolution techniques are revolutionizing biological research by breaking the diffraction limit of fluorescence microscopy. Each super-resolution image is reconstructed from a time series of images of randomly activated fluorophores. Here, a fundamental question is to determine the minimal imaging length so that the reconstructed image faithfully reflects the biological structures...
We propose an automated algorithm for segmentation of mitochondria from widefield fluorescence microscopy images for quantitative morphology characterization. Mitochondria are membrane-bound organelles that are essential to cells of higher living organisms. Reliable and precise quantitative characterization of their shape is crucial to understanding related physiology and disease mechanisms. Building...
The size and scope of cutting-edge scientific simulations are growing much faster than the I/O and storage capabilities of their runtime environments. The growing gap gets exacerbated by exploratory dataâ"intensive analytics, such as querying simulation data for regions of interest with multivariate, spatio-temporal constraints. Query-driven data exploration induces heterogeneous access...
The size and scope of cutting-edge scientific simulations are growing much faster than the I/O subsystems of their runtime environments, not only making I/O the primary bottleneck, but also consuming space that pushes the storage capacities of many computing facilities. These problems are exacerbated by the need to perform data-intensive analytics applications, such as querying the dataset by variable...
Efficient handling of large volumes of data is a necessity for exascale scientific applications and database systems. To address the growing imbalance between the amount of available storage and the amount of data being produced by high speed (FLOPS) processors on the system, data must be compressed to reduce the total amount of data placed on the file systems. General-purpose loss less compression...
The growing gap between the massive amounts of data generated by petascale scientific simulation codes and the capability of system hardware and software to effectively analyze this data necessitates data reduction. Yet, the increasing data complexity challenges most, if not all, of the existing data compression methods. In fact, loss less compression techniques offer no more than 10% reduction on...
Efficient analytics of scientific data from extreme-scale simulations is quickly becoming a top-notch priority. The increasing simulation output data sizes demand for a paradigm shift in how analytics is conducted. In this paper, we argue that query-driven analytics over compressed — rather than original, full-size — data is a promising strategy in order to meet storage-and-I/O-bound application challenges...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.