The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Cloud database management systems (DBMSs) often decouple database instances from physical storage to provide reliability and high availability to users. This design can robustly handle a single point of failure, but needs substantial effort to attain good performance. In this paper, we analyze the decoupled architecture and present important optimization issues that we faced in implementing the design...
The drastic increase in the commodity computer and network performance for the last generation has a resultant of faster hardware and more sophisticated software. But, the supercomputers of the current generation are still incapable of solving the current problems in the field of science, engineering, and business. This problems arises as a single machine cannot facilitate the availability of various...
All organizations big or small need to manage their data effectively, both the structured data stored in operational databases and the unstructured data that is spread out across documents. As the data size grows, it becomes essential to move the data to a warehouse so that data is available for adhoc queries and analysis. This data repository of both useful data and documents along with meta data...
Maintaining situational awareness in dynamic and complex systems, such as information and communications networks, is essential to protecting assets and increasing mission assurance. Network monitoring tools generate tremendous amounts of data, overwhelming network defenders with alerts.
Today's era is cloud era. Maximum organizations already are using cloud services for their data storage and many more looking forward to shift on cloud computing. Data stored on cloud data servers by data owners or organization should be accessible to privilege users. On cloud enormous data is stored but when authorized users wants to access specific data of his interest resource matchmaking is required...
Data of any organization increases day by day and emergence of different technologies brings challenge in maintenance and security of data. Data isolation in SaaS application needs a different approach to manage it. It requires a approach which assures isolation with low cost ownership. It should also maintain high quality security mechanism to protect it. New approach different than traditional approaches,...
In this paper the features of remote laboratories development are given. Investigation of tools and embodiments of the remote laboratories software was done. The development technology was chosen and approved based on requirement analysis for remote labs’. The different approaches to the organization of remote labs management system were considered. The practical results of the remote lab implementation...
Even though of its very recent origins, Cloud Computing has matured into a main stream technology over the past few years. Private cloud, where an organization sets up an internal cloud infrastructure, is gaining traction these days because of its perceived security advantages. Two of the major open source cloud middlewares are OpenStack and CloudStack. This paper provides a comparative study of these...
IDIPOS, that stands for Italian Database Infrastructure for Polar Observation Sciences, has been conceived to realize a feasibility study on infrastructure devoted to management of data coming from Polar areas. This framework adopted a modular approach identifying two main parts: the first one defines main components of infrastructure, and, the latter selects possible cloud solutions to manage and...
Microfinance as a financial institution to fight poverty in rural areas faces challenges to reduce operating costs and to increase effectiveness of their business activities. Information Technology (IT) has emerged as a powerful tool to improve efficiency and effectiveness of business operations, making it viable for the microfinance sector to expand into the rural and low-income area. IT Value implementation...
Programmability is one of the key factors on software defined networks (SDN),once it allows to decouple between data plan and control plan. However, in such context, managing several applications' inputs and outputs is a little explored task on controllers field, which are not in a controlled and organized environment. This article proposes a method to make SDN applications to reach the administrator,...
This paper shows how IT-Governance Audit can be computerized by the use of Multi-agent system and Interorganizational Workflow (IOW). An IOW aims at business process orchestration with the particularity that this processes are heterogeneous, autonomous and independent. In fact, Information System (IS) components nowadays should cooperate despite their divergence to ensure a value added services .In...
Nowadays, almost all organizations are using Information technology (IT) tools to manage their business. Information systems (IS) organizations involved in these areas are as diverse as system management, advertising, online sales and also maintenance and communication. IS becomes increasingly complex making it difficult to follow the complex growing demands of IT resources requirements. Cloud Computing...
A Grid system is a promising solution for sharing distributed resources of registered participants to perform tasks whose demands may exceed the capacity of individual participating organization. However, the collaborative aspect of Grids is still underdeveloped as they lack features and mechanisms for human interaction and collaboration. This paper presents a prototype system for collaborative task...
Among different emerging disruptive technologies, P2P computing has shown its usefulness in designing decentralized and scalable online learning systems. One important feature of P2P systems explored in this context is that of direct peer-to-peer communication. It has been shown in several recent research works that the direct communication between peers increases the interaction among peers and eventually...
Scale-out datacenters mandate high per-server throughput to get the maximum benefit from the large TCO investment. Emerging applications (e.g., data serving and web search) that run in these datacenters operate on vast datasets that are not accommodated by on-die caches of existing server chips. Large caches reduce the die area available for cores and lower performance through long access latency...
Through the analysis of the different iterations of the Geometry Mobile (GEM) project, a mobile learning effort in the field of mathematics, we have identified a major architectural issue to be addressed in the design and implementation of m-learning applications. Due to the dynamic nature of the field many challenging requirements are continuously emerging. One of them relates to the possibility...
Performance and total cost of ownership (TCO) are key optimization metrics in large-scale data centers. According to these metrics, data centers designed with conventional server processors are inefficient. Recently introduced processors based on low-power cores can improve both throughput and energy efficiency compared to conventional server chips. However, a specialized Scale-Out Processor (SOP)...
The proliferation of desktops requires addressing challenges related to high SLAs for desktop provisioning and increased operational expenses. Virtual Desktop Infrastructure (VDI) is a strategy of providing desktops to the user, where the virtual machines are hosted in a virtualized data center. A VDI implementation is a complex IT project that involves deployment and scaling complexities and high...
Need of storing huge amounts of data has grown over the past years. The data should be stored for future reuse or for sharing among users. Data files can be stored on a local file system or on a distributed file system. A distributed file system provides many advantages such as reliability, scalability, security, etc. This paper shows new trends in these systems with a focus on increasing performance...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.