The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
It's Big Data era. In this epoch, the terabytes of data are a dime a dozen as well as large metadata size. The large scale metadata size becomes a barrier in the era of exascale computation. However, a fine-tuning, and a well designing of metadata can enhance the performance of a file system. Therefore, the large scale metadata server (MDS) design becomes key research point now-a-days. The designing...
The Big Data is the most prominent paradigm now-a-days. The Big Data starts rule slowly from 2003, and expected to rule and dominate the IT industries at least up to 2030. Furthermore, the Big Data conquer the technological war and easily capture the entire market since 2009. The Big Data is blasting everywhere around the World in every domain. The Big Data, a massive amount of data, able to generate...
In this Exa byte scale era, data increases at an exponential rate. This is in turn generating a massive amount of metadata in the file system. Hadoop is the most widely used framework to deal with big data. Due to this growth of huge amount of metadata, however, the efficiency of Hadoop is questioned numerous times by many researchers. Therefore, it is essential to create an efficient and scalable...
Size of the data used in today's enterprises has been expanding at a huge range from last few years. Simultaneously, the need to process and analyze the large volumes of data has also increased. Hadoop Distributed File System (HDFS), is an open source implementation of Apache, designed for running on commodity hardware to handle applications having large datasets (TB, PB). HDFS architecture is based...
Size of the data used in today's enterprises has been growing at exponential rates from last few years. Simultaneously, the need to process and analyze the large volumes of data has also increased. To handle and for analysis of large datasets, an open-source implementation of Apache framework, Hadoop is used now-a-days. For managing and storing of all the resources across its cluster, Hadoop possesses...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.