The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Web QoS is an important research field on today's Internet application. The current Web server basically adopts the agreement of HTTP/1.1 version, so the network connection is a model of time maintenance, not the typical cached mode. According to the operational features of this Web server, combined with a forward neural network to predict the response delay, a kind of initiative TCP connection management...
The Lab Intelligent Monitoring System based on S3C2440 ARM microprocessor and embedded Linux has been designed to accomplish the remote monitoring of the parameters and video images of environment in the Laboratory. The embedded database SQLite manages the data collected by sensor networks. A system program based on Qt/Embedded and Linux drivers realizes the local management of environmental data...
Cluster is one of the most popular techniques to resolve the problem, in which the scalability of Internet servers is becoming more and more important. However, as well known, Web server cluster schemes have a lot of disadvantages, such as lack of self-organization, existing bottleneck of system. This paper proposes a model of server cluster based on mobile agent. To develop the capability of load-reassignment...
Using malicious sites to launch attacks against client user applications is a growing threat in recent years. This led to emergence of new technologies to counter and detect this type of client-side attacks. One of these technologies is honeyclient. Honeyclients crawl the Internet to find and identify web servers that exploit client-side vulnerabilities. In this paper, we address honeyclients by studying...
Web access logs contain information which can be converted to represent the access history of individual users. A large number of essential attributes can be extracted from the access history. For example, the access counts of each webpage, the occurrence of different webpage access sequences and the time spent between consecutive accesses. Each of the above attributes represents a dimension in the...
We argue that applications can benefit from the offloading of processing tasks into the data path of next-generation networks, where custom packet processing functions can be implemented. We present the concept of application-layer services and how they can be integrated into our existing network service architecture for easy use and control. Using a web server application as an example, we discuss...
The prevalence of the Internet has promoted the development of the electronic commerce. Digital products shopping system appeared in peoples' eyes. The security problems become very important. The core is how to guarantee the secure and flexible access of the digital products. As the most promising access control model, Usage control has been generally accepted. But the trust problem is the shortcoming...
The E-marketing model in China adopts the international generic model, which develops from WAN into the LAN. This model requires complete support systems, such as modern logistics distribution system, sound credit system and online settlement system. But these systems in China don't satisfy this demand. This paper addresses this problem and proposes a community-based E-marketing model. The model includes...
HTTP does not secure its requests and responses. Using Man-in-the-Middle attack, it is possible to alter the HTTP communication, while it still would look authentic. This can be a problem, if you download data such as PGP key, TOR client, access banking services on-line, or when there is an interest to filter what you can read on the Internet. It should be noted that under particular circumstances,...
Staged design has been introduced as a programming paradigm to implement high performance Internet services that avoids the pitfalls related to conventional concurrency models. However, this design presents challenges concerning resource allocation to the individual stages, which have different demands that change during execution. On the other hand, processing resources have been shown to form the...
Internet proxy server must response to the request of client and transmits the data. According to the requirement, it's important to resolve the problem about how to connect to the remote server and conduct the receiving and sending data. In this paper, caching techniques to reduce system I/O flow was applied as well as thread pool function to deal with I/O. By this way, resolving the problem of system...
This model of communication is using database replication and simple database abstraction models for transmitting data from a web hosting backend to its web hosting dedicated client servers assuring an asynchronous data transmission, a backup solution and a high degree of portability.
The impact of software faults present in components to the larger system is currently a relevant and still open research topic. Web-based applications are simultaneously a relevant type of system for our society and are typically exposed to many software components in the server side. The impact of faults in these components to the web servers is an important aspect when evaluating the dependability...
Web-based systems commonly face unique set of vulnerabilities and security threats due to their high exposure, access by browsers, and integration with databases. In this paper we present empirical analysis of attackers activities based on data collected by two high-interaction honeypots. The contributions of our work include: (1) Classification of the malicious traffic to port scans, vulnerability...
Measurement-based research benefits greatly from efficient and minimum-effort management of large-scale data and their metadata. Effective data management has become a key in promoting data sharing within our community. During the past decade, a great deal of effort has been devoted to building Internet data archives. Several institutions run repositories where they post metadata of their collections,...
Energy efficiency is a very important issue to keep the "Green" Earth, especially in recent years. We have to save the energy consumption anywhere and anytime, including a large set of servers, such as servers in the Google company. In our research, we mainly consider one important energy-electric power. In our previous work, we discussed the difference energy model in cluster servers, and...
Recently, due to the spread of broadband access to the Internet, the speeding up JavaScript on Web browsers and the development of communication technology such as Ajax, a variety of Web applications are provided. However, the access congestion to the Web server and the lower usability in case of higher frequent communication are major problems on the Web application. We have investigated the bottleneck...
In this paper, we have designed and implemented a kernel-level Web-based QoS (WQoS) mechanism that could efficiently support differentiated services when serving multiple diverse types of Web requests in a cluster-based Web server system. Our mechanism is implemented at kernel level to effectively reduce the number of protection domain switches and data copying between kernel space and user space...
As the number of Internet users increase explosively, the delay in network response time is also increasing. An economic and efficient solution for this problem is web caching. But the use of a cache server can cause another bottleneck because of the concentration of requests at the cache server. Many studies on improving cache server performance have been suggested, but existing studies have focused...
A web crawler forms the backbone of a search engine and this backbone needs a careful re- assessment that could enhance the efficiency of search engines. This paper conducts such a re- assessment from the perspective of systems and this is achieved through implementation and analysis of a web crawler "VisionerBOT" as a feed forward engine for search engines using the MapReduce distributed...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.