The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
The exchange of information and the cooperation of enterprise information systems play a key role in the development of applications based on merging data from a diversity of sources. Data sources can be stored in traditional databases and are increasingly available in semi-structured formats, including dynamic web pages, which can be accessed through web forms. To alleviate the inherent heterogeneity...
We present our initial experimental findings from the collaborative deployment of network Anomaly Detection (AD) sensors. Our system examines the ingress http traffic and correlates AD alerts from two administratively disjoint domains: Columbia University and George Mason University. We show that, by exchanging packet content alerts between the two sites, we can achieve zero-day attack detection capabilities...
The study of macromolecular protein structures at an atomic resolution is the source of many data and compute intensive challenges, from simulation, to image processing, to model building. We have developed a general platform for the secure deployment of structural biology computational tasks and workflows into a federated grid which maximizes robustness, ease of use, and performance, while minimizing...
Cyber security analysis tools are necessary to evaluate the security, reliability, and resilience of networked information systems against cyber attack. It is common practice in modern cyber security analysis to separately utilize real systems of computers, routers, switches, firewalls, computer emulations (e.g., virtual machines) and simulation models to analyze the interplay between cyber threats...
This paper proposes that social network data should be assumed public but treated private. Assuming this rather confusing requirement means that anonymity models such as k-anonymity cannot be applied to the most common form of private data release on the internet, social network APIs. An alternative anonymity model, q-Anon, is presented, which measures the probability of an attacker logically deducing...
As reliance on Internet connected systems expands, the threat of damage from malicious actors, especially undetected actors, rises. Masquerade attacks, where one individual or system poses as another, are among the most harmful and difficult to detect types of intrusion. Previous efforts to detect masquerade attacks have focused on host-based approaches, including command line, system call, and GUI...
In services and cloud computing, processes need to be continually adapted to changing environments and requirements. Undisciplined process adaptation could easily lead to data flow anomalies, e.g., input missing for some activities in the process. In this paper, we study the problem of data-flow-correctness-preserving adaptation and propose three important criteria that can maintain the data flow...
In large-scale compute cloud systems, component failures become norms instead of exceptions. Failure occurrence as well as its impact on system performance and operation costs are becoming an increasingly important concern to system designers and administrators. When a system fails to function properly, health-related data are valuable for troubleshooting. However, it is challenging to effectively...
Researches on preserving private data in the application of web data mining possess practical value. Through introducing basic concepts of web log mining and private data protection, this paper analyzes the status quo of privacy preservation in web log mining, and then it puts forward privacy preserving mining model based on evolutionary algorithm of cloud model, combining with evolutionary algorithm...
3GPP does not supply any active security service interface or available security algorithm model for the terminals, which restricts the initiative ability and configuration ability of security communication on the application layer in a great deal. Focusing on this problem, we firstly brought forward three-layer architecture which guarantees application layer's communication security of 3G terminals,...
Loopholes in online auction sites enabled fraudsters to easily hide themselves. To reduce the odds of being defrauded, online auction traders usually use reputation systems for estimating a trading partner's credit. However, reported dollar losses of online auction fraud have hit recorded height for years that implies existing reputation systems may not prevent fraud effectively as expected. To reduce...
The usage of the Internet has become ubiquitous, even for desktop applications to assume that the computer system it is running on is connected to the Internet. Desktop applications rely on the Internet connectivity for software license authentication and also for maintenance through downloading of software patches. However, the latter can pose an annoyance to the user when he or she is relying on...
This paper studies the solution to a kind of data acquisition model, introduces its design principle and architecture, makes a detailed study of its key technology and sums up a kind of method for realizing multi-source data synthesis acquisition, thereby laying a good foundation for the upper data analysis of unified network security management system.
The Internet evolved to a generic platform and became a fully pervasive infrastructure providing services anywhere and anytime. The development of the cloud allows the assumption that every needed service has already been implemented. The authors argue that this technical viewpoint needs to be enlarged by service science aspects, where technology is only one perspective beside others like new business...
The Internet and computers did not invent or even cause privacy issues. The issues existed long before the creation of computers and Internet. The existence of the Internet, computers and large data storage make it possible to collect, process and transmit large volumes of data, including personal data. In this paper, we shall study the privacy from following two different views, namely legal framework...
HTTP-related vulnerabilities are being more commonly exploited as HTTP applications becoming the number one application across the Internet. Several HTTP specific anomaly methods have been proposed, among which grammar-based methods tend more likely to reflect the underlying structure of HTTP communications, therefore showed a promising classifying capability between benign and malicious accesses...
The deep study of anomaly feature based on the particular server was made in this paper. By continuously monitoring on the honeypot deployed in Internet Data Center for more than two months, the experimental results were summarized and some initial exploratory models were built. The models show that the number of attackers for the main attack types and ports can be described by normal distribution;...
This paper discusses the network gatekeeper technology, and proposes a security domain isolation and data exchange model based on virtual machine monitor (VMM). Then we give up an implement framework of this model based on XEN. Finally, we discuss the security feature and the future appliance effect of the model.
Internet technology has developed rapidly and both software system and hardware equipment have improved greatly in recent years. However, Internet brings people not only convenience but also great potential threats. Facts show that potential safety hazards exist from the emergence of internet. As a kind of effective information security safeguard measure, intrusion detection makes up for the defects...
The Flume system is an implementation of decentralized information flow control (DIFC) at the operating system level. Prior work has shown Flume can be implemented as a practical extension to the Linux operating system, allowing real Web applications to achieve useful security guarantees. However, the question remains if the Flume system is actually secure. This paper compares Flume with other recent...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.