International Journal on Recent and Innovation Trends in Computing and Communication ISSN: 2321-8169 Volume: 2 Issue: 6 1519 – 1521 _______________________________________________________________________________________________ 1 IJRITCC | June 2014, Available @ http://www.ijritcc.org _______________________________________________________________________________________ Efficient Storage and Processing Of High Volume Network Monitoring Data Ms. S. Malathy 1 , Ms. D. Saranya 2 , Ms. S. Naveena 3 ,Mr. D. Rajesh Kumar 4 Asst. Professor 1,2,3,4 , Department of Information Technology, Bannari Amman Institute of Technology, Sathyamangalam -638401, India. E-mail: ksmalathy@bitsathy.ac.in 1 , saranya.d@bitsathy.ac.in 2 , naveena@bitsathy.ac.in 3 rajeshkumard@bitsathy.ac.in 4 Abstract -Network traffic monitoring describes the use of the system that constantly monitors a computer network for a slow or failing component and notifies the network administrator I case of outages. Monitoring modern networks involves storing and transferring huge amount of data. To overcome with this problem a technique is used which allows number of operations performing directly on the transformed data with a controlled loss of accuracy. The results show that the transformed data closely approximates the original data(within 5% relative error) while achieving compression ratio of 20%.A sensibility analysis show that technique allows to trade off the accuracy on different input fields, while scalability analysis indicate that the technique scales with input size spanning up to three orders of magnitude. With the increasing sophistication of attacks, there is a need for network security monitoring systems that store and examine very large amounts of historical network flow data. An efficient storage infrastructure should provide both high insertion rates and fast data access. Traditional row-oriented Relational Database Management Systems (RDBMS) provide satisfactory query performance for network flow data collected only over a period of several hours. Keywords-Network monitoring, traffic analysis, monitoring data compression, space utilization, traffic analysis. __________________________________________________*****_________________________________________________ I INTRODUCTION Network monitoring for a corporate network is a critical IT function that can save money in network performance, employee productivity and infrastructure cost overruns .A network monitoring system monitors an internal network for problems. Telecom Operators have to constantly monitor the network for a number of task like management, provisioning, service offering, etc.[4] Such as billing require the analysis of the data in real time. Network provisioning is performed on a set of statically indicator calculated over the last month and year. These task monitoring infrastructures have been presented such as relying on data reduction techniques to keep the amount of data manageable. Similarly these issues have been early addressed by the network research community. Cloud storage is an important service of cloud networking, which allows the data owners to move the data from their networking systems to the cloud. The new paradigm of data hosting service also introduces new security challenges. Owners would worry that the data could be lost in the cloud. This is because data loss could happen in any infrastructure, no matter with high degree of reliable measures cloud service providers. It could discard the data that have not been accessed or rarely accessed to save the storage space and claim that the data are still correctly stored in the cloud. Therefore, owners need to be convinced that the data are correctly stored in the cloud. Owners can check the data integrity based on two-party storage auditing protocols. In this the third-party auditing is a natural choice for the storage auditing in cloud networking. A third-party auditor has expertise and capabilities can do a more efficient work and convince both cloud service providers and owners. The auditing protocol should have the following properties they are Confidentiality, Dynamic Auditing, and Batch Auditing. Some existing remote integrity checking methods can only serve for static archive data. It cannot be applied to the auditing service, since the data in the cloud can be dynamically updated. Thus, an efficient and secure dynamic auditing protocol is desired to convince data owners that the data are correctly stored in the cloud. II LITERATURE SURVEY SECURE STORAGE IN HIGH VOLUME NETWORK: A spatially efficient data representation is produced that allows approximated computations in network monitoring logs, with no need for decompression stage. In order to ease the description of the technique, of its characteristics, and the challenges it must deal with, a specific format of monitoring log, but the technique can be applied to more complex monitoring data. The log format considered, analogous to flow-level traffic traces such as NetFlow ones, is constituted of records, each comprising four fields they are timestamp, source IP, destination URL, and load. Each record represents a single HTTP session originated by a source IP to retrieve the given URL1, and the amount of data that has been exchanged. This format represents an example of data commonly logged by operators. To be able to perform computations directly on the representation,a technique is applied which converts the log in numerical matrices, where each row/column is a