An enhanced intelligent classification approach to improve the encryption of big data

Progression in cloud applications have been hindered due to critical issues such as Data security and privacy. The aggravating concern remains the ability and access of cloud operators in acquiring sensitive data. Consequently, usage and excitement of cloud computing has remained lukewarm, with users remaining skeptical of the level of confidentiality promised. Principally, the financial industry and governmental agencies have shown lackluster and inimical enthusiasm of these indications. The paper serves to analyze the aforementioned issue, and present a creative proposal into cryptography, as a means to virtually restrict cloud service operator’s access to sensitive data. The free-will of cloud operators in handling sensitive data will be undermined. Through this method, file is divided with more accuracy using an intelligent classification technique. Alternatively, a different method can be utilized to find out whether data need splitting to obtain lesser operating time and minimize storage space. The results show that the approach can resolve innumerable risks associated with cloud computing, whilst requiring sufficient computing time using a very good intelligent machine learning classification techniques. As such, novel approach has been proposed which is entitled as a model for Sensitive Encrypted Storage (SES). In this model, three proposed algorithms Convolution Neural Network with Logistic Regression, Elliptic-curve Diffie Hellman-Shifted Adaption Homomorphism Encryption and Decryption have been integrated.


Introduction
Cloud computing is the new innovative technology that brings boom to the industry. Earlier when the user is to use some of the features of the application and in any case the local interface is blocked for the usage then one is unable to develop the application at that point of time and entire system of development become to halt [4]. There are many large companies that are efficient in serving cloud interfaces such as Google, Microsoft, Facebook and Amazon and offer their own clouds. The cloud interface is efficient and could be building in minimal investment and one could use the services from any point of time and for much longer time. It helps in migrating data in most convenient way. The customers can opt for flexible services and competitive storage package [1][2][3]. The other unique services offered by cloud interfaces are IaaS, SaaS and PaaS depending upon the demand of the customers. There exist some security problems in as users are more concerned about the storage of their sensitive data and these are better overcome with the advent of cloud interface. This is due to the fact no knowledge of physical location is being given to the user and due to this there is limited or no chance of access to data and causing breach. This way partnership between cloud service providers and customers is formed. The cloud interface worked well with the big data and provides various solutions as the users' end [5][6][7].

Phases of the proposed model
Below are the different phases that are involved in the proposed model as shown in figure 1: x Phase 1 In the process covered under this phase, precision, recall and accuracy form the basis for evaluation and analysis. The entire model in proposed phase reflects secure data classification. Protecting the sensitive data is IOP Publishing doi:10.1088/1757-899X/1049/1/012008 2 major concern here.
x Phase 2 The process in this phase is responsible for storing the data in the safe environment and also store data that is not of much importance for reliable use. The given phase evaluates encryption approach, time and analyses storage.
x Phase 3 The enhancement in the overall process is the main focus in the phase. This is done to ensure better results in return by making the relevant changes in time, accuracy, parameters, data security and integrity. The success of the process is measured if it is performed in minimal time as well as storage.

Simulation model
The followed algorithm is described below: The used simulation model mainly worked upon underlined three main algorithms : x Convolution Neural Network with Logistic Regression (CNN-LR) [6,12] x The tweet text is taken as the input in the algorithm. After input, the input text is pre-processed with the aim of extracting the main function. Further, in the initial step the mechanism of convolution, pooling, and activation is performed in order to reduce non linearity so that logistic regression could be used. The use of EPOCH enhances the process overall and improves upon the accuracy and all the loss are eliminated.
x Diffie-Hellman-Shifted Adaptation Homomorphism Encryption [8,13] The above algorithm approach is responsible for performing the data processing before the data is actually transfers or shared against the cloud interface [10,11].
x Elliptic curve Diffie-Hellman-Shifted Adaptation Homomorphism Decryption [9] The algorithm approach is responsible for performing the data processing before the data is actually transfers or shared against the cloud interface. The alignment of the algorithm is done until it is sent to the cloud processors for actual processing an represents the data in the elliptic curve [14].

Results and discussion
The result section is effective in presenting the outputs that are gained from the analysis as well as evaluations that are performed. Initially, the comparison of accuracies of different machine learning techniques is performed. This is achieved by inserting the data that is of same size and further their accuracy rate is calculated by using different classification techniques. The given figure 2 is representative of different machine learning techniques that follows both accuracy and precision. Again, for the analysis same data size is inserted and is measured for determining encryption time

Reason for selecting CNN-LG
x The main reason could be seen as one move from KNN to neural network than the resultant that is achieved is appropriate and selects layered network of convolution. x There is no scope of non-linearity in the selected model and all features are advances and are linear in nature.
x The classification is further optimized in the form of machine learning.
The experimental scale mostly ranges from 50 MB to 512 MB. The graph obtained as output is reflective of enhancement in the processes of encryption, decryption, time and security. Both the performance of large as well as small data set is calculated and it effectively accommodates. Following are the main reasons that are seen for a change are: x The binary stream is utilized over unary stream that eliminates overhead time.
x The storage is improved in case of encryption and decryption method that moves in left and right direction respectively. x Shifting analysis is mainly concerned with binary streaming.

Conclusion
To conclude, paper mainly talks about the services and benefits offered by the cloud interface. Further all the algorithms that are used for this purpose are highlighted in the paper. On analysis it is found the proposed methodology is effective in protecting the cloud-based methods against various threats and challenges. Later, the topic and concept related to encryption and decryption are also highlighted into document.