Research on Disaster Prevention Technology of Remote Sensing Image Segmentation of Ocean Big Data Platform Based on Internet of Things

In order to further improve the ability of disaster prevention and mitigation informatization, this article starts from the idea of comprehensive planning and construction, draws on the physiological principles of modern division of labor, and clearly coordinates the various components of the Internet of Things. The Internet of Things, big data, artificial intelligence and other information technologies, based on the Internet of Things-based ocean big data platform remote sensing image segmentation disaster prevention technology research theory, proposed a smart marine engineering construction plan, from the construction of the marine environment 3D monitoring system Ocean construction data platform, development of marine application systems, information standards and specifications, plan design and many other aspects. The platform’s remote sensing image segmentation disaster prevention technology covers marine environmental protection, disaster prevention and mitigation, islands, marine law enforcement, marine economy, marine fisheries and many other commercial fields. The research results have been practiced in the construction of smart oceans in many coastal provinces and cities in China, and have important reference significance for the construction of smart oceans in other coastal provinces and cities.


Introduction
At present, after years of informatization construction, most coastal provinces and cities in China have successively built computer networks, basic and business databases, application systems, and monitoring equipment, etc., and informatization has made significant progress. Computer networks include marine fishery government Private network, marine surveillance private network, national maritime private network, e-government network, etc. Infrastructure consists of computer room and environmental monitoring, servers and storage equipment, UPS power supply, etc. Application systems include coastal provinces, municipalities and self-organized application systems and Application systems deployed at the national and provincial levels. As can be seen from the construction process, the construction of informatization often starts from a single specific business field, lacks overall consideration, and overall planning. During the construction process, some are gradually exposed. Contradictions and problems, such as network connectivity, relatively weak basic data, low business informatization, information silos, repeated construction, and low intelligence. These problems need to be resolved urgently, otherwise it will seriously restrict the future application and development of marine informatization [1].
Based on the important role of ocean big data, China has gradually increased the construction of ocean big data platforms. However, in the process of practice, there are still problems such as the difficulty of integrating multiple heterogeneous data, the difficulty of analysing and storing massive data, and the inability to make full use of network data resources, which hinders the construction of the platform. In this situation, this article will explore the construction and application of the ocean big data platform.

Marine big data disaster prevention and mitigation processing technology
As a new type of data processing technology, computer data mining technology has a better solution to the increasing demand of people. Data mining technology selects data that meets people's needs by selecting random data from fuzzy memories in big data for people to use. At the same time, data mining is a process of continuous development. For work that does not reach the goal or its expected effect, the computer mining information processing system will automatically re-operate it until the goal is completed. Therefore, for people's life, learning and social production the required information provides a more convenient search method, which can be continuously refined. The establishment of security incident threat characteristic maps facilitates the analysis of graph-based attacks and characterization of attackers. This paper mainly uses the Louvain clustering algorithm to cluster the constructed security event feature map, and combines the clustering results to analyse the attackers represented by each community to obtain a portrait of the attackers. Clustering based on graph features can effectively retain the characteristics carried by the graph itself, and automatically cluster events with similar features, thereby effectively avoiding the limitations and false positives caused by overreliance on artificially constructed static models.
Louvain community clustering algorithm is a partitioning algorithm based on the idea of graph aggregation. In its standard definition, the modularity value is usually used to measure the clustering effect. This algorithm has excellent performance in a network with many nodes, and is one of the most commonly used algorithms in unsupervised community clustering algorithms. Modularity value (denoted as Q) and modularity increment value (denoted as Q  ) are two important index values of Louvain algorithm. Among them, Q represents the degree of modularity of the community, the value is between [0, 1], the closer to 1, the better the degree of modularity, the calculation method is as follows: In equation (1), ij A represents the edge weight between node I and node j, i k and j k represent the edge weights between all nodes and point I and point j, and . c represents the community, and i c and j c represent the community to which node I and node j belong. When the nodes I, j are in the same community, =1 σ , otherwise =0 σ . m represents the total number of edges in the network. During clustering, if a new node is added, the community module value will be recalculated. The change value of the community module degree after the new node is added is expressed by the module degree increment value Q  , and its calculation method is as follows: In equation (2), Σ in represents the sum of the weights of the edges within the community; Σ tot represents the sum of the weights of all the edges connected to the inside of the community; , i in k represents the sum of the weights when node I joins the community c; i k represents the sum of the weights of all edges connected to node I.
The Louvain algorithm mainly completes the clustering of the constructed security event feature map through two stages of multiple iterations. The specific process is shown in Figure 1. a) Assuming that each node acts as a community alone, traverse all nodes, add the nodes to the community with the largest increment of modularity, until the nodes no longer change, and label all nodes classified as the same type of community. b) Treat all nodes classified as the same type of community as a super node, calculate the sum of the edge weights between these super nodes and the edge weights of all nodes in the same community, and repeat the traversal of stage 1. After the multi-round iteration of the two phases ends, if the modularity Q of the graph no longer changes, or the total number of iterations exceeds the set value, the iteration ends, otherwise the next round of iterations continues [2].

Smart Ocean Construction Plan
The mapping relationship between the smart ocean and the human body is shown in Figure 1. The monitoring equipment in the smart ocean is like the facial features of the human body. Through a reasonable layout and division of labour, a comprehensive perception monitoring management is built; the PC, server, storage equipment, computer room, and conference in the smart ocean The hardware infrastructure such as the room is like a human skeleton, and it is the support of the smart ocean; the network architecture in the smart ocean is like the human's neural veins, which is complicated, but in the overall planning and design, all the veins should be sorted out, effectively integrated and subdivided, To enable them to perform their duties and achieve a high degree of unification; intelligent decision-making in a smart ocean is like the function of the human brain, directing and coordinating the functions of the body; islands in the sea, resources and environment, disaster forecasting, fisheries comprehensive, marine law enforcement and comprehensive administrative management, etc. The system is like limbs, with clear responsibilities, and advances in coordination and cooperation; the big data scattered in the smart ocean is like a cell, with a large volume, and the system construction of the operation mechanism of data collection, management, maintenance, and update should be designed.

Big data and ocean monitoring data management process
The characteristics of ocean monitoring data are suitable for processing with big data technology. The design of ocean monitoring data platform must take into account the collection, collection, storage, management, processing, analysis, sharing and visualization of monitoring data. The relationship between the ocean monitoring data management model and big data is shown in Figure 2.

Figure 2. Big data and ocean monitoring data management process
In the previous narrative, the relationship between each of the above links and big data is described. It can be seen that big data runs through the entire process of data management.

Architecture of Ocean Big Data Platform
Its software and hardware platform architecture are divided into three aspects: data layer, technology layer and application layer. The basis of the ocean big data platform is the data layer, which includes the information data collected by all platforms, such as observing the ground, sky, ships, etc., and obtaining data in remote sensing, physics, biology, chemistry and other aspects. After obtaining the data, pre-process it and organize and manage ocean big data in a unified mode; the technical layer is composed of ocean big data fusion and analysis, forecasting, etc., can integrate relevant technologies, and develop cloud platforms, and It has the function of personalized search for relevant information and accurately predicts marine elements; the application layer is to collect application modules under the premise of data retrieval and integration technology to improve the openness and comprehensiveness of marine application service management system (as shown in Figure 3), Combining data sharing, information processing and other technologies with scientific research to promote its development [3].  Figure 3. Application layer of ocean big data platform

Data acquisition platform.
In the entire process of building an ocean big data platform, collecting data is the beginning of the entire chain. Give full play to the important role of modern marine communication technology and network technology, improve the efficiency and quality of data collection, and maximize the scope of collection, including astronomical remote sensing, offshore surveying and mapping, scientific investigations, underwater surveys, etc. Multi-source and heterogeneity are the main features of ocean big data. Only pre-processed data can guarantee the high quality of the data used in the process of building a big data platform.

Storage and computing platform.
Under the effect of cloud computing, the basic platform can use wireless network or wired network to transfer the collected data to the basic platform. When storing and managing ocean big data, you need to use cloud storage, virtualized network and host services, cloud platform, etc. effect. The key storage and calculation methods are distributed. For example, a foundation developed a distributed system infrastructure Hadoop, which includes HDFS distributed file system, MapReduce parallel computing frame.

Analysis and application platform.
In the process of processing ocean big data, the core process is to analyse the data, use the existing data analysis methods and analysis tools to check and transform the data, and mine the potential value contained therein. The commonly used methods for analysing data are classification, neural network, optimization, and analysis of social networks. Real-time analysis and processing of data, playing an important role in the analysis results, can achieve good results in early warning of marine disasters, prevention of disaster damage, monitoring of the marine ecological environment, and prediction of fishing conditions.

Information visualization platform.
By analysing ocean data and constructing relevant application models, it is possible to use dynamic methods to visually express ocean elements, processes and forecasts from multiple dimensions, and to provide information visualization services to relevant industries in the ocean field or for related science the study [4].

Summary
The current status of cloud computing and big data is in full swing. It has been widely used in various fields. With the development of technology, the Internet, and the Internet of Things, big data will penetrate into every corner of our lives. Big data technology is quietly changing live our lives. The characteristics of ocean environment monitoring data are suitable for big data processing technology. Building cloud environment and data processing platform based on cloud computing and big data technology as a carrier is in line with the trend of IT development. If cloud computing and big data are used to establish a marine environment monitoring data platform, it will surely bring about tremendous changes to the operational management and decision-making of marine environment monitoring data, and will have great significance for the scientific development of marine environment monitoring.