A research port test bed based on distributed optical sensors and sensor fusion framework for ad hoc situational awareness

Maritime study sites utilized as a physical experimental test bed for sensor data fusion, communication technology and data stream analysis tools can provide substantial frameworks for design and development of e-navigation technologies. Increasing safety by observation and monitoring of the maritime environment by new technologies meets forward-looking needs to facilitate situational awareness. Further, such test beds offer a solid basis for standardizing new technologies to advance growth by reducing “time to market” of up-to-date industrial products and technologies. Especially optical sensor technologies are well suited to provide a situational and marine environmental assessment of waterways for (i) online detection of relevant situations, (ii) collection of data for further analysis and (iii) reuse of data, e.g. for training or testing of assistant systems. The test bed set-up has to consider maintainability, flexibility and extensibility for efficient test set-ups. This means that new use cases and applications within the test bed infrastructure, here presented by a research port, can be easily developed and extended by installing new sensors, actuators and software components. Furthermore, the system supports reliable remote communication between onshore and offshore participants. A series of in situ experiments at the research port of Bremerhaven and in other maritime environments were performed, representing applications and scenarios to demonstrate the capability for the proposed system framework and design.


Introduction
Modern information and communication technologies hold the potential to improve the efficiency and safety of maritime traffic.A working group of the NMMT (German National Masterplan Maritime Technologies) states in their position paper that "efficient maritime transport reduces the environmental impact by reducing pollutant emissions, preventive accident prevention protects against incidents with disastrous consequences for persons, environment and economy" (NMMT, 2015).
By means of utilizing cooperative technical systems, evolving synergies for supportive situational awareness can assist users to enable decision support and reduce workload (compared to the automotive sector).Therefore, our intention is to provide information about safety-critical and environmentally hazardous situations to assess and avoid the latter, which is among others the aim of e-navigation services (IALA, 2008;Hagen, 2013).E-navigation intends to originate a framework concept and strategy for knowledge and development of marine information services (Graff, 2009).
The design of such new supportive e-navigation systems requires methods and tools for testing and simulation approaches.They need to be based on fundamental experiments to prove the concept, reduce development costs and enable the assignment of new system-engineering methods (Hahn, 2014).Therefore, a maritime industry and research initiative supports the deployment of e-navigation services in the Published by Copernicus Publications on behalf of the AMA Association for Sensor Technology.
N. Rüssmeier et al.: Distributed optical sensors and sensor fusion framework form of an e-Maritime Integrated Reference Platform (eMIR) which enfolds tools for simulation as well as experimental physical test beds for new approaches and technologies.In this way, eMIR also includes logistics and hinterland connection, sustainable and compatible use of marine ecosystems and other shore-side aspects.All these test beds extend simulation environments into the real world and offer services for fundamental experiments.These services include a reference waterway, a research port, a mobile bridge system and a Vessel Traffic Services (VTS) system (Stasch et al., 2014).
In the following, we focus on the research port as one of these experimental platforms to facilitate research and testing of optical sensor technology, sensor data fusion, communication technology and data stream analysis tools.We will explore optical sensors as there are cameras (visual light, infrared) and lidar (light detection and ranging) systems deployed and tested in natural environmental study areas.Video sensors are widely used for civil surveillance, providing a situational picture (Trivedi et al., 2005).In addition, optical remote sensing became increasingly important for (in this case, maritime) environmental observations (Holman et al., 2003;Rüssmeier et al., 2016;Schulz et al., 2015;Zielinski et al., 2009).
In the course of this work, we present a suitable framework to provide information from distributed optical sensors to facilitate situational awareness in the research port area.We address current needs for an applicable and tested solution in form of a generic optical sensor system test bed for users in the port area and the maritime research community.Through this we offer a potentially way to provide regional video live streams for participants onboard and ashore which has until now not been considered by supportive e-navigation services (Plass et al., 2014).
The approach of the paper is divided as follows.First, we depict an overview of the requirements in the context of the research port proposed applications based on existing test bed structures as the initial situation for the concept development of the test bed which address optical sensors as well as sensor fusion technologies.In the system architecture section, we describe the framework for sensor-based research and development platforms and the implementation of the data stream architecture, which are based on prior work (Surm and Nicklas, 2015).Further on, we present the system set-up of the physical experimental test bed, followed by the section results and discussion.Therein, onshore and offshore field experiments within the research port and the maritime environment are presented.We provide a proof of concept for the following applications for the test bed: i. online detection of relevant situations, ii.collection of data for further analysis and iii.reuse of data.
Finally, we conclude the work by including a short outlook.

Test bed envisaged applications
Starting from envisaged applications (online detection of relevant situations, collecting data for further analyses, reuse of data), the test bed system requirements constrain various aspects.To increase spatial coverage of the sensors in a dynamic environment with moving objects in the waterway and to increase the reliability of the system and make efforts for synoptical information redundancy, it is much more advisable to use geographically distributed sensors than one single sensor.As a consequence of the demand to set up distributed sensors in real-world study areas, there are challenges for the complete sensor framework.Among others (Pearlman et al., 2014;When et al., 2007), we need to address distributed deployment, extensibility for large area sensor networks, data stream and information transmission, remote access between onshore and offshore participants, central system administration, tailored user interfaces, uniform open data interface and networked synchronized operation.This leads to the following research question: how is it possible to use independent flexible sensor systems with open interfaces for the generation of distributed situational awareness by sensor fusion technology to provide an open test bed for system verification and validation?

Test bed requirements
In general, there are requirements to be covered by hardware components, e.g.mobility, suitable sensor interfaces, energy supply or communication infrastructure technology.In this way, the flow of information targeted to the individuals involved in the test bed area is made possible and allows redundancy in the data distribution of the common operational picture.Further on, there are test bed functionalities to be covered by the system architecture and data management, namely consolidated adaptive and flexible configuration of sensors at runtime, support and extensibility for different data types, information quality processing, adaptive distribution of processing, validation of data management, reproducibility of experiments and integration of modelling (Surm et al., 2016).
-Adaptive and flexible configuration of sensors at runtime: While conducting sensor experiments, the experiment set-up may change often, sometimes even while an experiment is running.New sensors will be added, reconfigured or removed at runtime, and even the integration of new sensor types may be necessary.Sensors mounted on moving objects (ships, etc.) are not permanently accessible due to communication constraints, so sensor fusion queries must adapt to changing sensor configurations.
-Support and extensibility for different data types: The architecture must be able to adopt to different data types and domain-specific data models.This includes, e.g.video data and streams, images, audio data and data generated by laser scanner.
-Information quality processing: Data quality indicators like existence probability, data confidence, error margin, etc. shall be integrated from compatible sensors or can be calculated from previous sensor data.
-Adaptive distribution of processing: For large-area applications, communication between processing nodes may be limited or impossible.Here, an adaptive distribution of processing is needed to adjust to changing communication modes.Nodes which generate large amounts of data, for example, video sources, may preprocess their data and only send semantic information if the communication link only provides a low bandwidth, and can exchange whole video streams which are fused together at a central processing node for better object detection if the communication link provides a high bandwidth.
-Validation of data management: Since the sensor data processing is a vital part of the system, it should be verifiable and deterministic.Instead of allowing arbitrary sensor data processing programs in an entirely complete programming language, a processing model with a formally defined, limited algebra would enable systems where the sensor data processing could be automatically validated to fulfil specified requirements.
-Reproducibility of experiments and tests; intelligent archiving: To support execution and evaluation of experiments, recorded sensor data must be archived in an intelligent way allowing access to data of each sensor separately, and with georeferencing of sensor positions to gain full spatial information for sensor data.Also, experiments must be reproducible, such that the same data processing pipeline of the original experiment can be reused for replays.
-Linking of modelling, simulation, execution and evaluation: Since research on sensor-based applications includes modelling, simulation, execution and evaluation, these activities will be linked together tightly.In addition to the reproducibility of experiments, simulation environments must be easy to integrate into the architecture by an interface.The simulation interface therefore provides an input, e.g. for data grounded on archived data or modified enriched archived data.The idea behind is to couple in data from different eMIR services or data generated by other simulations, e.g.lidar sensor data or environment models.Thus, the environment picture can be exhibited for safely simulated scenarios or testing and training purposes, like dashboards for situational awareness.

Initial situation
As initially mentioned, various services and experimental test beds are provided within the eMIR platform (Hahn, 2014).Within the latter, the research port test bed is based on optical sensors and sensor fusion technologies for the generation of situational awareness for participating users in civil application.Other platforms focus on distributed information fusion for coastal surveillance like NEOps (Network Enabled Operations; When et al., 2007) which promises comprehensive working capacity and processes incoming data immediately; however, this system is not envisaged to be used for civil applications.Furthermore, context-based adaptive fusion systems, which can handle video data (Garcia et al., 2011), present research opportunities in the harbour; however, this approach is focused on post-processing distributed data.None of the existing platforms follow an open-source approach and cover an explicit way for sharing channelled distributed information to multiple users within the test bed or allow a simulation or a replay with stored data.Especially the aspect of testing new optical sensors for varying scenarios or application tools to increase safety by observation and monitoring of the maritime environment is mostly unaddressed, which shall be covered by the research port test bed and the underlying architecture presented here.

System architecture
As mentioned in the previous chapter, the system architecture needs to support distributed installations for sensor-based research.Figure 1 shows an overview of the main components that the system architecture needs to implement.Solid lines denote the data flow, dashed lines the control flow.Before sensor data are used within the architecture, they should be processed by quality and validation components that can, e.g.check whether the illumination is sufficient for a certain camwww.j-sens-sens-syst.net/6/37/2017/era system.After the data are cleansed in the quality and validation module, they will be further processed or archived for later usage (e.g.replay).Often, applications do not need raw data streams but some kind of processing (e.g.filtering or selection of data), which is provided by processing modules.
On the top layer of this test bed architecture, applications under testing (like dashboards for situational awareness or decision support systems) receive data from the system.They specify their data need to the adaptive configuration component that makes sure that the needed data are routed by the communication component to the application.The results of such modules will not only be sent directly to applications but they are also integrated into the so-called world model.This component is key for situational awareness: live data updates from sensors are correlated with pre-knowledge (semantics) about the observed situation, like map data or mobility models of detected objects.If a history of data needs to be kept, live world model data can be stored in archives.On the bottom layer, several distributed sensor systems should be supported; if a real-world field test should be enriched by simulated data, we also need interfaces to simulation modules.
To implement this architecture, we set up a data stream management system (DSMS; Babcock et al., 2002) to cope with the previously mentioned requirements.Similar to a database management system (DBMS; Date, 2004), a DSMS provides a high-level interface to specify the applications' information need.In a DBMS, this is typically realized by an SQL query (subscriber query language) API (application programming interface), where the application sends a onetime query and gets the results.Since the applications in the test bed need an on-going data stream, it would need to query the framework in a high frequency, leading to high data volumes not only on the data flow but also on the control flow (the queries).Therefore, a DSMS is tailored to process continuous queries (Geesen et al., 2012).Here, the query is sent only once to the data management system, and results are continuously sent to the application whenever they are available (similar to the publish-subscribe paradigm, sometimes also referred as event-driven architecture).

Implementation of access framework, processing and data fusion steps
Our set-up is based on the DSMS framework Odysseus (http: //odysseus.offis.uni-oldenburg.de/),developed by the University of Oldenburg.In general, the system architecture could also be implemented using other DSMS systems.However, since Odysseus was designed as an extensible framework for data stream management systems, our requirements could be realized with a reasonable effort (Appelrath et al., 2012;Bolles et al., 2010).The framework includes sensor data access, processing and fusion from one or more processing units, as these are essential requirements for our applica-tions to facilitate situational awareness by distributed optical sensor system operations.
Referring to optical sensors, data fusion can happen at three semantic levels, which require different effort for setup, configuration, analysis of sensors and sensor data by the researcher or the system administrator: visual alignment, feature and model level.For the creation of a context model, comprehensive categorized processing steps and data fusion levels for this system framework were introduced by Kuka et al. (2014) in accordance with the JDL (Joint Directors of Laboratories) data fusion process model (Hall and Llinas, 1997).Figure 2 presents the implementation of our proposed architecture, assigned to the following structure: 1.At sensor level, base sensors and protocol handlers were connected to the data stream management level.NMEA, 2002).
2. Fusion level 0 provides pre-processing steps by an access framework and attribute mapping.The access framework accomplishes communication configuration, a transfer of raw sensor data to a common data structure and union base.The result is a sensor source with a transport, protocol and data handler, e.g. a tuple element for a picture.The attribute mapping determines which data elements included in the output stream of an incoming sensor source, e.g.video streams.This way, sensor quality and validation elements can be annotated for downstream operations, e.g.calibration flags of the IR camera, remission coefficients of the lidar sensor or GPS sensor quality and number of satellites.The only processing provided by the system is a temporal synchronization, the simplest form of sensor data fusion.
Here, data from several sensors can be visualized on the same screen in different frames.
3. Fusion level 1 engenders a feature level by object elaboration of a sensor stream.A feature (Kuka et al., 2014) is a measurable data fact, e.g. the temperature of the water at a given position.If several sensors provide this information, a user could define an aggregation function to fuse several temperature values into one.Such aggregations could easily be defined using standard data stream publisher query language (PQL) queries.Image features (like operators), were applied to camera sources to resize raw images from 4K pixel visual cameras or reinterpret temperature values from IR cameras to a usable, educible greyscale picture.New features can also be assigned to a new protocol handler like FFmpeg Java bindings to use third-party video compression algorithms by image functions, a demandable step to enable effective networked streaming operations.The matrix func-  tion provides arbitrary functions to work with matrices from lidar sensor streams, and due to high scanning frequency the sample operator reduces load by throwing away data stream tuples.The result of the fusion level 1 provides data sinks with live streamed features to UDP (User Datagram Protocol).
4. At sensor fusion level 2, local models are generated using only those sensors available to one node.Subscribers then can access these models via network connections.To disburden the user from formulating the queries, we developed a data stream admin client application.Using a graphical user interface (illustrated in Fig. 3), the user can register new sensors to the system, start and stop data archiving or replay stored data.Also, the visual rendering of local models can easily be realized by the general admin client application.
5. In the application level (the highest provided level), previous sensor fusion levels from multiple processing units of the architecture provide output of local models to the world model (Kuka et al., 2014) and enables distributed situational awareness between participants.A sample world model would contain the map of the harbour including the current features/status of all objects.A global model fuses, cleanses and filters raw sensor data from all available data sources to obtain high semantic levels relevant for situation detection and decision making.If there is no sufficient communication link between distributed processing units available, world models are generated using only those sensors available to connected nodes.Multiple (distributed) subscribers, e.g.admin client applications, can then access this world model via network connections.

System set-up
The set-up consists of a sensor network communicating with an internet-accessible base station, providing a flexible system for sensor data recording, processing and access.The nodes of the sensor network, the mobile sensor boxes, can easily be placed at different locations, where they establish a multi-hop Wi-Fi-link to the base box (Fig. 4, top).Different kinds of sensors can be attached to each box.GPS allows the referencing of the sensor systems to a geoposition and delivers internal system timing (all sensor systems were also synchronized to timeserver npt.uni-oldenburg.de).
The sensor boxes may also be placed on a vessel, where they record and store data until these findings can be transferred to the base box or a mobile box with a connection to the base box.It also provides remote access to the stored data and the research port infrastructure via Wi-Fi to an admin client in the harbour area or via internet.The mobile sensor boxes provide an internal power supply, Ethernet, USB interface, Wi-Fi access and a LAN (local area network) hub, packaged in a weather-proof box for mobile outdoor sensor measurements (illustrated in Fig. 4, bottom).Multiple self-sufficient boxes form a sensor network using Wi-Fi bridges, directional antennas (Nanobeam NBE-M5AC-500, Ubiquiti Networks, USA) or wired Ethernet to overcome long communication distances to the base box.Sensor data recording, processing and access is realized using the highvolume data stream architecture.In the context of research on e-navigation, high-volume data streams often are provided from moving objects such as vessels or come from ad hoc sensor configurations for a specific experiment which can be flexibly handled with this framework.

Network management and administration
Any number of the geographically distributed sensor nodes can be connected via Ethernet cable, Wi-Fi or directional antennas, forming a network with a single subnet.Since each node contains a distinct Wi-Fi router which offers a DHCP (dynamic host configuration protocol) server for wireless devices, different DHCP address ranges had to be used for each box to avoid collisions between multiple DHCP servers on a single subnet.To simplify static and dynamic address allocation and recognition, a block of 255 IPv4 addresses to each box were assigned, starting with 190.168.1.x/16for the first node, 190.168.2.x/16 for the second node and so on.Note that using a 16 bit subnet mask, all addresses in the block 190.168.x.y belong to the same subnet, which drastically simplifies network topology and set-up.Each DHCP server then allocates addresses in the 190. 168.x.100 -190.168.x.254 range.
Connecting sensor nodes is then as easy as plugging in an Ethernet cable or setting up a Wi-Fi bridge using the HTTP interface of Wi-Fi routers.For distances up to 5 km, directional Wi-Fi antennas can be used, which act as a reliable network bridge, connecting two networks with the same subnet settings.To establish sensor data streams with clients over even greater distances, in general over the internet, a VPN (virtual private network) approach, e.g.via LTE (longterm evolution) is preferable.For this, the open-source software SoftEther VPN project developed by the University of Tsukuba, Japan (SoftEther, 2016), was furthermore integrated in the sensor nodes.In addition to basic VPN server and client functions, the software also offers a VPN bridge -connecting two spatially separated parts of the same subnet via the internet.If the bridge is established on two PCs in different subnets its route packets from different clients in each subnet, emulating a hardware network bridge.Regarding the maximum transmission unit (MTU) settings of network nodes on the VPN bridge, lower MTU values are preferred to receive stable video streams.This allows clients in one part of the network to directly access sensors in another part of the network via Ethernet and no additional specific sensor data access software is needed on the sensor node PCs.

Results and discussion
The proceeding sections describe the operating principles and development of the uniform flexible optical sensor systems at technical level.Therefore, technical tests were preceded successfully for ad hoc networking connectivity, sensor data stream scripts and central administration of the distributed sensor systems.Wireless high-volume data transfer for long distance onshore-offshore communication was tested in the field over a range of 1000 m via 5 GHz WLAN (wireless local area network) beam bridge (Nanobeam M5 AC 500) up to 300 Mbps for video streaming suitability.Specifications of antennas promise further ranges up to 5000 m, with a limitation of reception angle.Furthermore, special check-ups of visual and infrared camera lenses were performed in the optical laboratory.
In the following, we introduce applications for the purposes of the physical test bed and present results from the optical sensor system set-up in the research port maritime environment.
For field experiments, we used the port in Bremerhaven, Geestemündung, as a research port.The port is located in Bremerhaven, Germany (53 • 32.133 N, 8 • 34.667 E).With its highly frequented ferry terminals, berths at the entry to a double lock and the upstream waterway it is a suitable study area for the geographically distributed optical sensor systems.Up to four observation stations were temporally equipped with the self-sufficient sensor systems, attached to VIS cameras, lidar scanners and one IR camera (highlighted in blue in Fig. 5), where these establish a multi-hop Wi-Filink (illustrated in Fig. 4, top), covering the study area.Additional offshore applications were performed during a cam- paign in close proximity to the JadeWeserPort terminals in Wilhelmshaven, Germany (53 • 35.500N, 8 • 9.300 E).

Online detection of relevant situations
As the optical sensor system is designed to provide distributed optical sensor information for participants over a canalized data stream interface, it is obvious to use these for the detection of relevant situations.These may occur due to obscure information management, e.g.insufficient dissemination or incorrect appraisal of a local threat (Hall and Jordan, 2014) and can cause safety-critical and environmentally hazardous situations.Therefore, the sensor system information could be used to support situational awareness to assess and avoid such situations.
In the context for e-navigation environment applications, we present in the following scenarios for the online detection of relevant situations.To that end, we inspect various berthing and critical situation manoeuvres, person overboard situations and the remote sensing of sea surface temperature which is related to the environmental circumstances and the state of the upper ocean.

Berthing assistance
Berthing assistance tests were been performed with stationary onshore-placed and mobile offshore-placed sensor systems.The objectives for this surveys were to test different operating principles and applicability by a shore-based supporting system and a mobile shipboard-based system.
In the onshore demonstration, the video and laser scanner data (sensors introduced in Sect.3.1) were wirelessly streamed between the stationary sensor systems from the covered area of the research port (Fig. 5) towards to the research vessel RV Heincke and vice versa.The latter utilizing a vessel-based LTE router (B593s-12, Huawei Technologies, China) connected to the mobile sensor system.Furthermore, a forward-looking visual camera and lidar scanner were placed at the bow deck of the RV and connected to the sensor system for a mobile survey.The streamed data were then displayed by the admin client application (introduced in Fig. 3) on the bridge of the research vessel, as seen in Fig. 6 and also by an admin client application located at a station near the ferry terminals in the harbour.Berthing procedures were performed inside the double lock, before entering the stationary systems' covered area in the harbour.Meanwhile, the information of shipboard-driven system was also streamed towards to the stationary systems.Further tests inside the stationary covered area were performed while the RV left from berth toward the port entrance and also when the RV came into the port from the upcoming waterway.In all test cases, a flexible view from different scenes of the maritime study site for all participants could be achieved, which is especially a mutual benefit for safety when entering the port from the upcoming waterway.The tight entrance of the port used by ferries, together with other small private and large transportation vessels, elucidates the beneficial character of shared information from scenes covered by optical sensors for participants to coordinate maneuvers within this area.By the research port, which offers various scenes and maneuvers in a closed, well-known area, the field study setup of optical sensor systems and its results therefore become representative for upscaling approaches.
Another assistance approach for berthing maneuver was performed by a mobile sensor system, installed on RV Senckenberg.Two lidar scanners were placed near midship on the port side and starboard side, both at working deck level (about 1 m above the sea surface).This position offers an unobstructed horizontal bearing range of 180  operating ranges of 20 m radius for coverage of the vessel sides/board walls, illustrated in Fig. 7. Two visual cameras, equipped with wide-angle lenses, were additionally positioned at bridge level, aligned to the field of view of the lidar scanners.They provided the corresponding live video picture of the measured lidar structures and supplemented the lidar distance map.This is also done to enhance the assessment of the lidar information, because the horizontal scanning range requires a corresponding object height above the sea surface level for coverage structures, which is, especially during high tide, not pre-existing in a tide-dependent berthing facility.Measurements were then performed while the vessel entered the starboard side to berth to the quay bulkhead, illustrated in Fig. 8.
For the surveys realized herein, the lidar scanner provides distance resolutions of less than 0.1 m, a sufficient capacity for berthing applications, which has also been identified by Jimenez et al. (2004).Through berthing-supportive approaches, potential for expansions, e.g.extended lidar scanner scopes or appropriate field-of-view set-ups in the environment, can be demonstrated.Ambient conditions, e.g.lowsurface object remission (infrared, 905 nm) like from (vessel) paintings or biofouling, can interfere the lidar scanner measurements by reducing the detection range (Sick AG, 2015).It is therefore recommended to pre-investigate the arrangement of the set-up (field of view) and positions of the distributed optical sensor systems in the field study area to prevent adverse effects.
By comparison of stationary and mobile optical-based sensor systems, it is obvious that there are individual pros and cons for both berthing approaches.Connected sensor systems, as demonstrated in the research port, provide mutual benefits for all participants, but the information value for individual users depends on sensor-covered areas.On the other hand, the mobile approach which comes with shipboarddriven systems provides its own independent information for a single user, but the mutual benefit for others is constrained.In all cases, results show a success due to the application of flexible optical sensor systems, providing a possibility to develop, test and demonstrate applications in real study areas.

Person overboard situation
A life-endangering situation can happen if a person gets overboard.Especially seafarer or dock labourers are exposed to this danger.To demonstrate the support for locating a person in seawater, a controlled test (without danger to life) with shipboard-carried sensors was carried out.
Infrared thermal camera technology is not limited to daylight and operates as well without exposure of a scene or object of visible light.In this way, night vision sensor characteristics applied to the optical sensor system become a decided advantage for person SAR (search and rescue) applications.Therefore, the thermal image information (7.5-13 µm band) of the infrared camera was used to locate and track a warm object emulating a person in the open sea.Additional the visual camera (sensors introduced in Sect.3.1) provided the corresponding live video picture of the scene, illustrated in Fig. 9.This set-up enables synoptical information, and therefore mutual support, to recognize a critical situation while daylight is present.The optical sensor system was deployed onboard of RV Senckenberg and the sensors were placed looking forward at bow deck level (about 6 m above the sea surface).Emulating a person, a safety buoy with a water-filled container (initial temperature of 40 • C) was prepared and lowered into the (4.5 • C) cold water (performed in February).Subsequently, the vessel went away up to 400 m from the lifebuoy.Meanwhile, the temperature difference of the warm body became smaller over time due to heat exchange with ambient water.An automatic detection by a measuring grid of 3 × 3 pixels of the maximum temperature in the thermal image was possible with this observation for about 150 m object distance within the first 10 min for the warm body (up to 1 K temperature difference between object and environment).A minimum of a 3 × 3 pixels MFOV (measuring object field of view) was recommended and represents 750 mm 2 object surface at 150 m distance to the measuring object.A 1 × 1 pixels evaluation did not provide reliable results to discriminate objects against disorders, e.g.radiant reflections of waves.The measuring grid falls short of 3 × 3 pixels for the tracked object, and in addition waves masked the tracked object at distances above 150 m according to the vessel-carried camera perspective and prime lens (38 • × 29 • ) of the recognition approach.Through this successful study we also achieved information about the characteristics of optical sensors and the measuring arrangement to extend the limitation of the upper detection distance for calm environment conditions.There are potentially advanced and practical observation levels to decrease waved masking effects of objects, e.g.above the bridge of the RV (about 10 m above the sea surface), which can provide additionally detection distance up to 100 m.Further on, available infrared telephoto lenses (13 • × 10 • ) can enable additional 260 m detection distance while keeping 3 × 3 pixels (MFOV).For this purpose, new enhanced infrared camera models with optical sensor resolution up to 640 × 480 pixels are also promising in comparison to the applied (382 × 288 pixels) infrared camera model.Concluding the technical aspect, usage of optical sensor system can enhance SAR for persons overboard.Considering the seriousness of the life-threatening situation, the usage of this technology shall support situational awareness by providing safety-relevant information, while all other possible safety precautions are still perpetually inalienable.

Remote sensing of the sea surface temperature
Optical sensor systems are highly relevant as a remote sensing tool for maritime environmental research and operational applications (Moore et al., 2009).Thermal infrared imagery provides the sea surface temperature (SST) which varies with environmental conditions (e.g. according to the daily cycle of heating by the sun) and is also an important variable in air-sea interactions.The passive sensing method derives information from surface infrared emissivity and (if applied on water) from a layer only several micrometres thick.
The infrared emissivity factor of seawater derived for nadir view observations is in the range of 0.96-0.99 at wavelengths relevant to thermal remote sensing (the corresponding ideal Planck (black body) emitted radiance is 1.00) hence can be considered high (Soloviev and Lukas, 2014).Monitoring of the sea surface temperature is intimately related to surfactants, surface films, thermal convection or pollution impacts (Zielinski, 2003), corresponding to the state of the upper ocean.Therefore, the optical sensor system could provide access to information representing the good environmental status.In this way, also anomalies, e.g.local high temperature patterns, facilitating focus on regional situational awareness.The usage of our proposed sensor system, equipped with a PI 450 microbolometer infrared camera (7.5-13 µm band), enables us to exploit the spatio-temporal variation of the SST by stationary and mobile application.
As part of a demonstration in the research port area, a stationary system was positioned looking to the port entry, waterways and a ferry terminal, as illustrated in Fig. 10 (top).Regions of interest (ROIs) were predefined in the radiometric video image to the sea surface port area and also to the waterway area where vessels pass by, illustrated in Fig. 10 (bottom).The measurement output resulted in the mean temperature of the ROI (emissivity factor 0.96).The temporal temperature profile of the port ROI (black graph) showed the heating treatment of the sea surface by the sun occurring at cloudless sky conditions within 20 min (here in June).The waterway ROI provided the temporal temperature profile of mean temperature (blue graph) within this area.Herein, we found, in addition to sun heating treatment, a sudden temperature rise while a ferry came in to port at about 100 s and departed at about 500 s.The offset between both ROI temperatures indicated that SST differed for the waterway and port area, which could be in this case indicating transportation impact or different mixing of water masses.With this exemplary approach, we revealed multiple patterns from sensor information, representing environmental and also transportation conditions.
For a mobile demonstration of remote sensing of the SST, the optical sensor system was deployed onboard the RV Senckenberg where the infrared camera was installed on the bow deck (about 6 m above the sea surface), directed in nadir view toward the sea surface (emissivity factor 0.96).Meanwhile, the visual camera provided the video images of the associated region in front view, illustrated in Fig. 7.The route of the research vessel started near the offshore terminals of the JadeWeserPort, Germany, turned backward and went on forward parallel to the terminal section, as illustrated in Fig. 11 (bottom).The temporal temperature series shows the turbulent water mass mixing effect of the vessel propeller at 150 s while it turned backward.After 290 s, a higher surface temperature occurred while forward leaving the terminal section to an offshore landing pier, illustrated in Fig. 11 (top).Thereby, we could use the current spatial temperature  information in addition to visual video images to extrapolate to leverage of environmental circumstances.In this case, we identified the influence of industry infrastructure on SST.
As a consequence of changing environmental conditions, like sun exposure, sea surface glint or thermal reflections (sky, clouds and also from harbour structures), the optical sensor information and covered scenes can become interfered, as shown in Fig. 12.This has to be considered especially in mobile applications, where the covered scenes can change rapidly.In the case of interference, the video image information of sea surface conditions, like the presence of  reflected sunlight, could even be useful for supplementary quality information and flagging criteria for above-water optical measurement systems (Garaba et al., 2012(Garaba et al., , 2013)).
It should be noted that the measurements presented in this paper do not represent systematic measurements, but should be seen as use case demonstrations of the presented optical sensor systems.Because the infrared camera amendment for absolute temperature has to involve air humidity and sea surface infrared emissivity, the latter varies with viewing angle, salinity and sea surface roughness (Kuenzer and Dech, 2013;Garbe, 2014;Minkina and Klecha, 2016).

Data collection for further analyses
Recording sensor data are handled similarly as starting queries from the admin client application interface.Input sources (live sensor data) could be separately addressed to a logging category, to set up file size (split up in chunks), recording interval or external triggers.Each server instance logs sensor data locally in raw format to enable replays of the recorded experiments reproducing the original scene (Fig. 3).Metadata like timestamps, logging category or query orders can be saved in the original file or as separate metadata files, for example, when using pre-existing video compression libraries which do not offer support for metadata.Figure 13 shows a logging query graph for an infrared camera, where two different logging queries for the same video sensor are running.The buffer of the 16 bit one-channel temperature image generated by the camera is reinterpreted as a 32 bit RGBA image, such that common video compression and streaming algorithms can be used.Each PQL logging query gets translated into two Odysseus operators: a mapping operator which performs the image reinterpretation and a sending operator which logs the compressed video file.Using Odysseus query sharing, multiple logging queries share the reinterpreted image, such that the mapping operation only is executed once.
The following specific sensor types and data formats, which were used in the research port application, are implemented in the server software: -Lidar scanners scan distances in a plane around the sensor.The scan results are represented as a polar coordinate scan distance map.
-Visual and infrared cameras provide a stream of RGB or temperature images.A video bundle extends Odysseus to support video streaming and video writing functionality.
-NMEA sensors provide data in the NMEA format (NMEA, 2002) over TCP/UDP or serial connections.This includes (differential) GPS sensors and wind speed sensors.
-Text file sensors combine operators, so that sensor data continuously logged to text files can be read out.Timestamps can either be extracted from the log file or be extrapolated using information about the logging frequency.

Reuse of data
As initially mentioned, the experimental platforms are designed for system-engineering elaborations, e.g.testing new strategies, networking communication and information/data fusion for the generation of situational awareness in a research port, provided here by optical sensors.The experiment platform therefore also becomes important for data mining, ascertaining grounding experiment data and encouraging exchange for simulation tools.Recorded data should be used retrospectively to exploit further developments and process additional queries or test new applications without the need to re-run expensive real-world scenarios.The design of the system architecture facilitated (concurrently bilateral) data interchange between real sensor nodes and simulated nodes, and the real world and simulated world, as shown in Fig. 1.Using this, complicated scenarios can be recorded once and then replayed several times.Additionally, replay sensor nodes can be used concurrently with simulations and real sensor nodes.

Conclusions and outlook
We conceptualized and implemented an experimental maritime test bed for design and evaluation of sensor data fusion, communication technology and data stream analysis tools.The approach considered especially the application of distributed optical sensor technologies in maritime study areas, in here a port in Bremerhaven (Germany) as a reference research port field study area.The described set-up offers a high flexibility for sensor interfaces, networking/communication connectivity, information processed query instances, as well as physical mobile properties of the sensor systems periphery.Therefore, it can be applied in various research fields, from e-navigation over marine environment studies up to the generation of shared situational awareness (Surm et al., 2016).Progressive engineering approaches were evaluated deriving requirements of the proposed framework assignment and conducted to the design of the system and the data stream architecture.We presented a general-purpose structure for processing interfaces by sensor systems, simulation modules, replay modules and also applications which could specify the needed data by an adaptive configuration component.The results of such modules can not only be sent directly to applications but they are also integrated into the so-called world model which is the key component for situational awareness.
While devolving the architecture concept into a data stream management system tool for sensor fusion processes and communication, we were coping with additional requirements, e.g.management of distributed sensors or providing tailored user interfaces.Based on scenarios and use cases in the maritime study area, we presented applications for situational awareness and facilitated a situational and marine environmental live picture.The possibility to integrate the sensor nodes freely and flexibly in (maritime) study areas offers advantages in the research field under various aspects.Infrastructures in environmental study areas are often not fully developed; communication and supply of energy for sensors are often limited.On the contrary, the central administration of distributed sensors systems enables findings without disturbance of the scene, which is important for sensitive environmental areas.
Cooperative technical systems for supportive situational awareness can provide information about safety-critical and environmentally hazardous situations.They play a vital role in today's society and their complexity has dramatically increased over the last decades, since more and more supportive systems evolved into so-called "socio-technical systems": humans are integrated by providing and assessing information and by making decisions in otherwise semi-autonomous systems.One example for such a system is automotive traffic: it evolves from purely human-driven cars to cars with advanced driver assistance systems to a cooperative system of systems with car-to-car, car-to-infrastructure, car-to-human and (still) human-to-human interaction.Similar trends can be observed for systems that are to be used in the maritime region which need appropriate test environments, such as socalled cooperative e-navigation systems.
The obtained data can therefore be used for scientific foundation and in situ experiments by different applications, e.g.e-navigation services (Hahn, 2014; illustrated in Fig. 14), socio-technological research, data mining to build up a digital mock-up of the environment, pollution surveillance (Zielinski, 2003) or monitoring changing environmental conditions.
The developed optical sensor systems therefore provide a platform with flexible interfaces and sensor fusion technology to perform field studies within distributed areas for onshore and offshore applications.Thus, our development embeds comprehensive forward-looking aspects for future indicatory multiplatform observations.

Code availability
The open-source data stream management framework Odysseus is available at http://odysseus.informatik.uni-oldenburg.de, with detailed tutorials, instructions and libraries to set up various applications (the presented demonstration use case gives just a small impression of the whole range of possibilities).The framework consists of two instances: the core system Odysseus Server for sensor data processing, stream querying interface and also user management, and -Odysseus Studio for the development and implementation of queries and Odysseus server administration.
Java 8 RE (Oracle, USA), further Eclipse release 4.5.1 for rich client platform (RCP) and remote application platform (RAP) developer has been used as integrated development environment (IDE).The sensor-specific Pylon Viewer software was used to set up the initial justification of the visual cameras, e.g.frame rate, exposure time or white balance, and is available at http://www.baslerweb.com.The initial set-up for communication interface, temperature range and picture orientation of the infrared video camera was performed with the PI Connect software available at www.optris.de.

Data availability
The sensor data presented in this paper are available upon request from the corresponding author.Videos are not publicly accessible because participants are identifiable.

Figure 1 .
Figure 1.Components of the architecture showing information and control/data flow with actors, environment (real/simulated), sensor, world model, communication and application layer connections.

Figure 2 .
Figure 2. Processing and data fusion steps in the DSMS framework, beginning with sensor level to pre-processing steps over feature annotation steps to local fusion as a local model.Multiple distributed processing sources integrate local models by communication links to a world model at the application level, shareable to users in the research port area.

Figure 3 .
Figure 3. Graphical user interface of the data stream admin client application in replay mode with covered scenes from the research port in Bremerhaven, Germany.

Figure 4 .
Figure 4. Top: exemplary structure of a single mobile sensor system with sensors and integrated features (left) and schematic of a multi-hop Wi-Fi network formed by distributed sensor systems (right).Bottom: self-sufficient mobile sensor systems nodes with lidar scanner (left) and visual camera (right) in front of a port entry.

Figure 5 .
Figure 5. Overview of the covered area (highlighted by semi-circles) from distributed sensor systems in the port ofBremerhaven, Geestemündung.Ferry terminals (at the top), a double lock with berths (at the right) and the port entry (at the bottom).

Figure 6 .
Figure 6.Presentation of wireless streamed live data from shipboard sensor systems at the bridge on the RV Heincke.The visual camera is shown on the left display side, laser scanner on the right display side.

Figure 7 .
Figure 7. Shipboard-driven sensor system set-up and field of view (FOV) on RV Heincke; lidar scanners and visual cameras on the port side and starboard side; tilted forward-looking visual and infrared camera on bow deck level.

Figure 10 .
Figure 10.Top: view from the base optical sensor system station (shown at the top of Fig. 4) towards to port entry with ferry terminals (on the right) in visual and thermal infrared (pseudocolour) video image; bottom: region of interest defined measuring fields for mean sea surface temperature determination in the research port area (left) and corresponding temperature time series profile (right).

Figure 11 .
Figure 11.Mean temperature time series profile of the sea surface derived by shipborne infrared camera during an offshore campaign in the JadeWeserPort area (top) and visual video images of the associated region (bottom).

Figure 12 .
Figure 12.Example of direct sun exposure where the dynamic image response of the visual camera becomes partly saturated.Surface reflections and sun glint are prominent in the visual picture (left); sea surface reflections of the sun and sky from the same scene are in the infrared-greyscale picture (right).

Figure 13 .
Figure 13.Query graph for an infrared camera with two active logging branches which share one processing operator.

Figure 14 .
Figure 14.Maritime surveillance system with optical sensors and additional wind sensor, AIS (automatic identification system) -receiver and radar for e-navigation services at a reference waterway.