Data and information in the digital age are spearheading the evolution of services and product development, serving a continuum of user demands at all levels and scales, and boosting research and innovation applications. Indeed it is recognised that data is today a key ingredient for competitiveness, and this is the supra exponential trend for the immediate future. New applications and innovative products are indeed often triggered by smart ideas and the exploitation of fresh technological avenues, but in practice the creation and forging of quality activities is highly based on the value addition to data and its transformation into information and knowledge.
We often refer to this by what we conveniently refer to as ‘intelligence’. As a matter of fact, defining intelligence is quite straightforward. You find it in the dictionary. Interpreting it in our context, it is essentially about having access to more data layers (or parameters) than others, the ability to go into greater detail and precision.
A typical example that comes to mind is ‘police intelligence’ which refers to the intricate system of records that a specialised element of the police force deploys to track and anticipate crime in the attempt to restrain it. For some, the definition stops here. Whereas intelligence also involves the harnessing of skills to churn greater volumes of data, and most importantly the ability to exploit
skills to probe deeper into data for the extraction and delivery of knowledge, to acquire further insight, to solve problems and to innovate.
To express it mathematically, intelligence equals data plus skill. Both data and skill are essential and complementary ingredients. Raw data on its own, without skilled processing, is futile. Skill on its own is like a wasted resource and unexploited brain energy: it requires the feeding of data. Recognising all his skill, would Einstein have rendered the way he did to relativity theory without the
contemporary baseline records of mathematics and science that he inherited from the past? Skill and data going together again.
In an era when the impact of data and information technology are the key driving force of the knowledge economy, the creation and support of intelligence thus becomes critical to excellence. Essential steps for intelligence in the digital age include: a) the ability to access, share, codify, re-use and transform data and information into knowledge; b) the creation and recreation of hierarchical
levels of increasing complexity of interpretation, merging and synthesis of data; and c) the organised use of increasingly ramified networks and clusters of distributed activities.
The development of computer systems has enabled these articulated and complex processes to be performed by very fast machines and computer programmes that emulate human intelligence such as through visual perception, speech recognition, decision-making, and translation between languages. Artificial Intelligence (AI), as it is known, deals with large volumes of data (big data) which
are interpreted by intelligent computer programmes to execute high speed data manipulations leading to self-teaching machines that process real time data from multiple sensors, exploiting supra efficient processors and huge memory reservoirs, to exhibit intelligence through machine learning protocols. The culmination of AI is often linked to self-automated robots, cars and ships where performance is made to excel by systems that use algorithms to learn from their mistakes, and to autonomously adapt to changing environments. In reality, AI is much more than this and other lucrative applications lurk in high performance human-machine interfaces and in medicine such as for drug design that will no longer remain constrained by relatively limited data samples.
All systems derive their accuracy and usability from the availability of large volumes of fitting data. It is therefore not surprising that most valuable companies in the world are those which drive their competitive edge through the allocation of significant resources to collect and process data. Digital platforms like Facebook and Google are two giant examples that take advantage from the collection
of huge amounts of data from users. Other business endeavours are following trail and keeping up with the pace of digitisation by in vesting in massive data capture efforts and extracting intelligence from their data patrimony.
Unfortunately few players have the resources to collect such extensive volumes of data. Not that data does not exist. Data production is growing and covering many realms, types and scales, but unfortunately most of it remains locked up in closed databases, enterprises and institutions, albeit it is acquired by public funds in some cases. Unofficially it is estimated that the world generates 16 zettabytes (ZB) of data each year, but only one per cent is analysed. The problem is that data is withheld by the data collectors, and data hoarding is considered a right by those who collect it, refraining from its sharing. Where data is released it often does not automatically flow to users, and data remains untapped and scarce.
This is also the shape of things in the marine sector. Marine data and information services are triggering an unprecedented leap in the economic value of meteo-marine data, becoming essential for managing marine resources efficiently, and feeding benefits to the marine-related industry and the services sectors. The future points to multiple-purpose observing systems and enhanced techniques to simulate the functioning and response of the marine ecosystem to external factors, linking marine data to economic, environmental and social domains.
Such systems cater not only for monitoring, but also for research, service provision, security, safety and for policy purposes. However the overwhelming leap in data valorisation comes about through the merging of these scientific datasets to the broader classes of data encompassing non-scientific layers of the likes of spatial information, demography, socio-economic factors, accessible resources
and infrastructures, deployment of assets, tracking of maritime activities and many other diverse forms and expressions of information over different scales in space and time. This is already becoming critical to competitiveness, product development and enhancement of services, and is conducive to the future goals of ocean-based economies.
In Europe the legacy of economic activity to the sea is estimated to involve 5.4 million jobs rendering substantial added value to the reach of circa €500 billion annually. Europe has set the scene on what is anticipated to be a strong push for research and innovation to exploit the untapped potential of the oceans, seas and coasts in favour of the creation of jobs and growth related to the sea. Using
the EU’s jargon, this is termed ‘blue growth’, and it embraces the challenges to bolster economic excellence in the maritime sector by exploiting regional niches. It triggers new dimensions for marine research and innovation, for data integration, enhancement and valuation, and the valorisation of data into information, knowledge and smart added value downstream services.
Through its essential components comprising earth observations and forecasting, operational oceanography provides the routine data flows, the essential ingredient for the delivery of operational integrated services using real-time multidisciplinary data, and the merging of different data sources for the delivery of marine information in applications requiring hindcast, nowcast and
forecast fields. These targets are set to shape and support the evolving maritime economies in countries to meet the opportunities of an expanding marine-related industry, to exploit the positive trends in trade, and to satisfy the growing demands of societies for better standards of living.
This new engagement with the sea is set to follow a seamless path of sustainability, in synergy to principles of equity of benefits amongst countries, and without exacerbating the challenges set by climate change scenarios and socio-political situations. Yet even here the largest bottleneck is data- sharing. Individual marine data owners tend to hold avidly to their data patrimony because its acquisition requires personal effort, and they watch assiduously against third parties taking advantage of their data before they fully exploit it. Developers engaged in engineering works and projects commit to data collection for environmental impact assessments, feasibility studies and monitoring obligations, but tend to restrain the use of their data for other purposes.
Organisations and national responsible entities often assemble large datasets in connection with resource exploration and mining, as well as for sustainability assessments and regular monitoring, but tend to impose restrictions on their marine data availability on the premise of data sensitivity, propriety interests or disclosure constraints. There are then the questions related to quality controls,
data formats and interoperability, and data standards including for the metadata descriptors.
All these factors have hampered free and direct marine data exchange and have led to the practice of large centralised databases run by supported organizations, partnerships and programmes that do data archaeology and compilation at local, regional and global scales. They address the needs of different user groups such as for climatic studies and inputs to ocean numerical models, but each
impose diverse licensing procedures that have ultimately restrained the power of free data flows and established unnecessary monopolies and restrictive dependencies. The end result is that data remains locked up because owners are afraid of losing control on it once it is delivered to data exchanges.
Blockchain comes to our aid here, as a disruptive technology that is changing the many established ways of doing things, including data exchange. It provides a solution on how to curb the demand of emerging data-hungry users, promising to unleash the full power of data. Blockchain and the Distributed Ledger Technology (DLT) is better known for introducing a digital payment and peer-
topeer monetary transaction system which does not require third party endorsements.
Transactions are verified by independent nodes and recorded on a public digital ledger which can be imagined to be like an openly accessible logbook that exists in multiple copies in different servers (participants) over the internet, each copy carrying exactly the same content. All the copies are constantly talking to each other and synchronising the content continuously. Content entails details
about the involved parties, transaction parameters, time stamp and much more. Diverging copies are sieved out and thrashed.
A block is saved when a sufficient number of participants have secured a set number of identical records. This is rubberstamped by an encrypted checksum so that each closed block has distinct, unique and indelible content that makes tampering practically impossible. The links to the previous transactions are locked, and new links to the next blocks are initiated forming a web of links or
chains that keep a detailed trace of all the interlinked transactions. Blocks can also reside in different layers that are also chained in a common system. The greatest strength of Blockchain lies in its decentralised architecture, and its open, permissionless and
irrevocable qualities that renders the system affordable, flexible and secure. The exciting attribute is that DLT and the blockchain protocol can be used for non-currency purposes too and are already revolutionising the business world including the maritime domain. For example, it is the most promising technology in maritime logistics, and will revolutionise shipping in general. It is the
backbone for the development of the digital port and contains the potential for improved services and competitiveness in this sector. Imagine eliminating the need of printed shipping documents for the carriage of goods, speeding up processes, making the freight and logistics industry more efficient and at much lower cost. Great benefits are perceived by making exchange of information between
ship owners, shippers, ports, equipment manufacturers and IT companies more effective. The pooling of the vast quantities of unofficial and validated crowd sourced data and information generated by shipping, and its open exchange over faster digital sharing lanes, will open the way to avenues yet to be explored such as in the intelligent deployment of assets, lowering fuel
consumption, improving berth occupancy, enabling tighter time windows for delivery of services, lowering emissions and reducing the environmental footprint.
Can the Maltese shipping flag take advantage of this ahead of competitors? This is just a glimpse of how data will feed and innovate our activities in the coming years. The very near future will bring changes at how we generate, pack and exchange data and information, and this does not of course exclude the realm of marine data. The Blockchain protocol is evolving to specifically power
decentralised marine data marketplaces which are equivalent to distributed point to point databases relying on a system of collective mutual verification. Through the use of data encryption, time stamp, distributed consensus, and economic incentives in the distributed system, Blockchain can provide a system for data transactions without the need of formal mutual trust between parties,
providing solutions for data flows at low cost, high efficiency and maximal security compared to traditional data storage and exchange mechanisms.
Blockchain motivates what are being coined as ‘data commons’, these being places where data is freely unlocked and pooled to be used and re-used by many, while it carries with it the value and wealth that grows with the data itself, and evolves seamlessly with its usage, bringing back economic returns to both the original data providers, to the data product developers and the service providers.
This is just a peep at the exciting digital revolution looming ahead. Are we ready to be among the first to take this dip into the future? Well, it is already happening.
Prof. Aldo Drago leads the Physical Oceanography Research Group (formerly Physical Oceanography Unit, PO-Unit) within the Department of Geosciences of the University of Malta. The group undertakes oceanographic research in a holistic perspective. This includes operational marine observations and forecasts, specialised data management and analysis, also through the
participation in international cooperative research ventures.