MEDIN 2025 Webinar Series

MEDIN 2025 Webinar Series

This webinar series is hosted by the Marine Environmental Data and Information Network (MEDIN), a UK initiative dedicated to improving access to high-quality marine data. MEDIN works with organisations across sectors to promote best practices in marine data management, ensuring data is discoverable, accessible, and reusable for the long term.

Throughout the year, once per month, this series will feature one-hour online sessions led by expert guest speakers, each focusing on specialised topics not typically covered in MEDIN’s regular free online workshops. These webinars are designed to support better data stewardship and highlight emerging tools, standards, and approaches in marine data.

Sessions will be recorded and made available on the MEDIN YouTube channel, creating a lasting resource for the marine data community. 

Whether you are new to marine data management or have years of experience, and no matter which sector you work in, these sessions are designed to be inclusive, informative, and accessible to all who are interested in improving the way marine data is managed and shared.

 

20th August 2025 14:00 – 15:00

An Introduction to Marine Data Stewardship and the MEDIN Framework

tbc (MEDIN)

This presentation provides an overview of marine data stewardship and introduces the MEDIN Framework as a foundation for best practice in marine data management. It outlines the principles of effective stewardship, the importance of standardisation, and how MEDIN supports the marine community in making data more discoverable, accessible, and reusable.

Data Planning and Stewardship 

Monica Hanley (BODC – NOC)

Data management planning is integral to the lifecycle of data and should be done early for every project. There are a variety of ways to plan data, but important things to capture are datasets, storage size, data access restrictions, data originator, if the data should be archived; if so, where, and when. Data Management Plans (DMPs) are a tool used to document and track datasets throughout the data life cycle.

Once a DMP is drafted, it should be amended or reviewed throughout the lifetime of the project. Identifying ownership of a DMP is important to encourage accountability for the data and supporting (re)use of data beyond the initial collection or scope of the project. The DMP should address if data are expected to be submitted to an appropriate archive/repository. A DMP will also document the expectations around metadata and formats i.e. non-proprietary. Without storing the data with appropriate metadata or suitable formats there is a risk reuse of the data may be reduced if at all. Any target repository might be constrained by funder, journal or organisational requirements. Data depositors should consider and be aware of best practices for data management. Data stewards have a role in guiding best practices, but not all repositories have data stewards. Data management best practices depositors might consider are: suitable metadata, controlled vocabularies, using data access policies and licences, archive processes and applying quality control measures. These best practices aim to support reuse of data in the future.

The Marine Data Exchange: How we use MEDIN to enhance the value of our database

Harry Richardson (The Crown Estate)

This presentation introduces the Marine Data Exchange (MDE), outlining its origins, current role, and future direction. It then explores MDE’s collaboration with MEDIN, focusing on how MEDIN’s guidelines are integrated into MDE’s quality assurance processes. By aligning with these standards, MDE ensures adherence to industry best practices and defines robust data requirements. The session highlights how this partnership enhances the quality, consistency, and value of marine data shared through the platform..

17th September 2025 14:00 – 15:00

MEDIN Marine Data Standards and Guidelines

tbc (MEDIN)

This presentation offers a practical introduction to the MEDIN Marine Data Standards and Guidelines, designed to support consistent, high-quality marine data management. It will outline the purpose and structure of the standards, explain how they promote interoperability and data reuse, and demonstrate their relevance across a range of marine sectors.

Data standards and interoperability - A view from Ordnance Survey 

Allan Jamieson (Ordnance Survey)

Geospatial data interoperability between data suppliers and data users alike is a persistent problem. In an age where no single organisation holds all the data necessary for particular use cases, increased data interoperability is both desirable and essential. Geospatial data standards when consistently applied can be part of the solution. The longer-term goal, interoperability between the land and the sea.

 

Marine data use in ports

Katie Eades (OceanWise)

Having worked with Ports for 15 years - we have seen many changes in the adoption and practical implication of marine data management. This presentation discusses the changes that we have seen with a number of case studies and some best practice lessons on how ports are managing and utilizing marine data to improve safety / improve efficiencies

22nd October 2025 14:00 – 15:00

Introduction to controlled vocabularies and their importance in FAIR data sharing

Danielle Wright (BODC – NOC)

An introduction to controlled vocabularies, why they are important in describing and marking up marine datasets, and how they can be used to support FAIR data sharing. Will introduce the different methods for finding suitable vocabulary collections and concepts, either directly from the NERC Vocabulary Server (NVS), or via tools such as the SeaDataNet Hierarchical and Facet searches, and the British Oceanographic Data Centre’s (BODC) vocabulary builder and vocabulary editor tools. Will also cover the different options for submitting new term requests to the BODC NVS team, such as email or Github. Will introduce some of the different marine vocabulary management communities the NVS team are involved with, such as ICES, OBIS, EMODnet Chemistry and stress the importance of collaboration and alignment to reduce ambiguity and duplication of effort.

Semantic models & linking of NVS to other semantic web resources 

Alexandra Kokkinaki (BODC – NOC)

This presentation introduces the semantic model underpinning the NVS (NERC Vocabulary Server), placing it within the broader context of knowledge graphs and linked data. It explores the structure and design of the NVS semantic model, alongside the various methods available for accessing and integrating its content. The session highlights key tools used to explore the model and its data, including a walkthrough of mappings that connect NVS to other semantic resources. Additionally, it showcases current collaborations with RDA groups and alignment with initiatives such as FAIR-IMPACT, illustrating how NVS contributes to and evolves with global semantic data practices.

Practical guidance on SPARQL and/or API access to vocabs

Alexandra Kokkinaki (BODC – NOC)

This tutorial will equip participants with the skills to access the NERC Vocabulary Server (NVS) programmatically via its APIs and SPARQL endpoint. It will provide a practical understanding of the underlying semantic model, enabling users to retrieve and integrate precisely the data they need. Whether for building new tools, enhancing existing workflows, or aligning with semantic web practices, this session will support effective and targeted use of NVS in real-world applications.

12th November 2025 14:00 – 15:00

Archiving and Publishing Marine Biodiversity Data with DASSH: Tools, Standards, and Processes 

Chloe Figueroa Ashforth

DASSH (the UK Archive for Marine Species and Habitats Data and UK Node of OBIS) is the MEDIN Data Archive Centre (DAC) for marine biodiversity data, providing tools and services to support the archiving, management and publication of long-term marine biodiversity data in standardised formats. Data are published and made discoverable on our DASSH mapper and on various data aggregators’ portals such as OBIS, EMODnet, and NBN Atlas, with metadata for all datasets available on the MEDIN Discovery Metadata portal. 

This webinar will explain DASSH’s submission process for marine biodiversity data, from the initial contact with the provider, offering support with data and metadata standardisation, to the final publication of the data following a robust quality assurance process.  We will highlight the tools and services we provide, including data validation tools, the DASSH Mapper, the IPT and the MEDIN Metadata Helpdesk and the processes used to publish high-quality data that is findable, openly accessible, interoperable and reusable.

Digital Object Identifiers (DOIs) and Citations: What they are and how MEDIN can support or make use of them 

James Ayliffe (BODC – NOC)

Digital Object Identifiers (DOIs) are a service used to uniquely distinguish digital objects from others. DOIs provide a persistent permanent link to metadata that self describes the resource, which, in turn can be used to provide citation to resources such as journal articles, physical samples and data. Generating a DOI is a very simple process that can be used to support FAIR (meta)data however, maintaining the principles behind the DOI is a difficult and complex process, especially considering the impacts and use cases that DOIs are intended support. DOIs should support attribution of credit, traceability and transparency, which also creates a tension between the three as technology and use evolves and develops. We wish to discuss what DOIs are, how they work with citation and how they support the scientific method. We hope the session can explore how can MEDIN support and utilise DOIs and citations going forward?

10th December 2025 14:00 – 15:00

Autonomous Platform Data Management at BODC 

Robyn Owen (BODC – NOC)

The use of autonomous platforms is drastically increasing as technologies advance and science works towards becoming Net Zero and therefore autonomous platforms are established as a key commodity in multidisciplinary oceanographic research, often able to reach locations inaccessible to traditional research vessels be that due to remoteness or time of year. To account for this increase, data centres need to provide reliant data management systems that are operational all year round.   

The British Oceanographic Data Centre (BODC), part of the Digital Ocean Division at the UK’s National Oceanography Centre (NOC) provides data management for three autonomous platform types: Argo floats, gliders and Autosub Long Range (ALR). BODC supports scientific and commercial deployments in both near-real time and delayed mode. 

This talk will cover BODC data management practices for each of the three platform types including data processing, data dissemination and how we work with the wider platform communities. Data processing has a strong focus on metadata, applying controlled vocabularies from the NERC Vocabulary Server and for glider and ALR deployments utilising a Semantic Sensor Network (SSN) database to register sensors and platforms. There are multiple data dissemination pathways across the platforms with Argo float and glider data contributing to the Global Ocean Observing System (GOOS).

High Volume Data Archiving and Challenges 

Monica Hanley & Danielle Wright (BODC – NOC)

Archiving data can be a challenge   for both data originators and data repositories, and high-volume data can add an extra layer of complexity due to the size and format of the data. High volume in this instance is considered to be data that are above 100 GBs. The British Oceanographic Data Centre handle a few different data types that regularly fall into this category, including model outputs, geophysical and geospatial   data, and image and video data. All data, regardless of its type, should be in a standard, non-proprietary format with rich metadata accompanying it.

Model outputs should be in CF-NetCDF files, if they are deemed appropriate for archiving, with global attributes and the code published somewhere accessible so they can be linked to the outputs more easily. These datasets range heavily in size and file compression is recommended for larger outputs.

Geophysical data such as bathymetry and seismics can range from a few hundred MB to tens of TB in volume, and industry standard formats such as SEGY (seismic) and XYZ or TIF (bathymetry) are preferred. BODC also handles still imagery and video data, generally from camera surveys from the NOC fleet of remotely operated or autonomous underwater vehicles. Volumes can range from 1-10 TB per dive/mission, with anywhere from 5-15 dives/missions per cruise/campaign. Raw data are generally archived on offline tape servers, with useful re-usable formats made available online. Transfer and archive of such high volumes requires dedicated servers and collaboration with CEDA to use their high-volume JASMIN infrastructure, ensuring adequate metadata is supplied according to community standards to enable discovery and access. Bathymetry data at BODC is also made available at the European level through the EMODnet Bathymetry project and at the international level through Seabed 2030 and GEBCO.

21st January 2026 14:00 – 15:00

Boosting data sharing in the Ocean Decade 

Adam Leadbetter (Decade Coordination Office for Ocean Data Sharing)

The Decade Coordination Office for Ocean Data Sharing is the focal point for marine data and information management in the UN Decade of Ocean Science for Sustainable Development. It is committed to fostering effective knowledge and information exchange, and aims to raise awareness about data sharing, capacity building and collaboration with Decade Actions and other entities. In this webinar we will introduce the work of the UN Ocean Decade and explore how there are overlaps between the work of MEDIN and of the Decade in building a community of practice for data sharing, developing guidelines and data standards, and encouraging good data stewardship practices. We will also discuss how these approaches scale into the digital ecosystem which is being built across the Decade and the Intergovernmental Oceanographic Commission of UNESCO and feed an ecosystem of observation, creation or acquisition of data to data management to forecasting and prediction to action. This work aligns with the Decade Data and Information Strategy and supports the achievement of Ocean Decade Vision 2030 goals.

The UN Ocean Decade's approach to reshaping the global ocean datascape 

Pier Luigi Buttigieg (Alfred Wegener Institute)

This contribution will present the essence of the Implementation Plan created to follow the UN Ocean Decade's Data and Information Strategy. In its broad and deep recommendations, this plan contains a multifaceted and concrete approach for data systems around the world to follow, in order to effectively operate within and contribute to the maturation of the ocean's digital ecosystem. Including guidance on digital architectures, organisational processes, strategies, data readiness, measures to prepare for the impacts of new technologies, and core values, the Implementation Plan will serve all ocean communities in enhancing their systems while preparing for sustained and meaningful interoperation.

 

Please use this form to register.

Your Personal Information

Organisation Type

Additional Information

Which webinars in the series are you interested in attending?
Webinar 1: 20th August 2025 - Navigating Marine Data: Planning, Stewardship, and the value of MEDIN
Webinar 2: 17th September 2025 - Interoperability in Action: Data Standards and Marine Applications
Webinar 3: 22nd October 2025 - Speaking the Same Language: How Controlled Vocabularies Facilitate Data Sharing and Interoperability
Webinar 4: 12th November 2025 - Making Marine Data Count: Archiving Biodiversity Data with the UK Data Archive for Marine Species and Habitats (DASSH) and Citing with DOIs
Webinar 5: 10th December 2025 - Ocean Data at Scale: Autonomous Data Management and High-Volume Archiving at the British Oceanographic Data Centre (BODC)
Webinar 6: 21st January 2026 - Unlocking Ocean Knowledge: The Global Push for Better Data Sharing