MELT Track Repository

The TrackRepository checks whether the required ontologies and alignments are available in the cache folder (~/oaei_track_cache); if data is missing, it is automatically downloading and caching it for the next access. The MELT Track Repository is used by the OAEI since 2020.

Exemplary call using the TrackRepository:

// access the Anatomy track
TrackRepository.Anatomy.Default;

// access all Conference test cases
TrackRepository.Conference.V1.getTestCases();

The resulting instances can be directly used by the Executor or any other MELT functionality that requires tracks or test cases.

Available Tracks

MELT also provides a server with default datasets used by the OAEI. They are identified by three components:

  1. Repository (location-URI): http://oaei.webdatacommons.org/tdrs/
  2. Track/Suite - ID (collection-name): <see table below>
  3. Version ID: <see table below>

The MELT evaluation client uses these track identifiers. The MELT repository is built so that it is fully compatible with the former SEALS client. For example, you can still use the SEALS client JAR to fetch the data from the MELT repository.

You can also download a ZIP file of the data by clicking on the download icon () next to the Version ID in the table below.

Name Suite-ID Version-ID
anatomy anatomy_track anatomy_track-default
biodiv biodiv 2018
2021
2021owl
2022
2023
complex geolink
hydrography
popgeolink
popenslaved
popconference
geolink-v1
hydrography-v1
popgeolink-v1
popenslaved-v1
popconference-0-v1
popconference-20-v1
popconference-40-v1
popconference-60-v1
popconference-80-v1
popconference-100-v1
food food v1
v2
v2sub
conference conference conference-v1
conference-v1-all (also testcases without reference)
conference-dbpedia
knowledgegraph knowledgegraph v3
v4
commonkg commonkg nell-dbpedia-v1
yago-wikidata-v1
yago-wikidata-v1-small
GeoLinkCruise geolinkcruise geolinkcruise-v1
Laboratory laboratory laboratory-v1
iimb iimb v1
largebio largebio largebio-all_tasks_2016
largebio-fma_nci_small_2016
largebio-fma_nci_whole_2016
largebio-fma_snomed_small_2016
largebio-fma_snomed_whole_2016
largebio-snomed_nci_small_2016
largebio-snomed_nci_whole_2016
link link 2017
multifarm <language_pair>
multifarm
<language_pair>-v2 (see below table)
all-v2
phenotype phenotype phenotype-hp-mp-2017-bioportal
phenotype-doid-ordo-2017-bioportal
pm pm 2017-all
2017-br
2017-ua
mse mse 2021
pgx pharmacogenomics v1
digial humanities dh 2024all
2024arch
2024cult
2024dhcs
circular economy ce_track ce_track
archaeology archaeology 2024all

Available multifarm language pairs:
ar-cn, ar-cz, ar-de, ar-en, ar-es, ar-fr, ar-nl, ar-pt, ar-ru, cn-cz, cn-de, cn-en, cn-es, cn-fr, cn-nl, cn-pt, cn-ru, cz-de, cz-en, cz-es, cz-fr, cz-nl, cz-pt, cz-ru, de-en, de-es, de-fr, de-nl, de-pt, de-ru, en-es, en-fr, en-nl, en-pt, en-ru, es-fr, es-nl, es-pt, es-ru, fr-nl, fr-pt, fr-ru, nl-pt, nl-ru, pt-ru

ML-based Tracks

Following tracks with dedicated training and validation data (note the -ml at the end of the suite-id):

Name Suite-ID Version-ID
anatomy anatomy_track-ml anatomy_track-default
biodiv biodiv-ml 2023
knowledgegraph knowledgegraph-ml v4
bio-ml bio-ml-equiv-supervised-ml 2022

Further Services

TestCase/Track Validation Service

Creating new tracks and test case can be very cumbersome. The MELT validation service allows you to check whether your test cases:

  1. Contain parseable ontologies.
  2. Contain a parseable reference alignment.
  3. Mention only URIs in the reference alignment that also appear in the corresponding source and target ontologies.

Exemplary call using the TestCaseValidationService:

URI sourceUri = new File("<path to source ontology file>").toURI();
URI targetUri = new File("<path to target ontology file>").toURI();
URI referenceUri = new File("<path to reference alignment file>").toURI();
TestCase testCase = new TestCase("FSDM", sourceUri, targetUri, referenceUri, null);
TestCaseValidationService validator = new TestCaseValidationService(testCase)
System.out.println(validator);

You can also test your track on different versions of Jena and the OWL API automatically by adapting the TestLocalFile and running runAll.cmd in the Windows shell. The release versions to be tested can be edited in the corresponding pom.xml.