-
Notifications
You must be signed in to change notification settings - Fork 59
Release 3.0.0
Over the recent years, several extensions have been developed in the server. Up until now, all of these changes were developed in separate branches. Unfortunately, these separate branches tended to diverge, which made features incompatible with each other.
To avoid such incompatibilities in the future, we have modularized the server.
Concretely, the server itself has been split up into different modules (=npm packages).
Each of these modules contain a specific feature, for example the module @ldf/datasource-hdt
contains an HDT-based datasource type, and the module @ldf/feature-memento
adds support for the Memento protocol.
Using these modules, people can configure their own server, containing only the features they need/want.
For people that want to just use the server like they did before, @ldf/server
has been created to contain all default features that you would expect in an LDF server, such as support for different kinds of datasources that are exposed via a TPF/QPF interface.
npm uninstall -g ldf-server
npm install -g @ldf/server
This can either be done manually,
or by invoking the ldf-server-migrate-config-3x
tool on an existing config file.
ldf-server-migrate-config-3x my-config.json # Simulate conversion
ldf-server-migrate-config-3x my-config.json --apply # Convert and update file
Invoking this command in apply-mode, it will update the file, and create a my-config.json.back
backup file containing the pre-3.x.x format.
The main npm package has been moved from ldf-server
to @ldf/server
.
This is to be in line with the naming convention of the modules of this server,
and to allow other types of server in the future.
Concretely, instead of running npm install -g ldf-server
, you must now run npm install -g @ldf/server
.
The CLI command of the new server is still exposed through ldf-server
.
If you already had a previous version of the server installed, make sure to uninstall it first via npm uninstall -g ldf-server
.
While config files used to written in plain JSON, the new config files are written in JSON-LD.
The main differences are that the new config require a preamble of fixed entries, such as @context
, @id
, and import
. The other config entries remain similar, with a few changes here and there.
New config file:
{
"@context": "https://linkedsoftwaredependencies.org/bundles/npm/@ldf/server/^3.0.0/components/context.jsonld",
"@id": "urn:ldf-server:my",
"import": "preset-qpf:config-defaults.json",
"title": "My Linked Data Fragments server",
"datasources": [
{
"@id": "ex:myHdtDatasource",
"@type": "HdtDatasource",
"datasourceTitle": "DBpedia 2014",
"description": "DBpedia 2014 with an HDT back-end",
"datasourcePath": "dbpedia",
"hdtFile": "data/dbpedia2014.hdt"
},
{
"@id": "ex:mySparqlDatasource",
"@type": "SparqlDatasource",
"datasourceTitle": "DBpedia (Virtuoso)",
"description": "DBpedia with a Virtuoso back-end",
"datasourcePath": "dbpedia-sparql",
"sparqlEndpoint": "https://dbpedia.org/sparql"
}
]
}
Old config file:
{
"title": "My Linked Data Fragments server",
"datasources": {
"dbpedia": {
"title": "DBpedia 2014",
"type": "HdtDatasource",
"description": "DBpedia 2014 with an HDT back-end",
"settings": { "file": "data/dbpedia2014.hdt" }
},
"dbpedia-sparql": {
"title": "DBpedia 3.9 (Virtuoso)",
"type": "SparqlDatasource",
"description": "DBpedia 3.9 with a Virtuoso back-end",
"settings": { "endpoint": "https://dbpedia.org/sparql", "defaultGraph": "http://dbpedia.org" }
}
}
}
Below you can find an example of an advanced config file that enables all possible features. This can be used as a reference for checking how your config files can be upgraded:
{
"@context": "https://linkedsoftwaredependencies.org/bundles/npm/@ldf/server/^3.0.0/components/context.jsonld",
"@id": "urn:ldf-server:my",
"import": "preset-qpf:config-defaults.json",
"title": "My Linked Data Fragments server",
"baseURL": "https://example.org/",
"port": 3000,
"workers": 2,
"protocol": "http",
"datasources": [
{
"@id": "ex:myDatasourceVersion1",
"@type": "SparqlDatasource",
"datasourceTitle": "My SPARQL source",
"description": "My datasource with a SPARQL-endpoint back-end",
"datasourcePath": "mysparql",
"sparqlEndpoint": "https://dbpedia.org/sparql",
"enabled": true,
"hide": false,
"license": "MIT",
"licenseUrl": "http://example.org/my-license",
"copyright": "This datasource is owned by Alice",
"homepage": "http://example.org/alice"
},
{
"@id": "ex:myDatasourceVersion2",
"@type": "TurtleDatasource",
"datasourceTitle": "My Turtle file",
"description": "My dataset with a Turtle back-end",
"datasourcePath": "myttl",
"file": "path/to/file.ttl",
"graph": "http://example.org/default-graph"
},
{
"@id": "ex:myCompositeDatasource",
"@type": "CompositeDatasource",
"datasourceTitle": "Composite",
"description": "An example composite datasource",
"datasourcePath": "composite",
"compose": [ "ex:myDatasourceVersion1", "ex:myDatasourceVersion2" ]
},
],
"prefixes": [
{ "prefix": "rdf", "uri": "http://www.w3.org/1999/02/22-rdf-syntax-ns#" },
{ "prefix": "rdfs", "uri": "http://www.w3.org/2000/01/rdf-schema#" },
{ "prefix": "owl", "uri": "http://www.w3.org/2002/07/owl#" },
{ "prefix": "xsd", "uri": "http://www.w3.org/2001/XMLSchema#" },
{ "prefix": "hydra", "uri": "http://www.w3.org/ns/hydra/core#" },
{ "prefix": "void", "uri": "http://rdfs.org/ns/void#" },
{ "prefix": "skos", "uri": "http://www.w3.org/2004/02/skos/core#" },
{ "prefix": "dcterms", "uri": "http://purl.org/dc/terms/" },
{ "prefix": "dc11", "uri": "http://purl.org/dc/elements/1.1/" },
{ "prefix": "foaf", "uri": "http://xmlns.com/foaf/0.1/" },
{ "prefix": "geo", "uri": "http://www.w3.org/2003/01/geo/wgs84_pos#" },
{ "prefix": "dbpedia", "uri": "http://dbpedia.org/resource/" },
{ "prefix": "dbpedia-owl", "uri": "http://dbpedia.org/ontology/" },
{ "prefix": "dbpprop", "uri": "http://dbpedia.org/property/" }
],
"logging": true,
"loggingFile": "access.log",
"dereference": [
{
"dereferenceDatasource": "ex:myDatasourceVersion2",
"dereferencePath": "/resource/"
}
],
"responseHeaders": [
{ "headerName": "Access-Control-Allow-Origin", "headerValue": "*" },
{ "headerName": "Access-Control-Allow-Headers", "headerValue": "Accept-Datetime" },
{ "headerName": "Access-Control-Expose-Headers", "headerValue": "Content-Location,Link,Memento-Datetime" }
],
"sslKey": "../core/config/certs/localhost-server.key",
"sslCert": "../core/config/certs/localhost-server.crt",
"sslCa": "../core/config/certs/localhost-ca.crt",
"router": [
{
"@id": "preset-qpf:sets/routers.json#myPageRouter",
"pageSize": 50
}
],
"mementos": [
{
"timegatePath": "dbpedia",
"versions": [
{
"mementoDatasource": "ex:myDbpedia2015",
"start": "2014-09-14T11:59:59Z",
"end": "2015-04-15T00:00:00Z",
"originalBaseURL": "http://fragments.mementodepot.org/dbpedia_201510"
},
{
"mementoDatasource": "ex:myDbpedia2014",
"start": "2013-06-15T11:59:59Z",
"end": "2014-09-15T00:00:00Z"
}
]
}
],
"summaryDir": "../../summaries",
"summaryPath": "/summaries/"
}
More details on how each module can be configured can be found in the README of each module
JSON-LD allows more flexibility in how and where these config files can be used, as they now become valid RDF. Additionally, they are required for the new Components.js dependency injection framework. If you don't like JSON-LD, your config files can also be written in other RDF formats, such as Turtle.