Skip to content

Latest commit

 

History

History
41 lines (41 loc) · 123 KB

lodms-comparison__02-details.md

File metadata and controls

41 lines (41 loc) · 123 KB

Outer pipes Cell padding No sorting

# Category ConedaKOR LinkedDataHub Metaphacts Community Omeka S ResearchSpace Vitro Wikibase WissKI
1 Official website
2 Licence
3 Source code availability Source code available at https://github.com/coneda/kor https://github.com/AtomGraph/LinkedDataHub Source code available at https://github.com/omeka/omeka-s Source code available at https://github.com/vivo-project/Vitro Source code available at https://github.com/wikimedia/Wikibase The source code is available at drupal-git (you have to register use have detailed access).
4 Community maturity 5 contributers 4 contributers 1 contributer 30 contributers 37 contributers The underlying MediaWiki software is used by tens of thousands of websites and thousands of companies and organizations. As of 2022-05-03: The main MediaWiki GitHub page has 109,774 commits, 2.9K stars, 1.2K forks, and 563 contributors. The dedicated Wikibase GitHub page has 194 contributors. WissKI module: 93 community members, community 10 developers;
Drupal has 1.3M community members (incl. 46K developers). 22 contributers
5 Statistics (time stamp it) 317 stars, 72 forks (2022-06-30) 2022-08-31: 6267 commits, 297 stars, 100 forks GitHub stats as of 2023-02-03: 37,389 commits, 122 stars and 59 forks.
6 Payment models SaaS, individual contract Free to use Free to use WissKI is free to use, but you can host and maintain your instance by IGSD e.V., which also mediate to developers and other service providers.
7 Maintainer and Support The software is free to use. SaaS options are available through DAASI International, paid support and customizations are available through Wendig OÜ AtomGraph Metaphacts GmbH no longer develops the platform further. Some public forks are available, e.g. https://github.com/swiss-art-research-net/metaphacts-community maintained by the Swiss Art Research Infrastructure at UZH Digital Scholar, Roy Rosenzweig Center for History and New Media, Alfred P. Sloan Foundation, Institute of Museum and Library Services, Samuel H. Kress Foundation, Library of Congress, The Andrew W. Mellon Foundation, The Getty Foundation, National Endowment for the Humanities
Official repositories maintained by Wikimedia Deutschland with long-term funding support: 2 separate teams for Wikibase Cloud and Wikibase Suite including permanent positions for product and community managers, UX designers, dedicated deverlopers and engineering leads. In addition to official maintaners, a membership-based Wikibase Stakeholders Group features institutional and consulting agency teams working in a professional capacity to develop additional software extensions. A dedicated Wikimedia Affiliated User Group with corresponding open communication channels provides further support to the community. Actively maintained and supported with 1 permanent position by Germanischen Nationalmuseum Nürnberg (GNM), Community building within IGSD e.V., WissKICommunity and yearly User Meetings. Issue handling at Drupal WissKIs module page. Ongoing project funding (Objekte im Netz (2017-2020), Semantics4Art&Architecture (2019-2021), NFDI-Flexfunds for WissKI-Cloud (2022)). Multiple support possibilities (weekly Hackathons, chat, Skype-groups, direct contact with Robert Nasarek/ Mark Fichtner)
8 Documentation Documentation is available on Github, see the readme, the install docs and the user guide. Current funding will enable more extensive, tutorial-style documentation as well as developer docs for the JSON API. https://atomgraph.github.io/LinkedDataHub/linkeddatahub/docs/ Documentation and support is available on Github, on the Omeka Forum and on the User Manual. There is documentation for VIVO, which is Vitro plus Research Information related functions. Most of the documentation applies to Vitro, too. Documentation for the various components is availble through those relevant channels on the code repository and issue tracking platforms (GitHub and Phabricator). Additionally there are resources availble addressing the system holistically.
9 Infrastructure stack Docker, nginx, Apache Tomcat, Apache Jena, Eclipse Jersey, Varnish, Saxon-HE, Saxon-JS. Compatible with any SPARQL 1.1 triplestore, Apache Jena Fuseki is used by default. Apache, MySQL 5.6.4+ (or MariaDB 10.0.5+), PHP 7.4+ Wikibase is a MediaWiki extension, generally making use of a MariaDB database, as well as adding an additional Blazegraph triplestore for semantic queries. The integrated triple store, however, will be changed in future releases due to long-term sustainability issues with Blazegraph. Wikibase is commonly containerised as a Docker package, with web interface services managed by Apache or Nginx. WissKI is a module of Drupal. The Drupal CMS has a vast community and fokus on accessibility, security, backwarts compatibility and stability. It is very flexible and highly expandable (>3000 actively maintained modules), but coding in Drupal has a high learning curve. Drupal is written in PHP, but offers the possibility to encapsulate javascript software, so you can write and (with some coding) reuse JS-packages.
10 APIs W3C standard SPARQL 1.1 Protocol and SPARQL 1.1 Graph Store HTTP Protocol REST API Multiple custom endpoints, Sparql Query and Update, JSON, Triple Pattern Fragments The MediaWiki HTTP API is used to interact with the Wikibase in a manner similar to a traditional MediaWiki. A SPARQL endpoint is supplied for queries or data visualisations. The MediaWiki API is bidirectional, Blazegraph/SPARQL query services is not (ie read only). RESTful API in common formats (JSON, XML, HAL, CSV) via Drupal Views, SPARQL1.1 Endpoint via triple store
11 API usability level In addition to the OAI-PMH API, ConedaKOR has several JSON endpoints exposing all functionality available in the web interface All API methods are available in the UI as well as the CLI (command line interface). Well documented Well documented (for Vivo) The API comes with extensive documentation and examples, which are deployed by default into each instance. RESTful API is easy to configure and to describe within Drupal, but since there is no out of the box interface, the usabiltiy depends on the developer (which can be oriented to good web services, like those of the World Register of Marine Species). Regarding the SPARQL endpoint, it depends on which triple store software you use (GraphDB has quite a decent interface).
12 SPARQL Endpoint usability level YASGUI as SPARQL editor. Multiple chart types can be used to visualize results. Both queries and charts can be saved. The SPARQL endpoint has a GUI "query helper" and includes customisable "example queries" as an entry point. A range of data viz options are also included by default.
13 Reification Statements can be added any time, fields can be added per type and per entity Reification is supported by default on all data statements, by way of a "Reference" string or URL. Qualifiers can also be used as way to introduce additional "statements about statements". No agnostic technical solution, reification have to be activetly modelled (e.g. with CIDOC-CRM attribute assignment).
14 Reasoning operations via the triplestore endpoint Depends on triplestore configuration Jena offers an RDF reasoner, an OWL reasoner and transitive reasoning Basic reasoning in the form of following class and property hierarchy structures is supported, but there is no current mechanism for logical inferencing. Performs reasoning for class and property hierarchy, domain, range and inverse and stores it locally. Beside that, "normal" reasoning is performed by the triple store (if enabled).
15 Method for ontology definition The CIDOC-CRM can be imported as a base line, then, a derivative onthology can be defined with the web interface. Alternatively, othologies can be defined from scratch. OWL ontologies. Can be imported or authored. Hybrid. Vitro/VIVO has an integrated component for ontology creation and editing, but can import externally created OWL and SKOS, too. Data models can be constructed by defining properties and classes, but there is no overview to visualise taxonomies or class structures. There is no possibility to develop the semantics (classes, properties) and constrains (hierarchy, domains, ranges, transitivity etc.) of the underlying ontology in WissKI, you have to use editors like Protégé for that. Designing the application ontology requires a high level of experience in ontology and data model development!
16 Ontology constraints SPIN constraints are used by default, SHACL is supported as well Classes and Relationships can be mapped to the fields, but Omeka do not respect ontological constraints, nor it is possible to assign complete semantically paths to one field. Based on semantically defined rules. Constraints on range and domain can be defined in Vitro, too. It is not currently possible to apply ontological constraints. Must use an ontology, e.g. CIDOC-CRM and supports entity hierarchy, property constraints, transitivity and irreflexivity. WissKI maps Entities and their properties to ontological structures (so called paths) with WissKI's Pathbuilder. WissKI's Pathbuilder adheres to the constraints of the ontology and accordingly offers only domains and ranges belonging to the respective relations and the appropriate classes.
17 Data validation Fields can be defined as "required" or a regex can be used to validate text input. When creating relationships, their type can be selected according to the defined onthology SPIN constraints are used by default, SHACL is supported as well Data statements are validated against expected datatype per property.
18 Standardization and interoperability for data mapping / harvesting Supports Linked Data resources and SPARQL 1.1 compliant endpoints. Can map and import CSV data. XML imports can be easily added if required. Supports Linked Data resources (JSON-LD). It can map and import CSV data. XML imports can be added through additional plugins. There are numerous tools and workflows for harvesting and mapping, for example OpenRefine, Karma, VIVO Harvester, Generate2VIVO etc. Extensions such as WikibaseRDF or WikibaseExport can be used for mapping / harvesting respectively. Third-party tool OpenRefine with a configured reconciliation service can also be used for both operations. WissKI's Pathbuilder can connect to any SPARQL1.1 endpoint and map the data to new or existing entities. In addition to that WissKI has adapters for Geonames, Getty AAT, GND/DNB,GNM DMS, SKOS, XML and Zotero.
19 PID generation UUIDs are generated for every entity and every relationship. Fields can be defined as "identifier" which enables url resolution and ensures uniqueness. Every resource is identified with a URI. Blank nodes can be automatically skolemized by the server. PID can be created by using a specific module. Consecutive numerical identifiers are generated by default. Every entity gets an own URI according to a configured namespace.
20 PID reuse Any Linked Data resource can be imported into a local named graph. Existing PIDs can be referenced via "sameAs" statements, but reuse of the entity itself is not supported. A purely numerical identifier could be theoretically reused, although Wikibase does not currently support this behaviour by default. Reuses of PIDs are managed via owl:sameAs properties, so it is not a "real" reuse, more a mapping. Beside that, you always can have a field with the PID, that links to the authority record.
21 Import / export functions The OAI-PMH API delivers a full snapshot of the graph (read-only). Incremental harvesting is supported (deletion is supported). Data can be accessed through the JSON API as well (read/write) Data can be mainly exported in JSON-LD or tabular formats. Data can transported via MySQL/MariaDB snapshots, as well as graphical interface tools to import or export data as XML. Data can also be dumped as JSON or RDF. Drupal Instance is backuped and migrated via own routines. WissKI data can be exported and imported via the triple store (e.g. with GraphDB supports common formats like n-quad or turtle), but offers a second possibility by importing data from a MySQL/MariaDB database via the WissKI ODBC Module. Pathbuilders and related ontologies are exportable/importable via WissKI GUI.
22 Data input format support Custom JSON, Excel All RDF file formats; CSV using the CSV2RDF SPARQL mappings. JSON-LD, tabular format. Everything that's RDF 1.1 compliant and defined in the used ontology. Support for 17 distinct data types: CommonsMedia, EDTF, ExternalID, Form, GeoShape, GlobeCoordinate, ItemID, Lexeme, Math, MonolingualText, MusicalNotation, Property, Quantity, Sense, String, TabularData, Time and Url. Drupal Database stores common range of data types. Most triplestores support datatypes defined by the owl language. Drupal has a core suppurt basic field types, field widgets and field formatters, but could be extended with modules, i. e. for geospatial data, or custom field types can be developed. In additon to that, WYSIWYG Editors like CKEditor or N1ED supports own input formats.
23 Multiple value entries per field Multiple items can be entered against the same property.
24 Content templating Small HTML fragments can be generated from json data (e.g. links, video embeds or 3d viewers). More substantial changes to the look&feel have to be implemented. Standard XSLT 3.0 templates Yes, but requires additional configuration for complex ontology. Simple property graphs work out of the box. Templates for entities belonging to specific classes can be specified by Entity Schemas and accessed via the Cradle tool. For presentation, Lua templates can be configured to populate MediaWiki pages, but there are not ready-made models to follow - the work requires significant amount of custom development. Entities with their properties are defined through the so called "pathbuilder". Data model templates are available via own repository. Additionaly you can define display modes, both for data entry and presentation to display the same entity with different amount of properties depending on the context.
25 Front-end design The frontend design is hard coded. The color palette for the interface is customizable. Bootstrap 2.3.2. Planned upgrade to Material Design. Omeka S comes with specific themes out of the box, and a guide on how to create new ones. Only templates There is a limited number of "theme skins", but further alteration requires additional extensions or code modification. There are over 3000 Drupal themes, and an extensive documentation about creating own themes.
26 Translatable/ multilingual Interface Yes, built-in support for English and German Yes. Built-in support for English and Spanish UI labels. By default a multilingual interface and support for multilingual entries for every claim, as well as support for monolingual information where appropriate.
27 Version control Implements "soft-delete" to support OAI-PMH incremental harvesting with deleted entities Wikibase features a complete edit history of all changes made per entity, including authorship (IP address, if not made from a user account), timestamps, and alteration of the data footprint.
28 User access control on the entity/fact level Document-level ACL using the WebAccessControl vocabulary. Access level is checked using a customizable SPARQL query. Single declarations can be authored via the MediaWiki interface, which are then propagated to the Blazegraph triplestore. Editing direct to Blazegraph is not supported. WissKI stores its data independently in the triplestore, if you alter the data in the triplestore, it will be reflected in the Drupal front-end and vice versa. This is the fundation to federate multiple repositories or to work with different software tools on the same data repos.
29 Access rights settings UI editor for WAC allows creating granular authorizations for every document or document class Omeka S supports 6 different types of users and permissions. Through additional plugins (e.g. Annotate), additional roles can be created. Access controls are deployed via User Groups, but are not available at anything below entity view. SPARQL endpoint does not allow for granular access permissions. Drupal has an extensive rights management on page, entity and field level. Dynamic access (e.g. "show field 81 years after death" has to be coded individually). Authentification to the database and triple store have to be configured seperatly.
30 Media integrations Images: supporting formats via imagemagick, video/audio: supporting formats via ffmpeg Omeka S supports different formats out of the box (link), additional formats, e.g. 3d files have to be implement through external plugins. Wikibase provides support for embedding audio, image and moving image files, through WikiCommons, or via the new WikibaseLocalMedia extension. Geospational data must imported and linked through WikimediaCommons. WissKI relies on Drupals Media/ Media Library management, which allows to embed many different formats of audio, video and images, even if the are from a remote source. In most cases, users use moduls to customize the playback of audio files (e.g. with AudioField) or how images are displayed (e.g. with Colorbox). WissKI also offers the possibility to integrate 3D objects (via Sketchfab and Kompakkt), geospatial data e.g. with Leaflet, and the possibility to display and annotate images IIIF-capable (via WissKI Mirador 3 Integration).
31 Data visualization Maps using Google Maps (planned upgrade to OpenLayers), charts using Google Charts, graph view using interactive SVG renderer. Different modules are avaiable e.g. Datavis, Cartography The SPARQL endpoint includes default support for plotting and charting query results. Simple diagrams can be realised with Charts module.
32 Data analysis Few statistics aggregate out-of-the-box, while more sophisticated analyses have to be performed through external software. Basic numerical analysis can be achieved by creative SPARQL queries, but indepth analysis requires third-party tools or extensions. Only very few aggrational statistics out-of-the-box, like counting entities. More sophisticate analyses have to be performed with external software, like Excel, R or SPSS.
33 Third-party tools and libraries The community is active in the creation of multiple modules / extensions. Includes support from a large community of developers contributing extensions. Vast selection of existing libraries with the possibility of independent integration.
34 Scalability Out of experience: installations with a million or more entries are no problem Depends on the triplestore. Wikidata remains the highwater mark for scalability of the Wikibase system, and as of 2022-05-02 this comprises almost 100 million entities. Since WissKI uses native graph structures, it can handle billions of data records, but relies on the complexity of the data model in connection with technical infrastructure to outperform this capabiltiy. In addition to that, Drupal do not implement asynchronious data loading (yet). Lots of entity references can slow down the system, so caching and index mechanism should be implemented.
📋 Copy
Clear
Buy Me a Coffee at ko-fi.com