Loading TOC...
Semantic Graph Developer's Guide (PDF)

Semantic Graph Developer's Guide — Chapter 11

Client-Side APIs for Semantics

MarkLogic Semantics can be accessed through client-side APIs that provide support for management of triples and graphs, SPARQL and SPARQL Update, and access to the search features of MarkLogic server. The Java Client and Node.js Client source are available on GitHub. Also on GitHub are MarkLogic RDF4J and MarkLogic Jena.

The chapter includes the following sections:

Java Client API

The Java Client API enables you to create client-side Java applications that interact with MarkLogic. Semantics related features include support for graph and triple management, SPARQL Query, SPARQL Update, and Optic queries. The Java Client API can also be used in conjunction with the MarkLogic Jena API and the MarkLogic RDF4J API.

For details, see Working With Semantic Data in the Java Application Developer's Guide and the following interfaces and classes in the com.marklogic.client.semantics package in the Java Client API Documentation.

  • GraphManager
  • SPARQLQueryManager
  • SPARQLQueryDefinitions

MarkLogic RDF4J API

RDF4J is a Java API for processing and handling RDF data, including creating, parsing, storing, inferencing, and querying over this data. Java developers who are familiar with the RDF4J API can now use that same API to access RDF data in MarkLogic. The MarkLogic RDF4J API is a full-featured interface that provides access to MarkLogic Semantics functionality. The MarkLogic RDF4J API replaces the MarkLogic Sesame API, which was built using the older Sesame API. The MarkLogic RDF4J API includes the MarkLogicRepository interface, part of the persistence layer.

By including the MarkLogic RDF4J API, you can leverage MarkLogic as a Triple Store using standard RDF4J for SPARQL query and SPARQL Update. The MarkLogic RDF4J API extends the standard RDF4J API so that you can also do combination queries, variable bindings, and transactions. The MarkLogicRepository class provides support for both transactions and variable bindings.

The following example uses the MarkLogic RDF4J API to peform a SPARQL query and a SPARQL Update against Semantic data stored in a MarkLogic database. The example first instantiates a MarkLogicRepository and defines a default ruleset and default permissions. Notice that the SPARQL query is constrained by an additional query that limits results to the results of the combined query:

package com.marklogic.semantics.rdf4j.examples;

import com.marklogic.client.DatabaseClient;
import com.marklogic.client.DatabaseClientFactory;
import com.marklogic.client.io.Format;
import com.marklogic.client.io.StringHandle;
import com.marklogic.client.query.QueryManager;
import com.marklogic.client.query.RawCombinedQueryDefinition;
import com.marklogic.client.query.StringQueryDefinition;
import com.marklogic.client.semantics.Capability;
import com.marklogic.client.semantics.GraphManager;
import com.marklogic.client.semantics.SPARQLRuleset;
import com.marklogic.semantics.rdf4j.MarkLogicRepository;
import com.marklogic.semantics.rdf4j.MarkLogicRepositoryConnection;
import com.marklogic.semantics.rdf4j.query.MarkLogicTupleQuery;
import com.marklogic.semantics.rdf4j.query.MarkLogicUpdateQuery;
import org.eclipse.rdf4j.model.Resource;
import org.eclipse.rdf4j.model.IRI;
import org.eclipse.rdf4j.model.ValueFactory;
import org.eclipse.rdf4j.model.vocabulary.FOAF;
import org.eclipse.rdf4j.model.vocabulary.RDF;
import org.eclipse.rdf4j.model.vocabulary.RDFS;
import org.eclipse.rdf4j.model.vocabulary.XMLSchema;
import org.eclipse.rdf4j.query.*;
import org.eclipse.rdf4j.repository.RepositoryException;
import org.eclipse.rdf4j.rio.RDFParseException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.io.IOException;

public class Example2_Advanced {

    protected static Logger logger =LoggerFactory.getLogger(Example2_Advanced.class);

    public static void main(String... args) throws RepositoryException, IOException, RDFParseException, MalformedQueryException, QueryEvaluationException {

        // instantiate MarkLogicRepository with Java api client DatabaseClient
        DatabaseClient adminClient = DatabaseClientFactory.newClient("localhost", 8200, new DatabaseClientFactory.DigestAuthContext("username", "password"));
        GraphManager gmgr = adminClient.newGraphManager();
        QueryManager qmgr = adminClient.newQueryManager();

        // create repo and init
        MarkLogicRepository repo = new MarkLogicRepository(adminClient);

        // get repository connection
        MarkLogicRepositoryConnection conn = repo.getConnection();

        // set default rulesets

        // set default perms
        conn.setDefaultGraphPerms(gmgr.permission("admin", Capability.READ).permission("admin", Capability.EXECUTE));

        // set a default Constraining Query
        StringQueryDefinition stringDef = qmgr.newStringDefinition().withCriteria("First");

        // return number of triples contained in repository
        logger.info("1. number of triples: {}", conn.size());

        // add a few constructed triples
        Resource context1 = conn.getValueFactory().createIRI("http://marklogic.com/examples/context1");
        Resource context2 = conn.getValueFactory().createIRI("http://marklogic.com/examples/context2");
        ValueFactory f = conn.getValueFactory();
        String namespace = "http://example.org/";
        IRI john = f.createIRI(namespace, "john");

        //use transactions to add triple statements
        conn.add(john, RDF.TYPE, FOAF.PERSON, context1);
        conn.add(john, RDFS.LABEL, f.createLiteral("John", XMLSchema.STRING), context2);

        logger.info("2. number of triples: {}", conn.size());

        // perform SPARQL query
        String queryString = "select * { ?s ?p ?o }";
        MarkLogicTupleQuery tupleQuery = conn.prepareTupleQuery(QueryLanguage.SPARQL, queryString);

        // enable rulesets set on MarkLogic database

        // set base uri for resolving relative uris

        // set rulesets for infererencing
        tupleQuery.setRulesets(SPARQLRuleset.ALL_VALUES_FROM, SPARQLRuleset.HAS_VALUE);

        // set a combined query
        String combinedQuery =
                "{\"search\":" +
        RawCombinedQueryDefinition rawCombined = qmgr.newRawCombinedQueryDefinition(new StringHandle().with(combinedQuery).withFormat(Format.JSON));

        // evaluate query with pagination
        TupleQueryResult results = tupleQuery.evaluate(1,10);

        //iterate through query results
            BindingSet bindings = results.next();
            logger.info("predicate:{}", bindings.getValue("p"));
            logger.info("object:{}", bindings.getValue("o"));
        logger.info("3. number of triples: {}", conn.size());

        //update query
        String updatequery = "INSERT DATA { GRAPH <http://marklogic.com/test/context10> {  <http://marklogic.com/test/subject> <pp1> <oo1> } }";
        MarkLogicUpdateQuery updateQuery = conn.prepareUpdate(QueryLanguage.SPARQL, updatequery,"http://marklogic.com/test/baseuri");

        // set perms to be applied to data
        updateQuery.setGraphPerms(gmgr.permission("admin", Capability.READ).permission("admin", Capability.EXECUTE));

        try {
        } catch (UpdateExecutionException e) {

        logger.info("4. number of triples: {}", conn.size());

        // clear all triples
        logger.info("5. number of triples: {}", conn.size());

        // close connection and shutdown repository


The MarkLogic RDF4J API is available on GitHub at http://github.com/marklogic/marklogic-rdf4j, along with javadocs and examples.

The key interfaces of MarkLogic RDF4J API are:

  • MarkLogicRepository
  • MarkLogicRepositoryConnection
  • MarkLogicQuery
    • MarkLogicTupleQuery
    • MarkLogicGraphQuery
    • MarkLogicBooleanQuery
    • MarkLogicUpdateQuery

MarkLogic Jena API

Jena is a Java API for processing and handling RDF data, including creating, parsing, storing, inferencing, and querying over this data. Java developers who are familiar with the Jena API can now use that same API to access RDF data in MarkLogic. The MarkLogic Jena API is a full-featured interface that provides access to MarkLogic Semantics functionality.

By including the MarkLogic Jena API, you can leverage MarkLogic as a Triple Store using standard Jena for SPARQL query and SPARQL Update. The MarkLogic Jena API extends Jena so you can also do combination queries, variable bindings, and transactions. The MarkLogicDataSet class provides support for both transactions and variable bindings.

Here is an example showing how to run queries using MarkLogic Jena:

package com.marklogic.jena.examples;

import org.apache.jena.riot.RDFDataMgr;
import org.apache.jena.riot.RDFFormat;

import com.hp.hpl.jena.graph.NodeFactory;
import com.hp.hpl.jena.query.QueryExecutionFactory;
import com.hp.hpl.jena.query.QuerySolution;
import com.hp.hpl.jena.query.ResultSet;
import com.hp.hpl.jena.update.UpdateExecutionFactory;
import com.hp.hpl.jena.update.UpdateFactory;
import com.hp.hpl.jena.update.UpdateProcessor;
import com.hp.hpl.jena.update.UpdateRequest;
import com.marklogic.semantics.jena.MarkLogicDatasetGraph;

 * How to run queries.
public class SPARQLUpdateExample {
    private MarkLogicDatasetGraph dsg;

    public SPARQLUpdateExample() {
        dsg = ExampleUtils.loadPropsAndInit();

    private void run() {
        String insertData = "PREFIX foaf: <http://xmlns.com/foaf/0.1/> "
                + "PREFIX : <http://example.org/> "
                +"INSERT DATA {GRAPH :g1 {"
                + ":charles a foaf:Person ; "
                + "        foaf:name \"Charles\" ;"
                + "        foaf:knows :jim ."
                + ":jim    a foaf:Person ;"
                + "        foaf:name \"Jim\" ;"
                + "        foaf:knows :charles ."
                + "} }";
        System.out.println("Running SPARQL update");
        UpdateRequest update = UpdateFactory.create(insertData);
        UpdateProcessor processor = UpdateExecutionFactory.create(update, dsg);
        System.out.println("Examine the data as JSON-LD");
        RDFDataMgr.write(System.out, dsg.getGraph(NodeFactory.createURI("http://example.org/g1")),
        System.out.println("Remove it.");
        update = UpdateFactory.create("PREFIX : <http://example.org/> DROP GRAPH :g1");
        processor = UpdateExecutionFactory.create(update, dsg);

    public static void main(String... args) {
        SPARQLUpdateExample example = new SPARQLUpdateExample();


The MarkLogic Jena source is available on GitHub at http://github.com/marklogic/marklogic-jena, along with javadocs and examples.

The key interfaces of the MarkLogic Jena API are:

  • MarkLogicDatasetGraph
  • MarkLogicDatasetGraphFactory
  • MarkLogicQuery
  • MarkLogicQueryEngine

Node.js Client API

The Node.js Client API can be used for CRUD (Create, Read, Update, and Delete) operations on graphs; creating, reading, updating, and deleting triples and graphs. The DatabaseClient.graphs.write function can be used to create a graph containing triples, the DatabaseClient.graphs.read function reads from a graph. The DatabaseClient.graphs.remove function removes a graph. The DatabaseClient.graphs.sparql function queries semantic data.

See Working With Semantic Data in the Node.js Application Developer's Guide for more details. The Node.js Client source can be found on GitHub at http://github.com/marklogic/node-client-api. For additional operations, see the Node.js Client API Reference.

These operations only work with managed triples contained in a graph. Embedded triples cannot be manipulated using the Node.js Client API.

Queries Using Optic API

The Optic API can be used to search and work with semantic triples in both client-side queries and server-side side queries. Optic can be used for triple data client-side queries with the Java Client API and the REST Client API, but not with Node.js. See Optic Java API for Relational Operations in the Java Application Developer's Guide and Invoking an Optic API Query Plan in the REST Application Developer's Guide for more details.

For server-side queries using the Optic API, see Querying Triples with the Optic API for more information. Also, see the op:from-triples or op.fromTriples functions in the Optic API and the Data Access Functions section in the Application Developer's Guide .

« Previous chapter
Next chapter »