Joining SQL and SPARQL Data
We can also find all our employees in the Northeast with a multi-model inner-join query using fromSQL()
as our "left table" instead of fromView()
and fromSPARQL()
as our "right table" instead of fromTriples()
. (The fromSPARQL()
part of this query is from the Building Semantic Queries example):
// Select from employee profile columns via SQL: const sqlView = op.fromSQL(` SELECT * FROM Employee.Profile `); // Select with category expansion via SPARQL: const sparqlView = op.fromSPARQL(` PREFIX ex: <https://example.com/semantics/geo#> PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#> PREFIX skos: <http://www.w3.org/2004/02/skos/core#> SELECT ?code ?region FROM <https://example.com/semantics/geo> WHERE { ?state skos:broader ?region . ?state rdfs:isDefinedBy ?code . FILTER (?region = ex:Northeast) } `); // Join two different data models to create a multi-model query: sqlView .joinInner(sparqlView, op.on('State', 'code')) .offsetLimit(0, 100) .result()
We used this query to obtain results similar to Joining View and Triples Data:
The Data Accessor Function
fromSQL()
pulls data into a row sequence containing the SELECT-specified columns from the FROM-specified view. In our query, we selected all the columns from our employee view.The Data Accessor Function
fromSPARQL()
pulls data into a row sequence containing the SELECT-specified columns from the FROM-specified named graph WHERE the FILTER-specified column matches a specified value. In our query, we selected thecode
andregion
columns where theregion
isNortheast
.When you use more than one data accessor function in a query, it is best practice to use a variable to represent each one to keep the query simple and readable.
The Operator Function
joinInner()
returns all rows from bothfromSQL
() andfromSPARQL()
if the specified columns match, considering the data accessor function specified first in the query to be the "left table" and the second to be the "right table."The Auxiliary Function
on()
lets you specify a join condition for join-type operator functions likejoinInner()
.The Operator Function
offsetLimit()
restricts results returned. The first parameter specifies the number of results to skip; the second, the number of results to return. So, (0, 100) returns the first 100 results.The Executor Function
result()
executes the query and returns the results as a row sequence.
Here is row 1 of the 100-row x 25-column result:
{ "Employee.Profile.GUID": "91ed05cb-a2ff-4bd5-9ea1-e8a5829bc665", // EC1 "code": "MA", // TC1 = EC10 "region": "https://example.com/semantics/geo#Northeast", // TC2 "Employee.Profile.HiredDate": "2013-02-28", // EC2 "Employee.Profile.Gender": "female", // EC3 "Employee.Profile.Title": "Mrs.", // EC4, etc. "Employee.Profile.GivenName": "Priscilla", "Employee.Profile.MiddleInitial": "J", "Employee.Profile.Surname": "Torres", "Employee.Profile.StreetAddress": "4121 Cedar Lane", "Employee.Profile.City": "Cambridge", "Employee.Profile.State": "MA", // EC10 = TC1 "Employee.Profile.ZipCode": "2142", "Employee.Profile.Country": "US", "Employee.Profile.EmailAddress": "PriscillaJTorres@teleworm.us", "Employee.Profile.TelephoneNumber": "617-639-9170", "Employee.Profile.TelephoneCountryCode": "1", "Employee.Profile.Birthday": "1/11/45", "Employee.Profile.NationalID": "027-18-9604", "Employee.Profile.Point": "42.307606,-71.037903", "Employee.Profile.BaseSalary": 14480, "Employee.Profile.Bonus": 1448, "Employee.Profile.Department": "Engineering", "Employee.Profile.Status": "Active - Regular Exempt (Full-time)", "Employee.Profile.ManagerGUID": "3ad0ffbc-3ade-4897-902b-718417a721f5" }
This query returned 100 rows for view documents with matching triples documents.
The rows are in an unspecified order, which could change between executions. You can specify row order with the
orderBy()
operator function.The 25 columns are the 23 from our "left table," the
Employee Profile
view (EC#
) plus the 2 from our "right table," the triples row sequence (TC#
).
You now know how to use Optic to query your data in many ways. Next, you will learn how to use Optic to update your data.