Please run the following statements to set up the Tutorial. They can also be used to reset the tutorial if you want to start again at any point
create or replace view EMPLOYEE_KEY_VALUE
as
select EMPLOYEE_ID as ID, 'EmployeeId' as KEY, to_char(EMPLOYEE_ID) as VALUE
from HR.EMPLOYEES
union all
select EMPLOYEE_ID as ID, 'FirstName' as KEY, FIRST_NAME as VALUE
from HR.EMPLOYEES
union all
select EMPLOYEE_ID as ID, 'LastName' as KEY, LAST_NAME as VALUE
from HR.EMPLOYEES
union all
select EMPLOYEE_ID as ID, 'EmailAddress' as KEY, EMAIL as VALUE
from HR.EMPLOYEES
union all
select EMPLOYEE_ID as ID, 'TelephoneNumber' as KEY, PHONE_NUMBER as VALUE
from HR.EMPLOYEES
union all
select EMPLOYEE_ID as ID , 'HireDate' as KEY, to_char(HIRE_DATE) as VALUE
from HR.EMPLOYEES
union all
select EMPLOYEE_ID as ID, 'JobId' as KEY, JOB_ID as VALUE
from HR.EMPLOYEES
union all
select EMPLOYEE_ID as ID, 'Salary' as KEY, to_char(SALARY) as VALUE
from HR.EMPLOYEES
union all
select EMPLOYEE_ID as ID, 'Commision' as KEY, to_char(COMMISSION_PCT) as VALUE
from HR.EMPLOYEES
union all
select EMPLOYEE_ID as ID, 'ManagerId' as KEY, to_char(MANAGER_ID) as VALUE
from HR.EMPLOYEES
union all
select EMPLOYEE_ID as ID, 'DepartmentId' as KEY, to_char(DEPARTMENT_ID) as VALUE
from HR.EMPLOYEES
/
declare
cursor getTable
is
select TABLE_NAME
from ALL_TABLES
where TABLE_NAME in ( 'J_PURCHASEORDER', 'JSON_DUMP_CONTENTS','CITY_LOT_FEATURES')
and OWNER = SYS_CONTEXT('USERENV','CURRENT_USER');
begin
for t in getTable() loop
execute immediate 'DROP TABLE "' || t.TABLE_NAME || '" PURGE';
end loop;
end;
/
In Oracle there is no dedicated JSON data type. JSON documents are stored in the database using standard Oracle data types such as VARCHAR2, CLOB and BLOB. VARCHAR2 can be used where the size of the JSON document will never exceed 4K (32K in database where when extended VARCHAR support has been enabled.) Larger documents are stored using CLOB or BLOB data types.
In order to ensure that the content of the column is valid JSON data, a new constraint IS JSON, is provided that can be applied to a column. This constraint returns TRUE if the content of the column is well-formed, valid JSON and FALSE otherwise.
This first statement in this module creates a table which will be used to contain JSON documents.
create table J_PURCHASEORDER (
ID RAW(16) NOT NULL,
DATE_LOADED TIMESTAMP(6) WITH TIME ZONE,
PO_DOCUMENT CLOB CHECK (PO_DOCUMENT IS JSON)
)
This statement creates a very simple table, J_PURCHASEORDER. The table has a column PO_DOCUMENT of type CLOB. The IS JSON constraint is applied to the column PO_DOCUMENT, ensuring that the column can store only well formed JSON documents.
JSON documents can come from a number of different sources. Since Oracle stores JSON data using standard SQL data types, all of the popular Oracle APIs can be used to load JSON documents into the database. JSON documents contained in files can be loaded directly into the database using External Tables.
This statement creates a simple external table that can read JSON documents from a dump file generated by a typical No-SQL style database. In this case, the documents are contained in the file purchaseOrders.json. The SQL directory object ORDER_ENTRY points to the folder containing the dump file, and the SQL directory object JSON_LOADER_OUTPUT points to the database’s trace folder which will contain any ‘log’ or ‘bad’ files generated when the table is processed.
The following code is for reference only. External Tables are not supported in the LiveSQL environment.
CREATE TABLE DUMP_FILE_CONTENTS( PO_DOCUMENT CLOB ) ORGANIZATION EXTERNAL( TYPE ORACLE_LOADER DEFAULT DIRECTORY ORDER_ENTRY ACCESS PARAMETERS ( RECORDS DELIMITED BY 0x'0A' BADFILE JSON_LOADER_OUTPUT: 'JSON_DUMPFILE_CONTENTS.bad' LOGFILE JSON_LOADER_OUTPUT: 'JSON_DUMPFILE_CONTENTS.log' FIELDS( JSON_DOCUMENT CHAR(5000) ) ) LOCATION ( ORDER_ENTRY:'purchaseOrders.json' ) ) PARALLEL REJECT LIMIT UNLIMITED
The following code creates a simulation of an external table that can read a dump file that has been stored in a location that is accessible when using LiveSQL
create or replace view JSON_DUMP_CONTENTS (
PO_DOCUMENT
)
as
select JSON_DOCUMENT
from JSON_TABLE(
xdburitype('/public/tutorials/json/testdata/purchaseOrders.json').getClob(),
'$[*]'
columns (
JSON_DOCUMENT VARCHAR2(32000) FORMAT JSON PATH '$'
)
)
The following statement copies the JSON documents from the dump file into the J_PURCHASEORDER table.
insert into J_PURCHASEORDER
select SYS_GUID(), SYSTIMESTAMP, PO_DOCUMENT
from JSON_DUMP_CONTENTS
where PO_DOCUMENT IS JSON
and rownum < 1001
/
commit
/
The IS JSON condition is used to ensure that the insert operation takes place only for well formed documents. Make sure that the commit statement is executed after the insert statement has completed. SYS_GUID and SYSTIMESTAMP are used to populate columns ID and DATE_LOADED.
Oracle allows a simple ‘dotted’ notation to be used to perform a limited set of operations on columns containing JSON. It also introduces a set of SQL operators that allow the full power of the JSON path language to be used with JSON. This is similar to the way in which the database allows XQuery to be used to access the contents of XML documents stored in the database.
The dotted notation can be used to perform basic operations on JSON stored in Oracle Database. Using the dotted notation you can access the value of any keys contained in the JSON document. In order to use the dotted notation, a table alias must be assigned to the table in the FROM clause, and any reference to the JSON column must be prefixed with the assigned alias. All data is returned as VARCHAR2(4000). Also the simplified syntax can only be used with columns that have had an IS JSON constraint applied to them.
The following examples demonstrate how to use the simplified syntax to extract values from an JSON document and how to filter a result set based on the content of a JSON document. The first query shows the count of PurchaseOrder documents by cost center
select j.PO_DOCUMENT.CostCenter, count(*)
from J_PURCHASEORDER j
group by j.PO_DOCUMENT.CostCenter
order by j.PO_DOCUMENT.CostCenter
The second query shows how to fetch the JSON document where the PONumber key has the value 450:
select j.PO_DOCUMENT
from J_PURCHASEORDER j
where j.PO_DOCUMENT.PONumber = 450
The third query shows how to fetch data from any document where the key PONumber contains the value 450. The statement returns the values of the keys Reference, Requestor and CostCenter as well as the value of the key city which is a property of the object identified by the key Address which in turn is a property of the object identified by the key ShippingInstructions
select j.PO_DOCUMENT.Reference,
j.PO_DOCUMENT.Requestor,
j.PO_DOCUMENT.CostCenter,
j.PO_DOCUMENT.ShippingInstructions.Address.city
from J_PURCHASEORDER j
where j.PO_DOCUMENT.PONumber = 450
The fourth query shows how to fetch the content of the Address object:
select j.PO_DOCUMENT.ShippingInstructions.Address
from J_PURCHASEORDER j
where j.PO_DOCUMENT."PONumber" = 450
The Address object is returned as JSON text in a VARCHAR2(4000).
The JSON_VALUE operator uses a JSON path expression to access a single scalar value. It is typically used in the select list to return the value associated with the JSON path expression or in the WHERE clause to filter a result set based on the content of a JSON document.
JSON_VALUE takes two arguments, a JSON column and a JSON path expression. It provides a set of modifiers that allow control over the format in which the data is returned and how to handle any errors encountered while evaluating the JSON path expression.
The following examples demonstrate how to use the JSON_VALUE operator to extract scalar values from a JSON document using JSON path expressions and to filter a result set based on the content of a JSON document. The first query shows the count of PurchaseOrder documents by cost center:
select JSON_VALUE(PO_DOCUMENT ,'$.CostCenter'), count(*)
from J_PURCHASEORDER
group by JSON_VALUE(PO_DOCUMENT ,'$.CostCenter')
The query uses the JSON_VALUE operator to return the value of the CostCenter key from each document in the J_PURCHASEORDER table. The SQL Count and Group operators are then applied to the values returned, to generate the desired result. This shows how by storing JSON documents in the Oracle Database makes it possible to bring the full power of SQL to bear on JSON content without giving up any of the advantages of the JSON paradigm.
The second query shows how to access a value from within an array. This example uses index 0 to access the value of the UPCCode from the Part object inside the first member of the LineItems array. This query also shows how to use JSON_VALUE as a filter in the WHERE clause of the SQL statement, so that the JSON_VALUE operator in the select list is executed only for those rows where the JSON_VALUE based condition in the WHERE clause evaluates to TRUE.
select JSON_VALUE(PO_DOCUMENT ,'$.LineItems[0].Part.UPCCode')
from J_PURCHASEORDER p
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
JSON_VALUE also provides control over the value returned. You can specify size when returning a VARCHAR2 value, and size and precision when returning a NUMBER value. The third query shows how to specify the SQL data type that is returned by JSON_VALUE. In this case, the value is returned as a NUMBER(5,3) value.
select JSON_VALUE(PO_DOCUMENT, '$.LineItems[0].Part.UnitPrice' returning NUMBER(5,3))
from J_PURCHASEORDER p
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
The JSON_VALUE function also provides options for handling errors that might be encountered when applying the JSON PATH expression to a JSON document. Options available include:
The most common source of an error with JSON_VALUE is that the JSON path expression resolves to something other than a scalar value. JSON, by its nature is highly flexible, which raises the possibility of having a small number of documents that do not conform to a particular pattern. The NULL ON ERROR behavior ensures that a single error caused by one outlier does not abort an entire operation.
The fourth query shows the default behavior for JSON_VALUE in the event of an error during JSON path evaluation.
select JSON_VALUE(PO_DOCUMENT ,'$.ShippingInstruction.Address')
from J_PURCHASEORDER
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
In this example, there is an error in the first key in the JSON path expression. The first key should be ShippingInstructions, not ShippingInstruction. Since the JSON path expression does match a scalar value, the NULL ON ERROR behavior kicks in and the JSON_VALUE operator returns NULL.
The fifth query demonstrates the DEFAULT on ERROR behavior
select JSON_VALUE(PO_DOCUMENT, '$.ShippingInstruction.Address' DEFAULT 'N/A' ON ERROR)
from J_PURCHASEORDER
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
Since the JSON path expression does not evaluate to a scalar value, when the JSON path expression is evaluated the DEFAULT on ERROR behavior kicks in, and the JSON_VALUE operator returns “N/A”.
The sixth query demonstrates the ERROR ON ERROR behavior.
select JSON_VALUE(PO_DOCUMENT, '$.ShippingInstruction.Address' ERROR ON ERROR)
from J_PURCHASEORDER
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
Since the JSON path expression does not evaluate to a scalar value, when the JSON path expression is evaluated the ERROR ON ERROR behavior kicks in and results in error “ORA-40462: JSON_VALUE evaluated to no value” being raised.
The seventh query is another example of the ERROR ON ERROR behavior.
select JSON_VALUE(PO_DOCUMENT, '$.ShippingInstructions.Address' ERROR ON ERROR)
from J_PURCHASEORDER
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
In this case, the JSON path expression does evaluate to a key in the document; however the value associated with the key is an object, not a scalar. Since JSON_VALUE can only return a scalar value, when the JSON path expression is evaluated the ERROR ON ERROR behavior kicks in and results in error “ORA-40456: JSON_VALUE evaluated to non-scalar value” being raised.
The eighth query shows that the error handling clauses only apply to runtime errors that arise when the JSON path expression is applied to a set of JSON documents:
select JSON_VALUE(PO_DOCUMENT, '$.ShippingInstructions,Address' NULL ON ERROR)
from J_PURCHASEORDER
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
In this example, the error is that a comma, rather than a period, has been used as the separator between the first and second keys in the JSON path expression. Despite the fact that NULL ON ERROR semantics were requested, when the statement is executed error “ORA-40597: JSON path expression syntax error ('$.ShippingInstructions,Address') JZN-00209: Unexpected characters after end of path at position 23” is raised due to the fact that an invalid JSON path expression is a compile time error rather than a runtime error.
JSON_QUERY is the complement of JSON_VALUE. Whereas JSON_VALUE can return only a scalar value, JSON_QUERY can return only an object or an array. Like JSON_VALUE, JSON_QUERY takes two arguments, a JSON column and a JSON path expression. It provides a set of modifiers that allow control over the format in which the data is returned and how to handle any errors encountered while evaluating the JSON path expression. The following examples show the use of JSON_QUERY.
This module demonstrates how to use the JSON_QUERY operator to extract objects and arrays from a JSON document using JSON path expressions.
The first query shows how to return the contents of the key ShippingInstructions:
select JSON_QUERY(PO_DOCUMENT ,'$.ShippingInstructions')
from J_PURCHASEORDER p
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
Since the contents of ShippingInstructions is an object, the value is returned as a JSON object delimited by curly braces ‘{‘ and ‘}’.
The second query shows how to return the contents of the key LineItems:
select JSON_QUERY(PO_DOCUMENT ,'$.LineItems')
from J_PURCHASEORDER p
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
Since the contents of LineItems is an array, the value is returned as a JSON array delimited by square brackets ‘[‘ and ‘]’.
By default, for maximum efficiency, objects and arrays are printed with the minimal amount of whitespace. This minimizes the number of bytes needed to represent the value, and is fine when the object will be consumed by a computer. However, sometimes it is necessary for the output of a JSON_QUERY operation to be human readable. JSON_QUERY provides the keyword PRETTY for this purpose.
The third query shows the use of the PRETTY keyword to format the output of the JSON_QUERY operator.
select JSON_QUERY(PO_DOCUMENT ,'$.LineItems' PRETTY)
from J_PURCHASEORDER p
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
The array is output as neatly indented JSON, making it much easier for a human to understand. However this comes at the cost of increasing the number of bytes needed to represent the content of the array.
The fourth query shows how to use an array offset to access one item from an array of objects:
select JSON_QUERY(PO_DOCUMENT ,'$.LineItems[0]' PRETTY)
from J_PURCHASEORDER p
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
Since the selected item is a JSON object, it is delimited by curly braces ‘{‘ and ‘}’.
The fifth query shows how to access the value of a key from one specific member of an array of objects.
select JSON_QUERY(PO_DOCUMENT, '$.LineItems[0].Part')
from J_PURCHASEORDER p
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
Since the contents of Part is an object, the value is returned as a JSON object delimited by curly braces ‘{‘ and ‘}’.
The JSON_QUERY function also provides options for handling errors that might be encountered when applying the JSON path expression to a JSON document. Options available include:
The most common source of an error with JSON_QUERY is that the JSON path expression resolves to something other than an object or array. The NULL ON ERROR behavior ensures that a single error caused by one outlier does not abort an entire operation.
The sixth query shows the default behavior for JSON_QUERY in the event of an error during JSON path evaluation.
select JSON_QUERY(PO_DOCUMENT, '$.LineItems[0].Part.UPCCode')
from J_PURCHASEORDER p
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
In this example the error is that the JSON path expression maps to a scalar value rather than an object or an array. Since JSON_QUERY can only return an object or an array, the NULL ON ERROR handling kicks in and results in the query returning NULL.
The seventh query shows the ERROR ON ERROR behavior for JSON_QUERY in the event of an error during JSON path evaluation.
select JSON_QUERY(PO_DOCUMENT, '$.LineItems[0].Part.UPCCode' ERROR ON ERROR)
from J_PURCHASEORDER p
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
Since the result of the JSON path expression is a scalar value, the ERROR ON ERROR handling kicks in and results in an “ORA-40480: result cannot be returned without array wrapper” error being raised.
The eighth query shows the use of the EMPTY ON ERROR behavior
select JSON_QUERY(PO_DOCUMENT, '$.LineItems[0].Part.UPCCode' EMPTY ON ERROR)
from J_PURCHASEORDER p
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
Since the result of the JSON path expression is a scalar value, the EMPTY ON ERROR handling kicks in and the result of the JSON_QUERY operation is an empty array, “[]”.
The ninth query shows how to use the “WITH CONDITIONAL ARRAY WRAPPER” option to avoid a JSON_QUERY error when a JSON path expression evaluates to a scalar value. When this option is specified, a JSON path expression that evaluates to a scalar value is returned as an array. The array consists of one element containing a scalar value.
select JSON_QUERY(PO_DOCUMENT, '$.LineItems[0].Part.UPCCode' WITH CONDITIONAL ARRAY WRAPPER)
from J_PURCHASEORDER p
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
Since the result of the JSON path expression is a scalar value, JSON_QUERY automatically converts the result into an array with that one item and returns the array.
The tenth query shows how to use WITH ARRAY WRAPPER to force JSON_QUERY to always return the result as an array. In this example, the index is an asterisk, indicating that all members of the LineItems array should be processed, and the last key in the JSON path expression is also an asterisk, indicating that all children of the Part key should be processed.
select JSON_QUERY(PO_DOCUMENT, '$.LineItems[*].Part.*' WITH ARRAY WRAPPER)
from J_PURCHASEORDER p
where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) = 450
The result of executing this query is an array with 6 items. The first 3 items come from the Description, UnitPrice and UPCCode associated with the Part key of the first member of the LineItems array. The second 3 come from the Description, UnitPrice and UPCCode associated with the Part key of the second member of the LineItems array. Also note that since Description and UPCCode are strings, and UnitPrice is a numeric, the resulting array is heterogeneous in nature, containing a mixture of string and numeric values.
The JSON_TABLE operator is used in the FROM clause of a SQL statement. It enables the creation of an inline relational view of JSON content. The JSON_TABLE operator uses a set of JSON path expressions to map content from a JSON document into columns in the view. Once the contents of the JSON document have been exposed as columns, all of the power of SQL can be brought to bear on the content of JSON document.
The input to the JSON_TABLE operator is a JSON object or a JSON array. The JSON_TABLE operator is driven by a row pattern which consists of a JSON path expression. The row pattern determines how many rows the JSON_TABLE operator will generate. It will generate 1 row from each key that matches the supplied row pattern. If the row pattern matches a single key then the JSON_TABLE operator will generate exactly one row. If the row pattern references one or more arrays then it will generate a row from each item in the deepest array.
The rows output by a JSON_TABLE operator are laterally joined to the row that generated them. There is no need to supply a WHERE clause to join the output of the JSON_TABLE operator with the table containing the JSON document.
Columns are defined by one or more column patterns that appear after the COLUMNS keyword. The inline view contains 1 column for each entry in the columns clause. Each column pattern consists of a column name, a SQL data type and a JSON path expression. The column name determines the SQL name of the column, the SQL data type determines the data type of the column and the JSON path expression maps a value from the document to the column. The JSON path expressions in the columns clause are relative to the row pattern.
This module shows how to use JSON_TABLE to generate inline relational views of JSON documents.
The first query shows how to project a set of columns from values that occur at most once in the document. The values can come from any level of nesting, as long as they do not come from keys that are part of, or descended from an array, unless an index is used to identify one item in the array.
select M.*
from J_PURCHASEORDER p,
JSON_TABLE(
p.PO_DOCUMENT ,
'$'
columns
PO_NUMBER NUMBER(10) path '$.PONumber',
REFERENCE VARCHAR2(30 CHAR) path '$.Reference',
REQUESTOR VARCHAR2(32 CHAR) path '$.Requestor',
USERID VARCHAR2(10 CHAR) path '$.User',
COSTCENTER VARCHAR2(16 CHAR) path '$.CostCenter',
TELEPHONE VARCHAR2(16 CHAR) path '$.ShippingInstructions.Phone[0].number'
) M
where PO_NUMBER between 450 and 455
This example generates a view with 5 columns, REFERENCE, REQUESTOR, USERID, COSTCENTER and TELEPHONE. The use of ‘$’ as the JSON path expression for the ROW pattern means the entire JSON document. This means that all column patterns are descended directly from the top level object.
The second query shows how to work with arrays. In order to expose the contents of an array as a set of rows, the array must be processed using the NESTED PATH syntax. When a JSON_TABLE operator contains a NESTED PATH clause it will output one row for each member of the array referenced by the deepest NESTED PATH clause. The row will contain all of the columns defined by each level of the JSON_TABLE expression.
select D.*
from J_PURCHASEORDER p,
JSON_TABLE(
p.PO_DOCUMENT ,
'$'
columns(
PO_NUMBER NUMBER(10) path '$.PONumber',
REFERENCE VARCHAR2(30 CHAR) path '$.Reference',
REQUESTOR VARCHAR2(32 CHAR) path '$.Requestor',
USERID VARCHAR2(10 CHAR) path '$.User',
COSTCENTER VARCHAR2(16) path '$.CostCenter',
NESTED PATH '$.LineItems[*]'
columns(
ITEMNO NUMBER(16) path '$.ItemNumber',
DESCRIPTION VARCHAR2(32 CHAR) path '$.Part.Description',
UPCCODE VARCHAR2(14 CHAR) path '$.Part.UPCCode',
QUANTITY NUMBER(5,4) path '$.Quantity',
UNITPRICE NUMBER(5,2) path '$.Part.UnitPrice'
)
)
) D
where PO_NUMBER between 450 and 455
The NESTED PATH option simplifies the processing of nested collections. The NESTED PATH clause removes the need to define columns whose sole purpose is to pass a collection from one operator to the next, making it possible to use a “*” for the select list since the collections are no longer emitted by the JSON_TABLE operator.
One common use of JSON_TABLE is to create relational views of JSON content which can then be operated on using standard SQL syntax. This has the advantage of allowing programmers and tools that have no concept of JSON data and JSON path expressions to work with JSON documents that have been stored in the Oracle Database.
In Oracle 12.1.0.2.0 it highly recommended to avoid joins between these views. A fully expanded detail view will provide much better performance than defining a m aster view and a detail view and then attempting to write a query that returns a result set consisting of columns selected from both the master view and the detail view.
This module shows how to use JSON_TABLE to create relational views of JSON content that can be queried using standard SQL.
The first statement creates a relational view called PURCHASEORDER_MASTER_VIEW which exposes values that occur at most once in each document.
create or replace view PURCHASEORDER_MASTER_VIEW
as
select m.*
from J_PURCHASEORDER p,
JSON_TABLE(
p.PO_DOCUMENT ,
'$'
columns (
PO_NUMBER NUMBER(10) path '$.PONumber',
REFERENCE VARCHAR2(30 CHAR) path '$.Reference',
REQUESTOR VARCHAR2(128 CHAR) path '$.Requestor',
USERID VARCHAR2(10 CHAR) path '$.User',
COSTCENTER VARCHAR2(16) path '$.CostCenter',
SHIP_TO_NAME VARCHAR2(20 CHAR) path '$.ShippingInstructions.name',
SHIP_TO_STREET VARCHAR2(32 CHAR) path '$.ShippingInstructions.Address.street',
SHIP_TO_CITY VARCHAR2(32 CHAR) path '$.ShippingInstructions.Address.city',
SHIP_TO_COUNTY VARCHAR2(32 CHAR) path '$.ShippingInstructions.Address.county',
SHIP_TO_POSTCODE VARCHAR2(32 CHAR) path '$.ShippingInstructions.Address.postcode',
SHIP_TO_STATE VARCHAR2(2 CHAR) path '$.ShippingInstructions.Address.state',
SHIP_TO_PROVINCE VARCHAR2(2 CHAR) path '$.ShippingInstructions.Address.province',
SHIP_TO_ZIP VARCHAR2(8 CHAR) path '$.ShippingInstructions.Address.zipCode',
SHIP_TO_COUNTRY VARCHAR2(32 CHAR) path '$.ShippingInstructions.Address.country',
SHIP_TO_PHONE VARCHAR2(24 CHAR) path '$.ShippingInstructions.Phones[0].number',
INSTRUCTIONS VARCHAR2(2048 CHAR) path '$.SpecialInstructions'
)
) m
As can be seen from the output of a describe operation, this view looks like any other relational view. The JSON operators and JSON path expressions are hidden away in the DDL statement that created the view.
The second statement creates a relational view called PURCHASEORDER_DETAIL_VIEW that exposes the contents of the LineItems array as a set of rows.
create or replace view PURCHASEORDER_DETAIL_VIEW
as
select D.*
from J_PURCHASEORDER p,
JSON_TABLE(
p.PO_DOCUMENT ,
'$'
columns (
PO_NUMBER NUMBER(10) path '$.PONumber',
REFERENCE VARCHAR2(30 CHAR) path '$.Reference',
REQUESTOR VARCHAR2(128 CHAR) path '$.Requestor',
USERID VARCHAR2(10 CHAR) path '$.User',
COSTCENTER VARCHAR2(16) path '$.CostCenter',
SHIP_TO_NAME VARCHAR2(20 CHAR) path '$.ShippingInstructions.name',
SHIP_TO_STREET VARCHAR2(32 CHAR) path '$.ShippingInstructions.Address.street',
SHIP_TO_CITY VARCHAR2(32 CHAR) path '$.ShippingInstructions.Address.city',
SHIP_TO_COUNTY VARCHAR2(32 CHAR) path '$.ShippingInstructions.Address.county',
SHIP_TO_POSTCODE VARCHAR2(10 CHAR) path '$.ShippingInstructions.Address.postcode',
SHIP_TO_STATE VARCHAR2(2 CHAR) path '$.ShippingInstructions.Address.state',
SHIP_TO_PROVINCE VARCHAR2(2 CHAR) path '$.ShippingInstructions.Address.province',
SHIP_TO_ZIP VARCHAR2(8 CHAR) path '$.ShippingInstructions.Address.zipCode',
SHIP_TO_COUNTRY VARCHAR2(32 CHAR) path '$.ShippingInstructions.Address.country',
SHIP_TO_PHONE VARCHAR2(24 CHAR) path '$.ShippingInstructions.Phones[0].number',
INSTRUCTIONS VARCHAR2(2048 CHAR) path '$.SpecialInstructions',
NESTED PATH '$.LineItems[*]'
columns (
ITEMNO NUMBER(38) path '$.ItemNumber',
DESCRIPTION VARCHAR2(256 CHAR) path '$.Part.Description',
UPCCODE VARCHAR2(14 CHAR) path '$.Part.UPCCode',
QUANTITY NUMBER(12,4) path '$.Quantity',
UNITPRICE NUMBER(14,2) path '$.Part.UnitPrice'
)
)
) D
/
The following statements show how, once the relational views have been created, the full power of SQL can now be applied to JSON content, without requiring any knowledge of the structure of the JSON or how to manipulate JSON using SQL.
select SHIP_TO_STREET, SHIP_TO_CITY, SHIP_TO_STATE, SHIP_TO_ZIP
from PURCHASEORDER_MASTER_VIEW
where PO_NUMBER = 450
select PO_NUMBER, REFERENCE, SHIP_TO_PHONE, DESCRIPTION, QUANTITY, UNITPRICE
from PURCHASEORDER_DETAIL_VIEW
where UPCCODE = '27616854773'
select PO_NUMBER, REFERENCE, SHIP_TO_PHONE, QUANTITY, DESCRIPTION, UNITPRICE
from PURCHASEORDER_DETAIL_VIEW
where UPCCODE in ('27616854773', '56775053895', '75993852820')
order by PO_NUMBER
select COSTCENTER, count(*)
From PURCHASEORDER_MASTER_VIEW
group by COSTCENTER
order by COSTCENTER
select COSTCENTER, sum (QUANTITY * UNITPRICE) TOTAL_VALUE
from PURCHASEORDER_DETAIL_VIEW
group by COSTCENTER
select PO_NUMBER, REFERENCE, INSTRUCTIONS, ITEMNO, UPCCODE, DESCRIPTION, QUANTITY, UNITPRICE
from PURCHASEORDER_DETAIL_VIEW d
where REQUESTOR = 'Steven King'
and QUANTITY > 7
and UNITPRICE > 25.00
select UPCCODE, count(*) "Orders", Quantity "Copies"
from PURCHASEORDER_DETAIL_VIEW
where UPCCODE in ('27616854773', '56775053895', '75993852820')
group by rollup(UPCCODE, QUANTITY)
select UPCCODE, PO_NUMBER, REFERENCE, QUANTITY, QUANTITY - LAG(QUANTITY,1,QUANTITY) over (ORDER BY PO_NUMBER) as DIFFERENCE
from PURCHASEORDER_DETAIL_VIEW
where UPCCODE = '27616854773'
order by PO_NUMBER DESC
Relational views effectively provide schema-on-query semantics; making it possible to define multiple views of the JSON content. Application developers are still free to evolve the content of the JSON as required. New variants of the JSON can be stored without affecting applications that rely on the existing views.
As can be seen from the queries used, all the power of the SQL language can be used to access the JSON data. These views allow developers and, more importantly, tools that understand only the relational paradigm to work with JSON content.
The JSON_EXISTS operator is used in the WHERE clause of a SQL statement. It is used to test whether or not a JSON document contains content that matches the provided JSON path expression.
The JSON_EXISTS operator takes two arguments, a JSON column and a JSON path expression. It returns TRUE if the document contains a key that matches the JSON path expression, FALSE otherwise. JSON_EXISTS provides a set of modifiers that provide control over how to handle any errors encountered while evaluating the JSON path expression.
The following examples show the use of JSON_EXISTS.The first query counts the number of JSON documents that contain a ShippingInstructions key with an Address key that contains a state key:
select count(*)
from J_PURCHASEORDER
where JSON_EXISTS(PO_DOCUMENT ,'$.ShippingInstructions.Address.state')
JSON_EXISTS makes it possible to differentiate between a document where a key is not present and a document where the key is present but the value is null. The next three queries show the difference between using JSON_EXISTS to test for the presence of a key and using JSON_VALUE to check if a key is null or empty.
The first query simply shows the set of possible values for the county key in the Address object.
select JSON_VALUE(PO_DOCUMENT ,'$.ShippingInstructions.Address.county'),
count(*)
from J_PURCHASEORDER
group by JSON_VALUE(PO_DOCUMENT ,'$.ShippingInstructions.Address.county')
The results show that there are two possible values for the county key; it is either NULL or contains the value “Oxon.”. The problem with this is that, for the NULL values, we cannot differentiate between cases where JSON_VALUE returned NULL because the key did not exist in the document and the cases where it returned NULL because the key was present but the value contains a null or empty value.
With JSON_EXISTS it is possible to differentiate between the case where the key is not present and the case where the key has a null or empty value. The following query shows how this is done:
select JSON_VALUE(PO_DOCUMENT ,'$.ShippingInstructions.Address.county'),
count(*)
from J_PURCHASEORDER
where JSON_EXISTS(PO_DOCUMENT ,'$.ShippingInstructions.Address.county')
group by JSON_VALUE(PO_DOCUMENT ,'$.ShippingInstructions.Address.county')
The results show that there are 13 documents where the key is present but empty.
Starting with Oracle Database 12c Release 2, The JSON path expression used with JSON_EXISTS supported predicates. This significantly extends the power of JSON_EXISTS by enabling JSON_PATH expressions to include conditions on the value of the key, as well as the presence of a key.
Predicates are specified by adding a ‘?’ following the parent key and then specifying conditions in parenthesis. Within the predicate the @ sign is used to reference the parent key. A variable is indicated by a name prefixed with a $ sign. The value of the variable is set using the PASSING clause of the JSON_EXISTS operator. This allows the value to supplied using a bind variable when developing applications.
The following statement shows a very simple example of using a predicate in a JSON path Expression/ In this example a predicate is placed on the value of the PONumber key. PONumber is a member of the top level JSON object.
select j.PO_DOCUMENT
from J_PURCHASEORDER j
where JSON_EXISTS(
PO_DOCUMENT,
'$?(@.PONumber == $PO_NUMBER)'
passing 450 as "PO_NUMBER"
)
The predicate is this case is “@.PONumber == $PO_.NUMBER”. Since the PONumber key is part of the top-level object the JSON path expression in this example starts with a ‘$’. The presence of the predicate is indicated by adding “ ?()” to the JSON path expression. The predicate is then placed inside the parenthesis.
The value of the variable PO_NUMBER is supplied using the PASSING clause of the JSON_EXISTS operator.
In the next example a predicate is placed on the value of the UPCCode key. UPCCode occurs inside Part which in turn is one of the keys found in the objects that make up the LineItems array.
select j.PO_DOCUMENT.PONumber PO_NUMBER
from J_PURCHASEORDER j
where JSON_EXISTS(
PO_DOCUMENT,
'$?(@.LineItems.Part.UPCCode == $UPC)'
passing 27616854773 as "UPC"
)
The predicate in this case is “@.LineItems.Part.UPCode == $UPC”. Since LineItems contains an array, the condition will be met if any member of the array contains a Part object with a UPCCode key whose value matches the specified condition.
The next statement shows how to provide multiple predicates and how to perform a numeric range query. The first condition is on the value of the User key. User is member of the top-level object. The second condition is on the value of the Quantity key. Quantity occurs in each of the objects that make up the array LineItems. The two conditions are linked using the ‘&&’ operator. The ‘>’ conditional in the second condition indicates that a range search is required.
select count(*)
from J_PURCHASEORDER j
where JSON_EXISTS(
PO_DOCUMENT,
'$?(@.User == $USER && @.LineItems.Quantity > $QUANTITY)'
passing 'AKHOO' as "USER", 8 as "QUANTITY"
)
In this example the first predicate contains two conditions.
The first condition is on the value of the User key. The second condition is an existence check on the contents of the LineItems array. The two conditions are linked using the ‘&&’ operator.
The exists operator is used when working with arrays. The exists operation returns true if a single member of the array contains an object that satisfies all of the conditions specified by the supplied path expression. It returns false otherwise.
Without the exists operator the document could be selected even though no single member of the array satisfies all the conditions specified in the path expression. As long as each of the individual conditions specified in the path expression is satisfied by at least one member of the array the document would be selected.The above example uses a nested predicate to check if any member of the array contains a Part object where the value of UPCCode key matches the value specified using the variable UPC and a Quantity key whose value is greater than the value specified using the variable QUANTITY. If any member of the array satisfies the supplied predicates the exists operator returns true and the document is included in the results set.
select j.PO_DOCUMENT.PONumber PO_NUMBER
from J_PURCHASEORDER j
where JSON_EXISTS(
PO_DOCUMENT,
'$?(
@.User == $USER
&&
exists(
@.LineItems?(@.Part.UPCCode == $UPC && @.Quantity > $QUANTITY)
)
)'
passing 'AKHOO' as "USER",
17153100211 as "UPC",
8 as "QUANTITY"
)
/
Oracle Database supports multiple ways of indexing JSON documents. There are two approaches to indexing JSON content. The first is to create functional indexes based on the JSON_VALUE operator. The second is to leverage Oracle’s full-text indexing capabilities to create an inverted list index on the JSON documents.
Indexing scalar values using JSON_VALUE
JSON_VALUE can be used to create an index on any key in a JSON document as long as the following two conditions are met.
Indexes created using JSON_VALUE can be conventional B-TREE indexes or BITMAP indexes.
The first statement shows how to create a unique B-Tree index on the contents of the PoNumber key.
create unique index PO_NUMBER_IDX
on J_PURCHASEORDER (
JSON_VALUE(
PO_DOCUMENT,'$.PONumber' returning NUMBER(10) ERROR ON ERROR NULL ON EMPTY
)
)
As can be seen when a query includes a predicate on the expression that was used to create the index the index is used to satisfy the predicate
select PO_DOCUMENT
from J_PURCHASEORDER j
where JSON_VALUE(
PO_DOCUMENT,'$.PONumber' returning NUMBER(10) ERROR ON ERROR
) = 123
/
explain plan for
select PO_DOCUMENT
from J_PURCHASEORDER j
where JSON_VALUE(
PO_DOCUMENT,'$.PONumber' returning NUMBER(10) ERROR ON ERROR
) = 123
/
select * from TABLE(DBMS_XPLAN.DISPLAY)
/
The second statement shows how to create a BITMAP index on the contents of the CostCenter key
create bitmap index COSTCENTER_IDX
on J_PURCHASEORDER (JSON_VALUE(PO_DOCUMENT ,'$.CostCenter'))
The third statement shows how to create a B-Tree index on the contents of the zipCode key
create index ZIPCODE_IDX
on J_PURCHASEORDER(
JSON_VALUE(
PO_DOCUMENT,
'$.ShippingInstructions.Address.zipCode'
returning NUMBER(5)
)
)
Note the use of the returning clause in the JSON_VALUE operator to create a numeric index.
In addition to creating index on specific nodes, Oracle Database also supports creating a JSON Search Index. A JSON Search Index allows a single index to optimize JSON Path operations on any key. A JSON search index required no advanced knowledge of structure of the JSON documents that are being indexed, or the JSON path expressions that will be used when searching.
In Oracle Database 12c release 2 the DDL statements required to create a JSON search index were significantly simplified. The following statement will create the JSON Search Index.
create search Index JSON_SEARCH_INDEX
on J_PURCHASEORDER (PO_DOCUMENT) for json
The new “CREATE SEARCH INDEX” syntax makes creating a JSON Search Index much more like creating any other kind of index. It is no longer necessary to create preferences, and provide information about section groups and lexars when creating a search index. The parameters clause may still be used to override the default settings of certain configurable options, such as when to sync the index.
As well as being much easier to create, in Oracle Database 12c Release 2 the JSON search index had a number of significant enhancements
These enhancements make it a much more general solution for indexing JSON Documents
The JSON Dataguide allows you to discover information about the structure and content of JSON documents stored in Oracle Database. This information can be used in a number of ways including
In order to generate a JSON dataguide we need to perform an analysis of the JSON content that the database is managing. The following statements show how to create a dataguide on a set of JSON documents that have been stored in an Oracle Database, and how to use the dataguide to discover metadata about the documents stored in the database. It also shows how to use the dataguide to create relational views that expose the content of the JSON documents in a manner that enables SQL-based reporting and analytics.
There are two ways to create a dataguide. The first one is to use json_dataguide aggregate operator.
select json_dataguide(po_document) from j_purchaseorder
Just like any other aggregate operator, json_dataguide scans the column with JSON documents and builds dataguide on the fly. Users can add a where clause or a sampling clause to the statement.
Where clause:
select json_dataguide(po_document) from j_purchaseorder where JSON_VALUE(PO_DOCUMENT ,'$.PONumber' returning NUMBER(10)) <= 450
Sampling clause to reduce the cost of the operation on a large table:
select json_dataguide(po_document) from j_purchaseorder sample (20)
json_dataguide returns a flat formatted dataguide by default. It is a relational view of the data showing all of the possible JSON path expressions that can be used to access the content stored in the JSON documents. Users can change the returning dataguide format to be JSON Schema by passing argument dbms_json.format_hierarchical to the operator.
select json_dataguide(po_document, dbms_json.format_hierarchical) from j_purchaseorder
Users can also pass dbms_json.pretty argument to pretty print the dataguide with indentation.
select json_dataguide(po_document, dbms_json.format_hierarchical, dbms_json.pretty) from j_purchaseorder
Once the JSON dataguide has been created, we can use it to understand the structure of the JSON documents that have been stored in the column that has been indexed. A new PL/SQL package, DBMS_JSON, has been introduced to assist with this.
The next statement shows how to use the create_view procedure to create a relational view based on the information in the dataguide. The procedure takes 4 arguments: The name of the view, the name of the underlying table, the name of the column and the hierarchical formatted dataguide from json_dataguide operator.
declare
dg clob;
begin
select json_dataguide(po_document, dbms_json.format_hierarchical) into dg from j_purchaseorder;
dbms_json.create_view('J_PURCHASEORDER_VIEW1', 'J_PURCHASEORDER', 'PO_DOCUMENT', dg, resolveNameConflicts=>TRUE);
end;
/
Starting from 19c, create_view procedure can automatically resolve view name conflicts if users specify "resolveNameConflicts=>TRUE" in the procedur; it can also take "path" argument to only expand the given nested collection path. Please refer to create_view_on_path below for more detailed explanation.
The other way to create a dataguide is to use JSON search index infrastructure. The syntax used to create a database guide is similar to the syntax used to create a search index.
drop index JSON_SEARCH_INDEX
/
create search index JSON_SEARCH_INDEX on J_PURCHASEORDER (PO_DOCUMENT) for json parameters('search_on none dataguide on');
This example creates a basic JSON dataguide that will track of all the keys used by all the documents stored in the PO_DOCUMENT column of table J_PURCHASEORDER.
Since the dataguide is based on the JSON search index infrastructure we have the option of using a single index to drive both the dataguide and search operations. However if we only require the dataguide functionality we can avoid the overhead associated with maintaining the search index by using the parameters clause of the create search index statement to specify the options “search_on none” and “dataguide on”.Comparing to json_dataguide, this approach automatically maintains dataguide when JSON documents with new fields are inserted. On the other hand, it increases DML overhead.
The method get_index_dataguide() can return the dataguide stored in JSON search index in two formats, just like json_datataguide operator. The second provides a relational view of the data showing all of the possible JSON path expressions that can be used to access the content stored in the JSON documents. The second uses JSON Schema (http://json-schema.org/documentation.html) to represent the information. Both representations also provide information about the data type and size of the value associated with the JSON path expression.
The next statement shows how to use the get_index_dataguide method to generate a JSON Schema document that describes the documents analyzed by the dataguide.
select DBMS_JSON.GET_INDEX_DATAGUIDE('J_PURCHASEORDER', 'PO_DOCUMENT', DBMS_JSON.FORMAT_HIERARCHICAL, DBMS_JSON.PRETTY ) "HIERARCHICAL DATA GUIDE" from dual;
The next statement shows how to request the flattened representation of the dataguide.
select DBMS_JSON.GET_INDEX_DATAGUIDE('J_PURCHASEORDER', 'PO_DOCUMENT', DBMS_JSON.FORMAT_FLAT, DBMS_JSON.PRETTY ) "FLAT DATA GUIDE" from dual;
The next statement shows how to use the create_view_on_path procedure to create a relational view based on the information in the dataguide. The procedure takes 4 arguments: The name of the view, the name of the underlying table, the name of the column and a JSON path expression that identifies which JSON path expressions are to be included in the generated view.
begin
DBMS_JSON.CREATE_VIEW_ON_PATH( 'J_PURCHASEORDER_VIEW', 'J_PURCHASEORDER', 'PO_DOCUMENT', '$' );
end;
In this example the JSON path expression supplied to the procedure is ‘$’. This means that the view will contain one column for each JSON path expression that can occur in the set of documents that have been analyzed. The view will expand all possible nested collections and contain one row for each member of the deepest nested arrays. If the document contains more than one hierarchy, right outer join logic is used to ensure that each of the hierarchies present is represented by rows in the view.
The names of the columns in the view are generated from the JSON path expressions that provide the content of the column. Name mangling will occur when the path expression cannot be converted directly into a valid SQL column name. This can lead to some less than intuitive column names. In order to avoid this, package DBMS_JSON provides a procedure called rename that can be used to associate a user-friendly name with a JSON path expression.
The next statement shows how to use the rename procedure to provide user-friendly names for some of the columns derived from the PurchaseOrder documents. The procedure takes five arguments, the name of the table and column that contains the JSON documents, the JSON path expression, the data type that should be used for the value identified by the JSON path expression, and the name that should be used for any column based on the JSON path.
begin
DBMS_JSON.RENAME_COLUMN( 'J_PURCHASEORDER','PO_DOCUMENT', '$.PONumber',DBMS_JSON.TYPE_NUMBER,'PO_NUMBER');
DBMS_JSON.RENAME_COLUMN( 'J_PURCHASEORDER','PO_DOCUMENT', '$.Reference',DBMS_JSON.TYPE_STRING,'REFERENCE');
DBMS_JSON.RENAME_COLUMN( 'J_PURCHASEORDER','PO_DOCUMENT', '$.LineItems.Part.UPCCode',DBMS_JSON.TYPE_NUMBER, 'UPC_CODE');
end;
Once user-friendly names have been associated with the JSON path expressions, creating or recreating the view using procedure create_view_on_path will result in the creation of views which use the user-friendly column names.
begin
DBMS_JSON.CREATE_VIEW_ON_PATH( 'J_PURCHASEORDER_VIEW', 'J_PURCHASEORDER', 'PO_DOCUMENT', '$' );
end;
The next statement shows that views created using this technique function just like any other relational view. This allows the full power of the Oracle SQL language to be used to query JSON content and ensure that tools and programmers who only understand relational techniques can work with content stored as JSON in an Oracle database.
select REFERENCE from J_PURCHASEORDER_VIEW where PO_NUMBER between 500 and 504
JSON Dataguide and virtual columns
Virtual columns can also be used to expose JSON content in a relational manner. The Virtual Columns are appended to the table that includes the column that contains the JSON content. Each Virtual Column can expose the value identified by a JSON path expression, as long as the JSON path expression identifies a key that can occur at most once in each document.This section of the Hands-on-Lab shows how to create a JSON dataguide that automatically appends Virtual Columns to a table containing JSON documents as new JSON path expressions are detected by the dataguide.
The following statements show how to use a dataguide that will expose JSON content using Virtual Columns.
The second statement shows how to modify the parameters clause of the create search index statement to create a JSON search index that will automatically add Virtual columns to the base table. Note the use of the “on change add_vc” clause rather than the “dataguide on” clause.
DROP INDEX JSON_SEARCH_INDEX
/
create search index JSON_SEARCH_INDEX on J_PURCHASEORDER (PO_DOCUMENT) for json parameters('sync (on commit) SEARCH_ON NONE DATAGUIDE ON CHANGE ADD_VC');
The third statement shows that after the dataguide has been created the base table has been modified. A Virtual Column has been appended to the base table for each JSON path expression that references a leaf level key that occurs no more than once per document.
describe J_PURCHASEORDER
This technique cannot be used with keys that are occur within an array, so in this example no columns are generated for JSON path expressions that include the keys Phone and LineItems, since the content of these items are arrays.
The fourth statement shows that you can use SQL to query the Virtual Columns added by the dataguide just like any other column in the table.
select "PO_DOCUMENT$PONumber", "PO_DOCUMENT$User" from J_PURCHASEORDER where "PO_DOCUMENT$PONumber" between 500 and 504
The fifth statement shows what happens when you insert a JSON document that contains keys that were not present in any of the documents stored in the table when the dataguide was created.
insert into J_PURCHASEORDER (ID, DATE_LOADED, PO_DOCUMENT) values (SYS_GUID(),SYSTIMESTAMP,'{"NewKey1":1,"NewKey2":"AAAA"}');
commit;
As can be seen the insert succeeded, as would be expected since the document contained valid JSON. However, if we describe the table after the insert has been committed, and after the index is synchronized (which happens automatically since the “sync (on commit)” option was specified as one of the parameters passed to the create search index statement), we see that the dataguide has automatically appended Virtual Columns corresponding to the new keys to the base table.
describe J_PURCHASEORDER
The last statement shows that the new columns can be queried just like any other columns.
select "PO_DOCUMENT$NewKey1" KEY, "PO_DOCUMENT$NewKey2" VALUE from J_PURCHASEORDER where "PO_DOCUMENT$NewKey1" = 1
Oracle Database includes support for new SQL/JSON operators that allow JSON documents to be generated directly from a SQL statement without having to resort to error-prone string concatenation techniques. These operators are
The first statement shows how to use JSON_ARRAY. JSON_ARRAY returns each row of data generated by the SQL query as a JSON Array. The input to JSON_ARRAY is a list of columns. Each column will be mapped to one member of the array.
select JSON_ARRAY(
DEPARTMENT_ID, DEPARTMENT_NAME, MANAGER_ID, LOCATION_ID
) DEPARTMENT
from HR.DEPARTMENTS
where DEPARTMENT_ID < 110;
The second statement shows how to use JSON_OBJECT. JSON_OBJECT returns each row of data generated by the SQL Query as a JSON object. The generated object contains a key:value pair for each column referenced in the JSON_OBJECT operator. The name of the key can be supplied using a SQL string or a column.
SELECT JSON_OBJECT(
'departmentId' IS d.DEPARTMENT_ID,
'name' IS d.DEPARTMENT_NAME,
'manager' IS d.MANAGER_ID,
'location' IS d.LOCATION_ID
) DEPARTMENT
from HR.DEPARTMENTS d
where DEPARTMENT_ID < 110;
The third statement shows how a column name can be used generate the name of the key from the value of the column
select JSON_OBJECT(OBJECT_TYPE is OBJECT_NAME, 'Status' is STATUS)
from USER_OBJECTS
where ROWNUM < 10;
The fourth statement shows how to nest JSON_OBJECT operators to generate nested JSON structures. In this example a nested JSON_OBJECT operator is used to generate the object associated with the “manager” key. The object contains information about the department’s manager.
SELECT JSON_OBJECT(
'departmentId' IS d.DEPARTMENT_ID,
'name' IS d.DEPARTMENT_NAME,
'manager' is JSON_OBJECT(
'employeeId' is EMPLOYEE_ID,
'firstName' is FIRST_NAME,
'lastName' is LAST_NAME,
'emailAddress' is EMAIL
),
'location' is d.LOCATION_ID
) DEPT_WITH_MGR
from HR.DEPARTMENTS d, HR.EMPLOYEES e
where d.MANAGER_ID is not null
and d.MANAGER_ID = e.EMPLOYEE_ID
and d.DEPARTMENT_ID = 10;
The fifth statement shows how to use the operator JSON_ARRAYAGG to generate JSON arrays from the results of a sub query that returns multiple rows. In this example we are showing the employees for each department as a JSON array inside the department document.
select JSON_OBJECT(
'departmentId' is d.DEPARTMENT_ID,
'name' is d. DEPARTMENT_NAME,
'employees' is(
select JSON_ARRAYAGG(
JSON_OBJECT(
'employeeId' is EMPLOYEE_ID,
'firstName' is FIRST_NAME,
'lastName' is LAST_NAME,
'emailAddress' is EMAIL
)
)
from HR.EMPLOYEES e
where e.DEPARTMENT_ID = d.DEPARTMENT_ID
)
) DEPT_WITH_EMPLOYEES
from HR.DEPARTMENTS d
where DEPARTMENT_NAME = 'Executive';
The sixth statement shows the contents of view EMPLOYEE_KEY_VALUE. This view stores information about employees using a Key:Value pair storage model.
select ID, KEY, VALUE
from EMPLOYEE_KEY_VALUE
where ID < 105
order by ID, KEY;
The seventh statement shows the use of JSON_OBJECTAGG. JSON_OBJECTAGG makes it easy to generate a JSON object from data that has been stored using Key, Value pair storage.
select JSON_OBJECTAGG(KEY,VALUE)
from EMPLOYEE_KEY_VALUE
where ID < 105
group by ID;
Oracle Database provides new PL/SQL APIs for manipulating JSON documents stored in an Oracle Database. These APIs make it possible to add, update and remove key:value pairs. The APIs allow you to navigate the structure of a JSON document, in a manner that is very similar to the way in which the XML Document Object Model (DOM) enables navigation of an XML document.
The PL/SQL JSON API consists of three PL/SQL objects, JSON_ELEMENT_T, JSON_OBJECT_T and JSON_ARRAY_T. JSON_OJBECT_T and JSON_ARRAY_T extend JSON_ELEMENT_T, so they inherit all of JSON_ELEMENT_T’s methods.
Any JSON document can be represented by combining instances of these objects. All PL/SQL operators that return JSON content return an instance of JSON_ELEMENT_T. The JSON_ELEMENT_T instance can be cast to either JSON_OBJECT_T or JSON_ARRAY_T before further operations are performed on the data.
Let's start of by taking a look at the new PL/SQL objects that are used when processing JSON documents.
--
-- desc JSON_ELEMENT_T
--
select xdburitype('/public/tutorials/json/describe/JSON_ELEMENT_T.log').getClob() "DESCRIBE JSON_ELEMENT_T" from dual
/
JSON_ELEMENT_T provides a number of important methods. The most important are PARSE and STRINGIFY. PARSE is used to convert textual JSON into an instance of JSON_ELEMENT_T. STRINGIFY is used to generate textual JSON from a PL/SQL JSON_ELEMENT_T object.
JSON_ELEMENT_T also provides a set of methods for extracting the value part of a key:value pair as a PL/SQL data type, and methods for determining what the actual concrete type of any given instance of JSON_ELEMENT_T is.
Now let's take a look at the PL/SQL object JSON_OBJECT_T
--
-- desc JSON_OBJECT_T
--
select xdburitype('/public/tutorials/json/describe/JSON_OBJECT_T.log').getClob() "DESCRIBE JSON_OBJECT_T" from dual
/
In addition to the methods inherited from JSON_ELEMENT_T, JSON_OBJECT_T provides methods for manipulating the keys associated with a JSON object. This includes getters and setters for all of the data types supported by the API and methods for listing, adding, removing and renaming keys.
Next we'll look at the PL/SQL object JSON_ARRAY_T.
--
-- desc JSON_ARRAY_T
--
select xdburitype('/public/tutorials/json/describe/JSON_ARRAY_T.log').getClob() "DESCRIBE JSON_ARRAY_T" from dual
/
In addition to the methods inherited from JSON_ELEMENT_T, JSON_ ARRAY _T provides methods for working with the contents of a JSON array. This includes methods for counting the number of items in an array, accessing an item of the array, adding an item to the array and removing an item from the array.
The next section of the Hands-on-Lab shows how we can use the PL/SQL API for JSON to access and update JSON content stored in Oracle Database without having to resort to complex and error-prone string manipulation techniques. In this example we are going to update the value of the UPCCode key for any object where the current value of the UPCCode key is '17153100211' and add a DateModified key to the affec ted objects.
This code sample shows how to use the new PL/SQL API for JSON to access and update JSON content stored in an Oracle Database.
The first statement uses JSON_TABLE to show the current values of the keys PONumber, ItemNumber, and DateModified from documents where the LineItems array contains a object with a UPCCode key whose value is 17153100211. Since the UPCCode predicate is specified as a SQL condition on the output of the JSON table operator only rows generated from objects which satisfy the UPCCode condition are included in the results set.
select LI.*
from J_PURCHASEORDER j,
JSON_TABLE(
J.PO_DOCUMENT,
'$'
columns(
PO_NUMBER NUMBER(5) PATH '$.PONumber',
NESTED PATH '$.LineItems[*]'
columns(
ITEM_NUMBER NUMBER(4) PATH '$.ItemNumber',
UPC_CODE VARCHAR2(14) PATH '$.Part.UPCCode',
UPDATED VARCHAR2(38) PATH '$.Part.DateModified'
)
)
) LI
where UPC_CODE = '17153100211'
The next statement shows a PL/SQL block that performs the following operations
declare
cursor getDocuments
is
select PO_DOCUMENT
from J_PURCHASEORDER j
where JSON_EXISTS(
PO_DOCUMENT,
'$?(@.LineItems.Part.UPCCode == $UPC)'
passing 17153100211 as "UPC"
)
for UPDATE;
V_RESULT JSON_ELEMENT_T;
V_DOCUMENT_OBJECT JSON_OBJECT_T;
V_LINEITEMS_ARRAY JSON_ARRAY_T;
V_LINEITEM_OBJECT JSON_OBJECT_T;
V_PART_OBJECT JSON_OBJECT_T;
V_NEW_DOCUMENT VARCHAR2(4000);
begin
for doc in getDocuments loop
V_RESULT := JSON_ELEMENT_T.parse(doc.PO_DOCUMENT);
V_DOCUMENT_OBJECT := treat (V_RESULT as JSON_OBJECT_T);
V_RESULT := V_DOCUMENT_OBJECT.get('LineItems');
if (V_RESULT.is_array()) then
V_LINEITEMS_ARRAY := treat ( V_RESULT as JSON_ARRAY_T);
for i in 1..V_LINEITEMS_ARRAY.GET_SIZE() loop
V_RESULT := V_LINEITEMS_ARRAY.get(i-1);
if (V_RESULT.is_Object()) then
V_LINEITEM_OBJECT := treat (V_RESULT as JSON_OBJECT_T);
V_PART_OBJECT := treat(V_LINEITEM_OBJECT.get('Part')
as JSON_OBJECT_T);
if (V_PART_OBJECT.get_String('UPCCode') = '17153100211') then
V_PART_OBJECT.put('UPCCode','9999999999');
V_PART_OBJECT.put('DateModified',to_char(SYSTIMESTAMP,
'YYYY-MM-DD"T"HH24:MI:SSTZH:TZM'));
end if;
end if;
end loop;
end if;
V_NEW_DOCUMENT := V_DOCUMENT_OBJECT.stringify();
update J_PURCHASEORDER
set PO_DOCUMENT = V_NEW_DOCUMENT
where current of getDocuments;
end loop;
commit;
end;
The fourth statement shows that after the PL/SQL block has executed no documents exist where the UPCCode key contains the value 17153100211
select count(*)
from J_PURCHASEORDER j
where JSON_EXISTS(
PO_DOCUMENT,'$?(@.LineItems.Part.UPCCode == $UPC)'
passing 17153100211 as "UPC"
)
The final statement shows that the objects that satisfied the predicate UPCCODE = 17153100211 now satisfy the predicate UPCCODE = 9999999999 and that each of these objects now has a value associated with the key DateModified.
select LI.*
from J_PURCHASEORDER j,
JSON_TABLE(
J.PO_DOCUMENT,
'$'
columns(
PO_NUMBER NUMBER(5) PATH '$.PONumber',
NESTED PATH '$.LineItems[*]'
columns(
ITEM_NUMBER NUMBER(4) PATH '$.ItemNumber',
UPC_CODE VARCHAR2(14) PATH '$.Part.UPCCode',
UPDATED VARCHAR2(38) PATH '$.Part.DateModified'
)
)
) LI
where UPC_CODE = '9999999999'
GeoJSON is a popular standard for representing spatial information in JSON documents. Oracle Database provides an extension to the JSON_VALUE operator that make it possible to use Oracle Database’s powerful spatial features on JSON content containing information encoded using the GeoJSON standard. This allows location based queries to be performed on JSON documents.
In the following example, table CITY_LOT_FEATURES contains information about San Francisco city lots, including their coordinates. The coordinates are specified using GeoJSON. We'll start by creating a table containing FEATURE documents from the contents of the cityLots.json file.
First create a table with JSON Documents that include GeoJSON encoded information.
drop table J_PURCHASEORDER
/
drop table CITY_LOT_FEATURES
/
create table CITY_LOT_FEATURES
as
select JSON_DOCUMENT as FEATURE
from JSON_TABLE(
xdburitype('/public/tutorials/json/testdata/cityLots.json').getClob(),
'$[*]'
columns (
JSON_DOCUMENT VARCHAR2(32000) FORMAT JSON PATH '$'
)
)
where rownum < 1000
Let start by looking at the details of a feature object.
select JSON_QUERY(FEATURE, '$' PRETTY)
from CITY_LOT_FEATURES
where JSON_EXISTS(
FEATURE,
'$?(@.properties.MAPBLKLOT == $LOT)'
passing '0001001' as "LOT"
)
As can be seen a “Feature” consists of a properties object that contains information that describes the feature and a GeoJSON “geometry” object that, in this example, consists of a set of coordinates that define a polygon that identifies where the feature is located.
The remaining code snippets show how to query and index JSON documents that contain GeoJSON content.
The firststatement shows how to use JSON_VALUE to extract the GeoJSON information as an Oracle SDO_GEOMETRY object.
select XMLTYPE(JSON_VALUE(
FEATURE,
'$.geometry'
returning SDO_GEOMETRY
ERROR ON ERROR
))
from CITY_LOT_FEATURES
where JSON_EXISTS(
FEATURE,
'$?(@.properties.MAPBLKLOT == $LOT)'
passing '0001001' as "LOT"
)
JSON_VALUE has been extended to support using Oracle Spatial to convert GeoJSON encoded location information into SDO_GEOMETRY objects. Generating SDO_GEOMETRY objects from GeoJSON allows all of the Oracle Spatial functionality to be used on this kind of content.
Since LiveSQL does not currently support rendering Oracle Object data types (in the case an SDO_GEOMETRY object) the SDO_GEOMETRY object returned by JSON_VALUE has been wrapped in an XMLTYPE constructor to generate an XML representation of the SDO_GEOMETRY object which can be rendered using LiveSQL
The second statement shows how this allows the creation of spatial indexes on GeoJSON content.
create index FEATURE_GEO_INDEX
on CITY_LOT_FEATURES(
JSON_VALUE(
FEATURE,
'$.geometry'
returning SDO_GEOMETRY
ERROR ON ERROR
)
)
indextype is mdsys.spatial_index
The fourth statement shows how we can use Oracle Spatial operators to query GeoJSON content. In this example we return the GeoJSON that describes any objects that are within 50 meters of the chosen object.
select JSON_QUERY(s.FEATURE, '$.geometry') SOURCE_LOCATION,
JSON_QUERY(t.FEATURE, '$.geometry') TARGET_LOCATION,
JSON_QUERY(t.FEATURE, '$.properties') TARGET_PROPERTIES,
SDO_GEOM.SDO_DISTANCE (
JSON_VALUE(t.FEATURE, '$.geometry' returning SDO_GEOMETRY ERROR ON ERROR),
JSON_VALUE(s.FEATURE, '$.geometry' returning SDO_GEOMETRY ERROR ON ERROR),
.05
) DISTANCE
from CITY_LOT_FEATURES s, CITY_LOT_FEATURES t
where JSON_EXISTS(
s.FEATURE,
'$?(@.properties.MAPBLKLOT == $LOT)'
passing '0001001' as "LOT"
)
and SDO_WITHIN_DISTANCE(
JSON_VALUE(t.FEATURE, '$.geometry' returning SDO_GEOMETRY ERROR ON ERROR),
JSON_VALUE(s.FEATURE, '$.geometry' returning SDO_GEOMETRY ERROR ON ERROR),
'distance=50'
) = 'TRUE'
and NOT JSON_EXISTS(
t.FEATURE,
'$?(@.properties.MAPBLKLOT == $LOT)'
passing '0001001' as "LOT"
)
/