While researching the use of JSON in Oracle 19c I came some interesting behaviour that wasn’t immediately obvious (to me). With this post I am hoping to save you a couple of minutes scratching your head when working with JSON_TABLE(). This is Oracle 19.3.0 on Linux and I’m connecting to it using SQLcl 19.1.
Some background
As part of my JSON-support-in-Oracle research I had a good look at JSON_TABLE. Although complex at first sight, it is a lot less intimidating if you know how to use XMLTABLE :) My goal for this post is to convert a JSON document to a relational structure.
To convert JSON to (relational) rows and columns, I believe I need to use the JSON_value_column from the list of available options in JSON_TABLE‘s JSON_column_definition. Interestingly, and different from the syntax diagram for XMLTABLE(), I’m allowed to specify a JSON_value_return_type. There are quite a a few of them, and I guess the most commonly used ones are going to be varchar2, date and timestamp. This post is about the latter two, since parsing out a text string is pretty simple.
The Test Case
I created a very simple table to test the functionality:
SQL> create table t1 ( 2 id number not null, 3 json_txt clob not null, 4 constraint pk_t1 primary key (id), 5 constraint c_is_json check (json_txt is json) 6 ); Table created.
Essentially a table to hold a JSON column plus a check constraint to make sure that I’m not inserting data that isn’t JSON. Here’s the initial JSON document I started with:
SQL> select json_query(json_txt, '$' pretty error on error) 2 from t1 where id = 1; JSON_QUERY(JSON_TXT,'$'PRETTYERRORONERROR) ------------------------------------------------------------ { "results" : [ { "resultID" : 1, "resultDate" : "13-JUN-19 08:10:00", "details" : "none" } ] }
Thanks to my check constraint I know this is valid JSON. I specified the pretty
keyword to make the output more readable. When developing using new(-ish) features I prefer Oracle to raise an error so I’m going with ERROR ON ERROR
.
These are my NLS settings by the way, this is a system installed with the default locale set to en_US. I have added these settings for reference and they will play a role later on.
SQL> select * from nls_session_parameters 2* where regexp_like(parameter,'(DATE|TIME)'); PARAMETER VALUE ------------------------------ ------------------------------ NLS_DATE_FORMAT DD-MON-RR NLS_DATE_LANGUAGE AMERICAN NLS_TIME_FORMAT HH24.MI.SSXFF NLS_TIMESTAMP_FORMAT DD-MON-RR HH24.MI.SSXFF NLS_TIME_TZ_FORMAT HH24.MI.SSXFF TZR NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH24.MI.SSXFF TZR 6 rows selected.
I am now ready to start testing.
First Test
So let’s get started with the testing. My query is shown here:
SELECT jt.* FROM t1, JSON_TABLE ( t1.json_txt, '$.results[*]' ERROR ON ERROR COLUMNS ( resID NUMBER PATH '$.resultID', resDATE DATE PATH '$.resultDate' ) ) AS jt WHERE t1.id = 1;
Not too dissimilar to using XMLTABLE, I need to pass the column containing my JSON document to the function, followed by the path to the data within the JSON document I want to parse. I could have stored multiple results in the array (and I did initially, but considered that a little too confusing), hence the reference to all of them.
The columns() clause (of type JSON_value_column) allows me to define columns, their respective data type and the path relative to the second argument to JSON_TABLE. In my case that’s $.results.resultID and $.results.resultDate. These two are mapped to resID and resDATE in the output. I am limiting the query to my first row in t1.
When I execute this query against the first record in the table, I get an error:
SQL> SELECT 2 jt.* 3 FROM 4 t1, 5 JSON_TABLE ( t1.json_txt, '$.results[*]' ERROR ON ERROR 6 COLUMNS ( 7 resID NUMBER PATH '$.resultID', 8 resDATE DATE PATH '$.resultDate' 9 ) 10 ) 11 AS jt 12 WHERE 13 t1.id = 1; Error starting at line : 1 in command - SELECT jt.* FROM t1, JSON_TABLE ( t1.json_txt, '$.results[*]' ERROR ON ERROR COLUMNS ( resid NUMBER PATH '$.resultID', resdate DATE PATH '$.resultDate' ) ) AS jt WHERE t1.id = 1 Error report - ORA-01861: literal does not match format string
This is kind of surprising, my suspicion at the time was that something is wrong with the NLS_DATE_FORMAT settings. But whatever I tried, I always got the same error. Using MOS and other resources on the net turned out blank, which might well be down to my inadequate search efforts. Tracing didn’t really help either.
Second Test
I tried a few other things until, with the help of an educated guess, I got lucky: I removed the time portion from the resultDate and formatted it to the output you get from selecting sysdate from dual, as shown here:
SQL> select json_query(json_txt, '$.results' pretty error on error) 2 from t1 where id = 2; JSON_QUERY(JSON_TXT,'$.RESULTS'PRETTYERRORONERROR) ------------------------------------------------------------ [ { "resultID" : 2, "resultDate" : "13-JUN-19", "details" : "none" } ]
This time around, the error message was different:
SQL> SELECT 2 jt.* 3 FROM 4 t1, 5 JSON_TABLE ( t1.json_txt, '$.results[*]' ERROR ON ERROR 6 COLUMNS ( 7 resID NUMBER PATH '$.resultID', 8 resDATE DATE PATH '$.resultDate' 9 ) 10 ) 11 AS jt 12 WHERE 13 t1.id = 2; Error starting at line : 1 in command - SELECT jt.* FROM t1, JSON_TABLE ( t1.json_txt, '$.results[*]' ERROR ON ERROR COLUMNS ( resID NUMBER PATH '$.resultID', resDATE DATE PATH '$.resultDate' ) ) AS jt WHERE t1.id = 2 Error report - ORA-40688: date or time stamp string value does not conform to ISO 8601 format
Aha! That’s much better than ORA-01861: literal does not match format string. So it would appear as if the date the system expected to find in the JSON document must be ISO 8601 compliant. Ummm, what exactly is an ISO 8601 date format? The Internet knew!
Third Test
So maybe using an ISO 8601 time format in the JSON document is the answer? Worth a try, the JSON document now looks like this:
SQL> select json_query(json_txt, '$.results' pretty error on error) 2 from t1 where id = 3; JSON_QUERY(JSON_TXT,'$.RESULTS'PRETTYERRORONERROR) ------------------------------------------------------------ [ { "resultID" : 3, "resultDate" : "2019-06-13T16:39:00+00:00", "details" : "none" } ]
It seems that ISO 8601 requires a slightly more complex time expression, much more like a timestamp in Oracle. Let’s see if this works:
SQL> SELECT 2 jt.* 3 FROM 4 t1, 5 JSON_TABLE ( t1.json_txt, '$.results[*]' ERROR ON ERROR 6 COLUMNS ( 7 resID NUMBER PATH '$.resultID', 8 resDATE DATE PATH '$.resultDate' 9 ) 10 ) 11 AS jt 12 WHERE 13 t1.id = 3; RESID RESDATE ---------- --------- 3 13-JUN-19
Hurray, no errors, but result data for a change!
Caveat
But wait, that’s not the whole story. I have stored a time component in the input document, so how do I get that? I’d normally use to_char() for that purpose, so let’s give that a go:
SQL> SELECT 2 to_char(jt.resDATE, 'dd.mm.yyyy hh24:mi:ss') as formattedDate 3 FROM 4 t1, 5 JSON_TABLE ( t1.json_txt, '$.results[*]' ERROR ON ERROR 6 COLUMNS ( 7 resID NUMBER PATH '$.resultID', 8 resDATE DATE PATH '$.resultDate' 9 ) 10 ) 11 AS jt 12 WHERE 13 t1.id = 3; FORMATTEDDATE ------------------- 13.06.2019 00:00:00
Hmmm, that’s midnight, and not 16:39:00+00:00 …. Unless I have missed something important it seems as if the DATE column type in the columns() clause stripped the input of its time component. Not to worry though, there’s a timestamp type. Maybe that can help?
SQL> SELECT 2 jt.* 3 FROM 4 t1, 5 JSON_TABLE ( t1.json_txt, '$.results[*]' ERROR ON ERROR 6 COLUMNS ( 7 resID NUMBER PATH '$.resultID', 8 resDATE TIMESTAMP PATH '$.resultDate' 9 ) 10 ) 11 AS jt 12 WHERE 13 t1.id = 3; RESID RESDATE ---------- ---------------------------- 3 13-JUN-19 16.39.00.000000000
Voila! I’ve got a date and time.
Summary
From my very limited testing it appears as if input date in the JSON document has to be provided in ISO 8601 format, or otherwise Oracle raises an error. Unlike in SQLLDR’s control file there appears to be no option to tell Oracle about the formatting. Changing the NLS_DATE_FORMAT (not shown here) did affect the output of the query only. It didn’t appear to be applied when parsing the input.