Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Published by Scroll Versions from space DEV and version next

...

  • Recommended limit of 1 GB in source file size. Since conversion happens within the 
    D s node
    , this limit may vary depending on the memory of the 
    D s node
    .
  • Each JSON record must be less than 20 MB in size.

  • Filename extensions must be .json or .JSON
  • Conversion of compressed JSON files is not supported. Compressed JSON files can be imported using the previous method. See Working with JSON v1. 
  • For best results, all keys and values should be quoted and imported as strings. 

    Info

    NOTE: Escape characters that make JSON invalid can cause your JSON file to fail to import.

    • You can escape quote values to treat them as literals in your strings using the backslash character. For example: \"

    • When the values are imported into the Transformer page, the 
      D s webapp
       re-infers the data type for each column.

...

  1. Through the Import Data page, navigate and select your JSON file for import. 

    Info

    NOTE: File formats are detected based on the file extension. Please verify that your file extension is .json or .JSON, which ensures that it is passed through the conversion service.

    1. The file is passed through the conversion process, which reviews the JSON file and stores it on the base storage layer in a format that can be easily ingested as in row-per-record format. This process happens within the Import Data page. You can track progress on the right side of the screen.

  2. After the file has been converted, click the Preview icon on the right side of the screen. In the Preview, you can review the first few rows of the imported file.

    1. If some rows are missing from the preview, then you may have a syntax error in the first row after the last well-structured row. You should try to fix this in source and re-import.

    2. If all of the rows are problematic, your data is likely malformed.

  3. Complete the rest of the import process. For more information, see  Import Data Page

    Add

  4. In Flow View, add the JSON-based imported dataset to a your flow and create a recipe for it. For more information, see Flow View Page.
    1. Select the recipe, and click Edit Recipe...

...

id

score

short

title

ups

url

Average_score

9kt8ex

19669

bzygw285fpp11.jpg

M/07/1'3" [23lbs > 13lbs = 10lbs] Still a bit to go, but my owner no longer refers to me as his chunky boy!

19669

https://i.redd.it/bzygw285fpp11.jpg

18090.25

9x2774

19171

wbbufmll0cy11.jpg

M/29/5'11" [605 pounds > 375 pounds = 230 pounds lost] (14 months) Still considered super morbidly obese but I've made some good progress.

19171

https://i.redd.it/wbbufmll0cy11.jpg

18090.25

a8guou

16778

3t0kmljnmq521.jpg

F/28/5’7” [233lbs to 130lbs] Got tired of being obese and took control of my life!

16778

https://i.redd.it/3t0kmljnmq521.jpg

18090.25

atla3n

16743

9t6tvsjs16i21.jpg

M/22/5'11" [99lbs > 150lbs = 51lbs] Anorexia my recovery

16743

https://i.redd.it/9t6tvsjs16i21.jpg

18090.25

D s also
inCQLtrue
label((label = "structuring_tasks") OR (label = "json"))