Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Published by Scroll Versions from space DEV and version next

...

Warning

When you upgrade to a new version of

D s product
productee
, you must download your CLI packages from the Transformer page after the upgrade is complete. You may be able to execute your packages from the previous release, but backward compatibility of exported CLI packages is not explicitly supported.

 

Changes for Release 6.0

Warning

In the next release of

D s product
productee
after Release 6.0, the Command Line Interface (CLI) will reach its End of Life (EOL). The tools will no longer be included in the software distribution and will not be supported for use against the software platform. You should transition away from using the CLI as soon as possible. For more information, see CLI Migration to APIs.

...

  1. Search your CLI scripts for:

    Code Block
    pig
  2. For references to this type of job, replace with one of the following references, depending on your deployment:

    Job Type ValueDescription
    sparkRuns job in the Spark running environment.
    hadoopRuns jobs in the default running environment for the Hadoop cluster. For this release, that environment is Spark. This setting future-proofs against subsequent changes to the default Hadoop running environment.
    photon

    Runs job on the

    d-s-servernode
    . This setting is only recommended for smaller jobs.

  3. Save your scripts. 
  4. Run an example of each on your upgraded Release 4.1 instance.

...

In Release 4.1 and later, the Javascript execution engine has been deprecated, and the default job type for CLI jobs is now Photon, which is the default execution engine on the 

d-s-item
item
node
.

Info

NOTE: If your CLI scripts do not specify a job_type parameter, the job is executed on the Photon running environment, which replaces the Javascript running environment. If this is acceptable, no action is required.

Otherwise, you must review your scripts and manually specify a job_type parameter for execution.

...

The above option instructs the CLI to execute the job on the default running environment for the integrated Hadoop cluster

  • For upgrading customers who have not enabled Spark, the job is executed on Pig, as in previous releases. 
  • In a future release, when the Pig running environment is deprecated, this setting will still apply to the default running environment (Spark), and your scripts will not need to be updated.

...

Info

NOTE: If you are upgrading from a previous version in which the

d-s-item
item
node
is connected to Hive, you must recreate the Hive connection through the Command Line Interface.

...

For Release 3.1.1, the messages delivered back to standard output and in the JSON response from the 

d-s-item
node
node
 have been made consistent with the parameter names entered at the command line.

...