Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Published by Scroll Versions from space DEV and version next

...

  1. After you have upgraded to Release 4.0, you must re-enable Pig for job execution. See Running Environment Options.
  2. For each of your existing Python UDFs, author a new one in Java:
    1. This new UDF must have the same name and schema.
    2. Deploy each new Java UDF to the 
      d-s-item
      item
      node
    3. For more information on authoring and deployment, see Java UDFs.
  3. When all of your existing Python UDFs have been converted to Java and installed on the
    d-s-item
    item
    node
    , enable the Spark running environment on the node. See Running Environment Options.
  4. At this point:
    1. When you select Run on
      D s item
      itemServer
      , the
      d-s-serverphoton
       running environment uses the Python UDFs.
      1. In later versions, this selection is Photon.
    2. When you select Run in on Hadoop, the Spark running environment uses the Java UDFs.
      1. In later versions, this selection is Spark.
  5. Verify job results of executions on each running environment for each of your Python UDFs. 
  6. Remove the Python UDFs from the 
    d-s-item
    item
    node
    .

For Release 4.1 upgrade:

...