D toc |
---|
For individual users of the
D s platform |
---|
Info |
---|
NOTE: User-specific custom properties override or append any other custom properties that are passed to Hadoop. Suppose the Java properties file contains a single property.
|
Info |
---|
NOTE: You cannot specify user-specific properties for S3 jobs. |
Enable Java properties directory
This feature is enabled by defining the values for the Java properties directory for Spark in the platform configuration.
D s config |
---|
Steps:
Set the following properties to the directories where the user-specific property files can be stored on the
. Example:D s item item node Code Block "spark.userPropsDirectory": "/opt/trifacta/conf/usr",
- Save your changes and restart the platform.
Required Permissions
For the above locations, the
D s item | ||
---|---|---|
|
- Execute permission on the directory defined in
spark.userPropsDirectory
. - Read permission on the files in them.
Define user-specific properties files
For each user that is passing in custom properties, a separate file must be created in the appropriate directory with the following filename pattern:
Code Block |
---|
userEmail-user.properties |
where:
userEmail
is the email address for the user that is registered with the
D s platform |
---|
joe@example.com
, the filename is joe@example.com-user.properties
. File Format:
Each file must follow Java Properties file format, which is the following:
Code Block |
---|
property.a=value.a property.b=value.b |
Info |
---|
NOTE: Property names must use the full property nameDo not include |
For more information on this format, see https://docs.oracle.com/cd/E23095_01/Platform.93/ATGProgGuide/html/s0204propertiesfileformat01.html.