This section describes how to enable or disable logging of user activities and transfer of the logs to . When this feature is enabled, user activities are captured locally on the in a series of log files. Periodically, these log files are uploaded to a predefined S3 bucket, where can ingest the logging activity to improve the product and assist in troubleshooting.
Tip: This feature is useful for providing better suggestions and machine-based learning to the instance.
NOTE: During initial deployment, this service may be enabled for you. You can use the information below to disable the service.
captures the following types of usage information, which are available in different releases.
For more information on the data that is captured, see https://community.trifacta.com/s/article/Trifacta-Usage-Data-Collection-1515802070895.
proprietary capture of information about the
|Segment Analytics||Analytics for various common data segments, such as Google and Marketo.|
The following configuration steps must be completed:
Open user logging port, if not already opened.
To enable this service, customers must file a support ticket with . In your request, please include a request for the appropriate API write key values to insert in the configuration. Details are below.
The platform's custom-built telemetry system is controlled by the following config field in :
|When set to |
The following settings apply to segment analytics.
For remote analytics, this property identifies the segment API writeKey matching the project to which to push. Within the Segment project lives the configuration for each sink (e.g Google Analytics, Marketo).
The channels for which to record data:
For remote analytics, this value and the
|When set to |
NOTE: To receive the full benefits of this feature, the must be able to connect to the public Internet.
On the server hosting the , the following port must be opened:
To connect to S3, the requires that a set of credentials be generated and stored in the following directory. This credentials is provided by .
To regularly upload the generated logs to , you can configure a cron job to transfer the files.
log-forwarding.js. You should run this once per day.
An example command to run this script from the deployment directory is the following:
node bin/log-forwarding/src/log-forwarding.js protobuf-events.log segment-proto.log cleaned-join-logs.txt
To disable the service, set
false. Then, restart the platform.