Page tree

Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.


  • Linux- and Windows-based SFTP servers are supported. 


  • Ingest of over 500 files through SFTP at one time is not supported.
  • You cannot run jobs using Avro or Parquet sources uploaded via SFTP.
  • When you specify a parameterized output as part of your job execution, the specified output location may include additional unnecessary information about the SFTP connection identifier. None of this information is sensitive. This is a known issue.
  • Jobs can be executed from SFTP sources on the following running environments:
    • D s photon
    • HDFS-based Spark, which includes Cloudera and Hortonworks

  • You cannot publish TDE format to SFTP destinations.
  • You cannot publish compressed Snappy files to SFTP destinations.


  • Acquire user credentials to access the SFTP server. You can use username/password credentials or SSH keys. See below.

  • Verify that the credentials can access the proper locations on the server where your data is stored. Initial directory of the user account must be accessible.


By default, this connection type is automatically enabled. If it is not enabled in your environment, please complete the following: 



D s config


Locate the following parameter and set it to true:

Code Block
"feature.sftp.enabled": true,



NOTE: You must provide the protocol identifier and storage locations for the SMTP server. See below.

Configure file storage protocols and locations