Skip to main content

This content has been archived and is no longer being updated.

Links may not function; however, this content may be relevant to outdated versions of the product.

Connection tab

Suggest edit
Updated on April 5, 2022

From the Connection tab, define all the connection details for the Hadoop host.

Note: Before you can connect to an Apache HBase or HDFS data store, upload the relevant client JAR files into the application container with Pega Platform. For more information, see HDFS and HBase client and server versions supported by Pega Platform.
  1. In the Connection section, specify a master Hadoop host. This host must contain HDFS NameNode and HBase master node.
  2. Optional: To configure settings for HDFS connection, select the Use HDFS configuration checkbox.
  3. Optional: To configure settings for HBase connection, select the Use HBase configuration checkbox.
  4. Optional: Enable running external data flows on the Hadoop record. Configure the following objects: Note: You can configure Pega Platform to run predictive models directly on a Hadoop record with an external data flow. Through the Pega Platform, you can view the input for the data flow and its outcome.

    The use of the Hadoop infrastructure lets you process large amounts of data directly on the Hadoop cluster and reduce the data transfer between the Hadoop cluster and the Pega Platform.

  • Previous topic About Hadoop host configuration (Data-Admin-Hadoop)
  • Next topic Creating a File data set record for embedded files
Did you find this content helpful? YesNo

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us