Skip to main content

 –

Connection tab

Suggest edit Updated on March 11, 2021

From the Connection tab, define all the connection details for the Hadoop host.

Note: Before you can connect to an Apache HBase or HDFS data store, upload the relevant client JAR files into the application container with Pega Platform. For more information, see the Pega Community article JAR files dependencies for the HBase and HDFS data sets.
  1. In the Connection section, specify a master Hadoop host. This host must contain HDFS NameNode and HBase master node.
  2. Optional: To configure settings for HDFS connection, select the Use HDFS configuration check box.
  3. Optional: To configure settings for HBase connection, select the Use HBase configuration check box.
  4. Optional: Enable running external data flows on the Hadoop record. Configure the following objects: Note: You can configure Pega Platform to run predictive models directly on a Hadoop record with an external data flow. Through the Pega Platform, you can view the input for the data flow and its outcome.

    The use of the Hadoop infrastructure lets you process large amounts of data directly on the Hadoop cluster and reduce the data transfer between the Hadoop cluster and the Pega Platform.

Did you find this content helpful? YesNo

Have a question? Get answers now.

Visit the Collaboration Center to ask questions, engage in discussions, share ideas, and help others.

Ready to crush complexity?

Experience the benefits of Pega Community when you log in.

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us