Loading Additional Drivers
This article describes how to add external drivers not included with the MDM packages.
Overview
Drivers for some of the third-party data sources and other components are not included in standard packages due to the size or licensing reasons.
It is, however, possible for the MDM Server to load those additional drivers from a designated external drivers folder named lib-ext.
The individual JARs can be stored in subfolders.
See the following sections for examples of configuration and lists of drivers required by the third-party components.
Data sources
HBase
| Required JAR |
|---|
|
The required configuration is as follows:
<dataSource driverClass="cdata.jdbc.apachehbase.ApacheHBaseDriver" url="jdbc:apachehbase:Server=127.0.0.1;Port=8080;" name="HBase-CData" user="<user_name>" password="<password>">
<properties/>
<propertiesEncrypted/>
<propertiesFile/>
<propertiesFileEncrypted/>
</dataSource>
Databricks
| Driver group (image-based) |
|---|
|
The databricks driver group provides the Apache Calcite libraries required to run MDM data operations on Databricks execution clusters.
For cluster connection configuration, see Databricks connector documentation
Amazon Redshift
| Driver group (image-based) |
|---|
|
Required JAR (manual install) |
|
The required configuration is as follows:
<dataSource driverClass="com.amazon.redshift.jdbc42.Driver" url="jdbc:redshift://<host>:<port>/<database>" name="<name>" user="<user_name>" password="<password>">
<properties/>
<propertiesEncrypted/>
<propertiesFile/>
<propertiesFileEncrypted/>
</dataSource>
Snowflake
| Driver group (image-based) |
|---|
|
Required JAR (manual install) |
|
The required configuration is as follows:
<dataSource driverClass="net.snowflake.client.jdbc.SnowflakeDriver" url="jdbc:snowflake://<account>.snowflakecomputing.com/?db=<database>&schema=<schema>&warehouse=<warehouse>" name="<name>" user="<user_name>" password="<password>">
<properties/>
<propertiesEncrypted/>
<propertiesFile/>
<propertiesFileEncrypted/>
</dataSource>
Teradata
| Driver group (image-based) |
|---|
|
Required JAR (manual install) |
|
The required configuration is as follows:
<dataSource driverClass="com.teradata.jdbc.TeraDriver" url="jdbc:teradata://<host>/database=<database>,charset=UTF8" name="<name>" user="<user_name>" password="<password>">
<properties/>
<propertiesEncrypted/>
<propertiesFile/>
<propertiesFileEncrypted/>
</dataSource>
Message queue providers
Amazon SQS
| Driver group (image-based) |
|---|
|
Required JARs (manual install) |
|
The required configuration is as follows:
<component class="com.ataccama.dqc.jms.JmsProviderComponent">
<connectionPoolSize>5</connectionPoolSize>
<jmsResources>
<resource>awssqsmdm</resource>
</jmsResources>
</component>
<config class="com.ataccama.dqc.jms.config.JmsContributor">
<jmsConnections>
<jmsConnection connectionFactory="QueueConnectionFactory" name="awssqsmdm">
<contextParams>
<contextParam name="java.naming.factory.initial" value="com.ataccama.dqc.jms.sqs.SQSInitialContextFactory"/>
<contextParam name="java.naming.provider.url" value="https://sqs.eu-central-1.amazonaws.com/773634404733/awssqsmdmqueue"/>
<contextParam name="region" value="eu-central-1"/>
<contextParam name="queue.awssqsmdmqueue" value="awssqsmdmqueue"/>
<contextParam name="authType" value="AWS_ACCESS_KEY"/>
<contextParam name="accessKey" value="crypted:AES:{value}"/>
<contextParam name="secretKey" value="crypted:AES:{value}"/>
</contextParams>
</jmsConnection>
</jmsConnections>
</config>
Apache Kafka
| Driver group (image-based) |
|---|
|
The required configuration is as follows:
<config class="com.ataccama.dqc.streaming.config.KafkaContributor">
<kafkaConnections>
<kafkaConnection name="<connection-name>" connectString="<broker-host>:<port>"/>
</kafkaConnections>
</config>
Kafka stream ingestion is configured in the model’s nme-stream.gen.xml.
For Avro deserialization with Schema Registry, set value.deserializer to io.confluent.kafka.serializers.KafkaAvroDeserializer and provide schema.registry.url in the stream source Kafka properties.
|
Rabbit MQ, IBM Websphere MQ, Tibco MQ
| Active MQ is included in the standard package. |
These drivers are not available via the driver image and must be installed manually into the lib-ext folder.
|
| Rabbit MQ |
|---|
|
IBM WebSphere |
|
Tibco MQ |
|
The required configuration is as follows:
<component class="com.ataccama.dqc.jms.JmsProviderComponent">
<connectionPoolSize>5</connectionPoolSize>
<jmsResources>
<resource>{QUEUE NAME}</resource>
</jmsResources>
</component>
<config class="com.ataccama.dqc.jms.config.JmsContributor">
<jmsConnections>
<jmsConnection connectionFactory="QueueConnectionFactory" name="{QUEUE NAME}">
<contextParams>
<contextParam name="java.naming.factory.initial"
value="org.apache.activemq.jndi.ActiveMQInitialContextFactory"/>
<contextParam name="java.naming.provider.url" value="tcp://localhost:61616"/>
</contextParams>
</jmsConnection>
...
</jmsConnections>
</config>
Other
Azure Data Lake Storage Gen 2
| Driver group (image-based) |
|---|
|
The required configuration is as follows:
<contributedConfigs>
<config class="com.ataccama.dqc.azure.config.AzureGen2Contributor">
<azureGen2Connections>
<azureGen2Connection
clientId="clientID"
authenticateUser="false"
<!-- authType="AAD_CLIENT_CREDENTIAL" -->
<!-- authType="AAD_MANAGED_IDENTITY" -->
clientKey="crypted:AES:encryptedKey"
containerName="containerName"
name="AzurGen2"
storageAccount="storageAccount"
authTokenEndpoint="https://login.microsoftonline.com/<tokenID>/oauth2/token" <!-- for use with AAD Service Principal-->
clientId="clientID"/>
</azureGen2Connections>
</config>
</contributedConfigs>
Apache Avro
| Driver group (image-based) |
|---|
|
Required JAR (manual install) |
|
The avro driver group provides standalone Apache Avro file format support for DQC steps that read or write Avro files, without requiring the full Kafka stack.
No additional XML configuration is required.
Parquet File Reader
The hadoop driver group must be enabled — it includes parquet-hadoop-bundle together with all required Hadoop ecosystem dependencies.
For manual installations, the following JARs are required:
| Required JARs |
|---|
|
Additional JARs for use with AWS S3 |
|
Additional JARs for use with Azure Blob Storage |
|
Salesforce
Drivers are included in the standard package.
The required configuration is as follows:
<config class="com.ataccama.extension.salesforce.dqc.config.SalesforceContributor">
<salesforceConnections>
<connection name="SFServer" credentialsType="login" password="<password>" secretToken="<secret_token>" username="<user_name>" />
</salesforceConnections>
</config>
Was this page useful?