Before installing and running Oracle GoldenGate for Java, you must install Java (JDK or JRE) version 1.8 or later.
Installation of Oracle GoldenGate for Big Data:
- Create an installation directory that has no spaces in its name. Then extract the ZIP file into this new installation directory. For example:
Shell> mkdir installation_directory
Shell> cp path/to/installation_zip installation_directory
Shell> cd installation_directory
Shell> unzip installation_zip
If you are on Linux or UNIX, run:
Shell> tar -xf installation_tar - Stay on the installation directory and bring up GGSCI to create the remaining subdirectories in the installation location.
Shell> ggsci
GGSCI> CREATE SUBDIRS - Create a Manager parameter file:
GGSCI> EDIT PARAM MGR - Specify a port for the Manager to listen on by using the editor to add a line to the Manager parameter file. For example:
PORT 7801 - if you are on Windows and running Manager as a service, set the system variable PATH to include jvm.dll, then delete the Manager service and re-add it.
- Go to GGSCI, start the Manager, and check to see that it is running:
GGSCI>START MGR
GGSCI>INFO MGR
Note:
To check for environmental variable problems locating the JVM at runtime:
Add the parameter GETENV(PATH) for Windows or GETENV(LD_LIBRARY_PATH) for UNIX to the Replicat parameter file.
Start the Replicat process
Check the output for the report using the GGSCI command: SEND REPLICAT group_name REPORT
To check for environmental variable problems locating the JVM at runtime:
Add the parameter GETENV(PATH) for Windows or GETENV(LD_LIBRARY_PATH) for UNIX to the Replicat parameter file.
Start the Replicat process
Check the output for the report using the GGSCI command: SEND REPLICAT group_name REPORT
Having installed OGG for Big Datawe need to setup the Kafka adapter. As for other adapters,
we are copying the configuration files from $OGG_HOME/AdapterExamples/big-data directory
We need to adjust our kafka.props file to define Kafka/Zookeper topics for data and
schema changes (TopicName and SchemaTopicName parameters), and the gg.classpath for Kafka and Avro java classes.
we are copying the configuration files from $OGG_HOME/AdapterExamples/big-data directory
We need to adjust our kafka.props file to define Kafka/Zookeper topics for data and
schema changes (TopicName and SchemaTopicName parameters), and the gg.classpath for Kafka and Avro java classes.
Configuring Oracle GoldenGate to send transactions to the Connect API in Kafka:
Adapter Khafka:
There will be three files important in khafka adapter
1.custom_kafka_producer.properties
2.kafka.props
3.rkafka.prm
Adapter Khafka:
There will be three files important in khafka adapter
1.custom_kafka_producer.properties
2.kafka.props
3.rkafka.prm
Above there files needs to be added in the /u05/ggadapter/dirprm
Kafka Connect settingsEdit the existing /u05/ggadapter/dirprm/custom_kafka_producer.properties
custom_kafka_producer.properties:
Handler configurationEdit the existing /u05/ggadapter/dirprm/conf.props and amend gg.classpath as shown below.
kafka.props:
3.Replicat parametersCreate /u05/ggadapter/dirprm/rconf.prm with the following contents:
rkafka.prm:
rkafka.prm:
We have the source as table in oracle and our destination will be different which we say as topics in khafka so we need to configure SOURCEDEF file.
Below are steps to create sourcedef file
create SOURCEDEF file using Goldengate DEFGEN utility
Data definitions are needed when the source and target tables have different definitions or the databases are of different types.
Perform below steps on the SOURCE database from which you want to obtain metadata definitions.
From the Oracle GoldenGate directory, run GGSCI.
Testing the Replication:
Connect to Oracle and insert a row, not forgetting to commit the transaction
Now if you list the topics defined within Kafka, you should see a new one has been created, for the elastic_test table:
In case we dont to replicate all colums in the table to destination we can filter that in replicat param file in source by specifying the column name
example:
In below example we are transfering only column test_id,test_name not sending test_name_ar to destination
No comments:
Post a Comment