Skip to Main Content
IBM Data and AI Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data & AI organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com


ADD A NEW IDEA

My subscriptions: Replication: Change Data Capture

Showing 351 of 15496

Enhance User ID and Password encryption algorithm

Please advise to enhance encryption algorithm on User ID and Passowrd to AES 128 bit or higher during authentication and store in system.
over 7 years ago in Replication: Change Data Capture 0 Not under consideration

ibm supported schema registry component in the CDC replication solution

When using IBM's CDC to write to Kafka, there is a dependency on utilising the Confluenet Schema registry.To move this to product, regulation requires a support contract, thus requiring license on the Confluent platform.Remove tight coupling to Co...
over 7 years ago in Replication: Change Data Capture 0 Not under consideration

Ability for IBM CDC tool to load the data from S3 to snowflake

IBM CDC is able to replicate from DB2 Zos to S3 (which can then be uploaded to snowflake via snowpipe). It can also replicate to snowflake tables but the performance can be very slow. DB2 Zos to S3 performs well but we need the tool to take S3 and...
over 2 years ago in Replication: Change Data Capture 1 Future consideration

Poor management of CDC parser on Postgresql timestamp format

Cf CASE TS009333426 PostgreSQL supports the full set of SQL date and time types , shown in this table Source : https://www.postgresql.org/docs/11/datatype-datetime.html Name Storage Size Description Low Value High Value Resolution timestamp [ ( p ...
over 2 years ago in Replication: Change Data Capture 1 Future consideration

CDC for z/OS should use the same access control module as DB2 for z/OS.

CDC for z/OS can use only Db2 ACM exit named DSNX@XAC, made accessible to CDC via STEPLIB concatenation. However, starting with DB2 for z/OS v10, customers can use a member name other than DSNX@XAC for the access control exit routine (zPARM ACCESS...
about 5 years ago in Replication: Change Data Capture 2 Planned for future release

Oracle fastload Refresh method does not do intermediate commits

When doing a Refresh to Oracle using the fastload method, no intermediate commits are done. That means that all records are loaded to the target table before committing the transaction (and the fastload_refresh_commit_after_max_operations paramete...
about 5 years ago in Replication: Change Data Capture 0 Not under consideration

Graceful recovery from any error with /tmp in install of CDC engine on Linux/Unix

Currently the installer scripts for CDC on Linux and Unix uses /tmp by default for unpacking objects deposited on the server at the start of the install process. The installer deals gracefully with the case that /tmp does not have enough free stor...
about 5 years ago in Replication: Change Data Capture 0 Not under consideration

IDR-Parallele-Table-Refresh

to improve the init phase especially for DataLAKE projects (Kafka agents, HDFS agent), it would be very pertinent to be able to execute table refresh in parallel and not sequentially in the same subscription.
about 5 years ago in Replication: Change Data Capture 0 Not under consideration

Stronger encryption for usernames and passwords in CDC metadata

Currently the usernames and passwords are not strongly encrypted in the CDC metadata or in the Access Server files. Many customers in secure environments would like stronger encryption to be used.
about 5 years ago in Replication: Change Data Capture 0 Not under consideration

Filter using comparison with Unicode character field

The handling of multibyte character set data has changed with V11.3.3 and some features which worked fine with V6.1 are no longer working correctly. In the present case the customer has row filtering based on a comparison of a column with a string...
about 5 years ago in Replication: Change Data Capture 0 Planned for future release