Skip to Main Content
IBM Data and AI Ideas Portal for Customers

This portal is to open public enhancement requests against products and services offered by the IBM Data & AI organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (

Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,

Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea

Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal ( - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal ( - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM. - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at



Showing 21

Jobs API - have synchronous call to launch Datastage jobs

One of our customer (CNP in France) is evaluating the CP4D-DataStage. They have DataStage 11.5 and actually migrating to 11.7. They have a framework (BASIC code that orchestrates the DataStage jobs) that is the centerpiece of their DataStage proje...
8 months ago in DataStage 0 Submitted

Request to implement Folders for Jobs, similar to what has been implemented for Assets

While the implementation of the folder structure for many assets has been implemented with 4.8.2, it is missing for jobs. This is particularly important in a large environment with many thousands of jobs running daily. In legacy DataStage 11.7 the...
3 months ago in DataStage 0 Submitted

CP4D - Would like to be able to delete multiple jobs at the same time in the GUI

While the asset browser now allows selection of multiple assets for deletion in most places, the "job" view has no such selection and jobs must be selected one at a time and then deleted. The GUI should offer the same type of selection boxes for j...
3 months ago in DataStage 0 Submitted

MDM Connector for CP4D-DataStage

MDM connector will use API calls that will be directly integrated with MDM server resulting in better performance when processing large data volume. Alternate method for MDM connector is to use web service call which involves xml parsing or using ...
8 months ago in DataStage 0 Submitted

PostgreSQL does not support Reject Link in DataStage on CP4D.

PostgreSQL does not support Reject Link in DataStage on CP4D.
3 months ago in DataStage 0 Submitted

Detailed Log Files

One of our customers wanted to be able to track which IP the transactions were made from the logs, but we found out that this was not available. This could be very useful. Because if a username is used by more than one user, it is not enough to se...
about 5 hours ago in DataStage 0 Submitted

Big Query connector uses legacy API by default which is adding 90 minutes of wait time for the streaming buffer to be flushed for DML operations to be performed. We need functionality in BQ connector to be able to use Storage write API to avoid this wait time

If a BQ job abends for some reason due to connectivity issue etc, the application team has to wait for 90 min to restart the job 2nd time. if this issue occurs in critical flow, it is causing SLA miss which is causing business impact
6 days ago in DataStage 0 Submitted

create a command allowing to update parameters inside a datastage flow or pipeline in a project

We are requesting an option to cpdctl dsjob command (suggestion) which would allow to change the value of the parameter in the datastage flow or pipeline (not to confuse with a Parameter or Parameter set or environment variable). After migration f...
8 days ago in DataStage 0 Submitted

Unable to call BigQuery stored procedure using BigQuery connector stage in DataStage 11.7

As part of the existing Teradata to BigQuery migration POC, we are planning to replace the Teradata Macros’s to Big Query Stored Procedures and call the stored procedures from the BigQuery connector stage in Datastage 11.7. But the issue is we are...
15 days ago in DataStage 0 Submitted

Salesforce Connector to support Salesforce API 60 and after June 2024 API version 61

Salesforce updates its API versions but constantly IBM DataStage Salesforce connector falls behind, we need a connector that supports API 60 ASAP.
17 days ago in DataStage 0 Submitted