Skip to Main Content
IBM Data and AI Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data & AI organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com


ADD A NEW IDEA

DataStage

Showing 468 of 15091

Enhancing DataStage Monitoring and Debugging Capabilities

One of the critical aspects of monitoring, tuning, or debugging individual jobs or an entire DataStage project is to clearly understand which database system activities are triggered by which stage in a specific DataStage job within a particular D...
4 months ago in DataStage 1 Under review

Unable to call BigQuery stored procedure using BigQuery connector stage in DataStage 11.7

As part of the existing Teradata to BigQuery migration POC, we are planning to replace the Teradata Macros’s to Big Query Stored Procedures and call the stored procedures from the BigQuery connector stage in Datastage 11.7. But the issue is we are...
4 months ago in DataStage 0 Under review

Delta Files from Datastage

Currently Datastage does not supports delta files. Due to which it cannot integrate seamlessly integrate/leverage the functionality supported by potential downstreams of an ETL tool like Fabric Warehouse/Databricks. For Ex>> If datastage wou...
4 months ago in DataStage 1 Under review

MFA for Datastage Connections

We are undergoing a security assessment and per IA-2(1) for privileged access and IA-2(2) for non-privileged users (see below link and attachment for official documentation) we need to implement Multifactor Authentication for Privileged or Non-Pri...
4 months ago in DataStage 0 Submitted

When a DataStage job creates a dataset with a parameterized name, the dataset does not utilize the parameter value at runtime for naming.

When a dataset is created by a DataStage job, and the dataset's name is composed of a parameter, the dataset does not use the parameter's value at runtime to name it.
5 months ago in DataStage 0 Under review

Add business terms on DataStage flow

On IGC (11.x) -WKC (4.x) migration projects Clients asks to add business terms on datastage next gen flow but now this feature is not available using DataStage next gen flow on new interface.
over 2 years ago in DataStage 0 Future consideration

[SAP Pack] ABAP Extract Stage's non-default ftp port setup support request

hello I want to use ftp, the data transfer method of ABAP Extract Stage. There is no ability to specify ftp port. The use of the ftp default port 21/tcp shows up as a vulnerability in security audits, so a different port must be used. The ibm case...
about 1 year ago in DataStage 0 Not under consideration

Require datastage job interim run status in cpdctl dsjob jobinfo command output as same is available in classic datastage dsjob jobinfo option

Require datastage job interim run status in cpdctl dsjob jobinfo command output as same is available in classic datastage dsjob jobinfo option This option is used to get the datastage job interim status which is useful when we are trying to captur...
over 1 year ago in DataStage 1 Functionality already exists

Integration of DataStage parameters and parameter sets in Pipelines

Parameters and parameter sets in Pipelines are absolutely necessary for designing dynamic data integrations with DataStage.
over 1 year ago in DataStage 1 Functionality already exists

Request to implement Auto Map functionality on Output in Hierarchical Stage

I'm using the Hierarchical Data stage to pull data via a REST API and then feed that through a JSON_Parser stage and finally map it to the output columns on the Output tab. I know in the DataStage thick client there was an option to use the Auto M...
over 3 years ago in DataStage 4