Skip to Main Content
IBM Data and AI Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data & AI organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com


ADD A NEW IDEA

DataStage

Showing 10 of 15558

In Pipelines' DataStage nodes using Parameter Sets, add an option to use a paramfile

In some scenarios, the values of the DataStage parameters in a Parameter Set must be loaded from a file at runtime. This option is possible through DataStage paramfiles. However, when you're running DataStage jobs from Pipelines using a DataStage ...
1 day ago in Cloud Pak for Data / DataStage 0 Submitted

Need HTTP proxy support on the Google connectors.

Ford intends to switch authentication method to -> Workload Identity Federation (WIF) with token URL. Our CP4D runs behind a firewall that prevents internet access, so the communication has to be done via Proxy and currently has a HTTP proxy,...
about 1 month ago in Cloud Pak for Data / DataStage 0 Submitted

CP4D - Cloudera Impala Connector - Truststore/SSL

In Legacy DataStage, we use connect to Hive/Impala using a cacerts which is defined that sever level. we just pass the location of the cacerts and datastage is intelligent enough to use that cacerts and establish the connectivity. But in CP4D we d...
about 1 month ago in Cloud Pak for Data / DataStage 0 Submitted

Automation of code (UNIX scripts) deployment to storage volumes for CPD DataStage service

This is for automation of code especially unix scripts used for ETL batch processing by DataStage service on CPD platform. DataStage service needs storage volumes for processing files which are accessible by both conductor & compute PODs on CP...
about 2 months ago in Cloud Pak for Data / DataStage 0 Under review

Ability to setup Maintenance Mode to prevent users from login and executing new job's.

Hi Team, As Admin team has the responsibility of maintaining CPD environment for datastage, we wanted check if we can enable a capability to add a Maintenance Mode to prevent users from logging and initiate any of the new job's. We need this capab...
29 days ago in Cloud Pak for Data / DataStage 0 Under review

Need Postgres connector to not use positional binding

We have tested with positional binding for one usecase and wanted to test it for multiple scenarios and it would be better if IBM can make it independent of positional binding.
about 1 month ago in Cloud Pak for Data / DataStage 1 Not under consideration

Open a Pipeline or DataStage Flow from the Pipeline Canvas with 2 clicks only

For Pipeline Jobs or DataStage Job nodes in the canvas when editing a Pipeline, if you want to open the Pipeline or DataStage Flow that's being executed, you must click 4 times: Double-click the node (or right-click/menu then Open) Click the menu ...
4 months ago in Cloud Pak for Data / DataStage 0 Under review

Folder in watson studio for DataStage files

For InfoSphere DataStage user who transition to NextGen DataStage, they find it very messy if they cannot organize DataStage related files in folder.
6 months ago in Cloud Pak for Data / DataStage 0 Under review

Job Details view window limitations

There are actually two problems regarding this Job Details view: 1. There is no easy navigation from a given design view of a flow or pipeline to the corresponding Job Details window 2. There is no way back from this Job Details window back into t...
7 months ago in Cloud Pak for Data / DataStage 0 Under review

Allow using “GCS Staging” when reading from Google BigQuery using read-method "General"

When using the Google BigQuery connector in read-method "Select Statement" it is possible to configure a bucket (Google Cloud Storage = GCS) for better performance. When using the read-method "General" such a configuration is not available. It wou...
8 months ago in Cloud Pak for Data / DataStage 0 Under review