Skip to Main Content
IBM Data and AI Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data & AI organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com


ADD A NEW IDEA

My subscriptions: DataStage

Showing 37

GIT integration/version

I would like to be able to version my data stage flows using git repository.
9 months ago in DataStage 0 Delivered

Adding feature for more frequent commits when using DB2 connector

When loading big amount of data into DB2 for Warehousing, we are exceeding DB2 limitation 300GB which means that we need to load the data with more than just one commit. However when using external tables in DataStage, DB2 connector ingores the co...
about 3 years ago in DataStage 1 Delivered

WHERE-Condition for DB2 Connector Stage

The DB2 connector stage should have a WHERE clause field to provide filter expressions to limit the number of rows when using "Generate SQL" = "Yes". There is a similar feature in the DRS Connector Stage.
almost 7 years ago in DataStage 0 Delivered

User access management

A more granular split for user access on Data Stage Project will be necessary. A new role is needed that will permit for an user to only start a data stage job without the right to edit it. Currently in Cp4D version 4.6.5 when you build the access...
over 1 year ago in DataStage 1 Delivered

kafka connector should be able to read only committed messages

Our Kafka Producer writes messages on a Kafka topic acting transactionally and it could rollback the transaction when certain events occur. As of now, the behaviour of the Kafka connector is to read also uncommitted messages (see case TS003289786)...
over 4 years ago in DataStage 1 Delivered

DataStage connector for SalesForce should attempt reconnecting several times if connection is temporarily interrupted

This is referring to IBM Infosphere DataStage v11.5! I could not find this exact product in the list.ETL jobs that are connecting to SalesForce to fetch data do not extract all available data reliably. Sometimes the connection is briefly interrupt...
about 8 years ago in DataStage 1 Delivered

Datastage to Azure DataLake Storage (ADLS) connectivity

We need ability to connect to Azure datalake storage (ADLS) using Datastage product to enhance and augment our Enterprise DataWarehouse
about 6 years ago in DataStage 0 Delivered

HDP YARN preemption support in BigIntegrate

Provide HDP Yarn preemption support in Infoserver Big Integrate. The current behavior would be the RM would eventually kill BI container and the job will ultimately fail if one of our AM's containers was marked preempted. Aetna standard is to have...
about 7 years ago in DataStage 0 Delivered

Supporting all datatypes in parquet/avro/orc file format, we are really need it on parquet for sure

As of today these “date”, “decimal/numeric”, “timestamp” data types in parquet file format not supported by Datastage 11.5 or 11.7, which is required for critical data migration project.And now we are converting above data type to different format...
almost 5 years ago in DataStage 2 Delivered

Enhancement of DB2 Connector stage for table creation

Currently it when using the create table feature in DB2 Connector stage (Table Action = 'Create' and Generate create statement at runtime = 'Yes'), it is not possible to specify other parameters for table creation, such Organize by Row/Column, In ...
over 8 years ago in DataStage 0 Delivered