Skip to Main Content
IBM Data and AI Ideas Portal for Customers

Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea

Help IBM prioritize your ideas and requests

The IBM team may need your help to refine the ideas so they may ask for more information or feedback. The product management team will then decide if they can begin working on your idea. If they can start during the next development cycle, they will put the idea on the priority list. Each team at IBM works on a different schedule, where some ideas can be implemented right away, others may be placed on a different schedule.

Receive notification on the decision

Some ideas can be implemented at IBM, while others may not fit within the development plans for the product. In either case, the team will let you know as soon as possible. In some cases, we may be able to find alternatives for ideas which cannot be implemented in a reasonable time.

Additional Information

To view our roadmaps:

Reminder: This is not the place to submit defects or support needs, please use normal support channel for these cases

IBM Employees:

The correct URL for entering your ideas is:




Showing 58

Teradata Assets - Column analysed incorrectly as Primary Key

In Teradata the assets are defined as PRIMARY INDEX and UNIQUE PRIMARY INDEX. If a column is a PRIMARY INDEX, it is not a primary key until UNIQUE is mentioned.
about 3 years ago in Connectivity 1 Delivered

Adding functionality to BigQuery to report total rows modified after the AFTER-SQL executes.

We are loading Data into Google BigQuery via the BQ stage. We would like to know the total rows impacted after the stage completes.
about 1 year ago in Connectivity 0 Delivered

Connectivity for Oracle 19c (

Our Organization is updating all the oracle DB instances to 19c and the information server is unable to support the connection to 19c. We would like to know regarding the connectivity from Information server 11.7.1 to oracle 19c as our developers ...
over 2 years ago in Connectivity 13 Delivered

Currently we are failing when we send data file having more than 10MB in size to Cloud object storage via DataStage 11.7

Here is error we are receiving on DataStage 11.7 log (PRDCEDPDSENG01.W3-969.IBM.COM) . PutFile_COSBucket,0: Fatal Error: CDIER0410E: Error in step=REST, CDIER0905E: The REST step foun...
almost 4 years ago in Connectivity 1 Delivered

Greenplum connector in DataStage: CSV ipv TEXT bij external tables

When in a field (varchar/text) of the source query (postgres) a carriage return exists then we can not use the Greenplum connector of Datastage.
almost 3 years ago in Connectivity 7 Delivered

Kafka connector json message with key value

The kafka connector stage doesn’t support the key values for message write in kafka.
about 2 years ago in Connectivity 1 Delivered

connection to MariaDB

I have the need to connect to MariaDB, so I configured ODBC.ini, but when I searched and tried several libs, the connection was not possible. When opening a call to IBM support, I was told that there is really no official support for MariaDB and I...
almost 3 years ago in Connectivity 0 Delivered

Google Cloud Storage bucket region

We would like to use Google Cloud Storage stage to create bucket in our environment. Unfortunately it support only us-region
over 1 year ago in Connectivity 0 Delivered

The REST step found a large input data item whose size actual data size exceeds the supported size limit maximum data size li

DataStage job is not able to import the file of size 350MB using xml hierarchical stage. When we tried to import the file of lesser size the response from REST API call is Success. Where as in this case the REST API response is Fail. In this case ...
over 4 years ago in Connectivity 3 Delivered

Oracle connector support for AWS RDS

A lot of our source systems are moving to AWS RDS for Oracle. We use datastage 11.7 to read from these source systems and load our datawarehouse also an RDS Oracle database.
over 1 year ago in Connectivity 1 Delivered