Skip to Main Content
IBM Data Platform Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data Platform organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com



ADD A NEW IDEA

Clear

Ideas

Showing 87

An enhancement to allow switching the character type handling method for retrieving multibyte data from char and varchar types in the ODBC connector by using an environment variable.

When retrieving multibyte data from char or varchar columns through the ODBC connector, DataStage currently requires the corresponding target column types to be set to nchar or nvarchar, depending on whether the source system handles the data base...
2 days ago in Information Server  / DataStage 0 Submitted

The level of bundled open-source components should be clearly stated in the Release Notes or Technotes at the time of DataStage release. Additionally, the impact of differences in open-source components on DataStage should also be clearly indicated.

Currently, there is no mention of open-source components in the Release Notes (e.g. https://www.ibm.com/docs/ja/iis/11.7.0?topic=rn-infosphere-information-server-version-1171-fix-pack-5-releasenotes) or in the installation-related information prov...
4 months ago in Information Server  / DataStage 1 Not under consideration

Schema Validation using Avro File Format under File Connector stage (AvroStreams)

Schema Validation exists with old avro1.x version but not with Avrostreams. Schema vadliation should be always available Avrostreams which avoid CLASSPATH setting.
about 1 month ago in Information Server  / DataStage 0 Submitted

DataStage incorporates multiple open-source components. To ensure system stability and operational consistency, any behavioral changes resulting from updates to these bundled open-source components should be managed and absorbed by the DataStage installer. This approach prevents disruptions during system deployment and post-deployment operations.

Currently, there is no mention of open-source components in the Release Notes (e.g. https://www.ibm.com/docs/ja/iis/11.7.0?topic=rn-infosphere-information-server-version-1171-fix-pack-5-releasenotes) or in the installation-related information prov...
3 months ago in Information Server  / DataStage 0 Under review

Add/enable connection from DataStage to confluent cloud topics having OAUTH support for schema registry.

We are trying to connect to confluent cloud topic from on-prem Datastage with kafka connector used in jobs.To connect for confluent cloud topic we select OAUTHBEARER property in kafka connector. According to our use case, we are facing issue while...
25 days ago in Information Server  / DataStage 0 Submitted

Information Server 11.7.1.5 certify Teradata TTU v20

RBC currently has over 50 projects with more than 10,000 jobs running on Teradata v17.20. The Teradata DB is scheduled to be upgraded to v20 in January 2026. However, TTU v20 is not yet certified with IIS 11.7.1.5, and the minimum required IIS ver...
4 months ago in Information Server  / DataStage 0 Planned for future release

Db2 LUW SQL Syntax Check

It would be good if Db2 LUW had a SQL, SQL/PL syntax check engine. You can check queries for correct syntax by using the explain facility, but this doesn't work with DDL statements, functions and stored procedures. An SQL syntax check engine would...
over 1 year ago in Db2 / Application Development 0 Future consideration

DB2 Community Edition needs to support Ubuntu 24.04.3 LTS

I receive errors when running the db2prereqcheck ERROR : Requirement not matched. See cat -n of output log file in Additional details ... specific errors: 46 ERROR : Requirement not matched. 114 ERROR : Requirement not matched. 182 ERROR : Require...
4 months ago in Db2 / Application Development 0 Submitted

Conversion DECIMAL do CHAR

We are in the process of migrating data from Db2 z/OS to Db2 LUW. In our many programs we convert DECIMAL data into CHAR, e.g. to display it on the reports in a readable format. Unfortunatelly Db2 LUW formats data diffrently than Db2 z/OS. Let us ...
almost 2 years ago in Db2 / Application Development 1 Submitted

DB2 LUW: Add GnuCOBOL to supported compilers under Linux and Windows

Users of DB2 LUW regularly need to run their COBOL on LUW as well. DB2 LUW offers thje necessary precompiler for EXEC SQL to provide that. Those users previously had to use Micro Focus (now called Rocket COBOL), but have a working alternative in G...
7 months ago in Db2 / Application Development 1 Under review