Skip to Main Content
IBM Data Platform Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data Platform organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com



ADD A NEW IDEA

Clear

Ideas

Showing 304 of 18234

An enhancement to allow switching the character type handling method for retrieving multibyte data from char and varchar types in the ODBC connector by using an environment variable.

When retrieving multibyte data from char or varchar columns through the ODBC connector, DataStage currently requires the corresponding target column types to be set to nchar or nvarchar, depending on whether the source system handles the data base...
2 days ago in Information Server  / DataStage 0 Submitted

The level of bundled open-source components should be clearly stated in the Release Notes or Technotes at the time of DataStage release. Additionally, the impact of differences in open-source components on DataStage should also be clearly indicated.

Currently, there is no mention of open-source components in the Release Notes (e.g. https://www.ibm.com/docs/ja/iis/11.7.0?topic=rn-infosphere-information-server-version-1171-fix-pack-5-releasenotes) or in the installation-related information prov...
4 months ago in Information Server  / DataStage 1 Not under consideration

Schema Validation using Avro File Format under File Connector stage (AvroStreams)

Schema Validation exists with old avro1.x version but not with Avrostreams. Schema vadliation should be always available Avrostreams which avoid CLASSPATH setting.
about 1 month ago in Information Server  / DataStage 0 Submitted

Request for Provision of APIs for Automated Control of Resource Limits at Project Level in Cloud Pak for Data (on Prem) Monitoring

In IBM Cloud Pak for Data (on Prem), under Monitoring → Status summary → Projects → Set quotas, it is currently possible to manually define and enforce resource limits via the UI—such as vCPU and memory thresholds—at project level. To enable autom...
4 months ago in Cloud Pak for Data System 0 Submitted

DataStage incorporates multiple open-source components. To ensure system stability and operational consistency, any behavioral changes resulting from updates to these bundled open-source components should be managed and absorbed by the DataStage installer. This approach prevents disruptions during system deployment and post-deployment operations.

Currently, there is no mention of open-source components in the Release Notes (e.g. https://www.ibm.com/docs/ja/iis/11.7.0?topic=rn-infosphere-information-server-version-1171-fix-pack-5-releasenotes) or in the installation-related information prov...
3 months ago in Information Server  / DataStage 0 Under review

Make the production NRS mount CIFS mountable

While NRS is a great product it requires a backup of the database before registering it into a DR server as a valid NRS database. The backup of the database must still occur and be stored. Once stored the backup must be moved to the DR server(s) t...
16 days ago in Cloud Pak for Data System 0 Submitted

Change the JWT 'iss' claim to the introspection endpoint URL of the issueing IM/Zen

Customers' landscape still is sort of silo-ed, meaning there's a large department managing BAW , and there's another big department managing FileNet. Current production for bothruns on WebSphere traditional (tWAS), ( deployed on CloudPak Systems a...
23 days ago in Cloud Pak for Data / Bedrock 0 Submitted

Add/enable connection from DataStage to confluent cloud topics having OAUTH support for schema registry.

We are trying to connect to confluent cloud topic from on-prem Datastage with kafka connector used in jobs.To connect for confluent cloud topic we select OAUTHBEARER property in kafka connector. According to our use case, we are facing issue while...
25 days ago in Information Server  / DataStage 0 Submitted

Multiple Identity providers(idp)using oidc with Oauth for token generation)

Add multiple idp support for Zen Custom tokens for CP4BA - we have multiple IDPS onsite. Westpac consumers using JWT token generated by idps from frontend applications. This JWT token is locally validated by Websphere when request is received for ...
26 days ago in Cloud Pak for Data / Bedrock 0 Submitted

Token exchange - Use token created in customer's OIDC enterprise IdP to login in CP4D

Context: Customer is building in .NET their own Data Marketplace based on IBM CP4D (IKC+DV). One of the principal users will be Healthcare, that wants to publish and share data between users in a secure platform. Then, because of security requirem...
7 months ago in Cloud Pak for Data / Bedrock 0 Under review