Skip to Main Content
IBM Data Platform Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data Platform organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com



ADD A NEW IDEA

Clear

My votes: StreamSets

Showing 12

Handle Redis password as a credential field in Redis Destination rather than in plain text within the URL

Increases security
6 months ago in StreamSets 1 Delivered

Request support for Python version 3.10 and beyond for StreamSets SDK for StreamSets Self-Managed ( Legacy ) Controlhub

The current SDK supports only Python 3.9, which is scheduled to reach end-of-life on October 17, 2025. Continuing to use software beyond its EOL poses security, compliance, and operational risks, as it no longer receives updates or patches—includi...
5 months ago in StreamSets 0 Delivered

Support temporary AWS credentials generated via AWS SDK in SDC S3 stages.

Currently, the Amazon S3 stages do not support authentication methods that require a session token, such as temporary AWS credentials. It is a method accepted by AWS that we do not support. This would benefit any customer using this authentication...
9 months ago in StreamSets 0 Delivered

Update lib for Streamsets ver 5.12 to work with Cloudera Hadoop, Kudo with Java 11 or Java 17 (during upgrade, found using Kudo only will work when using Java 1.8)

Upgrade up to Streamsets ver 5.12, tried to start using Java 1.17, (from 1.8), and found error that would only work using Java 1.8 Support ticket -> TS019887719 (PC) - Streamset 5.12.0 java 17 support Customer = SYNIVERSE TECHNOLOGIES LLC prima...
4 months ago in StreamSets 0 Delivered

Azure Key Vault for Deployable Transfomer for Snowflake

For security conscious customers being able to store credentials in the cred stores we support in SDC would be helpful, Azure being this customers
about 1 year ago in StreamSets 2 Delivered

AWS S3 pipeline ability to use temporary credentials with AWS resources - passing 3 parm where session

https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp_use-resources.html State Street is starting its cloud movement and wishes to move data from On-Prem -> to AWS S3 State Street Security/internal IT wishes to use 60 min exp Tem...
7 months ago in StreamSets 0 Delivered

Read Parquet Schema from Origin that can be used in the Pipeline

When a Parquet file saved from a previous pipeline where the schema was inferred from the record, the parquetSchema field does not appear to be populated correctly. It appears to be an Avro schema and in the destination stage when the Parquet Sche...
10 months ago in StreamSets 5 Delivered

Snowflake Destination does not support "Object" data type.

IBM StreamSets Control Hub | PlatformSDC 5.10 High level pipeline logic:MongoDB → Snowflake The customer is attempting to ingest data from MongoDB and write it into Snowflake. They utilize the “Object” datatype a considerable amount in both MongoD...
about 1 year ago in StreamSets 0 Delivered

Allow HTTP response headers to be configured in Data Collector

The following headers need to be configurable via sdc.properties: Content-Security-Policy X-Content-Type-Options
about 1 year ago in StreamSets 0 Delivered

MongoDB Atlas CDC origin unable to capture update operations with save() method

When the record is update through Java Spring Mongodb application MongoTemplate.save() method, MongoDB Atlas CDC origin is not capturing update operations. Incase if the record is updated through MongoTemplate.updateFirst() method, then MongoDB At...
about 1 year ago in StreamSets 0 Delivered