Skip to Main Content
IBM Data Platform Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data Platform organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com



ADD A NEW IDEA

Clear

Ideas

Showing 150 of 18731

Snowflake Bulk not able to read from read-only database due to the temporary file format.

When Snowflake Bulk stage tries to read from a read-only database, it fails with the message: "Could not create a temporary file format (SNOWFLAKE_47)" due to the lack of permissions to create the file format object in the source database. This is...
over 1 year ago in StreamSets 1 Under review

Needed a new option that would stop any job to instantiate a pipeline inside a data collector then when all the currently running pipeline will terminate then the data collector would shutdown automatically

Needed a new option that would stop any job to instantiate a pipeline inside a data collector then when all the currently running pipeline will terminate then the data collector would shutdown automatically. 1 provide a lock at the control plan le...
8 months ago in StreamSets 0 Needs more information

Support for Kerberos in JDBC stages

To provide official support for utilizing Kerberos in JDBC stages for supported database systems. Kerberos settings should be allowed via the JDBC connection string or as a separate configuration in the pipeline, such as with Kafka stages. An exam...
8 months ago in StreamSets 0 Functionality already exists

Azure Synapse Stage - Exposing the Merge Query Options

Currently, the MERGE logic in StreamSets Synapse SQL Stage is largely fixed, limiting users in scenarios where custom match logic, conditional updates, or fine-grained control over insert/update/delete behaviour is needed. This idea is for a featu...
8 months ago in StreamSets 1 Needs more information

[Customer-HCA-Parallon] Projects Feature: Allow moving of organization level objects (pipelines, jobs, connections etc.) to relevant project

HCA has been keen on using Projects feature since it was demoed to them. They have the feature enabled now, they have explored it and shared feedback. HCA currently has many teams using StreamSets and they have numerous pipelines, fragments, jobs,...
8 months ago in StreamSets 4 Functionality already exists

Allow "MongoDBAtlasLookup" stage to search collections dynamically

I would like to use ${record:value("/collection")} in "MongoDBAtlasLookup" stage as I need my pipeline to check multiple different collections. Due to be only allowed to use a hardcoded value or parameter I'm forced to have to use multiple "MongoD...
8 months ago in StreamSets 0 Under review

Replace the 404 page for SDC 6.0.0 with an Informational page like Platform's headless UI page.

This is really to stay consistent in Platform like when navigating to the SDC page to accept the certificate. The 404 page causes confusion as there might be something wrong. A replacement page with information that SDC 6 no longer has a UI would ...
over 1 year ago in StreamSets 0 Under review

Native connector/stage for Workday including CDC

Native connectivity (stage) within StreamSets that enables connectivity and CDC for WorkdayChanges / updates to Workday would be sourced (pulled or pushed) by this stage and be available in the pipeline
over 1 year ago in StreamSets 0 Needs more information

Runtime resource files - support for encryption and decryption at runtime within the pipeline using keys from azure key vault

Humana is successful with using exec function in credential store by fetching encryption keys from Azure key vault. They are looking for similar feature/implementation for resource files, specifically encrypting resource files and decrypting them ...
over 1 year ago in StreamSets 3 Not under consideration

Snowflake CDC

Snowflake is central part of our ecosystem and multiple producers and consumer components are exchanging data through it. The lack of CDC support for snowflake table means we'll have to configure a poll at certain duration which will incur cost - ...
over 1 year ago in StreamSets 0 Not under consideration