Skip to Main Content
IBM Data Platform Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data Platform organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com



ADD A NEW IDEA

Clear

StreamSets

Showing 20

Orchestrator connection - Test connection must check that API creds are valid for the organization

This is related to the : Support Case TS020495554: "Connection is successful using API Credential from another Organization", Engineering intervention: INT-3683. Please check the attachment for detailed steps to reproduce the issue. Issue: Humana ...
4 months ago in StreamSets 4 Not under consideration

Customisable URL for Control Hub

Curently all users e.g. in AP region use Control Hub instance with the URL ap01.hub.streamsets.com The customer's security team has a problem with this, and they require a custom URL like e.g. asb.streamsets.com (the name in the custom URL is not ...
4 months ago in StreamSets 1 Not under consideration

There is no way to manage local pipelines in Data Collector 6.x through Legacy Control Hub

There is no way to stop and delete local Data Collector 6.x pipelines from Legacy Control Hub. This includes SCH Test Runs that have been disassociated with a pipeline in Control Hub and any pipeline that may have been created using the Data Colle...
4 months ago in StreamSets 1 Not under consideration

Add support for Hadoop corresponding to AWS EMR 7.8.x

OCLC have many critical business use cases around Hadoop and run an open-source flavor of it. They are currently running components corresponding to Cloudera CDP 7.1.7 but are planning to move to components corresponding to AWS EMR 7.8.0 and are a...
6 months ago in StreamSets 1 Not under consideration

Add support for top level union schemas in Kafka Avro schema support

Support top-level unions for Avro schemas instead of just top-level record schemas. Example top-level union: ["null", { "type": "record", "name": "SampleRecord", "fields": [ { "name": "id", "type": "string" }, { "name": "name", "type": "strin...
9 months ago in StreamSets 2 Not under consideration

Add Option to Truncate Table in Destination Stage Before Inserting Data

In many data transfer scenarios, it is essential to ensure that the destination table is empty before inserting a fresh set of data. Currently, this requires the creation of a separate pipeline to truncate the table, followed by an orchestration p...
11 months ago in StreamSets 1 Not under consideration

Provide native capability to write metrics to a a standard product eg: prometheus

Platform teams wants pipeline metrics to be available to users through standard platforms such as prometheus. While can be worked around and instrumented, it should be out-of the box
about 1 year ago in StreamSets 0 Not under consideration

Simplify expression evaluation notation and reference to variables

Currentlywe need a syntax of $ then a set of curly brackets {} and then a value. and for values, its record:value('/<the attribute>'). The above becomes very convoluted and long when you have a case statement of other long expressions. It wo...
about 1 year ago in StreamSets 1 Not under consideration

Make it easier to achieve advanced data engineering tasks within the UI instead of using Groovy

Currently many advanced tasks requires us to jump out of the standard UI capability and directly into Groovy. It would be worthwhile to consider if these can be done in a UI like framework and still wrapped into a single stage for execution. This ...
about 1 year ago in StreamSets 0 Not under consideration

Adding resource files to Deployable Transformer for Snowflake engines by introducing External Resource for Platform Users

In the same way that StreamSets Data Collector (SDC) allows for the upload of external resources via the user interface, there should be a similar option available for the Deployed Transformer for Snowflake. Use Case 1:According to our organizatio...
about 1 year ago in StreamSets 0 Not under consideration