Skip to Main Content
IBM Data Platform Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data Platform organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com



ADD A NEW IDEA

Clear

Ideas

Showing 185 of 18241

DataStage Snowflake Connector lacking "OAuth" support.

IBM DataStage Snowflake Connector is lacking feature to support OAuth. From Security perspective and risk management department: DTCC as an organization objective all connections to snowflake is - "to leverage OAuth token feature" We need IBM to w...
over 5 years ago in Connectivity 6 Not under consideration

ODBC - Bulk Load capability

CP4D - DataStage will be used in the data flows defined in our datalake using Postgres and Azure SQL. Because we are required to use the ODBC Connector for the connections we require the capability of doing Bulk Loads , exactly as this is offered ...
over 4 years ago in Connectivity 2 Not under consideration

Azure connector with SAS token

A shared access signature (SAS) provides a way to grant limited access to objects in the azure storage account to other clients, without exposing the account key.SAS keys are standard practice and with IS v.11.7 not being possible to use it, the i...
over 7 years ago in Connectivity 1 Functionality already exists

Add Tririga Connector to be able to connect to Tririga databases from CP4D

Add Tririga Connector to be able to connect to Tririga databases from CP4D.
over 1 year ago in Connectivity 0 Under review

Connector Key Columns

Add "Key Columns" as in Netezza Connector also to other Connectors.
over 7 years ago in Connectivity 4 Not under consideration

AWS IAM Role Compatibility with S3 Connections

Problem Statement/Pain Points: We work with a government agency who has specific security requirements and we are utilizing Cloud Pak for Data within their environment. They have a need for utilizing S3 Data Connections to resources in AWS GovClou...
over 4 years ago in Connectivity 2 Future consideration

Big Query connector uses legacy API by default which is adding 90 minutes of wait time for the streaming buffer to be flushed for DML operations to be performed. We need functionality in BQ connector to be able to use Storage write API to avoid this wait time

If a BQ job abends for some reason due to connectivity issue etc, the application team has to wait for 90 min to restart the job 2nd time. if this issue occurs in critical flow, it is causing SLA miss which is causing business impact
over 1 year ago in Connectivity 1 Under review

Allow File Connector to manage Hive tables properly when they are located in HDFS Transparent Data Encryption Zones

When the File Connector is configured to write files to an encrypted zone (TDE) within HDFS, with "Create Hive Table" set to Yes and "Drop existing table" set to Yes, jobs will fail if the table already exists. This is because Hive requires the PU...
about 8 years ago in Connectivity 0 Not under consideration

LZ4 DB2 Compression Option - DB2 Connector Stage

As we've been migrating from Netezza to DB2 (with the support of IBM) - It's been noticed (by IBM) that the LZ4 Compression Option is not available in the DB2 Connector. Tests performed via the DB2 Command Line (on the Datastage server) show signi...
about 5 years ago in Connectivity 0 Functionality already exists

Redshift connector - Allow the use of IAM role authentication to establish connection instead of the access key and secret key.

Allow the use IAM role authentication to establish this connection instead of the access key and secret key
almost 2 years ago in Connectivity 0 Under review