Skip to Main Content
IBM Data and AI Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data & AI organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com


ADD A NEW IDEA

All ideas

Showing 718

Pipeline node level log should be accessible through API

Currently pipeline log accessible through API is a consolidated log across the whole pipeline without details on node level error, which is crucial for debugging but currently only accessible through cpdctl. Customers need pipeline node level log ...
about 2 months ago in watsonx.ai 0 Submitted

Pipeline Run Metrics in the Ops console DB

To collect the run time metrics of pipeline is currently missing on CP4D environment. In On-Prem routines are helping to capture this metrics for all jobs and sequence. but on CP4D is limited to only DataStage flows. Looking for pipeline metrics t...
about 2 months ago in Cloud Pak for Data System 0 Submitted

MDM 11.6 and 12.x compatibility checking with OCP 4.14

ICICI Bank has planned to upgrade their OCP platforms from OCP 4.12 to OCP 4.14 and wanted to know whether their existing MDM setup ( MDM 11.6 and MDM 12.x) can be supported on OCP 4.14 platform. However current IBM MDM product release document fo...
about 2 months ago in Master Data Management 1 Submitted

Each window should contain links to corresponding documentation

IBM controls each window that is displayed in each Cloud Pak for Data service, and each documentation page. Users would benefit from having a direct link to the documentation that will help them understand/use the product.
7 months ago in watsonx.ai 0 Submitted

DMC - "Other Waits" detail graphic needs to be dynamic and smarter

DMC v3.1.12 decided to create a fixed bucket named "Other Times" under "Other Waits" main category on Database Time Spent monitor page.. Please see attached screen-shot. In the main graphic, we can see "other Waits" is the most contributor. And in...
9 months ago in Db2 / Data Management Console 0 Submitted

ER should have an option to replicate tables between different databases

Currently there is not an option in ER that would allows us to replicate a table from database A to database B. You must use the same database. There are real life situations where you need to replicate a table to a differente server/instance/data...
9 months ago in Informix / Informix Server 0 Submitted

Remove admin vulnerability on routing sets when non admin users copy data modules / packages / files

Only cognos administrators can create and edit routing sets. This allows administrators to specify that certain cognos dispatchers should only be used for certain bits of data. For example all the Finance data modules / packages should be sent to ...
9 months ago in Cognos Analytics / Administration 0 Submitted

Automatic & scheduled workflows to be available as part of PAW

Please refer the attached spreadsheet for more details
5 months ago in Planning Analytics 0 Submitted

Audit table for datasets

Want to be able to monitor dataset growth/decline of time. e.g. -size -number of rows -number of columns
9 months ago in Cognos Analytics / Other 0 Submitted

Automate the clean up of orphan rows in the database catalogs during the db2ckupgrade processing.

We have been dealing with orphan rows in the database catalogs for many years now. The first tool to deal with this was db2cleancat and it has gone through various iterations to address all of the various flavors of this type of issue and now has ...
5 months ago in Db2 / Installation & Upgrade 0 Submitted