Skip to Main Content
IBM Data Platform Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data Platform organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com



ADD A NEW IDEA

Clear

Ideas

Showing 92 of 18234

Reservation of nodes for spark job/ Labeling, Taint and Toleration of workers and spark pods 

As an CP4d service provides we would like to use special worker nodes only for spark job execution called via the pyspark API. One reason for this is, that we sometimes see that the spark jobs use up all resources. In this scenario we would like a...
about 3 years ago in Analytics Engine 0 Not under consideration

Permission on (specific) query based on user role, user group or user

It should be possible to restrict a query or a group of queries within a mandator to specific user roles, user groups (for cases) or even specific users (e.g. private queries). Currently every user that has the "view query" permission sees and can...
almost 3 years ago in IBM Safer Payments / Business 0 Future consideration

In Case Management, allow visibility of fired rules for only the mandator level they fired in.

Currently, in a Case all rules that fired will be shown in that case, even if another case class was raised by the same or other rules firing at the same time. Therefore, if there are sensitive rules from 1 department firing, these would be visibl...
about 3 years ago in IBM Safer Payments / Business 1 Not under consideration

Defined risk list entry upload detailed audit trail

Currently when a file is uploaded to a defined risk list there is no indication on the audit trail of what individual entries were added or updated to such list.
over 2 years ago in IBM Safer Payments / Business 0 Not under consideration

support auth mode 3 for scheduling forecasts in PAW

scheduling a forecast is not supported in Planning Analytics Local configured with Windows integrated authentication (mode 3). compare with: https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=forecasts-schedule-baseline-forecast since ther...
over 1 year ago in Analytics Engine 0 Submitted

Return Case ID in query

Business creates a report daily from query to get all cases generated on that day. At this time there is not a way to return Case ID in query results when running ad-hoc or index queries against transactions. It will great to get required informat...
9 months ago in IBM Safer Payments / Business 0 Under review

Allow displaying spark CPU and memory values on Monitoring page

Currently Spark runtimes do not register the pods on the Spark side with the runtimes framework. So, Monitoring page show only 1vCPU and 1G RAM regardless of the size of the Spark environment choosen is only the proxy pod on the Jupyter Client sid...
over 3 years ago in Analytics Engine 1 Future consideration

Allow option to count business days instead of calendar days

Looking for the option to view a count of business days across Safer Payments, including: Case investigation and case reporting. How many business days have passed since the case was created or since it was last transitioned? What is the average n...
about 3 years ago in IBM Safer Payments / Business 0 Not under consideration

Adding Name of job to job page/UI

Right now in the job page, you can see the timestamp and the AppID/ JobID, but there is no information about the actual name of the job. We would like to have that information to quickly check the status without having to collect the ID's and swap...
10 months ago in Analytics Engine 1 Submitted

HTTP Response codes displaying for job status

Right now the HTTP response codes are returning 200's for everything about the jobs. When executing a job, when it fails or errors out we would expect to receive the proper response codes for the error, but it still only returns 200.
10 months ago in Analytics Engine 4 Submitted