Skip to Main Content
IBM Data Platform Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data Platform organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com



ADD A NEW IDEA

Clear

Ideas

Showing 135 of 18234

Spark job to be able to access, read and write on external database without the credentials being exposed anywhere (instance, logs, job metadata, etc…)

We want to create a job on a Spark instance of the Analytics Engine powered by Apache Spark that runs a Spark application reading from and writing to an Exasol database. To enable database access, we need to pass our user credentials into the job....
about 1 month ago in Analytics Engine 0 Submitted

Enhance the Gateway Java API to include the ability to get and use a database connection in user exits

Publish a supported Java API that a customer implemented Gateway User Exits can get a database connection from. The database connection should be based upon the same database credentials as what the Gateway uses. The database connection should be ...
about 1 month ago in Financial Transaction Manager  / FTM Common Services 2 Submitted

Ability to paste '-' characters as 0.

0 values formatted as a '-' in Excel cannot be pasted into Planning Analytics. This causes a spreading error. Users often format 0 values as a '-'. It would be helpful if Planning Analytics could interpret '-' as a 0 on paste.
28 days ago in Analytics Engine 0 Submitted

Certain items that fail the modulus check do not trigger an error.

FROM TS007184858 When a modulus 11 routine is applied to checks for fields 3 (account) and 7 (aux-on-us) . If the modulus routine results in the check digit of a 10, it is represented by a dash in the MICR field. For certain items that have a nume...
about 2 months ago in Financial Transaction Manager  / FTM Common Services 0 Submitted

Would like Business rules to not allow a second click of the activate button or a flag be set that activation has started and not allow other attempts to activate again until it is complete.

A release was accidently activated twice within 2 minutes, while Business Rules was tied up, it kept inbound files from processing. This caused files to go into error in inbound physical transmissions. Files were able to be successfully restarted ...
5 months ago in Financial Transaction Manager  / FTM Common Services 0 Under review

Add a Trace Security in order to understand how a right is evaluated by the engine

As for business rules, PAL/TM1, security could be a mix of data and/or rules on the system security objects. For business rules, we can use the "trace calculations" which is really a great feature. For security, we can only use... our brain so a "...
over 2 years ago in Analytics Engine 3 Not under consideration

MDE Job is failed with error "Deployment not found with given id"

We found the issue related to the Spark/MDE blocker issue. MDE job triggered a spark-runtime deployment first The spark-runtime deployment then triggered corresponding spark-master/worker pods If the 1st spark-master/worker pod does not come up wi...
over 1 year ago in Analytics Engine 0 Under review

Ability to resize the columns across the job page, as well as any other columns in order to see text that is too long

Column Resizing so that we can see the full line of text and not have it cut off if the text is too long. The ability to resize would help for copying and pasting pieces of the string or the entire string itself.
10 months ago in Analytics Engine 1 Submitted

Reservation of nodes for spark job/ Labeling, Taint and Toleration of workers and spark pods 

As an CP4d service provides we would like to use special worker nodes only for spark job execution called via the pyspark API. One reason for this is, that we sometimes see that the spark jobs use up all resources. In this scenario we would like a...
about 3 years ago in Analytics Engine 0 Not under consideration

support auth mode 3 for scheduling forecasts in PAW

scheduling a forecast is not supported in Planning Analytics Local configured with Windows integrated authentication (mode 3). compare with: https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=forecasts-schedule-baseline-forecast since ther...
over 1 year ago in Analytics Engine 0 Submitted