Skip to Main Content
IBM Data Platform Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data Platform organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com



ADD A NEW IDEA

Clear

Ideas

Showing 152 of 18235

Spark job to be able to access, read and write on external database without the credentials being exposed anywhere (instance, logs, job metadata, etc…)

We want to create a job on a Spark instance of the Analytics Engine powered by Apache Spark that runs a Spark application reading from and writing to an Exasol database. To enable database access, we need to pass our user credentials into the job....
about 1 month ago in Analytics Engine 0 Submitted

Having more parameters in API calls

It would be beneficial to be able to pass more parameters during an API call (server/execute, for example). Currently, we can use an overrides file to modify a DBalias or the path to an XF file, for instance, in the case of a data extraction reque...
14 days ago in Optim / Optim Test Data Management 0 Submitted

Ability to paste '-' characters as 0.

0 values formatted as a '-' in Excel cannot be pasted into Planning Analytics. This causes a spreading error. Users often format 0 values as a '-'. It would be helpful if Planning Analytics could interpret '-' as a 0 on paste.
29 days ago in Analytics Engine 0 Submitted

Add a Trace Security in order to understand how a right is evaluated by the engine

As for business rules, PAL/TM1, security could be a mix of data and/or rules on the system security objects. For business rules, we can use the "trace calculations" which is really a great feature. For security, we can only use... our brain so a "...
over 2 years ago in Analytics Engine 3 Not under consideration

Requesting Java SE 9 or higher with 64 arch support for IBM Optim 11.7

We are currently using Oracle provided JRE 1.8 32 bit in all of our Optim servers in Linux/Windows. Since Oracle as made all Java/JRE need to have license which needs us to pay license cost for the same. Hence our organization adopted openjdk prov...
6 months ago in Optim / Optim Test Data Management 1 Submitted

MDE Job is failed with error "Deployment not found with given id"

We found the issue related to the Spark/MDE blocker issue. MDE job triggered a spark-runtime deployment first The spark-runtime deployment then triggered corresponding spark-master/worker pods If the 1st spark-master/worker pod does not come up wi...
over 1 year ago in Analytics Engine 0 Under review

Ability to resize the columns across the job page, as well as any other columns in order to see text that is too long

Column Resizing so that we can see the full line of text and not have it cut off if the text is too long. The ability to resize would help for copying and pasting pieces of the string or the entire string itself.
10 months ago in Analytics Engine 1 Submitted

OPTIM/HPU INTERFACE CHANGE TO ALLOW USE OF MULTIPLE IMAGE-COPY DATASETS

This is an enhancement request for OPTIM TDM for z/OS (FMID HAEYB70). OPTIM TDM/DP currently allows users to inform only one image-copy dataset (taken in another DB2 subsystem) in the process of extracting and converting data. In our environment w...
about 3 years ago in Optim / Optim Test Data Management 1 Future consideration

Reservation of nodes for spark job/ Labeling, Taint and Toleration of workers and spark pods 

As an CP4d service provides we would like to use special worker nodes only for spark job execution called via the pyspark API. One reason for this is, that we sometimes see that the spark jobs use up all resources. In this scenario we would like a...
about 3 years ago in Analytics Engine 0 Not under consideration

Optim not able to see these nested tables and the parent table

The nested tables are going to be in the same realm as complex nested view where Optim currently is not robust enough to access the correct meta data for them.
7 months ago in Optim / Optim Test Data Management 0 Submitted