Skip to Main Content
IBM Data and AI Ideas Portal for Customers

Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea

Help IBM prioritize your ideas and requests

The IBM team may need your help to refine the ideas so they may ask for more information or feedback. The product management team will then decide if they can begin working on your idea. If they can start during the next development cycle, they will put the idea on the priority list. Each team at IBM works on a different schedule, where some ideas can be implemented right away, others may be placed on a different schedule.

Receive notification on the decision

Some ideas can be implemented at IBM, while others may not fit within the development plans for the product. In either case, the team will let you know as soon as possible. In some cases, we may be able to find alternatives for ideas which cannot be implemented in a reasonable time.

Additional Information

To view our roadmaps:

Reminder: This is not the place to submit defects or support needs, please use normal support channel for these cases

IBM Employees:

The correct URL for entering your ideas is:

Status Under review
Workspace DataStage
Created by Guest
Created on Feb 22, 2022

Enhance Big Integrate BDFS stage to support parameters.

This request is an offshoot of Case TS008439858.

While migrating new jobs to 11.7 that run in a different hadoop cluster, we have run into an issue writing hdfs data to an encryption zone.

Our hadoop team is hesitant to set these cluster wide because these two in particular can have wide-reaching effects when set for a whole cluster.


We tested access outside of Big Integrate by adding the above settings.

Failed access:

hdfs dfs -touchz hdfs://prodmid:8020/jad/integration/intprod/use_case_11/MyTestFile

touchz: No KeyProvider is configured, cannot access an encrypted file

Successful access:

hdfs dfs -Ddfs.encryption.key.provider.uri=kms:// -touchz hdfs://prodmid:8020/jad/integration/intprod/use_case_11/MyTestFile

We would request the ability to add hadoop related parameters to the BDFS stage so these are available at job runtime and do not have to be set globally at the cluster level.

Needed By Quarter