Skip to Main Content
IBM Data and AI Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data & AI organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com


ADD A NEW IDEA

DataStage

Showing 445

DataStage - orchadminn cp command - copy the schema and contents with the target

orchadminn cp command doesn't copy the schema if the target file already exists.IHAC knows that it works fine if the target is deleted and orchadminn cp is executed.However, the man page is not clear, Copy the schema, contents and preserve-partiti...
over 3 years ago in DataStage 0 Not under consideration

Create SF_USE_LOCAL_TIMEZONE_DATEFIELD functionality for Timestamp fields

There seems to be an issue converting TimeStamp fields extracted from Salesforce to their correct timezone with the SF_USE_LOCAL_TIMEZONE_DATEFIELD setting that was suggested to me by IBM Support. It should be possible to have that same functional...
over 3 years ago in DataStage 0 Not under consideration

WSDL 2.0 support enhancment

We would like Datastage to support WSDL 2.0 importing.
over 3 years ago in DataStage 0 Not under consideration

Have to flip back and forth between Flow Designer and the Desktop Client

We are wondering if there is a patch already for this or if we can request an enhancement for something we see when using Flow Designer. Because we are building jobs in flow designer and using the Hierarchical stage, which can only be used in Flow...
almost 2 years ago in DataStage 0 Not under consideration

Support for functionality to retrieve job logs via REST API

Currently the DataStage Flow Designer REST API does not have a function to retrieve logs upon job completion. We would like to see such functionality added, similar to the DSJOB -logsum and -logdetail option. Thank you!
almost 2 years ago in DataStage 0 Not under consideration

Support for Custom URL within File connector HDFS

We need to import data files and folders as well as the file structure to IGC from Atlas HDFS, which is not possible when using Custom url.
over 3 years ago in DataStage 4 Not under consideration

Enhance DS Git integration for proper segregation between DTAP environments (TS003975527)

Currently, while using the Git Integration from Datastage Flow Designer, it is easy for any user to connect to any GitLab repository it had access and upload jobs inside it. The issue comes when on a Production environment, for example, you can lo...
over 3 years ago in DataStage 0 Not under consideration

AWS Lambda

We want to call a Lambda function which triggers another process on AWS, this needs to happen after we successfully land the file on S3 bucket immediately.
almost 2 years ago in DataStage 3 Not under consideration

Datastage Super Operator should have Run Now + Stop ability parameterised

In our production Datastage 11.5 environment we submit all of our jobs by using the command line from Linux scripts so that should only be submitted via an external scheduled job. This gives a degree of control over accidental updates to the produ...
almost 4 years ago in DataStage 3 Functionality already exists

Bigquery connector certified for balanced optimization

In the face of the customer case indicated below, the customer asks for this enanchment to have the bigquery connector certified on balance and optimization in order to obtain optimized jobs for that connector as well.
almost 2 years ago in DataStage 0 Not under consideration