IBM Data and AI Ideas Portal for Customers


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea

Help IBM prioritize your ideas and requests

The IBM team may need your help to refine the ideas so they may ask for more information or feedback. The product management team will then decide if they can begin working on your idea. If they can start during the next development cycle, they will put the idea on the priority list. Each team at IBM works on a different schedule, where some ideas can be implemented right away, others may be placed on a different schedule.

Receive notification on the decision

Some ideas can be implemented at IBM, while others may not fit within the development plans for the product. In either case, the team will let you know as soon as possible. In some cases, we may be able to find alternatives for ideas which cannot be implemented in a reasonable time.

Additional Information

To view our roadmaps: http://ibm.biz/Data-and-AI-Roadmaps

Reminder: This is not the place to submit defects or support needs, please use normal support channel for these cases

IBM Employees:

The correct URL for entering your ideas is: https://hybridcloudunit-internal.ideas.aha.io


ADD A NEW IDEA

FILTER BY CATEGORY

DataStage

Showing 340

DataStage - orchadminn cp command - copy the schema and contents with the target

orchadminn cp command doesn't copy the schema if the target file already exists.IHAC knows that it works fine if the target is deleted and orchadminn cp is executed.However, the man page is not clear,Copy the schema, contents and preserve-partitio...
about 1 year ago in DataStage 0 Not under consideration

Create SF_USE_LOCAL_TIMEZONE_DATEFIELD functionality for Timestamp fields

There seems to be an issue converting TimeStamp fields extracted from Salesforce to their correct timezone with the SF_USE_LOCAL_TIMEZONE_DATEFIELD setting that was suggested to me by IBM Support. It should be possible to have that same functional...
about 1 year ago in DataStage 0 Future consideration

WSDL 2.0 support enhancment

We would like Datastage to support WSDL 2.0 importing.
about 1 year ago in DataStage 0 Not under consideration

Add search capability for stages on the designer canvas

If you have very large sequences, it can get difficult to find the job that you need to change as you need to scroll through the whole sequence. It would be very handy to be able to search on the designer canvas based on different criteria (stage...
almost 3 years ago in DataStage 1 Planned for future release

Import a DataStage job into a different environment on a different server (Git integration)

Currently, the script dfdGitCli.sh doesn't allow a job to be imported into a slightly different project name on a different server, e.g. from project_dev on erads11.ci.datastage.com to project_tst on erads12.ci.datastage.com. Given that our work ...
7 months ago in DataStage 3 Not under consideration

IBM Datastage job migrated from V11.5 to V11.7 and use the Azure File System for storage account

We have migrated our jobs from V11.5 to V11.7 environment and uses soft link to Azure file system (storage account) for users input/output files. users are not allowed to drop files on server local system due to Azure server restriction on our en...
over 1 year ago in DataStage 3 Is a defect

Generate schema from SQL statement

In DB2 Connector stage it should be possible to generate the schema based on the (handcoded) Select statement. It could be an additional option of the Tools Button ("Generate schema/columns").
about 4 years ago in DataStage 0 Future consideration

Project specific GIT repository in GIT Setup server TAB of Dataflow Designer tool

Summary of Observation: We do have different projects in our datastage application. Each project is having its own GIT repository in Bit bucket & Facing 2 different functionalities missing in DFD tool. Scenario -1: As per our GIT repository se...
7 months ago in DataStage 1 Not under consideration

Support for Custom URL within File connector HDFS

We need to import data files and folders as well as the file structure to IGC from Atlas HDFS, which is not possible when using Custom url.
about 1 year ago in DataStage 4 Not under consideration

kafka connector should be able to read only committed messages

Our Kafka Producer writes messages on a Kafka topic acting transactionally and it could rollback the transaction when certain events occur. As of now, the behaviour of the Kafka connector is to read also uncommitted messages (see case TS003289786)...
almost 2 years ago in DataStage 1 Delivered