Skip to Main Content
IBM Data and AI Ideas Portal for Customers


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea

Help IBM prioritize your ideas and requests

The IBM team may need your help to refine the ideas so they may ask for more information or feedback. The product management team will then decide if they can begin working on your idea. If they can start during the next development cycle, they will put the idea on the priority list. Each team at IBM works on a different schedule, where some ideas can be implemented right away, others may be placed on a different schedule.

Receive notification on the decision

Some ideas can be implemented at IBM, while others may not fit within the development plans for the product. In either case, the team will let you know as soon as possible. In some cases, we may be able to find alternatives for ideas which cannot be implemented in a reasonable time.

Additional Information

To view our roadmaps: http://ibm.biz/Data-and-AI-Roadmaps

Reminder: This is not the place to submit defects or support needs, please use normal support channel for these cases

IBM Employees:

The correct URL for entering your ideas is: https://hybridcloudunit-internal.ideas.aha.io


ADD A NEW IDEA

FILTER BY CATEGORY

Connectivity

Showing 137

SAS ODBC Connectivity with DataStage

Need capability to use SAS ODBC connection with Data Stage.
over 5 years ago in Connectivity 3 Future consideration

In DataStage, wildcards in the Azure Data Lake Storage connector are only allowed the filename, not in the filepath

In the Azure Data Lake Storage connector DataStage can read multiple files at once by specifying wildcards. But these wildcards are only allowed in the filename, not in the filepath. This means that all the files we want to read have to be present...
9 months ago in Connectivity 0 Future consideration

Application Continuity for Data Stage - Oracle DB 19c

The Oracle 12.2 version has a feature called 'Application Continuity'. Application Continuity is used to improve the user experience when handling both unplanned outages and planned maintenance. It masks outages from end users and applications by ...
over 2 years ago in Connectivity 3 Future consideration

Enable renaming input/output links of a hierachical stage

Currently it is impossible to rename the input/output links of a Hierarchical stage without putting the assembly in an invalid state See Case: TS000177036
about 4 years ago in Connectivity 0 Future consideration

Reading data from Amazon Simple Queue Service (SQS)

Need Datastage integration with Amazon SQS (simple queue service) to communicate JASON file
over 4 years ago in Connectivity 0 Future consideration

Support for PostgreSQL and EnterpriseDB through ODBC

The official supported ODBC drivers from DataDirect do not support PostgreSQL versions above 9.2 and EnterpriseDB (the commercial implementation of PostgreSQL) is not supported at all. PostgreSQL and EnterpriseDB are more and more used, it is ridi...
over 5 years ago in Connectivity 0 Future consideration

Include 'NTLM' as an authentication method in DataStage Hierarchical stage

We have an important project that needs to source the data from the SharePoint page. The default authentication method in our SharePoint farm is 'NTLM' which is not supported by Hierarchical stage. It's impossible to enable the other method on our...
almost 2 years ago in Connectivity 3 Not under consideration

DataStage connectivity for Microsoft Azure Cosmos DB SQL API (noSQL)

As you've already done for Azure Cosmos DB Cassandra API, can you also developp a connector to Azure Cosmos DB SQL API ?
over 2 years ago in Connectivity 3 Future consideration

Support for Apache Accumulo through DataWave

USG customer uses Apache Accumulo as one of their main data repositories. Need CP4D to be able to use Accumulo as a source and target data source for Information Governance, Data Science, Virtualization, and AI functionality (All of CP4D)
over 2 years ago in Connectivity 2 Future consideration

Supress metadata check in ODBC connector

The ODBC connector compares the metadata in DataStage with the metadata in the target system. When there is a mismatch in the metadata, warnings are issued in the DataStage Director log.Please add an option, so this check can be disabled for speci...
over 3 years ago in Connectivity 0 Future consideration