Skip to Main Content
IBM Data and AI Ideas Portal for Customers


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea

Help IBM prioritize your ideas and requests

The IBM team may need your help to refine the ideas so they may ask for more information or feedback. The product management team will then decide if they can begin working on your idea. If they can start during the next development cycle, they will put the idea on the priority list. Each team at IBM works on a different schedule, where some ideas can be implemented right away, others may be placed on a different schedule.

Receive notification on the decision

Some ideas can be implemented at IBM, while others may not fit within the development plans for the product. In either case, the team will let you know as soon as possible. In some cases, we may be able to find alternatives for ideas which cannot be implemented in a reasonable time.

Additional Information

To view our roadmaps: http://ibm.biz/Data-and-AI-Roadmaps

Reminder: This is not the place to submit defects or support needs, please use normal support channel for these cases

IBM Employees:

The correct URL for entering your ideas is: https://hybridcloudunit-internal.ideas.aha.io


ADD A NEW IDEA

FILTER BY CATEGORY

Connectivity

Showing 141

Snowflake Connector Error handling

1) We encountered an issue while using the insert or update option. The continue on error option does not work and the job fails if it encounters bad data. We opened a case with IBM. There is no reject capture mechanism. Workaround: Two step solut...
almost 2 years ago in Connectivity 6 Future consideration

Data Direct Driver for Presto DB Connectivity

Need Data Direct ODBC/JDBC Driver to connect to the Presto DB. Currently Presto DB Connectivity is certified using JDBC Driver with external Simba JDBC Presto Driver but not through internal/native Data Direct Drivers.
over 2 years ago in Connectivity 1 Future consideration

ODBC - Bulk Load capability

CP4D - DataStage will be used in the data flows defined in our datalake using Postgres and Azure SQL. Because we are required to use the ODBC Connector for the connections we require the capability of doing Bulk Loads , exactly as this is offered ...
12 months ago in Connectivity 2 Not under consideration

Include 'NTLM' as an authentication method in DataStage Hierarchical stage

We have an important project that needs to source the data from the SharePoint page. The default authentication method in our SharePoint farm is 'NTLM' which is not supported by Hierarchical stage. It's impossible to enable the other method on our...
about 2 years ago in Connectivity 3 Not under consideration

IIAS-only connector ( like as Netezza)

Nowadays, we should use "db2 connector " when we develop job in IIAS ( IBM Integrated Analytics System)So they want to use 'IIAS-only connector' like as Netezza connector. If we have no plan for IIAS-only connector then they want to separate DB2 a...
over 3 years ago in Connectivity 0 Not under consideration

Azure Synapse Connector

A dedicated connector stage for Azure Synapse will be helpful for organization with multi-cloud presence. Current version of Cloud Pak doesn't have this feature.
9 months ago in Connectivity 0 Future consideration

Redshift connector slow in IIS DataStage for Insert/Update

We have a use case in which we need to process records from IIS DataStage into an aws Redshift instance. When using the stage as currently implemented, we're seeing a throughput that averages 2-6 records a second which is far below the acceptable ...
about 1 year ago in Connectivity 3 Planned for future release

Teradata connector restricts max Response Row size to 64K bytes only

We have data sources where column field lengths are in excess of varchar(3000), using the teradata connector with immediate mode, job fails with error "RDBMS code 9990: Response Row size exceeds 64K bytes and is incompatible with the Client softwa...
9 months ago in Connectivity 0 Future consideration

Application Continuity for Data Stage - Oracle DB 19c

The Oracle 12.2 version has a feature called 'Application Continuity'. Application Continuity is used to improve the user experience when handling both unplanned outages and planned maintenance. It masks outages from end users and applications by ...
over 2 years ago in Connectivity 4 Future consideration

Add option to use FTPS protocol in FTP Enterprise Stage in DataStage parallel jobs

Add option to use FTPS protocol in FTP Enterprise Stage in DataStage parallel jobs. Datastage only is allowed sftp and ftp protocols for long time.
almost 2 years ago in Connectivity 1 Future consideration