IBM Data and AI Ideas Portal for Customers


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea

Help IBM prioritize your ideas and requests

The IBM team may need your help to refine the ideas so they may ask for more information or feedback. The product management team will then decide if they can begin working on your idea. If they can start during the next development cycle, they will put the idea on the priority list. Each team at IBM works on a different schedule, where some ideas can be implemented right away, others may be placed on a different schedule.

Receive notification on the decision

Some ideas can be implemented at IBM, while others may not fit within the development plans for the product. In either case, the team will let you know as soon as possible. In some cases, we may be able to find alternatives for ideas which cannot be implemented in a reasonable time.

Additional Information

To view our roadmaps: http://ibm.biz/Data-and-AI-Roadmaps

Reminder: This is not the place to submit defects or support needs, please use normal support channel for these cases

IBM Employees:

The correct URL for entering your ideas is: https://hybridcloudunit-internal.ideas.aha.io


ADD A NEW IDEA

FILTER BY CATEGORY

Connectivity

Showing 197

Salesforce Plugin Authentication Enhancement

Update 6/23 - Main question for Aha is on NTLM Authentication support for Salesforce connector. See below for more details from Dayo Nelson (images attached):We currently use Salesforce plugin stage with proxy and no proxy ID and password. The pro...
8 months ago in Connectivity 1 Future consideration

Support for latest Snowflake JDBC Connectivity on AIX

3.10.3 is the last version of Snowflake JDBC supported on AIX. Partner with Snowflake to build the more current version of JDBC for Infosphere on AIX
8 months ago in Connectivity 2 Future consideration

DataStage Designer - Fields for Siebel Business Objects or Components returning wrong values

This issues has been experienced on both InfoSphere Information Server 11.5 and 11.7.1.1. After submitting an IBM PMR to try to resolve the issues the final response from IBM Support was:Our Engineering team reviewed the issue and have advised th...
9 months ago in Connectivity 2 Not under consideration

Support for MongoDB version 4.x

Currently there is ODBC drivers for older versions of MongoDB but not for the latest version
9 months ago in Connectivity 1 Future consideration

ODBC and JDBC support for CockroachDB

Getting the following error messages trying to connect to CockroachDB using the ODBC PostgreSQL Wire Protocol diriver or JDBC driver:
9 months ago in Connectivity 2 Future consideration

Extending temp file creation in BQ to include microseconds

The file names are not having the full microseconds. it is stopping with only 3 milliseconds and all the files are getting created with 4 zeroes at the end. node 4 and node 21 tried to create the file with the same name bigQueryTempFile20210415133...
9 months ago in Connectivity 0 Delivered

BQ connector parallelism solution for export job flow

The BQ connector has issues with exporting data from BQ to DataStage on-prem solution in a parallel mode. Would like it to work in parallel
10 months ago in Connectivity 1 Delivered

Support Oracle 19c multitenant on datastage

Oracle 19c is now the next long term version for Oracle.A new functionality on Oracle is Multitenant.We would like to have the possibility to use Multitenant when using the Oracle database as Datastage Repository and When connecting from Datastage...
11 months ago in Connectivity 1 Functionality already exists

Connection Pooling for DataStage

We wish us that DataStage has the possibility to use a connection pool. We have thousands of connects and disconnects in Db2 that stresses out LDAP and use a lot of resources in Db2. It also costs a lot of unnecessary time.
11 months ago in Connectivity 3 Not under consideration

Increase data block size when writing to BQ/Google Cloud

Maximize your single stream performance by uploading >16MB objects if you can control the object size, as there is a significant knee of the curve as the object size increases from 2 to 16 MB. Doing so will very likely also decrease your Cloud ...
11 months ago in Connectivity 0 Delivered