IBM Data and AI Ideas Portal for Customers


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea

Help IBM prioritize your ideas and requests

The IBM team may need your help to refine the ideas so they may ask for more information or feedback. The product management team will then decide if they can begin working on your idea. If they can start during the next development cycle, they will put the idea on the priority list. Each team at IBM works on a different schedule, where some ideas can be implemented right away, others may be placed on a different schedule.

Receive notification on the decision

Some ideas can be implemented at IBM, while others may not fit within the development plans for the product. In either case, the team will let you know as soon as possible. In some cases, we may be able to find alternatives for ideas which cannot be implemented in a reasonable time.

Additional Information

To view our roadmaps: http://ibm.biz/Data-and-AI-Roadmaps

Reminder: This is not the place to submit defects or support needs, please use normal support channel for these cases

IBM Employees:

The correct URL for entering your ideas is: https://hybridcloudunit-internal.ideas.aha.io


ADD A NEW IDEA

FILTER BY CATEGORY

Connectivity

Showing 191

Salesforce connector : Support SFDC Bulk API 2.0

Bulk API 2.0 simplifies and improves the performance by ability to automatically do chunking -https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/asynch_api_intro.htm
over 1 year ago in Connectivity 0 Planned for future release

DataStage to support numeric Column Family (CF) in HBase

Currently, DataStage only supports alphabetic (letters) Column Family (CF) in HBase tables and it is a common practice to create numeric CF in HBase tables (e.g. the default CF is zero). Therefore, if CF has a numeric value DataStage cannot write ...
over 1 year ago in Connectivity 4 Delivered

Amazon S3 connector MFA support

We would like to connect an Amazon S3 connector to crawl our MFA-enabled S3 instance to pull in data to use in the Watson CP4D environment, but the current Amazon S3 connector does not support connecting to MFA-enabled environments.
3 months ago in Connectivity 1

Kafka Connector job failed when schema compatibility mode set to FULL

TS006561571We have IIS 11.7.1.1 with SP1 installed on RedHat 7. Users are using Kafka Connector stage to publish message with Avro schema. The job success when Kafka schema compatibility set to NONE. But the job failed with error "Schema being reg...
about 1 month ago in Connectivity 0 Submitted

Denodo support

We need Denodo support for DataStage using the Denodo JDBC drivers.Currently JDBC can be used, but connecting to Denodo is not supported by IBM.
about 2 months ago in Connectivity 1 Future consideration

Snowflake Connector Error handling

1) We encountered an issue while using the insert or update option. The continue on error option does not work and the job fails if it encounters bad data. We opened a case with IBM. There is no reject capture mechanism.Workaround:Two step solutio...
12 months ago in Connectivity 3 Future consideration

DataStage Amazon S3 Connector Enhancement

Allow users to control the netowrk path of S3 connectivity. Allows a user to control the network path that our connections into S3 follow. What we're looking for involves providing an alternate URL for S3 communication. As an example, when run v...
9 months ago in Connectivity 2 Planned for future release

DataStage - Azure [Datalake] Storage Connector - Support parallelism parquet format

From PMR it has been confirmed that:Parallel Read/write - CSV and Delimited only support parallel write operations, rest of all file formats do not support parallel read/write. Parquet format doesn't support parallel read/writeIt is not documented...
10 months ago in Connectivity 0 Future consideration

Include 'NTLM' as an authentication method in DataStage Hierarchical stage

We have an important project that needs to source the data from the SharePoint page. The default authentication method in our SharePoint farm is 'NTLM' which is not supported by Hierarchical stage. It's impossible to enable the other method on our...
over 1 year ago in Connectivity 3 Not under consideration

Support for Avro Logical Data Types in Google Cloud Storage Connector Stage

Related to existing idea - "Support for Avro Logical Data Types in File Connector Stage"
5 months ago in Connectivity 0 Future consideration