IBM Data and AI Ideas Portal for Customers


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea

Help IBM prioritize your ideas and requests

The IBM team may need your help to refine the ideas so they may ask for more information or feedback. The product management team will then decide if they can begin working on your idea. If they can start during the next development cycle, they will put the idea on the priority list. Each team at IBM works on a different schedule, where some ideas can be implemented right away, others may be placed on a different schedule.

Receive notification on the decision

Some ideas can be implemented at IBM, while others may not fit within the development plans for the product. In either case, the team will let you know as soon as possible. In some cases, we may be able to find alternatives for ideas which cannot be implemented in a reasonable time.

Additional Information

To view our roadmaps: http://ibm.biz/Data-and-AI-Roadmaps

Reminder: This is not the place to submit defects or support needs, please use normal support channel for these cases

IBM Employees:

The correct URL for entering your ideas is: https://hybridcloudunit-internal.ideas.aha.io


ADD A NEW IDEA

FILTER BY CATEGORY

DataStage

Showing 25

kafka connector should be able to read only committed messages

Our Kafka Producer writes messages on a Kafka topic acting transactionally and it could rollback the transaction when certain events occur. As of now, the behaviour of the Kafka connector is to read also uncommitted messages (see case TS003289786)...
over 1 year ago in DataStage 1 Delivered

Supporting all datatypes in parquet/avro/orc file format, we are really need it on parquet for sure

As of today these “date”, “decimal/numeric”, “timestamp” data types in parquet file format not supported by Datastage 11.5 or 11.7, which is required for critical data migration project.And now we are converting above data type to different format...
almost 2 years ago in DataStage 2 Delivered

DFD engine as non-root user

As a financial company we have regulations around who get root access, it is strictly reserved only for server administration group personals. Our InformationServer/DataStage administration and support will not get any type of super user access (s...
over 2 years ago in DataStage 0 Delivered

Move Checkout/In from ISM to Designer

Checkout/In and versioning functionallity should be moved from ISM to the DataStage Designer. Deployment should stay within the ISM. The current implemantation is not practicle, which leads people to not using it and still work with DSX exports f...
over 2 years ago in DataStage 1 Delivered

Datastage Snowflake connector add Key Pair Authentication

Requesting enhancement to use key pair authentication with the datastage snowflake connector rather than using the typical username/password authentication. https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#using-key-pair-authentic...
over 2 years ago in DataStage 2 Delivered

Update iLog Stage To Allow Execution of Decision Engine/Decision Service

The current iLog Stage allows for execution of ruleset applications and rulesets using the 'classic' rule engine. A newer rule engine (Decision Engine), and rule application archive (Decision Service) architecture was implemented in version 8.5+ o...
almost 3 years ago in DataStage 0 Delivered

HDP3.0 certification with BigIntegrate 11.7

We would like to upgrade our HDP 2.6 to HDP 3.0 version and use ACID transactions in HIve within BigIntegrate jobs.
almost 3 years ago in DataStage 0 Delivered

Datastage 11.7.1 Snowflake Connector Upgrades Required

We have being trialling the Snowflake Connector and there's some functionality that has not yet been implemented into Snowflake Connector which greatly impacts it's performance.1. The use TWT's - The lack of this functionality limits performance o...
about 3 years ago in DataStage 1 Delivered

POWER9 Support

The competitive bidding will be held in April 2019, so we need to decide the configuration of the proposing server and software in October.We would like to propose POWER9 server, but Information Server 11.7 dosen't support POWER9.The request is fo...
about 3 years ago in DataStage 0 Delivered

Datastage to Azure DataLake Storage (ADLS) connectivity

We need ability to connect to Azure datalake storage (ADLS) using Datastage product to enhance and augment our Enterprise DataWarehouse
about 3 years ago in DataStage 0 Delivered