This portal is to open public enhancement requests against products and services offered by the IBM Data & AI organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).
Shape the future of IBM!
We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:
Search existing ideas
Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updateson them if they matter to you. If you can't find what you are looking for,
Post your ideas
Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,
Post an idea
Upvote ideas that matter most to you
Get feedback from the IBM team to refine your idea
Specific links you will want to bookmark for future use
DataStage Support for MQv9.2 and MQv9.3 Required for HSBC (and very likely other users)
This is useful because:
HSBC and other enterprise-class customers build standardised packages of software at compatible levels in support of business applications. Re-creation of these standard packages e.g. when a component reaches end of suppo...
IBM is involved in implementing data integration solution for State Bank of India using DataStage (version 11.7) and Spectrum Scale (version 5.05). DataStage is supposed to read / write object storage data hosted in Spectrum Scale using Swift prot...
Requesting enhancement to use key pair authentication with the datastage snowflake connector rather than using the typical username/password authentication. https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#using-key-pair-authentic...
Adding feature for more frequent commits when using DB2 connector
When loading big amount of data into DB2 for Warehousing, we are exceeding DB2 limitation 300GB which means that we need to load the data with more than just one commit. However when using external tables in DataStage, DB2 connector ingores the co...
The DB2 connector stage should have a WHERE clause field to provide filter expressions to limit the number of rows when using "Generate SQL" = "Yes". There is a similar feature in the DRS Connector Stage.
kafka connector should be able to read only committed messages
Our Kafka Producer writes messages on a Kafka topic acting transactionally and it could rollback the transaction when certain events occur. As of now, the behaviour of the Kafka connector is to read also uncommitted messages (see case TS003289786)...
DataStage connector for SalesForce should attempt reconnecting several times if connection is temporarily interrupted
This is referring to IBM Infosphere DataStage v11.5! I could not find this exact product in the list.ETL jobs that are connecting to SalesForce to fetch data do not extract all available data reliably. Sometimes the connection is briefly interrupt...
Supporting all datatypes in parquet/avro/orc file format, we are really need it on parquet for sure
As of today these “date”, “decimal/numeric”, “timestamp” data types in parquet file format not supported by Datastage 11.5 or 11.7, which is required for critical data migration project.And now we are converting above data type to different format...
Do not place IBM confidential, company confidential, or personal information into any field.