This portal is to open public enhancement requests against products and services offered by the IBM Data & AI organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).
Shape the future of IBM!
We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:
Search existing ideas
Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,
Post your ideas
Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,
Post an idea
Upvote ideas that matter most to you
Get feedback from the IBM team to refine your idea
Specific links you will want to bookmark for future use
Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.
IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.
ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.
IBM Employees should enter Ideas at https://ideas.ibm.com
See this idea on ideas.ibm.com
The WatsonX AI platform currently lacks support for batch mode inferencing, which is essential for handling large-scale data processing tasks required by our enterprise customers. Many of our users have identified the need to perform inferencing on multiple inputs simultaneously, often involving a significant volume of text data that needs to be processed quickly and efficiently.
For example, one of our customers operates in an environment where they frequently encounter limitations when trying to process large batches of texts for inferencing, even with a Bring Your Own Model (BYOM) instance. These limitations stem from network constraints, model configuration issues, or the need to perform pre/post-processing steps on a high volume of data before it can be fed into WatsonX AI for inference. This not only slows down their operations but also forces them to rely on technical workarounds that are often cumbersome and inefficient.
Implementing batch mode inferencing would directly address these pain points by enabling users to send large batches of texts or other types of data in a single request, thereby improving processing speed and reducing operational bottlenecks. This feature would not only enhance the usability of WatsonX AI but also open up new opportunities for customers to leverage its capabilities on an unprecedented scale.
Implementation Ideas/Options:
I can see this enhancement would be a valuable asset to WatsonX, offering both immediate benefits to current users and future growth opportunities. I would welcome the opportunity to discuss this proposal further and explore its feasibility and timeline for implementation.
Needed By | Yesterday (Let's go already!) |
By clicking the "Post Comment" or "Submit Idea" button, you are agreeing to the IBM Ideas Portal Terms of Use.
Do not place IBM confidential, company confidential, or personal information into any field.