IBM Data and AI Ideas Portal for Customers

Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea

Help IBM prioritize your ideas and requests

The IBM team may need your help to refine the ideas so they may ask for more information or feedback. The product management team will then decide if they can begin working on your idea. If they can start during the next development cycle, they will put the idea on the priority list. Each team at IBM works on a different schedule, where some ideas can be implemented right away, others may be placed on a different schedule.

Receive notification on the decision

Some ideas can be implemented at IBM, while others may not fit within the development plans for the product. In either case, the team will let you know as soon as possible. In some cases, we may be able to find alternatives for ideas which cannot be implemented in a reasonable time.

Additional Information

To view our roadmaps:

Reminder: This is not the place to submit defects or support needs, please use normal support channel for these cases

IBM Employees:

The correct URL for entering your ideas is:

Status Future consideration
Workspace Optim
Components Optim Archive
Created by Guest
Created on Oct 18, 2018

Allow record count limit for reference tables

We only do Archive for decommission.  We like to test Optim against each of the tables to ensure we will not have any surprises when we run the full archive.  It is very disappointing to run an archive and have it fail several hours into the process.  I like to say that we want to Fail Fast if it is only to fail.  To do this, we have a process that will take the original archive request and access definition and generate new test ones that will limit the number of rows to be archived.  We have picked to do 10 rows at a time.  For Oracle DB, we insert SQL to limit the rows.  We have not found a way to do this for other DB - like SQL Server.  So instead, we generate one access request and one access definition for each table.  For each one, we place the row limit on the start table.  This can generate a large number of items.  It would be nice to have it generate a row limit on the other tables (which we always set as reference tables).  Since we are doing decomm archiving, we only use the start table and reference tables. Since row limits are now allowed on reference tables, this is a problem.  It would nice to have the ability to have row limits on reference tables and also to be able to generate that in the user interface like you so for set all reference.  The ability to do that in the GUI for all reference tables is secondary to allowing the row limit on the reference tables.  This gives us some level of confidence that when we do the real archive, Optim can hande this database.  Also, it gives us a test archive to ensure that Optim Connect can also handle the different column names, table names and data types.  This way, if there is an issue we can find them up front and not wait till the full archive has been produced to find issues that have to be dealt with and have to rerun the archive.