Skip to Main Content
IBM Data Platform Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data Platform organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com



Status Submitted
Workspace Spectrum Conductor
Components Version 2.5.1
Created by Guest
Created on Aug 13, 2025

Have a way to enable/disable the perfdata feature. -XX:+UsePerfData

We need  a way to enable/disable the perfdata feature. -XX:+UsePerfData since under high loads, race condition will occur causing host to be blocked with exit code 127. This occurs since more than one process accessing /tmp/hsperdata_<user>/<PID>

Needed By Not sure -- Just thought it was cool
  • Guest
    Oct 6, 2025

    In addition to the last post: Any time a user (or we, as system owners) add another option such as "-Dlog4j.configurationFile=file:///… " to extraJavaOptions, it will override the extraJavaOptions that are absolutely necessary to stop the locking of hperfdata file, and therefore the "Error code 127 -- Command not found"

     

  • Guest
    Oct 6, 2025

    What I am suggesting is that we put in a way to inject arguments, but we make this optional, and therefore completely backwards compatible.

     

    Since we are already taking advantage of the SPARK_EGO_CONF_DIR_EXTRA option, to set some options across several SIGs at the same time, we can put in there 2 env variables:

     

    SPARK_EGO_SPARK_CLASS_LAUNCH_JAVA_OPTIONS="-XX:-UsePerfData"

    SPARK_EGO_SPARK_CLASS_CMD_JAVA_OPTIONS="-XX:-UsePerfData"

     

    Then, in the bin/spark-class code:

     

    • At the beginning, it loads bin/load-spark-env.sh, which also loads the SPARK_EGO_CONF_DIR_EXTRA options, so we’d get the variables we need, *if* they are set.

     

    • The build_command() function, becomes as such.  Note that if we read it into a bash array, and the array it’s empty, it does the right thing and expands to no argument at all, whereas if you have an empty string, it will be passed as an empty argument.

    build_command() {

      LAUNCH_ARGS=($SPARK_EGO_SPARK_CLASS_LAUNCH_JAVA_OPTIONS) # Array, and no quotes around var so it expands

      "$RUNNER" "${LAUNCH_ARGS[@]}" -cp "$LAUNCH_CLASSPATH" org.apache.spark.sessionscheduler.launcher.Main "$@"

      printf "%d\0" $?

    }

     

    • After that, there is a loop that constructs CMD, removes last argument as returned from launcher, etc.  We don’t touch that, we leave it alone. Then, at the end, we use array slicing to insert our arguments first.  Alternatively, you could intervene in the loop, but that may be more intrusive in the code.

    CMD=("${CMD[@]:0:$LAST}")

    REST=("${CMD[@]:1:$LAST}") # everything but the first one

    CMD_ARGS=($SPARK_EGO_SPARK_CLASS_CMD_JAVA_OPTIONS) # Array, and no quotes around var so it expands

    CMD=( "${CMD[0]}" "${CMD_ARGS[@]}" "${REST[@]}")

    exec "${CMD[@]}"

     

    • When Vikas and I tested this, it was giving an error that came from /usr/bin/env.  It said something like /usr/bin/env: invalid option.  We got around it by changing the top of the file to #!/bin/bash directly.  You many need to play around to make it portable.

     

    • Also, since we didn’t really know when exactly this was invoked and how, and we only wanted the extra arguments to be added to Java, we put a
      if [[ "${CMD[0]}" = "$RUNNER" ]]; then
      fi
      around the REST and CMD_ARGS lines.  We’re relatively sure that’s useless and should not be there.

     

    Thanks,

    -dimitri

     

  • Guest
    Oct 6, 2025

    Customer is looking for an Environment variable they can set similar to "BAC_SPARK_CLASS_JAVA_OPTIONS" that is used in the spark-class script to add options to the spark JVM options.