Skip to Main Content
IBM Data and AI Ideas Portal for Customers


This portal is to open public enhancement requests against products and services offered by the IBM Data & AI organization. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:


Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,


Post your ideas

Post ideas and requests to enhance a product or service. Take a look at ideas others have posted and upvote them if they matter to you,

  1. Post an idea

  2. Upvote ideas that matter most to you

  3. Get feedback from the IBM team to refine your idea


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

IBM Employees should enter Ideas at https://ideas.ibm.com


Status Future consideration
Workspace Db2
Components Db2 on Cloud
Created by Guest
Created on Jul 15, 2024

DB2 database backup on Google cloud storage

We can utilize Google cloud storage directly for the database backups.

Needed By Month
  • Guest
    Reply
    |
    Jul 19, 2024

    Carried out the test and verified that GCP supports S3 protocol.

     

    [db2ips1@bns-ftm-db2-standby-aux1 ~]$ aws s3 --endpoint-url https://storage.googleapis.com ls s3://bns-ftm-storage/

    PRE db2backup/

    2024-06-19 18:07:04 21167568 Db2_v11.5.4.0_Pacemaker_20200615_RHEL8.1_x86_64.tar.gz

    2024-06-19 13:54:08 1963519317 special_42449_v11.5.9_linuxx64_server_dec.tar.gz

     

    Followed the process to set registry variables...

     

    [db2ips1@bns-ftm-db2-standby-aux1 ~]$ db2set -all

    [i] DB2_ENABLE_COS_SDK=ON

    [i] DB2_OBJECT_STORAGE_LOCAL_STAGING_PATH=/db2space/FTM/FTMDB/backup

    [i] DB2COMM=SSL,TCPIP

    [g] DB2SYSTEM=bns-ftm-db2-standby-aux1.us-east4-b.c.bns-ftm-426014.internal

     

    Catalogged remote storage...

     

    [db2ips1@bns-ftm-db2-standby-aux1 ~]$ db2 "catalog storage access alias googlebkup vendor S3 server https://storage.googleapis.com user 'XXXXXX' password 'XXXXX' container 'bns-ftm-storage'"

    DB20000I The CATALOG STORAGE ACCESS command completed successfully.

     

    [db2ips1@bns-ftm-db2-standby-aux1 ~]$ db2 list storage access

     

    Node Directory

     

    Node 1 entry:

     

    ALIAS=googlebkup

    VENDOR=S3

    SERVER=https://storage.googleapis.com

    USERID=****

    CONTAINER=bns-ftm-storage

    OBJECT=

    DBUSER=

    DBGROUP=SYSADM

     

    Number of entries in the directory = 1

     

    DB20000I The LIST STORAGE ACCESS command completed successfully.

     

    [db2ips1@bns-ftm-db2-standby-aux1 ~]$ db2 "backup db FTMDB online to DB2REMOTE://googlebkup//bns-ftm-storage/"

    SQL2025N An I/O error occurred. Error code: "0x870F01B6". Media on which

    this error occurred: "DB2REMOTE://googlebkup//bns-ftm-storage/".

     

    Backup fails with above error.

     

    From diaglog i can see it has initialized the s3 protocols and path...

    2024-07-19-14.36.30.485447+000 I275911E1038 LEVEL: Error

    PID : 36049 TID : 140394285229632 PROC : db2sysc 0

    INSTANCE: db2ips1 NODE : 000 DB : FTMDB

    APPHDL : 0-17895 APPID: *LOCAL.db2ips1.240719143629

    AUTHID : DB2IPS1 HOSTNAME: bns-ftm-db2-standby-aux1.us-east4-b.c.bns-ftm-426014.internal

    EDUID : 126 EDUNAME: db2med.84.0 (FTMDB) 0

    FUNCTION: DB2 UDB, oper system services, sqlowriteEx, probe:11988

    MESSAGE : ZRC=0x870F01B6=-2029059658=SQLO_FAILED

    "An unexpected error is encountered"

    DATA #1 : String, 33 bytes

    COS PutObject request has failed.

    DATA #2 : COS SDK Client, PD_TYPE_COS_CLIENT, 2624 bytes

    SqloCosClient: 0x00007fb012c53020

    {

    Global COS SDK CB: 0x00007fb13852db80

    Is Initialized: Yes

    COS Interface CB: 0x00007fb13852dbe0

    API Type: 0 ( S3 )

    Endpoint: [https://storage.googleapis.com]

    Container: [bns-ftm-storage]

    Is Persistent: Yes

    Object size: -1

    Current GET position: 0

    SDK Persistent Client: 0x00007fb060017fe0

    }

     

    First error from log as below...

     

    2024-07-19-14.36.30.480824+000 I272942E863 LEVEL: Error

    PID : 36049 TID : 140394285229632 PROC : db2sysc 0

    INSTANCE: db2ips1 NODE : 000 DB : FTMDB

    APPHDL : 0-17895 APPID: *LOCAL.db2ips1.240719143629

    AUTHID : DB2IPS1 HOSTNAME: bns-ftm-db2-standby-aux1.us-east4-b.c.bns-ftm-426014.internal

    EDUID : 126 EDUNAME: db2med.84.0 (FTMDB) 0

    FUNCTION: DB2 UDB, oper system services, sqloS3Client_InitMultiPartUpload, probe:112

    MESSAGE : ZRC=0x870F01B6=-2029059658=SQLO_FAILED

    "An unexpected error is encountered"

    DATA #1 : String, 39 bytes

    S3:CreateMultipartUpload request failed

    DATA #2 : signed integer, 4 bytes

    100

    DATA #3 : String, 15 bytes

    InvalidArgument

    DATA #4 : String, 73 bytes

    Unable to parse ExceptionName: InvalidArgument Message: Invalid argument.

     

    With invalid argument & unexpected errors unable to debug it further.