NetBackup™ for Cloud Object Store Administrator's Guide

Last Published:
Product(s): NetBackup & Alta Data Protection (10.5)
  1. Introduction
    1.  
      Overview of NetBackup protection for Cloud object store
    2.  
      Features of NetBackup Cloud object store workload support
  2. Managing Cloud object store assets
    1.  
      Planning NetBackup protection for Cloud object store assets
    2.  
      Prerequisites for adding Cloud object store accounts
    3.  
      Configuring buffer size for backups
    4.  
      Permissions required for Amazon S3 cloud provider user
    5.  
      Permissions required for Azure blob storage
    6.  
      Permissions required for GCP
    7.  
      Limitations and considerations
    8. Adding Cloud object store accounts
      1.  
        Creating cross-account access in AWS
      2.  
        Check certificate for revocation
      3.  
        Managing Certification Authorities (CA) for NetBackup Cloud
      4.  
        Adding a new region
    9.  
      Manage Cloud object store accounts
    10. Scan for malware
      1.  
        Backup images
      2.  
        Assets by policy type
  3. Protecting Cloud object store assets
    1. About accelerator support
      1.  
        How NetBackup accelerator works with Cloud object store
      2.  
        Accelerator notes and requirements
      3.  
        Accelerator force rescan for Cloud object store (schedule attribute)
      4.  
        Accelerator backup and NetBackup catalog
      5.  
        Calculate the NetBackup accelerator track log size
    2.  
      About incremental backup
    3.  
      About dynamic multi-streaming
    4.  
      About policies for Cloud object store assets
    5.  
      Planning for policies
    6.  
      Prerequisites for Cloud object store policies
    7.  
      Creating a backup policy
    8.  
      Policy attributes
    9.  
      Creating schedule attributes for policies
    10. Configuring the Start window
      1.  
        Adding, changing, or deleting a time window in a policy schedule
      2.  
        Example of schedule duration
    11.  
      Configuring the exclude dates
    12.  
      Configuring the include dates
    13.  
      Configuring the Cloud objects tab
    14.  
      Adding conditions
    15.  
      Adding tag conditions
    16.  
      Examples of conditions and tag conditions
    17. Managing Cloud object store policies
      1.  
        Copy a policy
      2.  
        Deactivating or deleting a policy
      3.  
        Manually backup assets
  4. Recovering Cloud object store assets
    1.  
      Prerequisites for recovering Cloud object store objects
    2.  
      Configuring Cloud object retention properties
    3.  
      Recovering Cloud object store assets
  5. Troubleshooting
    1.  
      Reduced acceleration during the first full backup, after upgrade to version 10.5
    2.  
      After backup, some files in the shm folder and shared memory are not cleaned up.
    3.  
      After an upgrade to NetBackup version 10.5, copying, activating, and deactivating policies may fail for older policies
    4.  
      Backup fails with default number of streams with the error: Failed to start NetBackup COSP process.
    5.  
      Backup fails or becomes partially successful on GCP storage for objects with content encoding as GZIP.
    6.  
      Recovery for the original bucket recovery option starts, but the job fails with error 3601
    7.  
      Recovery Job does not start
    8.  
      Restore fails: "Error bpbrm (PID=3899) client restore EXIT STATUS 40: network connection broken"
    9.  
      Access tier property not restored after overwriting the existing object in the original location
    10.  
      Reduced accelerator optimization in Azure for OR query with multiple tags
    11.  
      Backup failed and shows a certificate error with Amazon S3 bucket names containing dots (.)
    12.  
      Azure backup jobs fail when space is provided in a tag query for either tag key name or value.
    13.  
      The Cloud object store account has encountered an error
    14.  
      The bucket is list empty during policy selection
    15.  
      Creating a second account on Cloudian fails by selecting an existing region
    16.  
      Restore failed with 2825 incomplete restore operation
    17.  
      Bucket listing of a cloud provider fails when adding a bucket in the Cloud objects tab
    18.  
      AIR import image restore fails on the target domain if the Cloud store account is not added to the target domain
    19.  
      Backup for Azure Data Lake fails when a back-level media server is used with backup host or storage server version 10.3
    20.  
      Backup fails partially in Azure Data Lake: "Error nbpem (pid=16018) backup of client
    21.  
      Recovery for Azure Data Lake fails: "This operation is not permitted as the path is too deep"
    22.  
      Empty directories are not backed up in Azure Data Lake
    23.  
      Recovery error: "Invalid alternate directory location. You must specify a string with length less than 1025 valid characters"
    24.  
      Recovery error: "Invalid parameter specified"
    25.  
      Restore fails: "Cannot perform the COSP operation, skipping the object: [/testdata/FxtZMidEdTK]"
    26.  
      Cloud store account creation fails with incorrect credentials
    27.  
      Discovery failures due to improper permissions
    28.  
      Restore failures due to object lock

About dynamic multi-streaming

Multi-streaming backup for Cloud object store policy runs simultaneous backup streams for a given backup selection. The backup selection is divided into several streams that run in parallel, resulting in a faster backup. The number of streams can be configured for each policy in the backup selection tab of the cloud object store policy. Each backup stream creates a unique backup image. Eventually, all the images created by the streams for that backup selection represent the backup of that specific selection.

Dynamic muti-streaming is enabled by default on all newly created Cloud object store policies.

Specifying the maximum number of streams

You can specify the maximum number of streams that you want to use for a bucket or container in the policy attributes.

See Policy attributes.

Considerations for using dynamic multi-streaming
  • Entire buckets/containers are backed up when you use dynamic multi-streaming.

  • The number of streams that you specify in a policy is applicable for each of the buckets that the policy protects. For example, if you specify 10 streams in the policy and select five buckets for backup, you get 50 concurrent streams. Some streams may go to a queue, if the maximum number of concurrent jobs allowed in the storage unit selected for the policy, is less than the total number of streams that are running across different policies. For optimal performance, keep the Maximum concurrent jobs allowed property of selected storage greater than the total number of streams that you expect to run across the policies.

  • You cannot use a scale-out server as a backup host, when you use dynamic multi-streaming.

  • Job retry feature does not work for backup jobs.

  • Checkpoint restart is not supported.

  • Dynamic multi-streaming starts all the backup streams for a bucket or container at the same time and writes to a storage unit. Therefore, using tape storage units as the target for primary backup copies is not recommended. You can use an MSDP storage as the target for the first backup copy, and configure a tape storage as the target for secondary or duplication copies.