Start a Conversation

Unsolved

This post is more than 5 years old

A

5 Practitioner

 • 

274.2K Posts

6995

February 28th, 2018 03:00

Avamar Virtual Edition 7.3 - image backups fail with error <17821>: failed to download vm metadata

Hi everyone,

Have been facing numerous Image backup fails with the same error:

----

avvcbimage Warning <15995>: HTTP fault detected, Problem with returning page from HTTP, Msg:'SOAP 1.1 fault: SOAP-ENV:Server [no subcode]

"HTTP/1.1 503 Service Unavailable"

Detail: Request rejected because there are already 500 /folder requests in progress.

avvcbimage Warning <40650>: Download VM .vmx file failed.

----

We're backing up several vCenters 6.0 u2 with AVE 7.3 but I'm getting so many fails (up to 50%) just on one of them and I can't figure out the root cause.

Some search results I've done which could help to narrow it down:

HTTP is in place and .vmx file is accessible if I'm trying to get it manually.

Backups of such VMs complete successfully if run separately.

Backups of  a certain VM do not fail in 100% attempts but around 50%.

CBT is out of scope: backups fail for VMs with both CBT engaged and not.

a ESXi host/Proxy server load  (storage IO, CPU) is ordinary during the backup window

Load statistics for certain LUNs from storage system (VMAX) is also normal.

I'm interested in this phrase - Request rejected because there are already 500 /folder requests in progress. Does anyone know what kind of requests is it about?

Guys, any hints will be very much appreciated, since I'm running out of thoughts.

***

Here is the full avvcbimage log:

Log #2: avvcbimage log 2018-02-28 03:00:56 UTC [7.3.101-125 Linux-x86_64]

2018-02-28 03:00:56 avvcbimage Info <5008>: Logging to /usr/local/avamarclient/var/Backup Window-Image Backup-1519786800011_b31f01ad9d7069236a59d1e26dfb8070f196cb8f_61ff07c1a024f63e39eb109c236ef69a089404c5-88e8f269ae04455c0f6d7365a1726b0b52cbb769-3016-vmimagew.log

2018-02-28 03:00:56 avvcbimage Info <5174>: - Reading /usr/local/avamarclient/var/avvcbimage.cmd

2018-02-28 03:00:56 avvcbimage Info <5174>: - Reading /usr/local/avamarclient/var/avvcbimageAll.cmd

2018-02-28 03:00:56 avvcbimage Info <6673>: CTL listening on port 39926

2018-02-28 03:00:56 avvcbimage Info <10684>: Setting ctl message version to 3 (from 1)

2018-02-28 03:00:56 avvcbimage Info <16136>: Setting ctl max message size to 268435456

2018-02-28 03:00:56 avvcbimage Info <40636>: SSL Certificate Thumbprint: DD:89:9D:6D:77:75:98:AE:10:BB:66:DB:28:12:5C:9E:D3:D9:14:0D

2018-02-28 03:00:56 avvcbimage Info <42137>: job type = 4

2018-02-28 03:00:56 avvcbimage Info <18664>: Login(https://drma2c01vcs01.cec.lab.emc.com:443/sdk) Datacenter: 'DRMA2C01VCS01DC1'

2018-02-28 03:00:56 avvcbimage Info <19728>: - connected to 'VirtualCenter' - version: 'VMware vCenter Server 6.0.0 build-5318203',  apiVersion:'6.0'

2018-02-28 03:00:56 avvcbimage Info <42146>: Login by loginWS test is OK.

2018-02-28 03:00:57 avvcbimage Info <40726>: VM Config Info: VMX version 4(F), ctk enabled(T), fault tolerant (F), Guest OS Microsoft Windows Server 2012 (64-bit), Cloud UUID = NoUUID. version=10

2018-02-28 03:00:57 avvcbimage Info <40727>: VM Guest Info: GuestOS Family = unknown, Guest Full Name = Microsoft Windows Server 2012 (64-bit), Guest OS State = notRunning, Tools Running = guestToolsNotRunning,  Tools Version Status = guestToolsSupportedOld

2018-02-28 03:00:57 avvcbimage Info <14625>: WorkOrder Disk Info:

  Prior Disk '2000': file(base):'[DRMA2C01VCS01-VMAX02-VOL20] hpdp9-dcdns/hpdp9-dcdns.vmdk', backItUp=1

               snapshot file:'[DRMA2C01VCS01-VMAX02-VOL20] hpdp9-dcdns/hpdp9-dcdns.vmdk'

               prior size(KB):104857600, current size(KB):104857600, match=1

               prior change block ID:'52 d3 eb c9 d4 2e 68 01-f6 c1 59 60 5d 14 bc 31/1454'

Datastore:'DRMA2C01VCS01-VMAX02-VOL20' Directly Accessible=1

2018-02-28 03:00:57 avvcbimage Info <40637>: Proxy Host Name=drma2c01vcs01esx11.cec.lab.emc.com(6.0 build-5572656)

2018-02-28 03:00:57 avvcbimage Info <40638>: VM Host Name=drma2c01vcs01esx10.cec.lab.emc.com(6.0 build-5572656)

2018-02-28 03:00:57 avvcbimage Info <19547>: Using Simultaneous Workorders method

2018-02-28 03:00:57 avvcbimage Info <19654>: Obtain cache lock (preferred proxy: 1)

2018-02-28 03:00:57 avvcbimage Info <19655>: skipping cache lock file (1)

2018-02-28 03:00:57 avvcbimage Info <19655>: skipping cache lock file (2)

2018-02-28 03:00:57 avvcbimage Info <19655>: skipping cache lock file (3)

2018-02-28 03:00:57 avvcbimage Info <19652>: Updated cache lookup

2018-02-28 03:00:57 avvcbimage Info <19587>: Create cache lock file succeeded

2018-02-28 03:00:57 avvcbimage Info <19588>: usLockFile: /usr/local/avamarclient/bin/var4.lck

2018-02-28 03:00:57 avvcbimage Info <11979>: proxy VM DNS name is: 127.0.0.2,drma2c01vcs01aveproxy11.cec.lab.emc.com,drma2c01vcs01aveproxy11,10.207.5.22

2018-02-28 03:00:57 avvcbimage Info <18672>: proxy VM Instance name is: drma2c01vcs01aveproxy11-proxy-4

2018-02-28 03:00:57 avvcbimage Info <17793>: proxy VM Moref is: vm-640

2018-02-28 03:00:57 avvcbimage Info <16010>: vCenter 'DRMA2C01VCS01DC1' is drma2c01vcs01.cec.lab.emc.com

2018-02-28 03:00:57 avvcbimage Info <11981>: VM's host is drma2c01vcs01esx10.cec.lab.emc.com

2018-02-28 03:00:57 avvcbimage Info <11982>: VM's primary storage location is [DRMA2C01VCS01-VMAX02-VOL20] hpdp9-dcdns/hpdp9-dcdns.vmx

2018-02-28 03:00:57 avvcbimage Info <11983>: VM's vCenter ID is vm-990

2018-02-28 03:00:57 avvcbimage Info <19658>: change block file: VmImage4_changefilemap.tmp

2018-02-28 03:00:57 avvcbimage Info <11984>: VM's backup account on 10.207.5.10 is /drma2c01vcs01.cec.lab.emc.com/ContainerClients/hpdp9-dcdns_UDJvHq24dY59jpyqnuEFSg

2018-02-28 03:00:57 avvcbimage Info <11986>: Changed block tracking is engaged for this VM

2018-02-28 03:00:57 avvcbimage Info <11987>: Reference backup is #247 captured on 2018-02-27 03:55:51

2018-02-28 03:00:57 avvcbimage Info <14680>: A reference to a valid prior backup is available so this will be an incremental.

2018-02-28 03:00:57 avvcbimage Info <19549>: metadata tmp dir: /usr/local/avamarclient/var/vmware/4/temp

2018-02-28 03:00:57 avvcbimage Info <40640>: create tmpDir: /usr, failed (-1)

2018-02-28 03:00:57 avvcbimage Info <40640>: create tmpDir: /usr/local, failed (-1)

2018-02-28 03:00:57 avvcbimage Info <40640>: create tmpDir: /usr/local/avamarclient, failed (-1)

2018-02-28 03:00:57 avvcbimage Info <40640>: create tmpDir: /usr/local/avamarclient/var, failed (-1)

2018-02-28 03:00:57 avvcbimage Info <40640>: create tmpDir: /usr/local/avamarclient/var/vmware, failed (-1)

2018-02-28 03:00:57 avvcbimage Info <40640>: create tmpDir: /usr/local/avamarclient/var/vmware/4, failed (-1)

2018-02-28 03:00:57 avvcbimage Info <40640>: create tmpDir: /usr/local/avamarclient/var/vmware/4/temp, failed (-1)

2018-02-28 03:00:57 avvcbimage Info <40654>: isExitOK()=0

2018-02-28 03:00:57 avvcbimage Info <17823>: Body- abortrecommended(f)

2018-02-28 03:00:57 avvcbimage Info <19660>: targetlist contains

2018-02-28 03:00:57 avvcbimage Info <5174>: - Reading /usr/local/avamarclient/var/avvcbimage.cmd

2018-02-28 03:00:57 avvcbimage Info <5174>: - Reading /usr/local/avamarclient/var/avvcbimageAll.cmd

2018-02-28 03:00:57 avvcbimage Info <7084>: target[0]=[DRMA2C01VCS01-VMAX02-VOL20] hpdp9-dcdns/hpdp9-dcdns.vmdk

2018-02-28 03:00:57 avvcbimage Info <40658>: vmparams (drma2c01vcs01.cec.lab.emc.com)

2018-02-28 03:00:57 avvcbimage Info <40654>: isExitOK()=0

2018-02-28 03:00:57 avvcbimage Info <18664>: Login(https://drma2c01vcs01.cec.lab.emc.com:443/sdk) Datacenter: 'DRMA2C01VCS01DC1'

2018-02-28 03:00:57 avvcbimage Info <19728>: - connected to 'VirtualCenter' - version: 'VMware vCenter Server 6.0.0 build-5318203',  apiVersion:'6.0'

2018-02-28 03:00:57 avvcbimage Info <42146>: Login by loginWS test is OK.

2018-02-28 03:00:57 avvcbimage Info <18664>: Login(https://drma2c01vcs01.cec.lab.emc.com:443/sdk) Datacenter: 'DRMA2C01VCS01DC1'

2018-02-28 03:00:57 avvcbimage Info <19728>: - connected to 'VirtualCenter' - version: 'VMware vCenter Server 6.0.0 build-5318203',  apiVersion:'6.0'

2018-02-28 03:00:57 avvcbimage Info <42146>: Login by loginWS test is OK.

2018-02-28 03:00:57 avvcbimage Info <16041>: VDDK:OBJLIB-LIB: Objlib initialized.

2018-02-28 03:00:57 avvcbimage Info <16041>: VDDK:VixDiskLib: Attempting to locate advanced transport module in "/usr/lib/vmware-vix-disklib".

2018-02-28 03:00:57 avvcbimage Info <16041>: VDDK:VixDiskLib: Advanced transport plugin /usr/lib/vmware-vix-disklib/lib64/libdiskLibPlugin.so was successfully loaded into vixDiskLib.

2018-02-28 03:00:57 avvcbimage Info <16041>: VDDK:VixDiskLib: Enabling advanced transport modes.

2018-02-28 03:00:57 avvcbimage Info <16041>: VDDK:VixDiskLib: Initialize transport modes.

2018-02-28 03:00:57 avvcbimage Info <16041>: VDDK:VixDiskLib: Available transport modes: file:san:hotadd:nbdssl:nbd.

2018-02-28 03:00:57 avvcbimage Info <16041>: VDDK:VMware VixDiskLib (6.0) Release build-2942432

2018-02-28 03:00:57 avvcbimage Info <16041>: VDDK:VixDiskLibVim: VixDiskLibVim_Init: Initialization is completed.

2018-02-28 03:00:57 avvcbimage Info <9666>: Available transport modes are file:san:hotadd:nbdssl:nbd

2018-02-28 03:00:57 avvcbimage Info <16041>: VDDK:VixDiskLib: VixDiskLib_PrepareForAccess: Prepare to access disk.

2018-02-28 03:00:57 avvcbimage Info <16041>: VDDK:VixDiskLib: VixDiskLib_Connect: Establish connection.

2018-02-28 03:00:57 avvcbimage Info <16041>: VDDK:VixDiskLib: Resolve host.

2018-02-28 03:00:57 avvcbimage Info <16041>: VDDK:VixDiskLibVim: VixDiskLibVim_AllowVMotion: Disable VMotion.

2018-02-28 03:00:58 avvcbimage Info <16041>: VDDK:VixDiskLibVim: VixDiskLibVim_FreeNfcTicket: Free NFC ticket.

2018-02-28 03:00:58 avvcbimage Info <17818>: VixDiskLib vMotion reservation successfully enabled by 'EMC Avamar VM Backup Proxy'

2018-02-28 03:00:58 avvcbimage Info <16011>: Running cleanup of old backups.

2018-02-28 03:00:58 avvcbimage Info <16041>: VDDK:VixDiskLib: VixDiskLib_Cleanup: Remove previous mount points and clean up .

2018-02-28 03:00:58 avvcbimage Info <16041>: VDDK:VixDiskLib: VixDiskLib_Connect: Establish connection.

2018-02-28 03:00:58 avvcbimage Info <16041>: VDDK:VixDiskLib: Resolve host.

2018-02-28 03:00:58 avvcbimage Info <16041>: VDDK:VixDiskLib: VixDiskLib_Disconnect: Disconnect.

2018-02-28 03:00:58 avvcbimage Info <16048>: VixDiskLib_Cleanup() found no prior mounts to cleanup.

2018-02-28 03:00:58 avvcbimage Info <18664>: Login(https://drma2c01vcs01.cec.lab.emc.com:443/sdk) Datacenter: 'DRMA2C01VCS01DC1'

2018-02-28 03:00:58 avvcbimage Info <19728>: - connected to 'VirtualCenter' - version: 'VMware vCenter Server 6.0.0 build-5318203',  apiVersion:'6.0'

2018-02-28 03:00:58 avvcbimage Info <42146>: Login by loginWS test is OK.

2018-02-28 03:00:58 avvcbimage Info <42215>: SnapshotManager: initializing at 10.207.5.10:8543, vCenter drma2c01vcs01.cec.lab.emc.com(ave@vsphere.local), datacenter(/DRMA2C01VCS01DC1), httpsTimeout_ms(60000), maxDeleteRetries(-1)

2018-02-28 03:00:58 avvcbimage Info <42191>: GET: curl_easy_getinfo succeeded with httpcode(200)

2018-02-28 03:00:58 avvcbimage Info <42195>: AvSnapMgr::getData Succeeded httpCode: 200

2018-02-28 03:00:58 avvcbimage Info <42162>: SnapshotManager version:7.3.1.125

2018-02-28 03:00:58 avvcbimage Info <40642>: Using SnapShot Manager - with fallback to proxy calling vSphere API functions.

2018-02-28 03:00:58 avvcbimage Info <18648>: The snapshot '' could not be removed.

2018-02-28 03:00:58 avvcbimage Info <40751>: Datastore DRMA2C01VCS01-VMAX02-VOL20

2018-02-28 03:00:58 avvcbimage Info <14664>: no snapshots are present on this VM.

2018-02-28 03:00:58 avvcbimage Info <42225>: (CURL) Logging into datacenter path of 'folder?dcPath=DRMA2C01VCS01DC1', DRMA2C01VCS01DC1 => DRMA2C01VCS01DC1

2018-02-28 03:00:58 avvcbimage Info <42226>: (Legacy) Logging into datacenter path of 'folder?dcPath=DRMA2C01VCS01DC1', DRMA2C01VCS01DC1 => DRMA2C01VCS01DC1

2018-02-28 03:00:58 avvcbimage Info <41153>: Using CURL Method

2018-02-28 03:00:58 avvcbimage Info <41154>: Re-using vCS Session Cookie

2018-02-28 03:01:54 avvcbimage Info <41153>: Using CURL Method

2018-02-28 03:01:54 avvcbimage Info <41154>: Re-using vCS Session Cookie

2018-02-28 03:03:04 avvcbimage Info <41153>: Using CURL Method

2018-02-28 03:03:04 avvcbimage Info <41154>: Re-using vCS Session Cookie

2018-02-28 03:04:14 avvcbimage Info <41153>: Using CURL Method

2018-02-28 03:04:14 avvcbimage Info <41154>: Re-using vCS Session Cookie

2018-02-28 03:05:24 avvcbimage Info <41155>: Using Legacy Method

2018-02-28 03:09:46 avvcbimage Info <14673>: All Datastore Info:

  datastore:'CCEContent3                   '  capacity=139949932806144  free=132734681874432

  datastore:'DRMA2C01VCS01-VMAX02-EFD01    ' capacity=1202322407424 free=101764300800

  datastore:'DRMA2C01VCS01-VMAX02-VOL01    ' capacity=65968550182912 free=38528543096832

  datastore:'DRMA2C01VCS01-VMAX02-VOL02    ' capacity=65968550182912 free=46557911056384

  datastore:'DRMA2C01VCS01-VMAX02-VOL03    ' capacity=65968550182912 free=39234200141824

  datastore:'DRMA2C01VCS01-VMAX02-VOL04    ' capacity=65968550182912 free=40757392048128

  datastore:'DRMA2C01VCS01-VMAX02-VOL05    ' capacity=65968550182912 free=43332288905216

  datastore:'DRMA2C01VCS01-VMAX02-VOL06    ' capacity=65968550182912 free=43685644337152

  datastore:'DRMA2C01VCS01-VMAX02-VOL07    ' capacity=65968550182912 free=33379221241856

  datastore:'DRMA2C01VCS01-VMAX02-VOL08    ' capacity=65968550182912  free=34771219513344

  datastore:'DRMA2C01VCS01-VMAX02-VOL09    ' capacity=65968550182912 free=32056532795392

  datastore:'DRMA2C01VCS01-VMAX02-VOL10    ' capacity=65968550182912 free=32769762656256

  datastore:'DRMA2C01VCS01-VMAX02-VOL11    ' capacity=65968550182912 free=26953420636160

  datastore:'DRMA2C01VCS01-VMAX02-VOL12    ' capacity=65968550182912 free=29590419406848

  datastore:'DRMA2C01VCS01-VMAX02-VOL13    ' capacity=65968550182912 free=32711476510720

  datastore:'DRMA2C01VCS01-VMAX02-VOL14    ' capacity=65968550182912 free=35714111635456

  datastore:'DRMA2C01VCS01-VMAX02-VOL15    ' capacity=65968550182912 free=30390330851328

  datastore:'DRMA2C01VCS01-VMAX02-VOL16    ' capacity=65968550182912 free=22076753707008

  datastore:'DRMA2C01VCS01-VMAX02-VOL17    ' capacity=65968550182912 free=41567493554176

  datastore:'DRMA2C01VCS01-VMAX02-VOL18    ' capacity=65968550182912 free=33899473272832

  datastore:'DRMA2C01VCS01-VMAX02-VOL19    ' capacity=65968550182912 free=20142536785920

  datastore:'DRMA2C01VCS01-VMAX02-VOL20    ' capacity=65968550182912 free=34055623016448

  datastore:'DRMA2C01VCS01C2-VMAX02-VOL01  ' capacity=16491600674816 free=3386855391232

datastore:'DRMA2C01VCS01C2-VMAX02-VOL01_old'  capacity=49476412637184  free=49339816738816

  datastore:'DRMA2C01VCS01C2-VMAX02-VOL02  ' capacity=16491600674816 free=3535238332416

datastore:'DRMA2C01VCS01C2-VMAX02-VOL02_old'  capacity=49476412637184  free=49475122888704

  datastore:'DRMA2C01VCS01C2-VMAX02-VOL03  ' capacity=16491600674816 free=3655169212416

  datastore:'DRMA2C01VCS01C2-VMAX02-VOL04  ' capacity=16491600674816 free=3667865370624

  datastore:'DRMA2C01VCS01C2-VMAX02-VOL05  ' capacity=16491600674816 free=3296557268992

  datastore:'Resources                     '  capacity=139949932806144  free=132734469537792

2018-02-28 03:09:46 avvcbimage Info <14675>: All VM Info (13/13):

  VM:'DRMA2VRAAGENT1/                         '  last modified on '06-Oct-2017 14:02'

  VM:'ESXi-6-host/                            '  last modified on '20-Sep-2017 19:08'

  VM:'SLES11SP4-base/                         '  last modified on '23-Feb-2018 14:18'

  VM:'ddst-ddve60-13/                         '  last modified on '18-Oct-2017 13:14'

  VM:'ddst-ddve60-14/                         '  last modified on '21-Jan-2018 03:29'

  VM:'ddst-ddve60-15/                         '  last modified on '02-Dec-2017 22:07'

  VM:'ddst-ddve60-18/                         '  last modified on '12-Jan-2018 19:41'

  VM:'ddst-ddve61-101/                        '  last modified on '17-Nov-2017 20:29'

  VM:'drma2c01vcs01aveproxy20/                '  last modified on '27-Feb-2018 04:34'

  VM:'dutivm443_template_Aug2016/             ' last modified on '16-Mar-2017 21:25'

  VM:'hpdp9-dcdns/                            '  last modified on '27-Feb-2018 04:07'

  VM:'hpdp9-rhel/                             '  last modified on '27-Feb-2018 04:00'

  VM:'hpdp9-server/                           '  last modified on '27-Feb-2018 03:58'

2018-02-28 03:09:46 avvcbimage Info <42204>: The download path of VM 'hpdp9-dcdns/' is (CURL) '/folder/hpdp9%2ddcdns?dcPath=DRMA2C01VCS01DC1&dsName=DRMA2C01VCS01%252dVMAX02%252dVOL20' or (LEGACY) '/folder/hpdp9%2ddcdns?dcPath=DRMA2C01VCS01DC1&dsName=DRMA2C01VCS01%252dVMAX02%252dVOL20'

2018-02-28 03:09:46 avvcbimage Info <41153>: Using CURL Method

2018-02-28 03:09:46 avvcbimage Info <41154>: Re-using vCS Session Cookie

2018-02-28 03:10:56 avvcbimage Info <41153>: Using CURL Method

2018-02-28 03:10:56 avvcbimage Info <41154>: Re-using vCS Session Cookie

2018-02-28 03:12:06 avvcbimage Info <41153>: Using CURL Method

2018-02-28 03:12:06 avvcbimage Info <41154>: Re-using vCS Session Cookie

2018-02-28 03:13:16 avvcbimage Info <41155>: Using Legacy Method

2018-02-28 03:21:43 avvcbimage Warning <15995>: HTTP fault detected, Problem with returning page from HTTP, Msg:'SOAP 1.1 fault: SOAP-ENV:Client [no subcode]

"End of file or no input: Resource temporarily unavailable"

Detail: [no detail]

2018-02-28 03:21:53 avvcbimage Info <41155>: Using Legacy Method

2018-02-28 03:28:20 avvcbimage Info <14677>: All VM file Info:

  File:'hpdp9-dcdns-aux.xml                               ' last modified on '20-Jul-2017 03:25'  size=13

  File:'hpdp9-dcdns-c64f5e67.hlog                         ' last modified on '17-Jun-2017 17:27'  size=89

  File:'hpdp9-dcdns-ctk.vmdk                              ' last modified on '27-Feb-2018 04:07'  size=6554112

  File:'hpdp9-dcdns-flat.vmdk                             ' last modified on '20-Jul-2017 18:44'  size=107374182400

  File:'hpdp9-dcdns.nvram                                 ' last modified on '20-Jul-2017 18:44' size=8684

  File:'hpdp9-dcdns.vmdk                                  ' last modified on '27-Feb-2018 04:07'  size=587

  File:'hpdp9-dcdns.vmsd                                  ' last modified on '27-Feb-2018 04:07'  size=45

  File:'hpdp9-dcdns.vmx                                   ' last modified on '27-Feb-2018 04:07' size=2685

  File:'vmware-88.log                                     ' last modified on '14-Jun-2017 16:15'  size=174671

  File:'vmware-89.log                                     ' last modified on '14-Jun-2017 19:29' size=176797

  File:'vmware-90.log                                     ' last modified on '14-Jun-2017 21:07' size=174595

  File:'vmware-91.log                                     ' last modified on '15-Jun-2017 00:27'  size=202899

  File:'vmware-92.log                                     ' last modified on '15-Jun-2017 00:49' size=173961

  File:'vmware-93.log                                     ' last modified on '17-Jun-2017 17:27' size=174085

  File:'vmware.log                                        ' last modified on '20-Jul-2017 18:44' size=1705083

2018-02-28 03:28:20 avvcbimage Info <19704>: DataStore Storage Info:DRMA2C01VCS01-VMAX02-VOL20 capacity=65968550182912 free=34055623016448

2018-02-28 03:28:20 avvcbimage Info <19716>: DS Capacity=65968550182912 FreeSpace=34055623016448 / HD committed=107376975860   unCommitted=8783028811     unShared=107374182400 

2018-02-28 03:28:20 avvcbimage Info <41153>: Using CURL Method

2018-02-28 03:28:20 avvcbimage Info <41154>: Re-using vCS Session Cookie

2018-02-28 03:29:30 avvcbimage Info <41153>: Using CURL Method

2018-02-28 03:29:30 avvcbimage Info <41154>: Re-using vCS Session Cookie

2018-02-28 03:30:40 avvcbimage Info <41153>: Using CURL Method

2018-02-28 03:30:40 avvcbimage Info <41154>: Re-using vCS Session Cookie

2018-02-28 03:30:50 avvcbimage Info <41155>: Using Legacy Method

2018-02-28 03:30:50 avvcbimage Info <40756>: Attempting to Download file:/folder/hpdp9%2ddcdns/hpdp9%2ddcdns%2evmx?dcPath=DRMA2C01VCS01DC1&dsName=DRMA2C01VCS01%252dVMAX02%252dVOL20

2018-02-28 03:30:50 avvcbimage Warning <15995>: HTTP fault detected, Problem with returning page from HTTP, Msg:'SOAP 1.1 fault: SOAP-ENV:Server [no subcode]

"HTTP/1.1 503 Service Unavailable"

Detail: Request rejected because there are already 500 /folder requests in progress.

2018-02-28 03:31:00 avvcbimage Info <41155>: Using Legacy Method

2018-02-28 03:31:00 avvcbimage Info <40756>: Attempting to Download file:/folder/hpdp9%2ddcdns/hpdp9%2ddcdns%2evmx?dcPath=DRMA2C01VCS01DC1&dsName=DRMA2C01VCS01%252dVMAX02%252dVOL20

2018-02-28 03:31:00 avvcbimage Warning <15995>: HTTP fault detected, Problem with returning page from HTTP, Msg:'SOAP 1.1 fault: SOAP-ENV:Server [no subcode]

"HTTP/1.1 503 Service Unavailable"

Detail: Request rejected because there are already 500 /folder requests in progress.

2018-02-28 03:31:10 avvcbimage Info <41155>: Using Legacy Method

2018-02-28 03:31:10 avvcbimage Info <40756>: Attempting to Download file:/folder/hpdp9%2ddcdns/hpdp9%2ddcdns%2evmx?dcPath=DRMA2C01VCS01DC1&dsName=DRMA2C01VCS01%252dVMAX02%252dVOL20

2018-02-28 03:31:10 avvcbimage Warning <15995>: HTTP fault detected, Problem with returning page from HTTP, Msg:'SOAP 1.1 fault: SOAP-ENV:Server [no subcode]

"HTTP/1.1 503 Service Unavailable"

Detail: Request rejected because there are already 500 /folder requests in progress.

2018-02-28 03:31:20 avvcbimage Warning <40650>: Download VM .vmx file failed.

2018-02-28 03:31:20 avvcbimage Error <17821>: failed to download vm metadata, try later

2018-02-28 03:31:20 avvcbimage Info <9772>: Starting graceful (staged) termination, failed to download vm metadata (wrap-up stage)

2018-02-28 03:31:20 avvcbimage Error <0000>: [IMG0009] createSnapshot: snapshot creation  or pre/post snapshot script failed

2018-02-28 03:31:20 avvcbimage Error <0000>: [IMG0009] createSnapshot: snapshot creation/pre-script/post-script failed

2018-02-28 03:31:20 avvcbimage Info <40654>: isExitOK()=157

2018-02-28 03:31:20 avvcbimage Info <40659>: snapshot created:false NOMC:false ChangeBlTrackingAvail:true UsingChBl:true, ExitOK:false, cancelled:false, fatal: true

2018-02-28 03:31:20 avvcbimage Info <40654>: isExitOK()=157

2018-02-28 03:31:20 avvcbimage Info <40660>: vcbimage_progress::terminate

2018-02-28 03:31:20 avvcbimage Info <16041>: VDDK:VixDiskLib: VixDiskLib_EndAccess: Disk access completed.

2018-02-28 03:31:20 avvcbimage Info <16041>: VDDK:VixDiskLib: VixDiskLib_Connect: Establish connection.

2018-02-28 03:31:20 avvcbimage Info <16041>: VDDK:VixDiskLib: Resolve host.

2018-02-28 03:31:20 avvcbimage Info <16038>: Final summary, cancelled/aborted 0, snapview 0, exitcode 157: miscellaneous error

2018-02-28 03:31:20 avvcbimage Info <16041>: VDDK:MemoryDbMappingsExpire: Expiring SSL ID mapping, hostname 10.207.2.190 SSL ID dd:89:9d:6d:77:75:98:ae:10:bb:66:db:28:12:5c:9e:d3:d9:14:0d

2018-02-28 03:31:20 avvcbimage Info <16041>: VDDK:VixDiskLibVim: VixDiskLibVim_AllowVMotion: Enable VMotion.

2018-02-28 03:31:20 avvcbimage Info <16041>: VDDK:VixDiskLibVim: VixDiskLibVim_FreeNfcTicket: Free NFC ticket.

2018-02-28 03:31:20 avvcbimage Info <17819>: VixDiskLib vMotion reservation successfully released

END avvcbimage log 2018-02-28 03:31:25 UTC (5 warnings, 3 errors, 0 fatal errors)

7 Posts

February 28th, 2018 03:00

There is a known issue in 6u2 about vcenter port mappings connection issue which is sort in u3. Not sure if this is the actual you have but might help you.

5 Practitioner

 • 

274.2K Posts

February 28th, 2018 04:00

Sorry for messing you up a little. I've just checked all our vCenters' build versions, since we've been upgrading some of them recently and some are 6u3 already.

This particular vCenter is 6u3 (build 5318203).

2K Posts

February 28th, 2018 06:00

Detail: Request rejected because there are already 500 /folder requests in progress

This message is coming back to Avamar from vSphere when we're trying to download the VMX. I suspect the vCenter or ESX host is getting a lot of HTTP requests at the time. If you're kicking off a lot of concurrent backups, you may need to throttle them back / stagger them or see if there is some tuning you can do on the VMware side.

498 Posts

March 6th, 2018 13:00

I agree with Ian,

I was told to not have more than 100 servers in VM policy

and to stagger them starting.  Looks like the limit is 500

so maybe only 450 in a policy and stagger them by 30 minuties to allow VM time to queue them up.

if you have them all in one policy, try making a new one of less then 500 and start it first

then start your other policy with all the others

see if the smaller policy goes ok.

5 Practitioner

 • 

274.2K Posts

March 7th, 2018 04:00

Ian and J.H., thank you guys. I'll check if concurrency and overall number of VMs which are being backed up at the moment might be an issue.

1 Message

August 6th, 2019 10:00

Did the issue get resolve? We having the same issue "failed to download vm metadata" It was upgraded from 5.5 to 6.5U2. On the 5.5 working fine.

 

Thanks,

No Events found!

Top