Hans Kraus | 29 Jul 18:45 2014

Pool and cpool vanished

Hi,

I had to fix the file system (xfs) of my backup server after a harddisk
failure.

BackupPC ran normally at first. But now it doesn't work any more. The
status output of the html ui is:
----------------%x-----------------%x---------------%x-----------------
0 pending backup requests from last scheduled wakeup,
0 pending user backup requests,
0 pending command requests,
Pool is 0.00GB comprising files and directories (as of 7/29 16:09),
Pool hashing gives repeated files with longest chain ,
Nightly cleanup removed 0 files of size 0.00GB (around 7/29 16:09),
Pool file system was recently at 46% (7/29 16:08), today's max is 48% 
(7/28 15:38) and yesterday's max was %.
----------------%x-----------------%x---------------%x-----------------
I ran the following tools:
BackupPC_fixupBackupSummary which reported no faults.
BackupPC_trashClean
BackupPC_nightly 0 255 wich gave following (truncated) output:
----------------%x-----------------%x---------------%x-----------------
BackupPC_stats 0 = pool,600,18,1634472,632024,0,0,0,0,0,7,1051
BackupPC_stats 1 = pool,620,17,1590832,497636,0,0,0,0,0,7,2150
BackupPC_stats 2 = pool,612,17,1443460,313624,0,0,0,0,0,5,3237
BackupPC_stats 3 = pool,594,17,1597464,591800,0,0,0,0,0,7,4286
:
:
BackupPC_stats 252 = pool,585,17,1776696,643628,0,0,0,0,0,7,270040
BackupPC_stats 253 = pool,592,17,1850752,742680,0,0,0,0,0,7,271076
(Continue reading)

Vaughan Roberts | 27 Jul 04:46 2014

Error in backuppc log file: unix bind() failed: No such file or directory

Hi,

 

I have been using backuppc on centos for over eight years, but now I am trying to upgrade my server to Centos 7 and unfortunately there does not seem to be a backuppc package for centos 7 as yet, so I tried using BackupPC-3.3.0-2.el6.x86_64.rpm.

 

Centos 7 uses apache 2.4 which required me to mess around with BackupPC.conf file, but I think that is now OK as I can access backuppc from a browser.

 

However, it looks like I have a permissions problem somewhere and I am not sure exactly what the cause is.  The extract from the BackupPC log file is:

 

Reading hosts file

Adding host …

Adding host …

Unix bind(): failed: No such file or directory

 

When I connect a browser to localhost/backuppc, I get the expected screen and I can browse the backups, but this screen also gives me this error:

Error: Unable to connect to BackupPC server
This CGI script (/sbin/BackupPC_Admin) is unable to connect to the BackupPC server on localhost port -1.
The error was: unix connect: No such file or directory.
Perhaps the BackupPC server is not running or there is a configuration error. Please report this to your Sys Admin.

Naturally the service is running and I have selinux in permissive mode.  This error occurs both on localhost and from a computer on my LAN, so it should not be a firewall issue.  I have checked permissions in /var/lib/BackupPC/* and they are all owned by backuppc: this directory is a mounted logical volume.  I use rsync on the pcs for backup purposes and at the moment no backups are occurring.

 

Can anyone point me to where this error is likely to be coming from?

 

OK, I have found the issue and my apologies for an empty posting (it was probably caused by me digitally signing my previous e-mail).  The issue was that /var/run/BackupPC did not exist.  I created that, but now I have another issue when I try and start a backup, this time in perl:

/usr/bin/perl: symbol lookup error: /usr/lib64/perl5/vendor_perl/auto/File/RsyncP/Digest/Digest.so undefined symbol: Perl_Gthr_key_ptr

 

After a bit of investigation it looks like I am stuffed.  The latest version of File-RsyncP-Digest (0.70) I could find requires rsync 2.x up to protocol version 28.  Centos 7 has rsync 3.0.9 protocol version 30.

 

I note that backuppc v4 uses rsync v3.0.9 (in place of File::RsyncP) but is still in alpha release.  How stable is this release?

 

Best regards,
Vaughan

------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Vaughan Roberts | 27 Jul 02:14 2014

Error in backuppc log file: unix bind() failed: No such file or directory

Hi,

 

I have been using backuppc on centos for over eight years, but now I am trying to upgrade my server to Centos 7 and unfortunately there does not seem to be a backuppc package for centos 7 as yet, so I tried using BackupPC-3.3.0-2.el6.x86_64.rpm.

 

Centos 7 uses apache 2.4 which required me to mess around with BackupPC.conf file, but I think that is now OK as I can access backuppc from a browser.

 

However, it looks like I have a permissions problem somewhere and I am not sure exactly what the cause is.  The extract from the BackupPC log file is:

 

Reading hosts file

Adding host …

Adding host …

Unix bind(): failed: No such file or directory

 

When I connect a browser to localhost/backuppc, I get the expected screen and I can browse the backups, but this screen also gives me this error:

Error: Unable to connect to BackupPC server
This CGI script (/sbin/BackupPC_Admin) is unable to connect to the BackupPC server on localhost port -1.
The error was: unix connect: No such file or directory.
Perhaps the BackupPC server is not running or there is a configuration error. Please report this to your Sys Admin.

Naturally the service is running and I have selinux in permissive mode.  This error occurs both on localhost and from a computer on my LAN, so it should not be a firewall issue.  I have checked permissions in /var/lib/BackupPC/* and they are all owned by backuppc: this directory is a mounted logical volume.  I use rsync on the pcs for backup purposes and at the moment no backups are occurring.

 

Can anyone point me to where this error is likely to be coming from?

 

Best regards,
Vaughan

Attachment (smime.p7s): application/pkcs7-signature, 7688 bytes
------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Russell R Poyner | 24 Jul 21:36 2014
Picon

signal to Kill DumpPreUserCmd

I have a bash script that runs on the BackupPC server as DumpPreUserCmd.

I'd like to have the script catch a signal and clean up it's files if a 
backup get's canceled while the script is running. So far I'm trapping 
INT, TERM, ABRT and ALRM but not getting what I want.

Does BackupPC send SIGKILL to the DumpPreUserCmd process? Or something 
else I haven't thought of?

The value of $Conf{UserCmdCheckStatus} seems to not matter.

Background:

This is in the context of a method to create shadow copies and start 
rsyncd on windows clients without having to remotely execute anything on 
the windows box via ssh or winexe.

1. The server starts our DumpPreUserCmd bash script which creates a file 
called <hostname>.html in a web-readable directory. It then polls the 
windows machine to see if rsyncd has started. Once windows starts it's 
rsyncd the PreUser script exits so that the dump can start.

2. The windows machine runs a script in task_scheduler every 5 minutes 
to see if the file <hostname>.html exists in the special directory on 
the BackupPC web server. If it does the windows box runs a powershell 
script that creates shadow copies, starts rsyncd and opens a firewall 
hole to allow the backup.

3. On the server when the dump completes DumpPostUserCmd runs and 
removes the <hostname>.html file.

4. When the periodic task on the windows machine no longer finds the 
<hostname>.html file it stops the rsyncd service, deletes the shadow 
copies, and closes the firewall hole.

It works fine unless the backup get's interupted while DumpPreUserCmd is 
running and waiting for windows to start it's rsyncd service. In that 
case the <hostname>.html file get's orphaned.

I *could* create a cron job on the server to look for and remove 
orphaned <hostname>.html files, but I'm hoping to not need that.

Thanks
Russ Poyner

------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

G.W. Haywood | 19 Jul 21:43 2014
Picon

Documentation.

Hi there,

After going around the Internet in circles for a couple of hours I've
just been looking at

backuppc.sourceforge.net/faq/BackupPC.html

to try to find the bit of documentation I'd missed that tells users
that in order to use

$Conf{XferMethod} = "smb"

to back up a Windows box it's necessary to have the netlogon service
running on the client.

I can't find it anywhere.  Where should I be looking?

--

73,
Ged.

------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Re: BackupPC acting strangely

After updating to the latest version of BackupPC, AND re-installing the
appropriate CPAN modules, AND deleting all the failed full backups,
things appear to be back to normal.

Much thanks to Les Mikesell for the help, really appreciated.
  - Richard

At 11:09 AM 7/10/2014, Les Mikesell wrote:
>On Thu, Jul 10, 2014 at 12:42 PM, Richard Stockton - Tierpoint Systems
>Administrator <Richard.Stockton <at> tierpoint.com> wrote:
> > 2nd try: 1st didn't post to list...
> >
> > Isn't there anyone out there who has had a similar problem?  I can't
> > believe I'm the only one.  Further investigation shows the problem
> > is happening for almost all of my 14 hosts.
> >
> > Bottom line: The incrementals get (and create) all the files, but the
> > full backups only create empty directories.  No errors are shown in
> > the logs, and the GUI shows all the backups as complete.  When the
> > incrementals are deleted, the fulls don't have everything, and data
> > is permanently lost.
> >
> > This is using rsync between multiple CentOS (Linux) boxes.
> >
> > I REALLY need to get this fixed.  Anybody?  Please help.
>
>It doesn't make any sense to me.  There was a recent thread on the list:
>https://www.mail-archive.com/backuppc-users <at> lists.sourceforge.net/msg26870.html
>which sounded similarly broken, but that was on ubuntu and fixed with
>a re-install.  My best guess would be that some perl module is
>corrupted (unless your xfer logs are full of 'can't link errors).
>
>If you installed the package from EPEL, you can use 'rpm -Vv BackupPC'
>to see if any of the package files have been changed since
>installation.
>
>--
>    Les Mikesell
>      lesmikesell <at> gmail.com
>
>------------------------------------------------------------------------------
>Open source business process management suite built on Java and Eclipse
>Turn processes into business applications with Bonita BPM Community Edition
>Quickly connect people, data, and systems into organized workflows
>Winner of BOSSIE, CODIE, OW2 and Gartner awards
>http://p.sf.net/sfu/Bonitasoft
>_______________________________________________
>BackupPC-users mailing list
>BackupPC-users <at> lists.sourceforge.net
>List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
>Wiki:    http://backuppc.wiki.sourceforge.net
>Project: http://backuppc.sourceforge.net/

-------------------------------------------------------------
Richard Stockton
Senior Systems Engineer - Seattle
140 4th AVE N. #360 | Seattle, WA 98109
D 206.404.9500 T 888.234.6781
W http://www.tierpoint.com E richard.stockton <at> tierpoint.com
Facilities in: Baltimore, Dallas, Oklahoma City, Tulsa, Spokane, Seattle

------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Toasterman | 1 Jul 21:17 2014

Ubuntu Backup server extremely slow

We use BackupPC at work on an Ubuntu server. It has gotten extremely slow during the past 2 weeks and I haven't
been able to figure out why. It always worked perfectly before. It takes about 10-15 minutes to start back
up after shutdown. Any commands issued via SSH are also extremely delayed.

There is more than enough RAM and the CPU is barely touched. Here's my out put of top:

top - 14:56:10 up  1:10,  1 user,  load average: 1.04, 1.38, 2.14
Tasks: 116 total,   1 running, 115 sleeping,   0 stopped,   0 zombie
Cpu(s):  0.0%us,  0.1%sy,  0.0%ni, 99.8%id,  0.0%wa,  0.0%hi,  0.2%si,  0.0%st
Mem:   8080268k total,   658308k used,  7421960k free,   281704k buffers
Swap:  4194300k total,        0k used,  4194300k free,    53052k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
 1073 backuppc  20   0 51984  14m 2184 S    0  0.2   0:03.68 BackupPC_trashC
 1034 backuppc  20   0 62848  12m 1736 S    0  0.2   0:00.14 BackupPC
 1050 snmp      20   0 47552 4904 2560 S    0  0.1   0:00.41 snmpd
 1105 www-data  20   0  342m 4392 1256 S    0  0.1   0:00.02 apache2
 1106 www-data  20   0  342m 3816 1268 S    0  0.0   0:00.01 apache2
 1212 root      20   0 20960 3644 1624 S    0  0.0   0:00.17 bash
 1181 root      20   0 70676 3384 2624 S    0  0.0   0:00.21 sshd
 1479 root      20   0 70676 3348 2604 S    0  0.0   0:00.02 sshd
 1100 root      20   0 71528 2816 1368 S    0  0.0   0:00.06 apache2
 1103 www-data  20   0 71260 2000  572 S    0  0.0   0:00.00 apache2
  922 syslog    20   0  123m 1928 1040 S    0  0.0   0:00.03 rsyslogd
    1 root      20   0 23640 1864 1288 S    0  0.0   0:00.92 init
 1564 root      20   0 19280 1304  964 R    0  0.0   0:02.13 top
  444 root      16  -4 17408 1252  344 S    0  0.0   0:00.03 udevd
 1262 root      20   0 49316 1116  556 S    0  0.0   0:00.00 sshd
  605 root      18  -2 17280 1104  312 S    0  0.0   0:00.00 udevd
  606 root      18  -2 17280 1096  304 S    0  0.0   0:00.00 udevd
 1021 root      20   0 21132 1000  764 S    0  0.0   0:00.00 cron
  442 root      20   0 17236  896  596 S    0  0.0   0:00.03 upstart-udev-br
 1493 root      20   0 12524  888  720 S    0  0.0   0:00.00 sftp-server
  988 root      20   0  6132  672  564 S    0  0.0   0:00.00 getty
  996 root      20   0  6132  672  564 S    0  0.0   0:00.00 getty
  998 root      20   0  6132  672  564 S    0  0.0   0:00.00 getty
 1180 root      20   0  6132  672  564 S    0  0.0   0:00.00 getty
  991 root      20   0  6132  668  564 S    0  0.0   0:00.00 getty
  994 root      20   0  6132  668  564 S    0  0.0   0:00.00 getty
 1004 root      20   0 11364  640  496 S    0  0.0   0:00.23 irqbalance
 1061 root      20   0 12776  568  368 S    0  0.0   0:00.01 mdadm
 1020 daemon    20   0 18936  384  216 S    0  0.0   0:00.00 atd
    2 root      20   0     0    0    0 S    0  0.0   0:00.00 kthreadd
    3 root      20   0     0    0    0 S    0  0.0   0:00.00 ksoftirqd/0
    4 root      RT   0     0    0    0 S    0  0.0   0:00.00 migration/0
    5 root      RT   0     0    0    0 S    0  0.0   0:00.00 watchdog/0
    6 root      RT   0     0    0    0 S    0  0.0   0:01.39 migration/1
    7 root      20   0     0    0    0 S    0  0.0   0:00.08 ksoftirqd/1
    8 root      RT   0     0    0    0 S    0  0.0   0:00.00 watchdog/1
    9 root      RT   0     0    0    0 S    0  0.0   0:00.00 migration/2
   10 root      20   0     0    0    0 S    0  0.0   0:00.01 ksoftirqd/2
   11 root      RT   0     0    0    0 S    0  0.0   0:00.00 watchdog/2
   12 root      RT   0     0    0    0 S    0  0.0   0:00.05 migration/3
   13 root      20   0     0    0    0 S    0  0.0   0:00.01 ksoftirqd/3
   14 root      RT   0     0    0    0 S    0  0.0   0:00.00 watchdog/3
   15 root      20   0     0    0    0 S    0  0.0   0:00.05 events/0
   16 root      20   0     0    0    0 S    0  0.0   0:00.03 events/1
   17 root      20   0     0    0    0 S    0  0.0   0:00.06 events/2
   18 root      20   0     0    0    0 S    0  0.0   0:00.04 events/3

Here is the output of free.

             total       used       free     shared    buffers     cached
Mem:       8080268     658392    7421876          0     281712      53064
-/+ buffers/cache:     323616    7756652
Swap:      4194300          0    4194300

Thanks in advance.

+----------------------------------------------------------------------
|This was sent by bstringfellow <at> bobcad.com via Backup Central.
|Forward SPAM to abuse <at> backupcentral.com.
+----------------------------------------------------------------------

------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

raceface2nd | 8 Jul 23:20 2014

Restore not working

Hi,

 <at> Holger: Your hint doesn't work, I get the following Errors because backuppc isn't allowed to run tar as
sudo without Password
Running: /usr/bin/sudo /bin/tar$ -x -p --numeric-owner --same-owner -v -f - -C /
Running: /usr/share/backuppc/bin/BackupPC_tarCreate -h localhost -n 0 -s / -t -r /var/www -p /var/www/ /var/www/index.html
Xfer PIDs are now 25189,25190
sudo: no tty present and no askpass program specified
restore failed: BackupPC_tarCreate failed
Without -M tar is not restoring out of backuppc and gibing me the error
/bin/tar: Multiple archive files require `-M' option

 <at> Adam: Thanks for reminding me to have a second look to my commands, I already took out one of the "-f" before,
but didn't recognize the minus afterwards. Taking out "-f -" solved the problem.

Thanks! Andy

+----------------------------------------------------------------------
|This was sent by info <at> andreasseiler.com via Backup Central.
|Forward SPAM to abuse <at> backupcentral.com.
+----------------------------------------------------------------------

------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

baro | 1 Jul 16:05 2014

fileListReceive failed

Hi Friends!

I have this problem in my backuppc, and i donĀ“t know how to solve or fixed it.

I intent to resolv the troubleshooting  doing a lot of things but imposible for me... 

Backuppc server can log into the Centos clients to copy the files  with the backuppc user without password,
but the backup finished with this error:

Got fatal error during xfer (fileListReceive failed)

Could you help me to repair this problem, please. 

Thanks!!!

+----------------------------------------------------------------------
|This was sent by jmgarrido78 <at> gmail.com via Backup Central.
|Forward SPAM to abuse <at> backupcentral.com.
+----------------------------------------------------------------------

------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
raceface2nd | 6 Jul 17:57 2014

Recovering not working

Hi all,

I denied root to login via ssh on my server running backuppc, so I changed the TarClientRestorCmd to

/usr/bin/sudo /tar/tarRestore -x -p --numeric-owner --same-owner -v -f - -C $shareName

my tarRestore is

#!/bin/sh
exec /bin/tar -M -x -f - "$ <at> "

I get the following error

Running: /usr/bin/sudo /tar/tarRestore -x -p --numeric-owner --same-owner -v -f - -C /
Running: /usr/share/backuppc/bin/BackupPC_tarCreate -h localhost -n 0 -s / -t -r /var/www -p /var/www/ /var/www/index.html
Xfer PIDs are now 27885,27886
/bin/tar: Options `--f' and `--f' both want standard input
Try `/bin/tar --help' or `/bin/tar --usage' for more information.
Tar exited with error 512 () status
restore failed: BackupPC_tarCreate failed

When I run the command

/usr/share/backuppc/bin/BackupPC_tarCreate -h localhost -n 0 -s / -t -r /var/WWW -p /var/www/ /var/www/index.html

as BackupPC-user in the command line it Shows me the content of the file

./var/www/index.html0000674000004100000410000000026312267725145014673 0ustar 
www-datawww-data<!DOCTYPE html>
<html>
<head>
        <script type="text/javascript"> window.location.href="index.php"; </script>
        <meta http-equiv="refresh" content="0; URL=index.php">
</head>
</html>
Done: 1 files, 179 bytes, 0 dirs, 0 specials, 0 errors

but doesn't create the file.

Does anybody has any ideas?

Thanks!

Andy

+----------------------------------------------------------------------
|This was sent by info <at> andreasseiler.com via Backup Central.
|Forward SPAM to abuse <at> backupcentral.com.
+----------------------------------------------------------------------

------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

yashiahru | 7 Jul 22:18 2014

error disk too full, deleted pc, how to delete cpool?

CentOS release 6.4 (Final)
a LVM is mounted on /backup

Using BackupPC for years, suddenly:
2014-07-08 01:00:00 Disk too full (96%); skipped 1 hosts

What i did:
1) Update BackupPC to  BackupPC-3.3.0-2.el6.x86_64
Problem unsolved

2) Delete backup manually follow this guide: http://blackbird.si/deleting-backup-from-backuppc-manually/
Problem unsolved

3) ./BackupPC_nightly 0 255
BackupPC_stats 254 = pool,0,0,0,0,0,0,0,0,0,0,
BackupPC_stats 255 = cpool,1295,17,25951504,25063812,0,0,0,0,0,12,406941
LOG: 
2014-07-08 01:00:50 BackupPC_nightly now running BackupPC_sendEmail
2014-07-08 01:01:08 Finished  admin  (BackupPC_nightly -m 0 127)
2014-07-08 01:01:08 Pool nightly clean removed 0 files of size 0.00GB
2014-07-08 01:01:08 Pool is 0.00GB, 0 files (0 repeated, 0 max chain, 0 max links), 1 directories
2014-07-08 01:01:08 Cpool nightly clean removed 0 files of size 0.00GB
2014-07-08 01:01:08 Cpool is 6784.73GB, 330274 files (19 repeated, 2 max chain, 290 max links), 4369 directories
Problem unsolved

4) perl -w ./BackupPC_nightly 0 255
too much error, total 3 types of them:
Statement unlikely to be reached at /usr/share/BackupPC/lib/BackupPC/Lib.pm line 1317.
        (Maybe you meant system() when you said exec()?)
Use of uninitialized value $fileLinkTotal in concatenation (.) or string at ./BackupPC_nightly line 208.
BackupPC_stats 90 = pool,0,0,0,0,0,0,0,0,0,0,
Use of uninitialized value in string ne at /usr/share/BackupPC/lib/BackupPC/Lib.pm line 473.
Use of uninitialized value in string ne at /usr/share/BackupPC/lib/BackupPC/Lib.pm line 551.
Problem unsolved

I found that:
pc/hostname is very small while the cpool directory is still 6.7T
the structure is: backuppc/cpool/0/0/0/theFiles

Please advise on solutions.
thanks.

+----------------------------------------------------------------------
|This was sent by kubo.yashiharu <at> gmail.com via Backup Central.
|Forward SPAM to abuse <at> backupcentral.com.
+----------------------------------------------------------------------

------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Gmane