Russell R Poyner | 17 Sep 16:51 2014
Picon

rsync 3.09 fails on windows symlinkd

I've been using a powershell script to create shadow copies and link 
them to the filesystem in order to expose them to rsyncd. This works 
with my oldish copy of DeltaCopy rsync, but when I use the current 
cygwin-rsync package from the BackupPC web site I'm not able to follow 
the link.

If I tell it to follow the link:
  rsync -r -L rsyncuser <at> host::shadow
symlink has no referent: "C" (in shadow)
drwxr-xr-x           0 2014/09/17 11:32:30 .
rsync error: some files/attrs were not transferred (see previous errors) 
(code 23) at main.c(1538) [generator=3.0.9]

If I don't tell it to follow the link:
  rsync -r 
behdadrsync <at> cbe-win7amd64.che.wisc.edu::shadowdrwxr-xr-x           0 
2014/09/17 11:37:30 .
lrwxrwxrwx          54 2014/09/17 11:37:30 C

On the windows side:
dir c:\shadow

09/17/2014  09:27 AM   <SYMLINKD>     C 
[\\?\GLOBALROOT\Device\HarddiskVolume....

I have use "chroot = no" in rsyncd.conf

I've also tried a variety of links related switches on the rsync client.

Googling "symlink has no referent: windows"
(Continue reading)

Nicola Scattolin | 17 Sep 11:18 2014

very slow backup on windows

Hi,
i have backuppc backing up a couple of linux machines of 30/40 GB each 
and a windows shared folder of 1.1 TB.
Usually it takes one day and a half to backup the windows folder with a 
speed around 7 mbit/s but now the speed has decreased to 5 and it takes 
up ot 3 days to make full backup.
I have already tried to reboot the server but speed remain the same.
so some details, the backup server is proxmox virtualized, 2.5 Gb of ram 
and 2 processors (seems enaught since never reach more than 80% when 
making backup) 10/100 eth integrated port.
The windows system is a windows server 2003, also proxmox virtualized, 1 
processor.
Backup is make on local environment, at night so lan traffic is very low 
or null.
host configuration on backuppc is:
XderMethod:smb
ClientCharsetLegacy:iso-8859-1
SmbClientFullCmd: $smbClientPath \\$host\$shareName $I_option -U 
$userName -E -d 1 -c tarmode\ full -Tc$X_option - $fileList
SmbClientIncrCmd: $smbClientPath \\$host\$shareName $I_option -U 
$userName -E -d 1 -c tarmode\ full -TcN$X_option $timeStampFile - $fileList
compresslevel:3 (maybe i can reduce compression to save time, IF the 
problem is backuppc server speed)

how can i speed up my backups? also on linux machine max speed is 11 
mbit/s, so not so much.
thank yuo

------------------------------------------------------------------------------
Want excitement?
(Continue reading)

Evaristo Calatravita | 15 Sep 22:27 2014
Picon

first copy on slow line

Hi,

I'm testing bakckuppc during las months but is the first problem/question I have: I'm testing to backup various  2TB-filesystems with relativelly lower daily changes (about 10-30Mb)

The problem is that the line between these hosts  and backuppc server is terribly slow :-(, but I have the possibility of make initial backup moving info in a harddisk or similar.

The question is: there is some function or hack to copy first backup manually to the backuppc server?


Thanks everyone
------------------------------------------------------------------------------
Want excitement?
Manually upgrade your production database.
When you want reliability, choose Perforce
Perforce version control. Predictably reliable.
http://pubads.g.doubleclick.net/gampad/clk?id=157508191&iu=/4140/ostg.clktrk
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
strowi | 9 Sep 12:42 2014
Picon

Long-Term Backups/Rotation

Hi everyone,

i am currently trying to setup a long-term backup-solution with 
backuppc.

But it seems i am struggling with the KeepCnt- Values..

I would like to do daily incremental backups -> weekly full backups so 
far so good..

But furthermore keep 1 full backup of each month for a year and 1 full 
backup of each year for 10 years.

Is there some way to accomplish this with backuppc?

greetings,
Roman

------------------------------------------------------------------------------
Want excitement?
Manually upgrade your production database.
When you want reliability, choose Perforce.
Perforce version control. Predictably reliable.
http://pubads.g.doubleclick.net/gampad/clk?id=157508191&iu=/4140/ostg.clktrk
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Francisco Suarez | 8 Sep 16:09 2014

Best Appoach for MySQL Backups

What would be a good approach for backing up MySQL databases on hosts with BackupPC?
------------------------------------------------------------------------------
Want excitement?
Manually upgrade your production database.
When you want reliability, choose Perforce
Perforce version control. Predictably reliable.
http://pubads.g.doubleclick.net/gampad/clk?id=157508191&iu=/4140/ostg.clktrk
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Francisco Suarez | 5 Sep 14:45 2014

Admin and Admin1 host? Is it maintenance?

I have this showing on the dashboard this morning. Not sure what i means?

Currently Running Jobs

Host Type User Start Time Command PID Xfer PID
admin 9/5 01:00 BackupPC_nightly -m 0 127 6156
admin1 9/5 01:00 BackupPC_nightly 128 255 6157

------------------------------------------------------------------------------
Slashdot TV.  
Video for Nerds.  Stuff that matters.
http://tv.slashdot.org/
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Francisco Suarez | 4 Sep 16:32 2014

Link Running / Current Running Jobs Problem

Backuppc is running well and backing up other hosts. For some reason there is one job showing under "Current Running Jobs" and "Host Summary" showing 'link running. I requested stopping the job, still showed some time after under both sections.

I recurred to killing the process pid manually on terminal.

Something else noticed was the backup server was super slow prior to killing this process.

The logs showed previously it did complete the full, but maybe didn't closed the process?

Logs:


What could cause this and how can I prevent?
------------------------------------------------------------------------------
Slashdot TV.  
Video for Nerds.  Stuff that matters.
http://tv.slashdot.org/
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
hooper82 | 3 Sep 01:36 2014

Two active backups for a single host

Hi All,

I've got a host with 2 active backups after restarting the backuppc server.  There seams to be no way to clear
them, or start a new backup.  I can request one to start, but nothing happens.

Any ideas?

+----------------------------------------------------------------------
|This was sent by hooper82 <at> gmail.com via Backup Central.
|Forward SPAM to abuse <at> backupcentral.com.
+----------------------------------------------------------------------

------------------------------------------------------------------------------
Slashdot TV.  
Video for Nerds.  Stuff that matters.
http://tv.slashdot.org/
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

vano | 2 Sep 16:03 2014

504 Gateway Time-out on Backuppc web interface

I have the same error on a new installed Backuppc 3.3.0 under Centos 6.5, when trying to see Error Log of host.
Backuppc is running.
Link is look like: http://ip/backuppc?action=view&type=XferErr&num=0&host=host-with-errors.

So, got: The gateway did not receive a timely response from the upstream server or application.
Apache/2.2.15 (CentOS) Server at ...

+----------------------------------------------------------------------
|This was sent by vano <at> qrz73.com via Backup Central.
|Forward SPAM to abuse <at> backupcentral.com.
+----------------------------------------------------------------------

------------------------------------------------------------------------------
Slashdot TV.  
Video for Nerds.  Stuff that matters.
http://tv.slashdot.org/
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Xavier Crespin | 30 Aug 16:30 2014
Picon

Prevent backuppc redownloading files already in the pool

Hello,

I've been using backuppc for some time to backup windows clients using 
the rsync/cygwin package available on backuppc's sourceforce repository.
The backuppc server is not in my local network, and machines are being 
backed up over the internet, wich takes some time at first (several 
weeks), but once the pool is populated it runs like a charm.

The problem i face is that if I edit/add a module, or move a directory 
on the windows client, even if the content of the directory is 
identical, backuppc will redownload all the content before doing it's 
md5 hash, wich takes several weeks all over again, even if the files are 
already present in the pool.

What actually happens : Backuppc downloads the file > hash > file 
already in the pool > hardlink created

Is there a way to ask rsync to hash the files on the client side to 
prevent downloading them again?

What i wish would happen : Backcuppc asks rsync to hash the file > hash 
on client > compare client hash with pool hash > file already in the 
pool > no download > hardlink created

Thank you

------------------------------------------------------------------------------
Slashdot TV.  
Video for Nerds.  Stuff that matters.
http://tv.slashdot.org/
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Tom Fallon | 30 Aug 14:18 2014
Picon

504 Gateway Time-out on Backuppc web interface

The web interface on one of our backuppc servers is not responding – it gives error The gateway did not receive a timely response from the upstream server or application.

 

Apache appears to be ok as plugging the server IP into a browser brings up the It Works! Default web page. I’ve tried restarting Apache to no avail.

 

Permissions look ok and no changes have been made to this box recently. Apache config looks same as on other 2 backup machines.

 

Is there anything else I can check to resolve this?

 

Regards, Kiweegie

------------------------------------------------------------------------------
Slashdot TV.  
Video for Nerds.  Stuff that matters.
http://tv.slashdot.org/
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Gmane