Mouhamadou Khadim DIOUF | 25 Nov 19:02 2015
Picon

Problem with the CGI Interface



Hello,
I have installed the new version of BackupPC 3.3.1 in Debian Jessie.
I can not display the graphical user interface (CGI interface)
which command to assign this permission?
 
ls -l __CGIDIR__/BackupPC_Admin
 -swxr-x---    1 __BACKUPPCUSER__   web      82406 Jun 17 22:58 __CGIDIR__/BackupPC_Admin

thank








--
Mouhamadou.Kh.DIOUF 

   Ingénieur Informatique
Consultant en Développement
      de Services DSI/UT
      Tel: 77-328-34-86
------------------------------------------------------------------------------
Go from Idea to Many App Stores Faster with Intel(R) XDK
Give your users amazing mobile app experiences with Intel(R) XDK.
Use one codebase in this all-in-one HTML5 development environment.
Design, debug & build mobile apps & 2D/3D high-impact games for multiple OSs.
http://pubads.g.doubleclick.net/gampad/clk?id=254741551&iu=/4140
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Jaime Fenton | 24 Nov 01:47 2015
Picon

Support Question

Hi There,

 

My company is using BackupPC and we’ve run into a problem that the system is sending email alerts from an email address that is not currently imputed into the system.

 

Also, email alert frequencies are going out every day even though it is set to every 7 days.

 

Has anyone encountered issues like this before and if so, how did you resolve it? We are using the service in two different offices, one works fine but the one that does not work has almost exactly the same setup (obviously with email address names changed).

 

We are using version 3.3.1.

 

Happy to provide any other information that would be useful to help us resolve this issue.

 

Thanks,

Jaime

--
Jaime Fenton
Support Engineer



www.animallogic.com

Please think of the environment before printing this email. This email and any attachments may be confidential and/or privileged. If you are not the intended recipient of this email, you must not disclose or use the information contained in it. Please notify the sender immediately and delete this document if you have received it in error. We do not guarantee this email is error or virus free.

------------------------------------------------------------------------------
Go from Idea to Many App Stores Faster with Intel(R) XDK
Give your users amazing mobile app experiences with Intel(R) XDK.
Use one codebase in this all-in-one HTML5 development environment.
Design, debug & build mobile apps & 2D/3D high-impact games for multiple OSs.
http://pubads.g.doubleclick.net/gampad/clk?id=254741551&iu=/4140
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
donkeydong69 | 21 Nov 04:48 2015

Backuppc restore freezing

I'm using ubuntu across 4 different computers in my network - one of which runs backuppc. Anything over 3-4
files seems to cause bakuppc to freeze. Right now I'm actually trying to restore an entire file system to a
computer, but I've noticed the same issue occur trying to restore 4 items in a download directory to a
separate computer, no less than 30mb total.

What I observe using sudo iftop and sudo htop is my server jumps to uploading 50mb/s for several minutes to
the designated computer, and then dropping all connection. On top, I see at the same time of the drop, the
process stops using the CPU and process up time just pauses and doesn't change after hours. Thus, on the web
GUI it looks like the process is running, through htop it looks like the process has frozen.

The systems hardware should not be limited enough to cause such freeze ups. How can I diagnose this?

+----------------------------------------------------------------------
|This was sent by vurstp <at> me.com via Backup Central.
|Forward SPAM to abuse <at> backupcentral.com.
+----------------------------------------------------------------------

------------------------------------------------------------------------------
Go from Idea to Many App Stores Faster with Intel(R) XDK
Give your users amazing mobile app experiences with Intel(R) XDK.
Use one codebase in this all-in-one HTML5 development environment.
Design, debug & build mobile apps & 2D/3D high-impact games for multiple OSs.
http://pubads.g.doubleclick.net/gampad/clk?id=254741551&iu=/4140
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Kenneth Porter | 21 Nov 01:14 2015

cygwin rsyncd find_fast_cwd warning

I'm using the cygwin-rsyncd package from Sourceforge to back up my Windows 
boxes.

I'm not sure if this affects the Windows rsyncd service but I'm seeing the 
warning message when using rsync from the command line after updating from 
Win7 to Win10.

There was a breaking change in a Windows DLL in Win8.1 used by cygwin to 
manipulate the current directory. Details here. Scroll down to Corinna 
Vinschen's first post.

<http://cygwin.1069669.n5.nabble.com/Cygwin-Cygwin-Compatibility-Issue-report-ID-386608-td40432.html>

------------------------------------------------------------------------------
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Yoann David | 19 Nov 12:09 2015
Picon

Large directory moved on backuped server, how can avoid the complete re-synchronization

Hello,

mostly everything is in the title.

On a target server, we move a quite large directory (90 Go), as the 
target change in backuppc, it try to re-sync everything (the whole 90Go) 
not only the difference.
Our bandwith between the backuped/target server and backuppc server is 
low (80ko/s), so it will take more than 13 days to transfer all datas !!!

What can we do ?

Yoann DAVID

------------------------------------------------------------------------------
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Johan Ehnberg | 19 Nov 07:50 2015
Picon

BackupPC pre-loading / seeding and migrating / moving script

Hi all,

A recurring theme I've noticed is the pre-loading and migration 
challenge. We just finished work on a corrupt pool one of our offsite 
instances, which BackupPC was unable to fix properly despite checksum 
prob 1.0. We decided to move to a fresh pool, and keep the old one on 
cold storage until retention runs out. In the process, I created a 
script that proved useful for other situations as well.

This script can save (restore to tar), seed (pre-load) and move 
(migrate, switching between old and new backuppc partitions on the same 
server) BackupPC backups. It works at least for tar and rsync transfer 
modes and with BackupPC 3.3.0. For migration, it is considerably faster 
and more portable than migrating the pool, but previous backups will not 
be included. Migrating to another server is done using save on the old 
and seed on the new.

Some use cases I can think of for this script are:
- Preloading secondary servers at a remote location
- Moving to another partition when hardlink preservation takes too long
- Taking the usable parts of a corrupt pool when starting a fresh one

The details are on:
http://johan.ehnberg.net/backuppc-pre-loading-seeding-and-migrating-moving-script/

The latest version of the script is found here:
https://owncloud.molnix.com/index.php/s/t879KKwnHg7nHDu

Best regards,
Johan

--

-- 
Johan Ehnberg
johan <at> ehnberg.net
+358503209688

Molnix Oy
molnix.com

------------------------------------------------------------------------------
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

robp2175 | 12 Nov 13:40 2015

Rsync hanging when doing a restore

I am trying to do a large restore on a failed system (About 40GB), however the restore does not get very far
before it just hangs on a file. It does not seem to be the same file everytime either. Running lsof | grep
rsync shows me the current file it is hung on. I have googled this problem without finding any good answer.
Hoping others have found this problem and solution here. BackupPC is fully up-to-date.

+----------------------------------------------------------------------
|This was sent by robp2175 <at> hotmail.com via Backup Central.
|Forward SPAM to abuse <at> backupcentral.com.
+----------------------------------------------------------------------

------------------------------------------------------------------------------
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

George Avrunin | 11 Nov 18:41 2015
Picon

BackupPPC 4 -- rsync_bpc using very large amount of memory

I have a new machine with 32 gig of memory running Fedora (now upgraded
to Fedora 23 from 22) and I decided to try BackupPC 4.0.0.alpha3 on it.  It
has been working fine for a couple of weeks, as far as I can tell, but
today I had to do a backup of a laptop (also running Fedora 23) while I was
working on the backup server, and I noticed tremendous lags. top says that
rsync_bpc is using 95% of memory and free says:

           total        used        free      shared  buff/cache available 
Mem:       32854592  32237588      172360      120776  444644      339248 
Swap:      67108860    22412576    44696284

There are actually two rsync_bpc processes running, one that's using all
that memory and the other using hardly any.  Firefox is next in the memory listing in top, using .6% of memory.

Is this reasonable for rsync_bpc?  If so, is there a way to limit the
memory used by rsync_bpc?  And if it's not reasonable, what should I be
checking?  I've used BackupPC for a long time, but this is my first
experimentation with 4.0.

Thanks,

  George
------------------------------------------------------------------------------
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Russell Poyner | 10 Nov 17:21 2015
Picon

Is it possible to split a large pool?

I have a single BackupPC 3.3 backing up 110 machines. We are in the 
process of getting a second server and I'm wondering if it's possible to 
split the pool to move machines to the new server without losing their 
backup history.

Thanks
Russ Poyner

------------------------------------------------------------------------------
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

absolutely_free@libero.it | 10 Nov 12:03 2015
Picon

R: Re: Command line restore

Hi Holger,

thank you very much for you detailed answer.
I launched restore through web interface (and it completed well without 
issue).
During restore, I see this file:

$TopDir/pc/myserver/RestoreInfo.2

content is:

%RestoreReq = (
  'fileList' => [
    '/account/mail/account.it/username'
  ],
  'shareDest' => 'home',
  'pathHdrDest' => '/account/mail/account.it/username/',
  'num' => '196',
  'reqTime' => 1447152479,
  'shareSrc' => 'home',
  'pathHdrSrc' => '/account/mail/account.it/username',
  'hostDest' => 'myserver',
  'hostSrc' => 'myserver',
  'user' => 'backuppc'
);

so.. which command should I use next time?

Thank you again!

>----Messaggio originale----
>Da: wbppc <at> parplies.de
>Data: 09/11/2015 18.07
>A: "absolutely_free <at> libero.it"<absolutely_free <at> libero.it>, "General list for 
user discussion, questions and support"<backuppc-users <at> lists.sourceforge.net>
>Ogg: Re: [BackupPC-users] Command line restore
>
>Hi,
>
>absolutely_free <at> libero.it wrote on 2015-11-09 16:24:29 +0100 [[BackupPC-
users] Command line restore]:
>> Hi,I am using BackupPC 3.2.1-4 (official Debian package).Is there a way to
>> launch a restore process through command line?
>
>yes.
>
>> I mean, I don't want to create a tar / zip archive. I need to restore files
>> to original server.Thank you very much
>
>Considering the web server doesn't do the restore itself but rather instructs
>the BackupPC server to do so, there must be a way.
>
>Regards,
>Holger
>
>P.S.: In case you were wondering *how* to launch a restore via command line,
>      it's a bit complicated. The command as such is something like
>
>      BackupPC_serverMesg restore <ip> <host> <user> <request file>
>
>      where <ip> should probably be the IP address of <host> (but will
>      apparently be looked up(*) if it isn't - presuming some piece of code
>      doesn't complain first), <user> is only for logging purposes, if I
>      remember correctly, and <request file> might be somewhat difficult
>      to construct. Technically speaking, it isn't, it's just a Data::Dumper
>      dump of a Perl hash containing the relevant information. So, what is
>      the relevant information? Let's do it the easy way (for both you and
>      me): initiate a restore from the web interface (and make sure to either
>      direct it somewhere it won't do any harm or make (absolutely) sure you
>      actually can't restore; better yet, do both), and after it has 
completed
>      or failed, look in $TopDir/pc/≤host> for a file named RestoreInfo.n 
(and
>      unless that turns out to be RestoreInfo.0, you can skip that part and
>      just look at one of the preexisting files straightaway). Figure out 
what
>      the individual hash entries mean and fill the values to match your 
needs.
>      You can probably get away with setting 'num' => -1 to always refer to
>      the latest backup and leaving 'reqTime' as it is (even though that 
will,
>      strictly speaking, be incorrect), but I'd test that, just to be sure.
>      Hint: for a full restore, I get "fileList => [ '/' ]" (among other hash
>      entries).
>
>      As always, you need to run BackupPC_serverMesg as the backuppc user.
>
>      Hope that helps.
>
>      (*) As I read the code, <host> will be looked up if <ip> doesn't look
>	  like an IP. You might expect <ip> to be looked up, but that
>	  apparently is not the case.
>

------------------------------------------------------------------------------
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Christian Völker | 9 Nov 20:02 2015
Picon

Best to Move a Large Pool to Different FS?

Hi all,

I want to transfer my pool from ext4 to xfs. The pool is around 1.3TB
with approx 15 hosts backing up.

Well, the obvious rsync -avH is called to be too memory consuming
because of the hardlinks.

So I started the way listed here:
http://roland.entierement.nu/blog/2013/12/02/rsyncing-a-backuppc-storage-pool-efficiently.html

The rsync of the cpool is done (took quite a while!)

I started the "store-hardlinks.pl" and up to now top tells me:

top - 20:01:12 up  3:32,  2 users,  load average: 1.36, 1.08, 1.03
Tasks: 107 total,   1 running, 106 sleeping,   0 stopped,   0 zombie
Cpu(s):  0.3%us,  0.8%sy,  0.0%ni,  0.0%id, 98.7%wa,  0.0%hi,  0.2%si, 
0.0%st
Mem:   8061568k total,  7778968k used,   282600k free,   442732k buffers
Swap:  4063228k total,     7672k used,  4055556k free,    12848k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
 1644 root      20   0 5760m 5.5g 1008 D  1.3 71.5   7:09.30 store-hardlinks

So it consumes already nearly 6GB of memory!

Anyone else having a better idea how to transfer in an reasonable amount
of time with reasonable memory consumption?

Greetings

Christian

------------------------------------------------------------------------------
Presto, an open source distributed SQL query engine for big data, initially
developed by Facebook, enables you to easily query your data on Hadoop in a 
more interactive manner. Teradata is also now providing full enterprise
support for Presto. Download a free open source copy now.
http://pubads.g.doubleclick.net/gampad/clk?id=250295911&iu=/4140
_______________________________________________
BackupPC-users mailing list
BackupPC-users <at> lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Gmane