Matthew Woehlke | 1 Dec 01:10 2006
Picon
Picon

Re: gnulib broken on systems lacking fchdir

Eric Blake wrote:
> Matthew Woehlke writes:
>> That sounds like a good idea, but... does that mean I have to *write* an 
>> entire unistd.h *and* make it work everywhere, or is there a way to 
>> 'drop in' one that pulls the system unistd.h, plus extras?
> 
> For an example of how to provide a replacement <unistd.h>, see how gnulib 
> already provides a replacement <sys/stat.h> which pulls in the system version, 
> then touches it up as needed.  You would really be writing lib/unistd_.h, which 
> first includes  <at> ABSOLUTE_UNISTD_H <at> , then, if HAVE_FCHDIR is not defined, 
> replaces fchdir with rpl_fchdir, etc.

Thanks for the info. Actually, I found unistd_safer.h too, which might 
help? :-)

>> As mentioned, so far this has nothing to do with coreutils except that I 
>> know it *will* affect coreutils. Right now I'm worrying about gzip.  
>> But... I'm also planning on building gettext (albeit not a version that 
>> has the newer *at stuff AFAIK).
> 
> To my knowledge, gettext does not depend on fchdir (as evidenced by the fact 
> that it builds on mingw).  But coreutils, findutils, tar, and gzip all use 
> gnulib directory traversal.

Ok, but I may try to avoid GPL dependencies anyway.

>> Back to the technical standpoint, come to think of it don't most systems 
>> limit # of fd's to a reasonable number like 1024? 
> 
> No, the GNU spirit is to avoid arbitrary limits, and 1024 is arbitrary
(Continue reading)

Simon Josefsson | 1 Dec 10:50 2006

Re: gif patents on GNU web pages

karl <at> freefriends.org (Karl Berry) writes:

> In the past, it was common practice to put the "GNU head" image on GNU
> software web pages with the text "no gifs due to patent problems",
> linking to http://gnu.org/philosophy/gif.html.
>
> Well, the gif patents have expired (we updated the gif.html page), so
> that statement should be removed or changed now.  Personally I just
> removed it from the pages for my packages, it was the easiest thing to
> do.

FYI, maintain.texi in gnulib still says:

  Web pages for GNU packages should not include GIF images, since the GNU
  project avoids GIFs due to patent problems.   <at> xref{Ethical and
  Philosophical Consideration}.

Rather than removing the paragraph, I'd prefer if it explained that
the patent expired, and that gif's are now OK.  Here is a strawman:

  Historically, web pages for GNU packages should not include GIF
  images, because of patent problems,  <at> xref{Ethical and Philosophical
  Consideration}.  However, the patent have expired, and using GIF
  images is now acceptable.

/Simon

Jim Meyering | 1 Dec 11:16 2006
Picon

Re: gnulib broken on systems lacking fchdir

Eric Blake <ebb9 <at> byu.net> wrote:
> To my knowledge, gettext does not depend on fchdir (as evidenced by the fact
> that it builds on mingw).  But coreutils, findutils, tar, and gzip all use
> gnulib directory traversal.

When I think of gnulib directory traversal, I think of its lib/fts.c.
I knew about findutils using fts, but didn't know about the other two.

So I took a look.
I see that gzip doesn't use anything like that.  It just rolls its own
and operates on full relative names.  But that's the way it should be,
since no one cares if gzip can't recursively compress or decompress a
hierarchy that's really deep or that contains very long names.  Here,
simple is better, since gzip has to be so portable.

Tar rolls its own, too, but uses chdir, so doesn't hit the PATH_MAX
limitation.

Jim Meyering | 1 Dec 11:22 2006
Picon

Re: gnulib broken on systems lacking fchdir

Eric Blake <ebb9 <at> byu.net> wrote:
> Jim Meyering <jim <at> meyering.net> writes:
>
>> Right.
>> I did a survey, some time ago, of reasonable porting targets, and all
>> had fchdir.  Eventually I should remove the test for fchdir, too.
>
> FYI, mingw is another relatively-active porting target that lacks fchdir,
> which would benefit from a good fchdir replacement.  As to whether you
> consider mingw a reasonable porting target, that's a different question
> (native Windows tends to be quite far from POSIX to make it a very
> difficult porting task).

Thanks.
With so many systems lacking fchdir (BeOS, Tandem NSK/OSS, and now mingw),
I suppose there is critical mass for adding fchdir support.

I'm glad you're lending a hand.

Bruno Haible | 1 Dec 14:38 2006

Re: gnulib broken on systems lacking fchdir

Eric Blake wrote:
> > What about the FD table; should it be a hash table, a binary tree, an 
> > ordered linked list, or something else entirely?
> 
> Gnulib already provides the gl_list module.  The idea there is that you start 
> by coding with an array list (probably a good choice anyways, since the 
> underlying kernel also maintains an array of open fds, and since it seems to me 
> that you are always going from fd to name, never a reverse lookup), work out 
> the bugs, then decide if some other representation, such as an AVL tree list, 
> would be more efficient for the typical usage pattern of the list.  Once you 
> use the gl_list API, it is only a one-line code change and reinvocation of 
> gnulib-tool to pick up the new underlying list implementation.

This is all true, but I think using gl_list is an overkill here: All known
libc or kernel implementations have a per-process array of file descriptors,
where the information of each file descriptor (the file it refers to,
whether it's inheritable or not, etc.) is stored. So, you can assume that
an fd is a _small_ integer >= 0. Some systems make this assumption even more
explicit by providing a getdtablesize() function.

So, the representation I would prefer would be a resizable, malloc()ed array.
No need to bother calling getdtablesize().

Bruno

James Youngman | 1 Dec 18:59 2006
Picon

Re: switching gnulib from CVS to a dVCS

On 11/27/06, Eric Blake <ebb9 <at> byu.net> wrote:
> Based on this thread, I took a leap and ported git/cogito to cygwin, so
> that they are now available from a standard cygwin installation, and am
> considering using git on more projects myself.  I think moving gnulib to
> git would be reasonable; I particularly liked the concept of being able to
> diff history without a network connection, and of branches being O(1)
> instead of O(n).

I would vote for this too, despite the fact that I've never used git
before.  The rationale is that I would be able to tag which version of
Gnulib got released in which version of findutils, which at the moment
I can only do by examining the files in the findutils-x.yy.zz.tar.gz
release file.

(That is, there should be no need to have commit access to the root
Gnulib repository in order to simply track which version of the
software I used)

James.

Simon Josefsson | 1 Dec 22:52 2006

Autobuild of gnulib

I have started daily builds of gnulib modules on a machine.  The
results so far (the build takes _hours_ to finish) are available from:

http://autobuild.josefsson.org/gnulib/gnulib.html

I know the output isn't easy to parse yet -- I will modify autobuild
to output a summary with the latest build for each project in the same
output file.  Autobuild simply hasn't been used for so many projects
in the same output file yet.  Note that until each module has a self
tests, the "result" column will only print "Built" and not "Success"
for that column.  To have "Success" be printed, the self test has to
run ok.  See for example:

http://autobuild.josefsson.org/gnulib/gnulib.html#arcfour

For an example where there is a self-test but it fails (which results
in a status of 'Almost'), see:

http://autobuild.josefsson.org/gnulib/gnulib.html#argp

The script that you run on a build robot is simple, see below.
Everyone can run a similar script, that submit build logs to
gnulib <at> autobuild.josefsson.org, and the results will end up in the
same URL as above.

I'm still experimenting with the setup, but I intend to move daily
builds for several projects to the same system shortly, and provide
the same free service for other projects.  If someone can help and run
the script on a non-i386 or non-linux machine and submit build logs,
that would make things more interesting.
(Continue reading)

Simon Josefsson | 1 Dec 22:56 2006

Re: Autobuild of gnulib

Btw, here are links to modules that failed to build on my machine:

http://autobuild.josefsson.org/gnulib/gnulib.html#fts-lgpl
http://autobuild.josefsson.org/gnulib/gnulib.html#savewd

/Simon

Jim Meyering | 1 Dec 23:27 2006
Picon

Re: Autobuild of gnulib

Simon Josefsson <simon <at> josefsson.org> wrote:

> Btw, here are links to modules that failed to build on my machine:
>
> http://autobuild.josefsson.org/gnulib/gnulib.html#fts-lgpl
> http://autobuild.josefsson.org/gnulib/gnulib.html#savewd

Nice.
Here's a proposed (but untested) patch.  Gotta run.

	* modules/savewd (Depends-on): Add fcntl_h to avoid self-test
	build failure due to missing definition of HAVE_WORKING_O_NOFOLLOW.
	Reported by Simon Josefsson.

Index: modules/savewd
===================================================================
RCS file: /sources/gnulib/gnulib/modules/savewd,v
retrieving revision 1.2
diff -u -p -r1.2 savewd
--- modules/savewd	26 Sep 2006 23:33:11 -0000	1.2
+++ modules/savewd	1 Dec 2006 22:26:13 -0000
 <at>  <at>  -10,6 +10,7  <at>  <at>  Depends-on:
 dirname
 exit
 fcntl-safer
+fcntl_h
 raise
 stdbool
 xalloc

(Continue reading)

Karl Berry | 2 Dec 01:08 2006

Re: gnulib broken on systems lacking fchdir

    since no one cares if gzip can't recursively compress or decompress a
    hierarchy that's really deep or that contains very long names.  

Really?

Well, I guess the deepest things gzip would operate on is distributions
of some sort.  That probably doesn't compare to the monstrous stuff you
make coreutils handle.

At least, I personally have never had gzip -r fail :).  (But then, I've
never had cp -r fail [in the "old" implementation], either.)

    Here, simple is better, since gzip has to be so portable.

More portable than coreutils?

Well, whatever.  Even if Paul wants to address this, I'm sure it's
nothing to do for the first release.

Cheers,
Karl


Gmane