Erik Moeller | 1 May 17:30 2006
Picon

RfC: A Free Content and Expression Definition

Dear Wikimedia community,

I am posting this to multiple lists, as I believe it is relevant to
each of them (more on that below).

For years, we have been using the term "free content" to refer to our
projects. However, what exactly is free content? Does it include the
right to make commercial use? Does it allow derivative works? A year
ago, Anthere, one of our elected trustees, noted that the English
Wikipedia article [[free content]] is confused and contains no clear
definition. This is no surprise, as the term has evolved purely
through its usage. One year on, the article doesn't look much better
and still doesn't contain a single reference.

It is clear that we need a definition. With the help of feedback from
the likes of Richard Stallman and Lawrence Lessig, and an increasing
number of collaborators, I have drafted up a first version of such a
definition, called the "Free Content and Expression Definition":

        http://freecontentdefinition.org/Definition

You can also use the URLs <http://freedomdefinition.org/> or
<http://freedomdefined.org/>. Please use the URL

        http://freecontentdefinition.org/static/ (with trailing slash)

when submitting this link to high traffic sites.

Licenses covered by this definition must grant the following freedoms:

(Continue reading)

Matsobane Moloto | 9 May 16:55 2006
Picon

Downloading Contents from wikis project

     Greetings
I installed Mediawiki which i think is working fine on the web server
and i have been trying to download contents from very interesting wiki's
project such as wikibooks;wikiqoute,wikicommons,wiktionery and other
without success.
Could you guys help on the easiest way to dowmload or find the contents.
I am working for a government project in SOUTH AFRICA that specialise in
opensource and our targets are school from primary to secondary levels
so i need to contents that we could install in their labs because they
do not have internet.
I hope i could get helped
Regards
Matsobane Moloto
icommunity(SA)
Tel:+27 15 4834878
cell:0731634000
Wildrick Steele | 9 May 18:19 2006
Picon

Re: Downloading Contents from wikis project

On 09/05/06, Matsobane Moloto <molotomf@...> wrote:
>
>      Greetings
> I installed Mediawiki which i think is working fine on the web server
> and i have been trying to download contents from very interesting wiki's
> project such as wikibooks;wikiqoute,wikicommons,wiktionery and other
> without success.
> Could you guys help on the easiest way to dowmload or find the contents.
> I am working for a government project in SOUTH AFRICA that specialise in
> opensource and our targets are school from primary to secondary levels
> so i need to contents that we could install in their labs because they
> do not have internet.
> I hope i could get helped
> Regards
> Matsobane Moloto
> icommunity(SA)
> Tel:+27 15 4834878
> cell:0731634000
> _______________________________________________
> Wiktionary-l mailing list
> Wiktionary-l@...
> http://mail.wikipedia.org/mailman/listinfo/wiktionary-l
>

All downloads are available at
http://download.wikimedia.org/

Cheers,
Wildrick
http://en.wiktionary.org/wiki/User:Vildricianus
(Continue reading)

Matsobane | 9 May 16:18 2006
Picon

Downloading Contents from wikis project

Greetings
I installed Mediawiki which i think is working fine on the web server
and i have been trying to download contents from very interesting wiki's
project such as wikibooks;wikiqoute,wikicommons,wiktionery and other
without success.
Could you guys help on the easiest way to dowmload or find the contents.
I am working for a government project in SOUTH AFRICA that specialise in
opensource and our targets are school from primary to secondary levels
so i need to contents that we could install in their labs because they
do not have internet.
I hope i could get helped
Regards
Matsobane Moloto
icommunity(SA)
Tel:+27 15 4834878
cell:0731634000
Wytukaze | 9 May 21:49 2006
Picon

Re: Downloading Contents from wikis project

Whoops. Sorry guys, I should really check the list before clearing out the
pending items.

(Matsobane, see the reply to your other email for help.)

--
Wytukaze - Anyɛɛ mɔbi.
_______________________________________________
Wiktionary-l mailing list
Wiktionary-l <at> Wikipedia.org
http://mail.wikipedia.org/mailman/listinfo/wiktionary-l
GerardM | 10 May 15:53 2006
Picon

The need for identifying languages properly for lexicological use

Hoi,
When there is a vote for "yet another" wikipedia, it is necessary to
have a code that identifies the new database. As Wikipedias are
written in a language, we use a code that identifies that language.
Typically people say we use the ISO-639 codes for that. This would
imply that a code used has a relation to the language that is being
used and, it should also imply that a wikipedia is indeed in a
particular language as recognised by the code.

The way the Wikipedia are is a matter of history and the continued
abuse of codes makes for often heated political discussions about
languages, it only make things more complicated.When you are
interested in reading more details on this subject, you can read what
I wrote on my blog.
http://ultimategerardm.blogspot.com/2006/05/languagecodes-on-wikimedia-foundation.html

In many projects we use "Babel" templates to indicate the language
proficiency of people. Particularly in Wiktionary and in WiktionaryZ,
we have to be precise when we indicate a  language. It means that when
we are to indicate that a word is in a specific language, it has to be
THAT language and not another language.

I propose for WiktionaryZ and for the Babel proficiency to exclusively
use the ISO-639-3 codes. When there are not enough codes in ISO-639-3
we will have to use codes that are clearly not ISO-639-3. These codes
may indicate orthographies, dialects and different scripts and even
languages that have not yet been considered to be a language.

The use of well defined codes will allow us to have our data used
reliably and to define our content better. This will enable people to
(Continue reading)

Gerard Meijssen | 12 May 13:58 2006
Picon

Re: [Wikitech-l] The lang tag in <HTML> ain't identical to $wgContLanguageCode

Brion Vibber wrote:
> Shinjiman wrote:
>   
>> <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="XXX" lang="XXX">
>>
>> The lang (and xml:lang) attribute defined at the HTML tag in some language is 
>> not correct and it's supposed to not making this value identical to 
>> $wgContLanguageCode.
>>     
>
> Incorrect; it *is* supposed to be the value of $wgContLanguageCode, as by
> definition $wgContLanguageCode is the RFC 3066 language code for the language of
> the wiki's content.
>
> A reasonable case might be made that when variant display conversion is engaged,
> the lang attribute should be overridden.
>
>   
>> For example there's no such language tag called "simple", 
>>     
>
> Indeed there's not; that would be "en".
>
> Note that $wgContLanguageCode is not the same as the *domain name* or *interwiki
> identifier*. These are separate issues.
>
>   
>> according to ISO639, RFC1766, RFC3066 (R1,R2). Hence for my previous patch 
>> that submitted to Bug:5790. The main purpose of the patch is adding a new 
>> Language Tag Mapping against the user interface language which using the 
(Continue reading)

GerardM | 28 May 08:31 2006
Picon

Re: Categories on nap.wikipedia.org don't work

Hoi,
I do not understand the need for markup in titles. I certainly believe
that when markup prevents correct orthography in titles in any
language, it means that markup cannot be handled in titles and
consequently the markup in titles should not be allowed.

As MediaWiki is about supporting all languages, markup in titles is
clearly at best a nice to have and certainly not a must have.
Representing orthographies is indeed a MUST have feature.

PS I cross-post to the Wiktionary list. As they have as their aim to
include all words in all languages, it affects ALL wiktionary
projects.

Thanks,
   GerardM

On 5/28/06, Sabine Cretella <sabine_cretella@...> wrote:
> Brion Vibber schrieb:
> > Sabine Cretella wrote:
> >
> >> We really do need '' as normal chars and not to initiate text written in
> >> italics ... we must be able to include this in a wiki link in some way
> >> and also in a category link ... is there a way to get a different tag
> >> for that - something like <apostrophs> </apostrophs> or whatever?
> >>
> >
> > Since '' can't be round-tripped reliably in wikitext, it's most likely that
> > we'll have to make '' explicitly forbidden in wiki page titles in the course of
> > fixing parser bugs.
(Continue reading)

Muke Tever | 28 May 12:58 2006
Picon

Re: [Wikitech-l] Categories on nap.wikipedia.org don't work

GerardM <gerard.meijssen <at> gmail.com> wrote:
> On 5/28/06, Sabine Cretella <sabine_cretella <at> yahoo.it> wrote:
>> Brion Vibber schrieb:
>> > Sabine Cretella wrote:
>> >
>> >> We really do need '' as normal chars and not to initiate text written in
>> >> italics ... we must be able to include this in a wiki link in some way
>> >> and also in a category link ... is there a way to get a different tag
>> >> for that - something like <apostrophs> </apostrophs> or whatever?

Doesn't the regular, non-typewriter apostrophe <’> work, thus <’’>?

	*Muke!
--

-- 
website:     http://frath.net/
LiveJournal: http://kohath.livejournal.com/
deviantArt:  http://kohath.deviantart.com/

FrathWiki, a conlang and conculture wiki:
http://wiki.frath.net/
_______________________________________________
Wiktionary-l mailing list
Wiktionary-l <at> Wikipedia.org
http://mail.wikipedia.org/mailman/listinfo/wiktionary-l
Erik Moeller | 1 May 17:30 2006
Picon

RfC: A Free Content and Expression Definition

Dear Wikimedia community,

I am posting this to multiple lists, as I believe it is relevant to
each of them (more on that below).

For years, we have been using the term "free content" to refer to our
projects. However, what exactly is free content? Does it include the
right to make commercial use? Does it allow derivative works? A year
ago, Anthere, one of our elected trustees, noted that the English
Wikipedia article [[free content]] is confused and contains no clear
definition. This is no surprise, as the term has evolved purely
through its usage. One year on, the article doesn't look much better
and still doesn't contain a single reference.

It is clear that we need a definition. With the help of feedback from
the likes of Richard Stallman and Lawrence Lessig, and an increasing
number of collaborators, I have drafted up a first version of such a
definition, called the "Free Content and Expression Definition":

        http://freecontentdefinition.org/Definition

You can also use the URLs <http://freedomdefinition.org/> or
<http://freedomdefined.org/>. Please use the URL

        http://freecontentdefinition.org/static/ (with trailing slash)

when submitting this link to high traffic sites.

Licenses covered by this definition must grant the following freedoms:

(Continue reading)


Gmane