Stephan Gambke | 30 Jan 11:18 2015

Add members to extension owners

What is the process to add people as owners of an extension?

What is described on [0] does not work. Contrary to what is
written there I can not add group members myself to a group that I am
in. And a request I added there is open now for more than three

So, what do I have to do to give other people access to extensions?
Are there any plans to streamline that process?


Wikitech-l mailing list
Wikitech-l <at>
Bryan Davis | 30 Jan 08:45 2015

Converting debug logging to PSR-3

PSR-3 logging has been fully supported in MediaWiki since1.25wmf5.
We've been making various tuning improvements since then including a
recent deprecation of the initial MWLogger wrapper class in favor of
direct usage of Psr\Log\LoggerInterface by wfDebugLog() and other
internal wrapper methods [2].

The next big step in introducing structured logging to MediaWiki is to
begin to replace calls to wfDebugLog() (and dear god wfDebug()) with
direct usage of Psr\Log\LoggerInterface instances. Kunal and I have a
couple starter patches that show this in gerrit [3]. In both
conversions we have chosen to implement Psr\Log\LoggerAwareInterface
in the effected classes to allow setter based injection of a
LoggerInterface. The log channel names were also chosen to match the
previous wfDebugLog logGroup values. When you use a PSR-3 logger you
need to choose a severity for each log message. PSR-3 has quite a few
possible levels, but I'd like to propose that we really only need 4 to
start with in MediaWiki:

* debug: Useful for ummm.... debugging. :) These are messages that are
useful for local development and are generally too "spammy" to output
on a production wiki. This would typically include anything currently
being logged via wfDebug.
* info: Valuable state change information. This level is a great place
to record information that would be useful in a production environment
when tracing the path of a request that eventually had an error.
* warning: A soft error condition such as a recoverable error or
another condition that typically should not be seen but isn't halting
for the operation in process
* error: A hard error such as a caught exception with no recovery path.

(Continue reading)

Ori Livneh | 30 Jan 04:21 2015

Re: From Node.js to Go

(Sorry, this was meant for wikitech-l.)

On Thu, Jan 29, 2015 at 7:20 PM, Ori Livneh <ori <at>> wrote:

> We should do the same, IMO.
Wikitech-l mailing list
Wikitech-l <at>
Bryan Davis | 30 Jan 02:01 2015

Blog post on Librarization project published

For the last four months, my main focus has been the Librarization
project [0]. Today a wrap up blog post was posted to [1] that I'd invite all of you to read to get an
overview of what our high level goals and motivations were and what we
accomplished. The TL;DR is that we now have some guidelines for how to
separate code from MediaWiki and publish it as a semi-autonomous open
source project.

The blog post ends with a thinly veiled call to action for MediaWiki
developers to continue the work of extracting code from the current
MediaWiki core application and publishing them as independent
libraries. We've published some information on how to deal with git
hosting, code review, and various other general issues on [2]. There is also a list of some areas of the existing
code base that we thought would be interesting targets for extraction
[3]. The CDB library [4] can serve as one concrete example of using
the guidelines.

I'd like to invite anyone interested in starting work on decoupling a
particular area of the code to start a thread on wikitech-l and file a
task in Librarization phabricator project [5] to attract collaborators
and help reduce possible duplication of effort. It would also be great
to have edits on the list page and/or phabricator tasks to act as a
wish list of things that know of in MediaWiki that you would either
like to be able to use in a non-MediaWiki PHP project or feel would be
a good candidate for isolation so that alternate implementations could
be introduced.

(Continue reading)

Arlo Breault | 29 Jan 23:47 2015

Urlencoding strip markers

Currently, while {{urlencod}}ing, content in strip markers is skipped.

I believe this violates the expectation that the entire output
will be properly escaped to be placed in a sensitive context.

An example is in the infobox book caption on,

There’s a brief discussions of the security implications of
some proposed solutions in the review of,

It seems best (I guess) to just drop the content (`killMarkers()`).

Any opinions or better ideas?


Wikitech-l mailing list
Wikitech-l <at>
Jon Robson | 29 Jan 21:56 2015

Improving our code review efficiency

I was really happy to hear Damon, at the MediaWiki Developer Summit,
ask us how long we take to code review and whether we had communicated
a timeframe in which we promised to do it to our community. He quite
rightly stressed that this was vital for the survival of our
community. I spoke to one of our more new developers during the summit
and he also confessed to me that the reason he was an active volunteer
in our extension was that he got feedback on his code pretty quickly.

I had a few ideas about how to measure this so in my spare time I have
generated this report based on data from Gerrit patchsets using a
hacked together python script [1] which I hope will be if nothing else
an interesting artifact to talk about and generate some discussion.


To help you understand what you are reading, let's take Echo as an example:

Project: mediawiki/extensions/Echo
524 patches analysed (23 open, 501 merged)
Average review time: 29 days
Oldest open patch: (bug 41987) Updating tables indexes' names. (766
days) -

The average time for code to go from submitted to merged appears to be
29 days over a dataset of 524 patches, excluding all that were written
by the L10n bot. There is a patchset there that has been _open_ for
766 days - if you look at it it was uploaded on Dec 23, 2012 12:23 PM
is -1ed by me and needs a rebase.

(Continue reading)

Keith Welter | 29 Jan 18:26 2015

wfShellExec() quirk - advice needed on further debug

The GraphViz extension uses wfShellExec() to invoke the "dot" command.
Sometime in the last month or so, on my Ubuntu 14.04 installation, the
command started failing with:
Warning: Could not load "/usr/lib/graphviz/" - file
not found

The file does exist and the dot command runs fine from a shell session.

I found that by eliminating the ulimit -v option from, which
wfShellExec() invokes, the problem goes away.

The -v limit is 102400 on my installation (so 100Mb).  I find it hard to
believe that dot actually needs that much virtual memory.

Suggestions on how to proceed with debug would be much appreciated.

Keith Welter
Wikitech-l mailing list
Wikitech-l <at>
MZMcBride | 29 Jan 04:11 2015

Request for comments for RESTBase?


There's been quite a bit of discussion about RESTBase lately. Is there a
request for comments on about RESTBase? I looked at
<> and didn't see one.

From my limited understanding of what's being proposed, I'd personally be
a lot more comfortable with the idea if someone from both the software
architecture side (Brion, Tim, or equivalent) and someone from the
operations side (Mark, Faidon, Giuseppe, or equivalent) weighed in and
signed off on what's being proposed. This may have already happened
somewhere, but I didn't see anything in my brief searching and poking
around on pages such as <>.


Wikitech-l mailing list
Wikitech-l <at>
James Douglas | 28 Jan 21:30 2015

Dev Summit debrief: SOA proliferation through specification

Howdy all,

It was a pleasure chatting with you at this year's Developer Summit[1]
about how we might give SOA a shot in the arm by creating (and building
from) specifications.

The slides are available on the RESTBase project pages[2] and the session
notes are available on Etherpad[3].

I'm eager to keep the conversation going on the mailing list, and want to
address a couple items that came up (or were missing) during the session,
as well as prompt for further discussion.

I mentioned after the presentation that we're using our spec to drive our
automated testing.  I added some info about that to slide #14 in the
slides[2].  The idea is that, since Swagger lets us add custom fields to a
spec, we can augment each endpoint specification with a functional
description of its expected inputs and outputs.  During testing, we parse
the spec and verify that these indeed hold true.  There's a lot of
opportunity for enhancement of our (currently very basic) approach to this,
but it's already proving pretty handy from a coverage standpoint.

There was a question in the notes[3] about Swagger's support for
internationalization, but I'm not familiar with the use case in mind.  How
might an API differ, aside from the content of fields in a specified model,
under different localizations?  Might users want the models themselves (or
parameter names, etc.) to vary?

(Continue reading)

Matthew Flaschen | 28 Jan 20:38 2015

Scrum of Scrums notes

This is the Collaboration team update for 1/28.

We are in early stages of improving the user experience for enabling and 
disabling Flow boards.

We enabled Flow on two pages on Portuguese Wikipedia, and are preparing 
to finish enabling Flow for the Co-op 

We fixed an urgent issue caused by moving a regular page into Flow's 
topic namespace (both preventing the issue in the future and treating 
the symptoms this time).

Matt Flaschen

Wikitech-l mailing list
Wikitech-l <at>
Guillaume Paumier | 28 Jan 18:54 2015

Wikimedia engineering report, November 2014


The report covering Wikimedia engineering activities in November 2014 is
now available:


Guillaume Paumier

Wikitech-l mailing list
Wikitech-l <at>