Tuesday 15 February 2011

Dropbox config change from the CLI

I use Dropbox on my MacBook. It's neat. However for some reason it's not really autodetecting my proxy, which completely set up via a master proxy.pac file.

I have already a shell script that takes care of adjusting my SSH configuration and my custom proxy.pac depending on where I am. So I just extended it to change Dropbox's configuration and restart it. Here's the gist of it:

Friday 23 April 2010

The nature of history

As an historian, this strikes me as words of wisdom :

You can't even change the future, in the sense that you can only change the present one moment at a time, stubbornly, until the future unwinds itself into the stories of our lives.
-- Larry Wall in perlmonks

Paul Veyne and Michel Foucault would not disagree.

Friday 2 April 2010

The origins of sanctity of marriage

In The Knight, The Lady, and the Priest, Georges Duby, historian and medievist, analyses the transformations of occidental society between the Xth and the XIIth centuries, with some focus on the marriage institution.

During this period of time, a new feudal system was progressively put in place and a new balance of power was found in the occidental kingdoms. As a consequence, the patrilineal transmission of domains and titles began to become the rule; to sustain this societal change, it was necessary to make the marriage more rigid, notably by disallowing the common practice of divorces, marriages by rapt, and repudiations.

And that's where the Church intervenes. Previously, the Christian religion didn't care much about marriage. Jesus didn't -- he declared that in the kingdom of God, there will be no husbands or wives; actually he didn't care much about family values in general (see Luke 14:26 for example). Paul, the co-founder of Christianism, wasn't very concerned either: it's better to be married than to burn, did he wrote, but it was clear that the true Christian was to be celibate in his mind.

So the Church started to introduce during the XIth century mandatory marriage blessings, and then, a couple generations later, full-fledged ceremonies. Pope Gregory VII forbid the marriage or even concubinage of priests. Marriages that were not approved by a priest were declared invalid. Nobles, even kings, were excommunicated when they didn't follow the new rules. Progressively it became a sacrament.

There was some resistance to the new institutions; as usual in those Christian times, they took the form of heresy, although the heretics, this time, weren't the reformers, but the traditionalists.

This new form of marriage has stuck until us. But remember: when someone talks about sanctity of marriage, he's actually speaking about an opportunist theological innovation invented by some French and Italian bishops during the eleventh century for purely political reasons.

Tuesday 3 November 2009

Ubuntu upgrade woes

After having upgraded my laptop (a Dell Latitude D420) to Ubuntu Karmic Koala, it refused to boot. The normal boot process was only showing the usual splash screen, that only consists in the Ubuntu logo, then switched to a black screen, where nothing was possible anymore -- no switching to a text console, nothing. Damn.


Here's how I fixed that (in the hope that could be useful to someone). I found out, by booting without the splash screen, that the root partition wasn't mounted on boot, which was the cause of all problems. For the record, to boot without the splash screen, you have to select the kernel you want in the grub menu, edit its command-line, and remove the words "quiet splash" from it.


So I booted on the rescue kernel (by selecting it in grub), which gave me a basic busybox shell in RAM. There, I mounted manually my root partition, /dev/sda1, to a newly created directory /grumpf. I moved /grumpf/etc/fstab away and wrote a basic working fstab with the commands:

echo proc /proc proc defaults 0 0 > /grumpf/etc/fstab
echo /dev/sda1 / ext3 defaults,errors=remount-ro 0 1 >> /grumpf/etc/fstab

Then I rebooted. In the grub selection menu, I selected the regular kernel, but edited its command-line: I replaced the part root=UUID=deafbeef... by root=/dev/sda1, actually telling grub to look up the device by symbolic name instead of UUID. At this point the computer successfully booted.


Once there, I could log in as root, edit /boot/grub/menu.lst to make permanent my changes to the kernel command-line, and complete the fstab with appropriate lines for the swap, the cdrom and my /home partition. One last reboot, and voilà, the system was fully functional again.


This doesn't explain why device UUIDs aren't supported in the boot sequence on that hardware, though.

Monday 27 July 2009

Smart match to-do

The "new" smart match in Perl 5.10.1 can still be ameliorated.

Unlike what will happen with the changes between 5.10.0 and 5.10.1, which won't be backwards-compatible, the futher improvements to smart matching will be considered only if they don't break code. However the semantics can be extended.

Here's an example. The following code, that prints "ok" under 5.10.0, will now issue a run-time error :

use 5.10.1;
$obj = bless { foo => 42 };
say "ok" if 'foo' ~~ $obj;

The new error, Smart matching a non-overloaded object breaks encapsulation, which was suggested by Ricardo Signes, avoids to access by mistake the implementation details of an object.

However, from inside a method, it can be completely correct to use ~~ on the object's underlying reference. That's no longer possible in 5.10.1, unless you make a shallow copy of the object, for example with:
'foo' ~~ { %$obj }

Which won't be very performant if you have lots of keys there.

That's why I recently added in perltodo this little item:

Currently $foo ~~ $object will die with the message "Smart matching a non-overloaded object breaks encapsulation". It would be nice to allow to bypass this by using explictly the syntax $foo ~~ %$object or $foo ~~ @$object.

Wednesday 22 July 2009

On the influence of religion and astrology on science

I have, in addition, introduced a new method of philosophizing on the basis of numbers. -- Pico della Mirandola, Oration on the Dignity of Man


Wolfgang Pauli, in addition of being a Nobelized physicist, was most interested in the mental and psychological processes that make scientific discoveries possible. In a book co-authored with C.G. Jung, The Interpretation of Nature and the Psyche, he studies how Johannes Kepler discovered his three laws. By looking at the original texts of Kepler, who was a strict protestant, Pauli shows that the image that Kepler had of the Trinity led him to accept the Copernician heliocentrism, and to invent the idea that the movements of the planets could be measured and mathematically described.

The main enemy of Kepler was Robert Fludd, an English physician and astrologer. Fludd had a background in alchemy, kabbalah and mysticism. For him, the heavens (the macrocosm) was the reflection of the human body (the microcosm), and vice-versa: consequently, applying mathematical rules to the movements of planets was implying a negation of free will. (It's to be noted at this point that both Kepler and Fludd believed that planets were living beings.)

The same gap between Fludd's and Kepler's presuppositions can be seen on their approach to astrology. Fludd believed that astrology worked because of the mystical correspondence between heavens and earth. Kepler supposes action on the human mind induced by remote sources of light placed at certain angles -- the same angles that appear in Kepler's second law. As Pauli notes, Kepler's thesis is experimentally verifiable, but Kepler didn't seem to think that, if he his correct about astrology, artificial sources of light would have the same effect. Here too, Kepler completely externalizes the physical processes, something that Fludd refuses to do on religious grounds.

It's remarkable to note that Fludd's conceptions made him refuse Kepler's approach to astronomy, but enabled him to correctly discover the principle of blood circulation. That is, as the human body is like the heavens, where planets revolve around the Sun, image of God, then in the body blood must revolve around the heart, where is located the Soul, image of God. Or at least that's how Fludd saw it.

Even with false assumptions, both men made science advance. The conclusion of Pauli somehow was that, while all assumptions need to be scrutinized with skepticism, only the results will validate a scientific theory; but that those assumptions are precisely what makes the scientists creative. So, what kind of assumptions led to Pauli's Exclusion Principle?

Tuesday 21 July 2009

Deprecation in the Perl core

chromatic goes on arguing that deprecated language constructs deserve to be removed just because they're deprecated.

I'll argue, at the contrary, that deprecated constructs should be removed only when they stand in the way of a performance improvement or of a bug fix. Which is precisely why, for example, pseudo-hashes, empty package names or $* got removed in Perl 5.10. In other cases the deprecation warning is more of a "bad style, don't do this" message -- until someone finds a good reason to remove the deprecated construct.

Here are a few examples, which are a panel of different cases:


  • The $[ variable used to be a waste of memory. Now re-implemented as a %^H element, its existence has now no runtime memory impact (as long as it's not used). It's very possible, however, that it has a speed impact (I'd be glad too see a benchmark). If someone demonstrates that perl can be made faster by removing it, it's worth removing. (Note that it's deprecated in bleadperl currently.)

  • The apostrophe syntax for namespaces (foo'bar instead of foo::bar) has been used quite recently for its cuteness value (like the function isn't). It's not yet deprecated. Moreover the core still has Perl 4 files that use it. (Those would be worth considering for removal, by the way -- they're not maintained.) However, it's largely considered bad style to use this syntax and it can be confusing (as in print "$name's"). Adding a deprecation warning would be nice to warn users against potential errors, but I think that removing it would cause too many problems; so, that deprecation warning is likely to stay for some value of ever.

  • The defined(@foo) and defined(%bar) syntax is actually only deprecated for lexicals -- not for globals. (You can check.) Some deep magic uses them for stashes. I'm not sure about the details, but a comment in toke.c asserts that removing them would break the Tk module. So that construct can't be removed without some heavy engineering to devise a replacement.

  • Finally, the "split to @_ in scalar context" deprecation. Removing assignment to @_ would change the behaviour of split, and that's what caused the discussion on P5P: if we remove it and replace it by something more sensible, code developed on 5.12 might cause problems when run on an older perl. Opinions differ here. I would personally favor removing it unconditionally, which is likely to be chromatic's point of view too (for once). More conservative opinions were issued.


Now, let's have a look at the two arguments against deprecation and removal that are misrepresented by chromatic:

The arguments for retaining these features are likewise simple: modifying code may cause bugs and removing features may break existing code.

First, "modifying code may cause bugs". This meme surfaced on P5P about a very complex and hairy Perl module with almost no test (ExtUtils::ParseXS). However the code changes we're talking about here are C changes. Written properly, C is a more straightforward language than Perl, and the test coverage of the core is excellent. So I don't buy that argument at all and I don't think anyone knowledgeable used it seriously about core changes. Also, if we're speaking only about adding a warning, it's usually a very simple change. Removal itself might be more complex, but still a lot simpler than adding features. And features get added.

Secondly, "removing features may break existing code". Right, but are we talking about deprecation of removal ? chromatic seems to suppose that deprecation must necessarily be followed by removal. He says: The problem, I believe, is that there's little impetus to migrate away from deprecated features; features can remain deprecated for 15 years. But this is not a problem -- this is a sensible policy decision. Unlike mere deprecation, removal of constructs will break code, so we must balance it with convenience, and proceed only when it brings more benefits than inconveniences for Perl users.

Basically the only real argument against removal of features is precisely the one that chromatic persists in ignoring: preservation of backward compatibility, to avoid gratuitous breakage of Perl programs. But instead of repeating myself on that point, let me finish this post by a quote:

Perl has broken backward compatibility in the past where I judged very few people would be affected.
-- Larry Wall in P5P.

Friday 17 July 2009

The job of the pumpking

The job of a pumpking is a difficult one. It demands technical and human skills, and constant commitment.

Historically, the pumpking has been the one who applied most of the patches. (That's actually the origin of the word -- the pumpking holds a semaphore known as the patch pumpkin.) Applying a patch is never completely trivial. Even if the code change is clear enough, you'll have to look at the regression tests, and eventually write some; make sure the docs are still correct and complete (and eventually write some); if applicable, increment the version numbers of the core modules, or contact the maintainer of the module if it's dual life.

Sometimes, the code change is not clear at all, or really big, or it touches areas of the code that the pumpking is not familiar with. In this case he has to learn about the surrounding code, or ask experts in that field (if any are around and willing to respond), understand how the patch works, determine what kind of performance or footprint impact it will have, think about possible better implementations, detect potential subtle regressions, and consider test coverage of the new code.

Even if it's a doc patch, he will have to check that the new documentation is right, doesn't duplicate or contradict information elsewhere, and is written clearly enough and in a grammatically correct English. (Which makes it only more difficult for non-native pumpkings.)

Putting it shortly, patch application is a time consuming activity, and it demands a fair amount of concentration. In any case, that's not something you can do when you have ten minutes free between two tasks.

Is it rewarding? Well, not really. Applying patches is a lot like scratching other people's itches. You don't get much time left to work on what would interest you personally. Like, fixing funny bugs, or adding new functionalities. And people tend to get upset when their patches are not applied fast enough.

Sometimes you have to reject a patch. You do this in the most diplomatic way possible, because the person who sent it usually is full of good will, and volunteered some of his own free time, and you don't want to drive volunteers off. So you ask, could I have some more docs with that, because I don't understand it fully? or: what problem does it solve for you? or: excuse me, but have you considered this alternate implementation that might be better suited? Care to work on it a bit more? Thanks. Even when you know at the first sight that a patch is totally worthless, you can't reply, "ur patch iz worthless, lol", you have to try to be pedagogical about why the proposed patch is a bad idea.

And all of this, you do on borrowed time, because you're not paid for it. You could be sleeping, or cooking, or having fun with your family and your friends, or enjoying a nice day's weather. No, you stay inside and you apply other people's patches. And why do you do it? Because you like Perl, you want to make it better, and you're not doing a bad job at it. And probably because no-one else would be doing it for you.

And of course no-one except the previous pumpkings fully realize what all of this really means. No. You take decisions, lead by technical expertise and years of familiarity with perl5-porters. Sometimes heat is generated, but that settles down quickly and stays contained within the mailing list. You listen to all parties and you show respect. And you take an informed decision, trying to remember, as Chip Salzenberg noted, that good synthesis is different from good compromise, although they're easy to mistake for each other, and that only the first one leads to good design. All of this is normal, and expected.

Being a pumpking is difficult. It's demanding; it's not rewarding. But that's not why I quitted. I quitted because I can't continue to do it under constant fire. I want my work to be properly criticized based on technical considerations, not to be denigrated. (Also, I want a pony, too. I realize that it's difficult to deal with people who are in the middle of the "hazard" peak of this graph.) That's also why I would refuse to be paid to be a full-time pumpking: in my day job, I get recognition.

Actually, I don't think that a new pumpking will step up, and I think that this will be for the best. P5P probably needs to transition from pumpkingship to a more oligarchic form of governance. More people need to take the responsibility to review and apply patches. More people need to back up the technical decisions that are made. A vocal consensus is stronger than a single pumpking, and it will force us to write down a policy, which will be a good thing and increase the bus factor of Perl. Moreover, a system involving many committers would scale better. All committers are volunteering time. It's a scarce resource, but can be put in common. And new major Perl releases are not going to happen without more copious free time.

As a side-effect, if many people start handling the patch backlog, that means that I'll be able to devote time to long-standing issues without feeling guilty. Like the UTF-8 cleanup I have been speaking about.

For once, I plan to take some vacations this summer -- away from P5P. That didn't happened to me since years. I think I deserved it.

Friday 10 July 2009

The strictperl experiment

The strictperl experiment by chromatic, and the strict-by-default patch by Steffen Mueller, will help me explain what approaches are right or wrong in dealing with Perl 5 evolutions.

strictperl is, as chromatic puts it, unilaterally enabling strict for all code not run through -e. Unsurprisingly, that breaks a lot of code. And I mean a lot, as in "most". Not only on the DarkPAN (I seldom use strict for scripts of four lines), on the CPAN (I have modules that will break under strictperl), but in the core itself. chromatic mentions Exporter and vars as modules that break under it. Well, of course they break. Their very purpose is to manipulate symbols by name, which is exactly the kind of thing that strict prevents. (Incidentally all code that uses Exporter or vars can't run under strict perl, and I think that's most of the Perl code out there actually.) That is why chromatic added a separate makefile target to build strictperl: if the patch was going in perl itself, perl couldn't be even built! That's how broken it is.

It's certainly interesting to dive in Perl's internals, but to experiment with enabling strict on a global level, a simple source filter would have been sufficient. One could have written one in ten minutes, including time to look up the documentation, and it would have been immediately obvious afterwards why it was a bad idea. Except to chromatic, who still thinks it's (quoting) a feature I believe is worth considering.

So now let's look at Steffen Mueller's solution, which is strictperl done right, and which is already in bleadperl.

Currently, with bleadperl, a simple use 5.11.0 will enable strictures in the current lexical scope, like use strict would do, but implicitly. (It will also enable features, but that's unrelated.) Look:

$ bleadperl -e 'use 5.11.0; print $foo'
Global symbol "$foo" requires explicit package name at -e line 1.
Execution of -e aborted due to compilation errors.

Moreover, like in chromatic's strictperl, a simple -e or -E won't enable them, because strictures are not wanted for one-liners.

That way, we don't break code that doesn't want strictures, (actually we don't break anything at all that doesn't require 5.11.0 already, it's completely backwards compatible), but it's still removing some boilerplate for default strictness if you're requesting a perl recent enough.

Steffen's patch itself is not very recent, but I didn't apply it to bleadperl immediately, because I disagreed with the implementation. As you can see in the sources, it uses Perl_load_module to load the file strict.pm and execute its import() method. That's very slow. I applied it when I got around to make it faster with the next commit, which replaces that method call by just flipping three bits on an internal integer variable.

All this goodness is coming to you in Perl 5.12 when someone will be willing to take the pumpking's hat.

Next time, I'll explain why enabling warnings by default is not a good idea.