Welcome to my blog.
Have a look at the most recent posts below, or browse the tag cloud on the right. An archive of all posts is available.
using git-svn with rsync
From time to time I find myself using
git-svn with
projects using subversion, this is all nice and fine but it takes quite some
time to do the ''initial import'' if you are doing it over a remote transport
such as ssh+svn
.
One trick to speed things up is to use rsync to copy the svn repo locally and then ''git-svn'' it from your harddisk. The idea is originally documented at the allegro wiki
For example a typical sourceforge project will be done like this:
DESTDIR=/your/dest PROJECTNAME=vde PROJECTURL=https://$PROJECTNAME.svn.sourceforge.net/svnroot/$PROJECTNAME SVN_COPY=$(mktemp -d) rsync -c -av $PROJECTNAME.svn.sourceforge.net::svn/$PROJECTNAME/'*' $SVN_COPY mkdir -p $DESTDIR cd $DESTDIR git svn init file://$SVN_COPY --rewrite-root=$PROJECTURL --stdlayout git svn fetch # ... git svn imports the repo ... sed -i "s@url =.*@url = $PROJECTURL@" .git/config rm -rf $SVN_COPY
The important bit is to remember to update the url
key after
updating otherwise subsequent commits will go to your rsync-ed repo and not the
original one!
do not exit your shell if /etc has uncommited changes (act 2)
I have wrote already about this, now for an improved version some years later.
Nothing new under the sun: you have your /etc
under git (I
recommend doing that with etckeeper)
and you want to be reminded about uncommited changes if you leave the root
shell. This is handy for example when co-maintaining a server and everybody is
required to leave a trace after changing a file.
So you put something like this in your favourite root shell initialization file
(e.g. ~root/.bashrc
) and make sure the shell is invoked as
interactive/login (e.g. alias su='su -l'
)
git_functions="/usr/local/sbin/git-etc-common" # export GIT_* variables if [ -r "$git_functions" ]; then . "$git_functions" git_export_env fi case $- in *i*) # interactive shell check_uncommitted(){ ttyuser=$(ttyuser) logoutfile="$HOME/.logout-$ttyuser" # ttyuser is empty if the shell is not connected to a terminal # IOW, avoid forkbombing if the terminal has been closed if ! git_etc_status && [ ! -e "$logoutfile" ] && [ ! -z "$ttyuser" ]; then echo "Uncommitted changes to /etc. touch $logoutfile to force logout." # TODO show a list of files to commit? $SHELL -$- fi rm -f $logoutfile } trap check_uncommitted EXIT ;; esac
And git-etc-common
looks like this:
# source this file ttyuser() { cur_tty=$(tty) && echo $(stat -c "%U" $cur_tty) } # detect who's currently at the console git_export_env() { # TODO ask for name if GIT_AUTHOR_{NAME,EMAIL} is not set? ttyuser=$(ttyuser || echo "") if [ -n "$ttyuser" ]; then ttyuserhome=$(getent passwd "$ttyuser" | cut -d: -f6) conf=$ttyuserhome/.gitconfig fi if [ -z "$GIT_AUTHOR_NAME" ] && [ -z "$GIT_AUTHOR_EMAIL" ]; then if [ ! -z "$GIT_CONFIG_LOCAL" ] || [ ! -z "$GIT_CONFIG" ]; then export GIT_AUTHOR_NAME=$(git config --get user.name) export GIT_AUTHOR_EMAIL=$(git config --get user.email) elif [ -n "$conf" ] && [ -r "$conf" ]; then export GIT_AUTHOR_NAME=$(git config --file $conf --get user.name) export GIT_AUTHOR_EMAIL=$(git config --file $conf --get user.email) fi fi } git_etc_status() { changed=$(cd /etc && git ls-files --modified --deleted --others \ --exclude-per-directory=.gitignore \ --exclude-from=.git/info/exclude) ret=$? [ -z "$changed" ] && [ "$ret" -eq 0 ] return $? }
Exercise: generalise the functions above so you can have warnings about uncommited changes in generic repositories.
how to ship spidermonkey in your autotools project
While working on freej we'd need to update spidermonkey/mozjs to the latest version and while doing that integrate it better with the rest of autoconf/automake. AFAICT spidermonkey/mozjs is distributed as part of XULRunner nowadays.
The requirements are to make it easy to put a new version of spidermonkey into
the tree and redistribute it with make dist
and also to have a
successful make distcheck
.
The first step is a trick explained in automake manual about third-party Makefiles so we're using a GNUMakefile.in (git) to "proxy" the targets down to the real Makefile or to ignore some targets called by autotools, e.g:
# otherwise linking static libmozjs.a with dynamic libfreej.so won't work CFLAGS += -fPIC # the makefile to proxy targets to js_makefile = $(builddir)/Makefile # proxy these targets to the real makefile all export js-config clean libs tools: $(MAKE) -f $(js_makefile) $(AM_MAKEFLAGS) $@ # targets required by autotools but which we don't need at all .PHONY: dvi pdf ps info html installcheck check install uninstall dvi pdf ps info html installcheck check install uninstall:
This more or less fixes the make
part, now for the
autoconf
part there's a similar tecnique in the autoconf manual
involving running arbitrary configuration
commands.
I first tried to treat spidermonkey as a separate project via
AC_CONFIG_SUBDIRS
as expained in Configuring Other Packages in
Subdirectories
but it didn't work and I forgot exactly why.
The configure.ac (git) snippet is something like this: (sorry no highlighting, highlight doesn't support m4 yet)
if test x$have_mozjs = xno || test x$static_mozjs = xyes; then dnl run lib/javascript/configure after freej's configure, building it static AC_CONFIG_COMMANDS([lib/javascript/.xulrunner-subdir], (cd lib/javascript && CXXFLAGS="$GLOBAL_CFLAGS $CXXFLAGS -fPIC" \ CFLAGS="$GLOBAL_CFLAGS $CFLAGS -fPIC" \ $ac_srcdir/configure --enable-static \ --enable-js-static-build --disable-jit) || exit $? ]) fi
This will run spidermonkey configure after the main configure, at this point
config.status will also expand GNUmakefile.in into GNUmakefile with the correct
variables.
The final step after all this work is of course to add the spidermonkey
directory to SUBDIRS (e.g. SUBDIRS += lib/javascript
), after that
make
will hopefully recurse into lib/javascript and use our
GNUmakefile.
While this is nice it doesn't cater for the make dist
family of
autotools targets which build the real distribution tarball.
To accomplish this I've used an explicit list of spidermonkey files that will
end up in the tarball and made make distdir
to copy only those
files plus GNUmakefile.in
and the list itself
(.js-distfiles
). Which is the second part of the
GNUmakefile.in
above, i.e. distdir/distfiles targets plus distclean to clean them up, see
comments in the file as well.
I hope this is somewhat useful to people who are trying to properly (autotools-wise) ship spidermonkey in their project, comments welcome as usual
command line message approve for mailman
Today I was dealing with an instance of GNU mailman without the web interface and there were some messages held for approval.
How to approve them from the command line? Turns out it is possible thanks to
the withlist
tool shipped with mailman and a bit of fiddling. Code
below.
# installation: # copy this file in the same dir as withlist and name it approvefile.py # usage: # bin/withlist -l -r approvemessage <list> <cookie> # cookie is the digest in the Subject: after approve import sys from Mailman import mm_cfg from Mailman import Pending # ripped out from MailList.py ProcessConfirmation def approvemessage(ml, cookie=None): if cookie is None: print >> sys.stderr, "!!!! No cookie specified" return rec = ml.pend_confirm(cookie) if not rec: print >> sys.stderr, "!!!! Cookie not found" return try: op = rec[0] data = rec[1:] except ValueError: print >> sys.stderr, "!!!! No op found for cookie" return if op != Pending.HELD_MESSAGE: print >> sys.stdout, "!!!! Message isn't held" return ml.HandleRequest(data[0], mm_cfg.APPROVE) ml.Save() print >> sys.stdout, "!!!! Message approved" # interactive version: # bin/withlist -l admin #Loading list admin (locked) #The variable `m' is the admin MailList instance #>>> from Mailman import mm_cfg #>>> res = m.pend_confirm('862db35189d896667888114e67db8f41fa986c9f') #>>> m.HandleRequest(res[1], mm_cfg.APPROVE) #>>> m.Save()
blurb: public domain, no warranty whatsoever, use at your risk. Comments welcome though
cs ricordi
$ finger giunched Login: giunched Name: Filippo Giunchedi Directory: /home/students/giunched Shell: /bin/bash Office: studente Informatica (13/10/2003) On since Tue Apr 6 10:03 (CEST) on pts/1 from adsl-ull-106-7.51-151.net24.it Mail forwarded to "|procmail" No mail. Project: Being rich and famous Plan: I love it when a plan comes together! $
si vede la fine
pare che si veda la fine della tesi specialistica su vde3 fatta insieme a luca.
buona lettura
welcome to ikiwiki
This was long due, I switched from pyblosxom to the marvelous ikiwiki and gave a general revamp to my website by tweaking ikiwiki's default templates. note: the template still needs a general cleanup, don't look at it as it is not pretty (nor are the CSSes)
With some luck and mod_rewrite karma nobody's feed readers nor planets will be flooded.
Comments are powered by disqus.
Svalentino di domenica
la cosa migliore e' poter leggere postsecret fresco fresco
silent rules with automake 1.11
It seems that automake 1.11 made it into unstable and, besides other features, that means silent rules like linux does since quite some time. For the impatient, in configure.ac:
AM_INIT_AUTOMAKE
m4_ifdef([[AM_SILENT_RULES([yes](AM_SILENT_RULES],))])
Then after a round of automake/configure the make invocation will be like
make[3]: Entering directory `foo'
CXX source1.lo
CXX source2.lo
CCLD lib1.la
[...]
Credit note: the autotools mythbuster guide
mozilla ubiquity commands for debian
I'm an avid user of mozilla ubiquity but I couldn't find any commands related to debian to replace those of yubnub. So, prodded by zack I've produced some like bts or pts.
The command feed is located at ubiquity-commands (don't be scared by the "this-code-can-be-evil" subscription page) with the actual code maintained under ubiquity-commands.git.
Note that there's much room for improvement so it would be nice to have a common set of commands for the various debian web resources.