Monday, December 13, 2010

scala specs functional matchers

I was reading and wondered if new matchers could be defined more tersely than:

So here it is, with a couple of implicits:

The example shows how to promote a function to a matcher. Currently I didn't find how to define
a function literal with call-by-name semantics, but for simple matchers where a call-by-value semantic is acceptable this trick works fine.

Sunday, October 10, 2010

Haml macro for clojure

I made a small project which implements a clojure macro that reads a haml file and generates hiccup markup.

The parser is more or less compatible with the ruby haml parser, except the clojure expressions instead of ruby.

An example:

        - (for [side-name ["left" "right" ]])

Wednesday, August 4, 2010


This is a trac plugin which automatically adds the anonymous reporter email into the user's session.

This is useful when you want to make a helpdesk trac instance, open to anonymous users. They will put their email in the "reporter" field, pass a captcha test, and
submit tickets. The problem is that, whenever they want to comment or make another ticket, they are required to type their email over and over again, unless they
go to 'preferences' and set their session's email address, which is unlikely to happen for 'normal' users.

This plugin simply adds the reporter email address into the session email field, when an anonymous user creates a ticket, and it doesn't already have a session email address.

Monday, May 10, 2010

nvidia auto display

I've created a little shell script which allows me to avoid continuously struggling with nvidia-settings in order to manually switch monitor resolution on my MacBook pro 5,3 running ubuntu linux.

  • you plug your external screen and it automatically switches to it, changing resolution and resizing the desktop
  • useful on laptops with nvidia graphic card and nvidia proprietary drivers
  • behaves more or less like on MacOS X
  • works also after standby

Hope you find it useful.

Wednesday, April 21, 2010

Scala 2.8 Maven and continuous compilation

Scala looks very interesting, and I wanted to give a try to akka, hoping to be able to push erlang style practices in a JVM oriented workplace.

I one of the key features I want is to be able to use jrebel to hot update by running scala code during development. Another important requirement is to run scala 2.8.0 (beta or rc1) because akka 0.8.x requires it and I don't want to suffer later on from migration issues; I hope it's safe since for the time I've learned scala, 2.8.x will probably be mainstream.

I invested some time to learn sbt (looks really promising and fast) but at the end I was not feeling comfortable using it because of a couple of issues regarding unit test detection and dependency management issues. So for now, let's start with maven.

But with maven I had to:
MAVEN_OPTS="$MAVEN_OPTS -noverify -javaagent:/home/marko/bin/ZeroTurnaround/JRebel/jrebel.jar" mvn scala:console
because the documented method doesn't work (because the scala maven plugin doesn't fork a new virtual machine, see issue ...)

Ok, now whenever my scala sources are recompiled I can see hot updates in my scala console! Great stuff.

Eclipse can recompile each source file as soon as it's saved but, unfortunately the eclipse scala plugin for 2.8 (I tried both snapshot and -rc1) is not very stable and there are tons of issues which will be fixed soon, but I want to start to focus on scala and reduce the impact of immature tooling.

So I had to delegate this continuous recompilation to command line tools and use my programmer's editor of choice.

Both sbt and maven support continuous compilation. Sbt worked great out of box with "~ compile" but I preferred being able to use only maven to do the job.

Maven scala:cc goal has some issues with the compilation daemon:

[INFO] [scala:cc {execution: default-cli}]
[INFO] Checking for multiple versions of scala
[INFO] use fsc for compilation
[INFO] stop server(s)...
[No compilation server running.]
[INFO] start server...
[INFO] wait for files to compile...
[INFO] /home/marko/tmp/scala/scalatest/src/main/scala:-1: info: compiling
It turned out that it happens because there is an executable /usr/bin/fsc from the ubuntu scala 2.7.5 installation. If I remove that:

[INFO] [scala:cc {execution: default-cli}]
[INFO] Checking for multiple versions of scala
[INFO] use fsc for compilation
[INFO] stop server(s)...
[No compilation server running.]
[INFO] start server...
[INFO] wait for files to compile...
[INFO] /home/marko/tmp/scala/scalatest/src/main/scala:-1: info: compiling
[INFO] Compiling 1 source files to /home/marko/tmp/scala/scalatest/target/classes at 1271863673692
[INFO] Cannot start compilation daemon.
[INFO] tried command: List(scala,
[INFO] prepare-compile in 0 s
[INFO] compile in 0 s
[INFO] wait for files to compile...

In the meantime I tried to start my own fsc daemon (see

java -cp ... 

and it worked, and it's also incredibly fast at compiling (mvn scala:cc -Dfsc=false on the other hand works but it's very slow, it takes 3-4 seconds to compile)

So, basically all we need it provide an executable called "scala" which gets invoked by maven scala plugin with the, parameter:

Now maven scala:cc works fine, and spawns a compile server as needed.

Next, I will try to do some mixed java/scala development, thrift stubs building, subprojects etc.

Tuesday, April 13, 2010

reverse proxy overriding content-type

I had some links to images hosted on http servers which reported application/octet-stream instead of an 'image' mime type.

My application relied on that and I had to find a quick way to fix this by putting a reverse proxy in front of that particular server and override the content type.

I tried with no luck to do it with apache2 but it appears that there is no way to set the content-type response header after it's generated by the mod_proxy module.

So I tried out nginx, but it turns out that even nginx cannot force headers, but only add new headers

However the nginx docs point to a 3rd party module . Unfortunately the ubuntu nginx distribution didn't have this module compiled in, so I had to recompile nginx with this module built in. It turned out to be a simple operation.

I managed to reuse the same ubuntu nginx configuration files, by simply installing the custom nginx in /opt/nginx and symlinking ubuntu /etc/nginx to /opt/nginx/conf and changing the /etc/init.d/nginx script to point to /opt/nginx/sbin/nginx instead of /usr/sbin/nginx

then I simply added this line in my 'server' section:

   more_set_headers 'Content-Type: image/jpeg';

Wednesday, March 24, 2010

Deploying a Git subdirectory in Capistrano

Many suggest applying a patch to capistrano but for me this isn't an option.

I found this solution which works:


Perhaps this can be refactored in vendor plugin receipt

Monday, March 15, 2010

apache CXF and .NET

I had some troubles finding out why .NET (3.5 RC1) svcutil wasn't able to invoke my java CXF web service. Hope this may help you.

The error was:

exception = {"There was an error reflecting 'return'."} InnerException = {"The Form property may not be 'Unqualified' when an explicit Namespace property is present."}

It turned out that (besides mandatory namespaces) it was also caused by the fact svcutil doesn't like encoding arrays/lists as:

<xs:element form="qualified" maxoccurs="unbounded" minoccurs="0" name="linkedGroups" type="tns:group">

but it requires a nested element:

<xs:element form="qualified" minOccurs="0" name="linkedGroups">
        <xs:element form="qualified" maxOccurs="unbounded" minOccurs="0" name="item" type="tns:group"/>

This can be accomplished with the following jaxws annotations:

@XmlElementWrapper(namespace = "", name = "linkedGroups")
@XmlElement(namespace = "", name = "item")
public List<Profile> getLinkedGroups() {
  return linkedGroups;

This blog helped me

Monday, March 1, 2010

lazy clojure couchdb

lazy paged clojure couch db interface:

Tuesday, January 26, 2010

Clojure lazy sequence

I want to share my experience with lazy chunked sequences of clojure 1.1 and threads.

I wrote some code that used 'futures' to execute parallel IO, potentially in a large number of background threads (about 200).

I noticed that my code wasn't behaving like I expected and I was wondering if clojure executed the futures in some unbounded thread pool, or if it had some fixed maximum. So I wanted to try it out:

(map deref (map #(future-call (fn [] (Thread/sleep 1000) %)) (range 20))) 

this actually created 20 threads (I saw it with jconsole), and returned to the REPL within 1 second. So far so good. However:

(map deref (map #(future (Thread/sleep 1000) %) (range 200))) 

this took about 6 times more to execute. I also noticed a strange behaviour in thread creation. New threads were only created in ... chunks of 32....

Well now it seems obvious for me, but I didn't realize that I stuck upon the new clojure chunked lazy evaluation feature. The correct code is:

(map deref (doall (map #(future (Thread/sleep 1000) %) (range 200))))

Without the doall, only the first 32 futures are evaluated and actually submitted to the cachingThreadExecutor that sits behind the "future-call" core function.

In order to avoid this kind of errors in future, I created a simple helper:

(defn future-map [f seq]
  (doall (map #(future-call (fn [] (f %))) seq)))

to be used as:

 (future-map do-something asequence)

This sounds like a 'pmap' parallel map, but AFAIK the 'pmap' stuff was intended as a performance enhancement, and as such it tries to use a reasonable number of threads in order to exploit the available CPUs, a different kind of requirement from what I needed.

Friday, January 8, 2010


I attempted to port Haskell monadic monadic parsing library Parsec to clojure

for now it's very basic but I already for work.