Stubbisms – Tony’s Weblog

July 10, 2009

Git Script to Show Largest Pack Objects and Trim Your Waist Line!

Filed under: Java — Tags: , , , , — Antony Stubbs @ 2:07 pm

This is a script I put together after migrating the Spring Modules project from CVS, using git-cvsimport (which I also had to patch, to get to work on OS X / MacPorts). I wrote it because I wanted to get rid of all the large jar files, and documentation etc, that had been put into source control. However, if _large files_ are deleted in the latest revision, then they can be hard to track down.

The script effectively side step this limitation, as it simply goes through a list of all objects in your pack file (so try and run git gc first, so that all your objects are in your pack), and list the top largest files, showing you their information. The, with the file locations, you can then run:

# remove a tree from entire repo history
git filter-branch --index-filter "git rm -rf --cached --ignore-unmatch $files" HEAD

# pull in a repo without the junk
a git pull file://$(pwd)/myGitRepo

Which will remove them from your entire history, trimming your waist line nicely! But be sure to follow the advice from the man page for filter-branch – there’s things you should be aware of, such as old tags (that one got me) etc… Rather than messing around trying to get it exactly right, I actually just retagged the new repo by matching the dates of the tags from the initial cvsimport – there were only 9 after all!

But for reference, here is the command I’m referring to, from the git-filter-branch man page:

You really filtered all refs: use –tag-name-filter cat — –all when calling git-filter-branch.

There’s a few different suggestions as to how to remove the loose objects from your repository, in order to _really_ make it shrink straight away, my favourite being from the man page:

git-filter-branch is often used to get rid of a subset of files, usually with some combination
of –index-filter and –subdirectory-filter. People expect the resulting repository to be
smaller than the original, but you need a few more steps to actually make it smaller, because
git tries hard not to lose your objects until you tell it to. First make sure that:

o You really removed all variants of a filename, if a blob was moved over its lifetime. git
log –name-only –follow –all — filename can help you find renames.

o You really filtered all refs: use –tag-name-filter cat — –all when calling
git-filter-branch.
Then there are two ways to get a smaller repository. A safer way is to clone, that keeps your
original intact.

o Clone it with git clone file:///path/to/repo. The clone will not have the removed objects.
See git-clone(1). (Note that cloning with a plain path just hardlinks everything!)

Apart from the section on “are your objects _really_ loose?”, the most useful bit of information was running the git-pull command, which someone suggested from the discussion on the git mailing list. This was the only thing that actually worked for me, contrary to what it states about git-clone. However, be careful, as git pull by default doesn’t pull over all information…

And without further a due, here is the script:

#!/bin/bash
#set -x 

# Shows you the largest objects in your repo's pack file.
# Written for osx.
#
# @see https://stubbisms.wordpress.com/2009/07/10/git-script-to-show-largest-pack-objects-and-trim-your-waist-line/
# @author Antony Stubbs

# set the internal field spereator to line break, so that we can iterate easily over the verify-pack output
IFS=$'\n';

# list all objects including their size, sort by size, take top 10
objects=`git verify-pack -v .git/objects/pack/pack-*.idx | grep -v chain | sort -k3nr | head`

echo "All sizes are in kB's. The pack column is the size of the object, compressed, inside the pack file."

output="size,pack,SHA,location"
for y in $objects
do
	# extract the size in bytes
	size=$((`echo $y | cut -f 5 -d ' '`/1024))
	# extract the compressed size in bytes
	compressedSize=$((`echo $y | cut -f 6 -d ' '`/1024))
	# extract the SHA
	sha=`echo $y | cut -f 1 -d ' '`
	# find the objects location in the repository tree
	other=`git rev-list --all --objects | grep $sha`
	#lineBreak=`echo -e "\n"`
	output="${output}\n${size},${compressedSize},${other}"
done

echo -e $output | column -t -s ', '

Thanks to David Underhill for the inspiration, and the various posts on the git mailing list!

For other migration tips (svn) – see here: http://fpereda.wordpress.com/2008/06/11/how-i-migrated-paludis-to-git/

P.s. if someone tries running the script on Linux or Cygwin and it needs modifying, let me know and I’ll post the modified versions all next to each other in this article.

July 8, 2009

Spring Modules Fork

Filed under: Java — Tags: , , , — Antony Stubbs @ 10:45 pm
Ok guys – the project is up on git-hub!
Install Git! – http://git-scm.com/
To download an anonymous repository run http://github.com/astubbs/spring-modules or sign up with git-hub and get your personal account!
Let the patching begin!
From the Wiki page: “””
This is a resurection of the extremely valuable and abandoned Spring-Modules project.
The plan is to fully embrace Maven as the build tool and eventually throw out all the old build code.
At this point, all the old jar libraries and generated documentation have been pruned from the repository history.
This pruning has reduced the size of the repository from 95m to 7m. Nice.
The idea will be to slowly add one by one, as the are compile ready, to the parent module section, so they can be included.
Msg/email me on antony.stubbs@gmail.com to discuss!
“””
I will try and get a development mailing list setup asap, but for now, email me on antony.stubbs@gmail.com with the [subject spring-module-fork] and I’ll start building a manual list.

One thing I want to discuss is ow we can organise the issues on jira :/ I still have had no response from InspiSpring.

The project website: http://wiki.github.com/astubbs/spring-modules

Inspired by my work with Spring Modules (or difficulties with), and as explained in this post on the Spring Modules forum, I am now announcing the Spring Modules Fork! An effort to bring back to life some very useful software!

I have emailed various people and even emailed suggesting an extensions project as described on the Spring site[1] but haven’t had any responses.

As Spring Modules has been officially declared dead [2] and i’m not getting nay replies to my mails I would like to fork the project into git-hub and get some vital patches applied – particularly on my area of interest – the cache module. I also would like to migrate the whole project to maven.

I’m posting here to get any thoughts or suggestions regarding this – hopefully a response from someone at Spring – particularly Colin Yates.

Let’s get this useful software moving again!

Ok guys – the project is up on git-hub!

Install Git! – http://git-scm.com/

To download an anonymous repository run git clone git://github.com/astubbs/spring-modules.git or sign up with git-hub and get your personal account!

Let the patching begin!

From the Wiki page:

This is a resurrection of the extremely valuable and abandoned Spring-Modules project.

The plan is to fully embrace Maven as the build tool and eventually throw out all the old build code.

At this point, all the old jar libraries and generated documentation have been pruned from the repository history.

This pruning has reduced the size of the repository from 95m to 7m. Nice.

The idea will be to slowly add one by one, as the are compile ready, to the parent module section, so they can be included.

Msg/email me on antony.stubbs@gmail.com to discuss!

I will try and get a development mailing list setup asap, but for now, email me on antony.stubbs@gmail.com with the [subject spring-module-fork] and I’ll start building a manual list. (If anyone knows a free mail list setup, let me know).

One thing I want to discuss is how we can organise the issues on jira :/ I still have had no response from Spring.

P.s.  I’ve also setup a mailing list now:

Mailing List

Here are the essentials:

April 2, 2009

spring-modules-ehcache and ehcache issues you should be aware of

Filed under: Java — Tags: , , , , — Antony Stubbs @ 1:56 pm

Anyone looking at or currently using spring-modulesehcache should be aware of a couple of issues:

  1. spring-modules is no longer maintained
  2. a cache cannot be re-configured after it’s construction. save your-self some headache and use the recommended ehcache.xml method instead of programmatic configuration
  3. the HashCodeCacheKey generator in spring-modules-ehcache suffers from inconsistency issues. This is a problem when setting up a distributed cache, or using a disk persistent cache (diskPersistent=”true”). A fix is described in the jira
  4. the logging system in ehcache 1.6 has changed from commons logging to JDK logging – currently contrary to what the documentation will tell you. If anyone can tell me how to get the logging working in 1.6, I would be greatly appreciative.
  5. the 0.8 and 0.8a pom’s in repo1 have broken pom’s. The fix is under http://jira.springframework.org/browse/MOD-463
  6. contrary to what is available in repo1, there is a newer version available – 0.9. A patch and pom files have been posted, but you will have to install it to your company’s repo manually, a-la something along the lines of:

mvn install:install-file -DgroupId=org.springmodules -DartifactId=spring-modules-cache -Dversion=0.9 -Dpackaging=jar -Dfile=Downloads/spring-modules-0.9/sources/spring-modules-cache-src.zip  -Dclassifier=sources

mvn install:install-file -DgroupId=org.springmodules -DartifactId=spring-modules-cache -Dversion=0.9 -Dpackaging=jar -Dfile=Downloads/spring-modules-0.9/sources/spring-modules-cache.jar  -Dclassifier=sources

mvn install:install-file -DgroupId=org.springmodules -DartifactId=spring-modules -Dversion=0.9 -Dpackaging=jar  -DpomFile=Downloads/spring-modules-0.9-maven2-poms/pom.xml

and on the parent pom spring-modules:

mvn install -N

And just for an example, an ehcache.xml with disk persistent, ever-lasting entries:

<ehcache xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:noNamespaceSchemaLocation="http://ehcache.sf.net/ehcache.xsd">

    <diskStore path="java.io.tmpdir" /> 

    <cacheManagerEventListenerFactory class="" properties="" />

    <!-- eternal used during development for web services -->
    <defaultCache
            maxElementsInMemory="10000"
            eternal="true"
            timeToIdleSeconds="120"
            timeToLiveSeconds="120"
            overflowToDisk="true"
            diskSpoolBufferSizeMB="5"
            maxElementsOnDisk="10000000"
            diskPersistent="true"
            diskExpiryThreadIntervalSeconds="120"
            memoryStoreEvictionPolicy="LRU"
            />

    <cache name="webservice"
            maxElementsInMemory="10000"
            eternal="true"
            timeToIdleSeconds="1200"
            timeToLiveSeconds="1200"
            overflowToDisk="true"
            diskSpoolBufferSizeMB="1"
            maxElementsOnDisk="10000000"
            diskPersistent="true"
            diskExpiryThreadIntervalSeconds="120"
            memoryStoreEvictionPolicy="LRU"
            />

</ehcache>

And spring-context so you can use declaritive caching:

<bean id="cacheProviderFacade" class="org.springmodules.cache.provider.ehcache.EhCacheFacade">
        <property name="cacheManager" ref="cacheManager" />
    </bean>

    <bean id="cacheManager" class="org.springframework.cache.ehcache.EhCacheManagerFactoryBean">
        <property name="configLocation" value="classpath:ehcache.xml" />
    </bean>

    <bean id="cachingAttributeSource" class="org.springmodules.cache.annotations.AnnotationCachingAttributeSource" />

    <bean id="autoproxy" class="org.springframework.aop.framework.autoproxy.DefaultAdvisorAutoProxyCreator" />

    <bean id="myKeyGenerator" class="com.componence.mubito.webservices.clients.MyCacheKeyGenerator" />

    <bean id="cachingInterceptor" class="org.springmodules.cache.interceptor.caching.MetadataCachingInterceptor">
        <property name="cacheProviderFacade" ref="cacheProviderFacade" />
        <property name="cachingAttributeSource" ref="cachingAttributeSource" />
        <property name="cachingModels">
            <props>
                <prop key="webservicesCache">cacheName=webservice</prop>
            </props>
        </property>
    </bean>

    <bean id="cachingAttributeSourceAdvisor" class="org.springmodules.cache.interceptor.caching.CachingAttributeSourceAdvisor">
        <constructor-arg ref="cachingInterceptor" />
    </bean>

And annotate the methods you want cached with:

@Cacheable(modelId = "webservicesCache")

Update!

Due to these dificulties and more, I have forked the long dead Spring Modules project!

Check out the project page here:

http://wiki.github.com/astubbs/spring-modules

Discuss it on the google group:

groups.google.com/group/spring-modules-fork/

And submit your patches here:

http://github.com/astubbs/spring-modules/

Happy moduling!

February 18, 2009

Fighting Scala – Scala to Java List Conversion

Filed under: Scala — Tags: , , — Antony Stubbs @ 2:42 pm

Apparently this is going to be addressed in Scala 2.8, but until then there’s an annoying little detail when dealing with Java libraries (in my case, Wicket), from Scala. That being that methods requiring java.util.List types cannot be called with scala.List types.

The offending code:

val issues = List( 1, 2, 3 )
new org.apache.wicket.markup.html.list.ListView(wicketId, issues) {...

3

Causes this compilation error:
error: overloaded method constructor ListView with alternatives (java.lang.String,java.util.List[T])org.apache.wicket.markup.html.list.ListView[T] (java.lang.String,org.apache.wicket.model.IModel[java.util.List[T]])org.apache.wicket.markup.html.list.ListView[T] cannot be applied to (String,List[String])

There is a collection of implicit conversion functions for going from java.util.List to scala.List in the scala.collection.jcl.Conversions object, but not go go the other way around. These functions look like:

object Conversions {

implicit def convertSet[T](set : java.util.Set[T]) = Set(set)
implicit def convertList[T](set : java.util.List[T]) = Buffer(set)
implicit def convertSortedSet[T](set : java.util.SortedSet[T]) = SortedSet(set)
implicit def convertMap[T,E](set : java.util.Map[T,E]) = Map(set)
implicit def convertSortedMap[T,E](set : java.util.SortedMap[T,E]) = SortedMap(set)</code>

implicit def unconvertSet[T](set : SetWrapper[T]) = set.underlying
implicit def unconvertCollection[T](set : CollectionWrapper[T]) = set.underlying
implicit def unconvertList[T](set : BufferWrapper[T]) = set.underlying
implicit def unconvertSortedSet[T](set : SortedSetWrapper[T]) = set.underlying
implicit def unconvertMap[T,E](set : MapWrapper[T,E]) = set.underlying
implicit def unconvertSortedMap[T,E](set : SortedMapWrapper[T,E]) = set.underlying

}

22

With some friendly advice, I created the following conversion functions to do the conversion, including 2-dimensional (lists of lists) lists.

implicit def convertScalaListToJavaList(aList:List[String]) = java.util.Arrays.asList(aList.toArray: _*)
implicit def convertScalaListListToJavaList(aList:List[List[String]]) = java.util.Arrays.asList(aList.toArray: _*)

11

The obscure notation (aList.toArray: _*) is required as described in section 8.8 Repeated Parameters page 188 of the Programming in Scala book:

“This notation tells the compiler to pass each element of arr as its own argument to echo, rather than all of it as a single argument.”

and in 4.6.2 Repeated Parameters in The Scala Language Specification Version 2.7:

Furthermore, assume the definition:
def sum(args: Int*)
val xs = List(1, 2, 3)

The following applications method sum is ill-formed:
sum(xs) // ***** error: expected: Int, found: List[Int]
By contrast, the following application is well formed and yields again the result 6:
sum(xs: _*)

To cut the story short, it is required because the asList method accepts variable number of object arguments, but we are passing in a single list object. The : _* tells the compiler to instead pass in each element of the list individually, so that it looks like varargs. And as suggested by someone on the channel, it is consistent, if obscure, notation. When you read it, keep in mind that variableName: ObjectType is normal notation in Scala for specifying the type of a declared parameter. So instead of a concrete type, we are specifying “any” type ( ‘_’ ), “any” number of times ( ‘*’ ).

Not so bad eh? ;P

December 25, 2008

Aaaaand we’re done.

Filed under: General, Java — Tags: , , , , , , , — Antony Stubbs @ 4:09 am

Just under 4 months ago, I left New Zealand, heading for The Netherlands (Holland) to take up an invitation to work on Portal technology at Componence. This has taken me to Armin van Buren in Belgium, Lviv and Kiev in Ukraine, Munich and the Oktoberfest (thanks Stefan! I am sooo coming back next year) and I trip to Oslo, Norway to visit an old university friend.

Oh and in the Netherlands I found an 80,000 EURO TV. Nice.

That's 80,000 EURO$, 194,025 NZ$, 112,000 US$

That's 80,000 EURO$, 194,025 NZ$, 112,000 US$. I think the ones sitting next to it there are 40" TVs - they look small eh?

On my way over to Europe, I went through San Francisco which was really cool – while I was there I snapped some fan-boy pics:

Motorcycle tour of the Google campus

Motorcycle tour of the Google campus

And now, here I am, in Melmo Sweden, enjoying the beginnings of my holiday with old family friends from Australia. Like a flash, my contract is over. It all stemmed from my seemingly innocent comment on a blog from a core member of the Wicket team. Just goes to show, if you open yourself to opportunity and put yourself out there a bit, you never know what will happen.

I had a few posts planned regarding this, but this has been a very busy trip with lots of new aspects of life to get used to. Now that my ‘proof’ is complete, I shall re-balance things a bit and get a whole lot of draft posts out onto the blog.

I left my job at IBM to follow my passions…

I left my job at IBM in New Zealand before I found this contract. There are a few reasons why I left, but they are pretty much summed with with saying that I felt I could progress in my career at a pace more suited to my liking if I left. One of those aspects was working overseas. Buy me a beer and I’ll tell you the whole story.

What I really want to do is work on new technology, not simply the application of.

Ever since I was young I wanted to work on satellites at NASA which is not really the typical childhood dream. But working on Play Station 4 would suffice 😉

My Work on Wicket Portlets

One major aspect of my work here was completeing the Portlet 2.0 specification JCR 286 implementation in Wicket. Wicket of course being one of the Top web development frameworks today and a pleasure to work with. All the work sits under the Wicket issue WICKET-1620.

JavaDoc

When I came on the scene, Portlet 1.0 support was complete, with the help of Apache Portal Bridges and work on Wicket resource serving support was mostly complete. The first daunting task however, was completeing the documentation for nearly the entire existing Portlet support in Wicket. This was a big learning experience and a deep look into Wicket internals (which btw, made me a bit nervous about my upcoming event implementation task). This was mostly completed in WICKET-1875, with the a lot more additional javadoc included in my patches for WICKET-1620.

Events

It was pretty much decided that I wasn’t going to try and do anything at this stage with the public render parameters part of the spec, and was going to focus on the Events system. This was a much bigger task than any of the other parts of the Portlet 2.0 spec, and a lot more complicated than I expected.

However, in the end, I think I came up with a pretty nice solution. After some false starts, I ended up implementing with custom

  • WebRequestCycleProcessor
  • AbstractBehavior
  • BehaviorRequestTarget

aptly named

  • PorletWebRequestProcessor
  • AbstractPortletEventListenerBehaviour
  • PortletEventRequestTarget

This really fits into Wicket superbly well, a testament to the sophisticated, yet reasonably straight forward (considering the problem domain) extensibility of Wicket. Bravo.

The patches are sitting with the issue, and now that I have some extra time, I will massage them some more and I’m confident that they will be in Wicket 1.5. Included with the patches is also a simple example application.

Known issues with the implementation at this stage are:

  1. Issues around events in response to events
  2. Resource URL’s being generated for links, in some cases, instead of Action URL’s.

On my second to last day at Componence, I gave a presentation to the company on the work I had done. Posted below are two small sections of that presentation (Keynote is lots of fun) which show the gist of it.

Basic flow of code through the events sub-system

Basic flow of code through the events sub-system

Triggering an event is merely a call into the Portlet API and letting the hosting Portal take care of the rest.

The basic steps for receiving a Portlet event are as follows:

  1. Wicket receives an event request – this is like a normal HTTP request (or a Action request in Portlet speak – so to speak 😉
  2. Wicket calls into the custom (registered) PorletWebRequestProcessor.
  3. The PorletWebRequestProcessor checks the context is an Event Request – otherwise the processing is delegated to the parent.
  4. The Request Target is resolved to the custom PortletEventRequestTarget.
  5. During the normal Wicket request processing, at the appropriate stage, calls into our PortletEventRequestTarget the processEvents method from the IEventProcessor.
  6. Inside this method call, we search through the page’s component tree looking for behaviours attached to components which extend the AbstractPortletEventListenerBehaviour. The search performed is a depth first recursive search. I have a niggling feeling that this could perhaps be improved somehow.
  7. When found, the event name is checked to see if it matches the name of event the behaviour is registered as wanting to subscribe to.
  8. If they match, we call into the behaviour, passing in the Event.
  9. Hey presto!

In hind sight it all seems a little simple now 😉 But of course I have come out the other end a much wiser and learned man – gotta love the opportunity to learn from Open Source! So my current view is of course, biased.

wicketevents-code

Code example of using the events API

I’m very proud of the work I’ve done on Wicket and am very excited to see it make it into trunk. As I said before, the next step is to polish up the patch and remove extra unneeded code (it’s pretty big), so that the core guys can review it properly.

A snapshot of the end of my talk at Componence about Wicket et al

A snapshot of the end of my talk at Componence about Wicket et al

What’s Next

I’m meeting up with my girl friend soon on the 28th of December in Amsterdam, after which we’re off to Paris for new years, then tripping around Europe for January.

During that time, I am sure we will find our selves with some time to relax and so I thought that time might be useful to finish the several draft articles I have waiting in my blog queue (19!). They are all nagging at the back of my mind…

Hopefully this stint will lead onto more exciting work at and maybe even some more presenting and teaching – I am still eager to make anyone who wants to learn, sit through my Scala presentation ;), even if I haven’t used Scala since I wrote the thing!

I still have aspirations of consulting, teaching, traveling, experiencing and working in other parts of the world and of working at Sony Computer Entertainment of America. But not to worry – they are still in the queue.

Until next time – peace out.

The crazy Kiwi.

Update

Good news – as of my latest patch submition, “Issues around events in response to events” has been fixed!

The example applicayion now includes two links – one that triggers an event, and one that triggers an event which will in turn trigger another event within the same request cycle.

August 28, 2008

RE: Maven2 – it can be quite good

Filed under: Java — Tags: , , , , , — Antony Stubbs @ 10:44 pm

This dude has comments disabled on his post, and I just wanted to correct him on something.

“as long as you use the standard Maven directory structure” and
“Standard directory structure” section

That isn’t correct. All you have to do is configure your custom source
directory location, test source directory and any resource directories
and all things Maven will work just as if you were using the standard
convention.

Maven also does a whole lot more you haven’t mentioned here, so just
to refer people:
http://maven.apache.org/maven-features.html

Not to mention the hordes of plugins that are coming out.

Some of my take on the subject:
https://stubbisms.wordpress.com/2008/08/28/maven-is-to-ant-as-a-nail-gun-is-to-hammer-and-nails-you-need-to-move-on/

update

For Google’s sake, this is a reply to a blog post by Les Hazlewood regarding the issue of grafting Maven onto legacy projects, legacy being that they weren’t layed out with Maven in mind.

My reply to his post wasn’t going to come out right so here it is:

Hi Guys,

Good comments!

Here is the Maven POM you might be looking for, to build your sample app. I think it’s output is what you’re after, but I couldn’t check for sure as I had dificulty with your Ant build (fingers crossed this posts ok) 😉

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>org.jsecurity</groupId>
    <artifactId>jsecurity-spring-hibernate-sample</artifactId>
<packaging>war</packaging>
    <version>0.9-SNAPSHOT</version>
    <build>
        <sourceDirectory>src</sourceDirectory>
<plugins>
<plugin>
                <artifactId>maven-compiler-plugin</artifactId>
                <configuration>
                    <source>1.5</source>
                    <target>1.5</target>
                </configuration>
            </plugin>
<plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-war-plugin</artifactId>
                <configuration>
                    <webResources>
                        <resource>
                            <directory>WEB-INF</directory>
                            <targetPath>WEB-INF</targetPath>
                        </resource>
                    </webResources>
                </configuration>
            </plugin>
        </plugins>
    </build>
    <dependencies>
        <dependency>
            <artifactId>hibernate</artifactId>
            <groupId>org.hibernate</groupId>
            <version>3.2.6.ga</version>
        </dependency>
        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>spring</artifactId>
            <version>2.5.5</version>
            <type>jar</type>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>org.jsecurity</groupId>
            <artifactId>jsecurity</artifactId>
            <version>0.9.0-snapshot</version>
        </dependency>
        <dependency>
            <groupId>org.jsecurity</groupId>
            <artifactId>jsecurity-support</artifactId>
            <version>0.9.0-snapshot</version>
        </dependency>
        <dependency>
            <groupId>hsqldb</groupId>
            <artifactId>hsqldb</artifactId>
            <version>1.8.0.7</version>
        </dependency>
        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>1.2.14</version>
        </dependency>
    </dependencies>
</project>

I also add to add a POM to the ‘support-spring’ module:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">

    <modelVersion>4.0.0</modelVersion>

    <groupId>org.jsecurity</groupId>
    <artifactId>jsecurity-support</artifactId>
    <version>0.9.0-snapshot</version>
    <name>JSecurity-support</name>
    <url>http://www.jsecurity.org</url>

    <dependencies>
        <dependency>
            <groupId>commons-beanutils</groupId>
            <artifactId>commons-beanutils</artifactId>
            <version>1.7.0</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>spring</artifactId>
            <version>2.5.5</version>
            <type>jar</type>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>org.jsecurity</groupId>
            <artifactId>jsecurity</artifactId>
            <version>0.9.0-snapshot</version>
        </dependency>
        <dependency>
            <groupId>javax.servlet</groupId>
            <artifactId>servlet-api</artifactId>
            <version>2.5</version>
        </dependency>
        <dependency>
            <groupId>org.easymock</groupId>
            <artifactId>easymock</artifactId>
            <version>2.3</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.3.1</version>
            <scope>test</scope>
        </dependency>
    </dependencies>

    <build>
        <!-- non-standard source locations -->
        <sourceDirectory>${basedir}/src</sourceDirectory>
        <testSourceDirectory>${basedir}/test</testSourceDirectory>
<pluginManagement>
<plugins>
<plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-compiler-plugin</artifactId>
                    <version>2.0.2</version>
                    <configuration>
                        <source>${maven.compile.source}</source>
                        <target>${maven.compile.target}</target>
                        <encoding>${jsecurity.encoding}</encoding>
                    </configuration>
                </plugin>
            </plugins>
        </pluginManagement>
<plugins>
<plugin>
                <!-- generate the IntelliJ project files -->
                <artifactId>maven-idea-plugin</artifactId>
                <configuration>
                    <jdkLevel>${maven.compile.source}</jdkLevel>
                    <downloadSources>true</downloadSources>
                </configuration>
            </plugin>
<plugin>
                <!-- generate the Eclipse project files -->
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-eclipse-plugin</artifactId>
                <configuration>
                    <downloadSources>true</downloadSources>
                    <downloadJavadocs>false</downloadJavadocs>
                </configuration>
            </plugin>
        </plugins>
    </build>
<properties>
        <!-- Default configuration for compiler source and target JVM -->
        <maven.compile.source>1.5</maven.compile.source>
        <maven.compile.target>1.5</maven.compile.target>
        <!--
            Encoding of Java source files: Make sure, that the compiler
            and the javadoc generator use the right encoding.
            Subprojects may overwrite this, if they are using another
            encoding.
        -->
        <jsecurity.encoding>iso-8859-1</jsecurity.encoding>
        <jsecurity.docEncoding>${jsecurity.encoding}</jsecurity.docEncoding>

    </properties>

</project>

Note that most of this would be cut down with the use of a parent POM, which I haven’t used here.

Note that you will also have to have jsecurity and the support module either ‘installed’ in your local repo (mvn install) or in your workspace in Elcipse when using m2eclipse.

The resultant structure looks like this:

Let me know if you have any queries…

Also if you add:

<plugin>
    <groupId>org.mortbay.jetty</groupId>
    <artifactId>maven-jetty-plugin</artifactId>
</plugin>

to the plugins section of your pom, you can then run the application in place using jetty with:

mvn jetty:run-war

Maven is to Ant as a Nail Gun is to Hammer and Nails – you need to move on.

Filed under: Java — Tags: , , , , , — Antony Stubbs @ 6:02 pm

No, wait, “Maven is to Ant as Linux From Scratch! is to Ubuntu“…

No, no, hold up – “Maven is to Ant as the Vidalia Slice Wizzard is to the Potato Peeler“….

I think Maven is great.

I discovered something really neat-o the other day. The definition of the word Maven:

“an expert or connoisseur.”

Maven is Just That(tm). You don’t have to know anything about building*, you leave it to the Maven. The Expert. The Connoisseur you might say…

It knows how to do so much for you. It has many many tricks up it’s sleeves, and people are teaching it more and more every day! Plus Ant seems to be considered effectively complete and not under much active development. The leaders in the game are moving onto greener pastures.

Dependencies in Control

The main complaint people have is the online thing. That being, you, reasonably, pretty much have to be online for Maven to work reliably.

One fix for this is to host your own Maven repo inside your project directory and commit it to version control. This is effectively the same as having a /libs dir and sticking all your jars in there, except you get all the benefits of Maven’s dependency management. Great!

Here’s an example

  1. run mvn -Dmdep.useRepositoryLayout=true -Dmdep.copyPom=true dependency:copy-dependencies
    This creates /target/dependencies with a repo-like layout of all your projects dependencies
  2. Copy /target/dependencies to something like /libs
  3. Add to pom.xml the location of repository like so:
    <project>
    ...
    	<repositories>
    		<repository>
    			<releases />
    			<id>local</id>
    			<name>local</name>
    			<url>file:///${basedir}/libs</url>
    		</repository>
    	</repositories>
    </project>
    NB: ${basedir} is required otherwise Maven complains about not having an absolute path.
  4. commit!
  5. Note – you can also do this for a multi module project, and have all modules share the same common repository. You should just make sure you run the copy-dependencies command from your parent pom.
  6. You can also install your custom dependecies into such a repository such as Oracle db drivers using the install-file goal:
    mvn install:install-file -Dfile=your-artifact-1.0.jar \
                             [-DpomFile=your-pom.xml] \
                             [-DgroupId=org.some.group] \
                             [-DartifactId=your-artifact] \
                             [-Dversion=1.0] \
                             [-Dpackaging=jar] \
                             [-Dclassifier=sources] \
                             [-DgeneratePom=true]
                             [-DcreateChecksum=true]

    NB: the easier way to use the install-file goal, is to not use it, try and build your project, then Maven will complain about the missing dependency, and prompt you with instructions on how to use install-file with nearly all the options pre-filled in, according to the dependency description in your POM.

Taking Maven completely Off-Line

The other thing people talk alot about is other build tools that are better than Ant, e.g. Gant, Raven (ruby build scripts for Java) or some other one I heard of recently – ah-ha! found it – Gosling (I’m sure I read a recent article about it somewhere, that pointed to a newer website?) that lets you write your build in Java and then just wraps Ant! Bizzaro.

Another one is that people are concerned about their build tool changing itself over time, and so their build is not necessarily stable and reproducible. I for one love the idea of self upgrading software – hey, it’s one step closer to the end of the world right? Well I for one welcome our build tool overlords. But seriously, I think that the advantage of having the build tool upgrade itself and getting the latest bug fixes and feature updates, outweighs the disadvantage of the build breaking one day. So what? – You take an hour out, or half a day, or even a day, and fix it!

Buut, if you work in a stiff, rigid environment, or work for NASA or the militar or something, then there’s a way around this as well. This is also the suggested best practice for dealing with plugins. (O.k., I remember reading this information somewhere, but it was harder to find again than i thought.)

  1. Run mvn help:effective-pom -Doutput=effective.pom this produces a list of the plugin versions your project is currently using.
  2. Open effective.pom and copy the build->pluginManagement section into your pom, optionally deleting the configuration and just keeping the goupid, artifactid and version.
  3. Make sure your project packages, to test you got the pluginManagement right.
  4. Rename your local repository to repository.back
  5. Run mvn dependency:go-offline – this will download all plugins and their dependencies for your project, into a clean repository.
  6. Move the repository into  your project directory.
  7. Add the project repository to your POM as described above.
  8. Try running your mvn package with the –offline option and make sure everythings ok.
  9. Rename your backup from repository.bak back to repository.
  10. Commit.
  11. Done! You should be able to now build the project off of a fresh checkout and an empty repository.
  12. If you’ve gone this far, you may as well also commit the version of Maven your using into your source control as well, in a directory such as /tools/maven.

***

When MDEP-177 is addressed, this will be much easier to do.

On another note, as far as running off-line is concerned, the other really neat-o thing you should definately do if there are more than two of you on location (or if you’re keen to share snapshots easily), is setup a local Maven repository cache/proxy/mirror using Nexus. IMO, don’t bother trying Artefactory or the other one, Nexus is da’ bomb.

Ant, Ivy and Transition

What i don’t think these people seem to appreciate, is that the beauty of Maven is that you don’t write any build logic**! All these other tools don’t really address this problem! Making Ant easier to write, still means you have to write Ant! Yuck! As far as I’m concerned, our job is to further the state of the art of technology, and this means effectively achieving more while doing less! Maven is exactly that.

Ivy is all well and good, but Maven is just so much more. In fact, since Maven does everything Ivy does (to a degree), and Maven can be used from Ant (i.e. so you can integrate Maven’s dependency management into your Ant build instead of using Ivy), I would propose that the Ivy developers stop working Ivy now, and try to bring to Maven’s dependency management system whatever it was they thought they could do better with Ivy. A little competition never hurt anyone though…

If you don’t want to adopt Maven out-right (i.e. if you have a very large project), using the Maven Ant tasks, you could use Maven for your dependency management instead of doing it all manually with Ant or better with Ivy. This is what the JBoss Application Server team have done btw – you can see their source code for hints on how to get started.

Or, you can do as I have done on my last project at IBM, start using Maven entirely for the components you are working on, and integrate the output of it into your legacy Ant build.

How to Integrate Maven into a Legacy Ant multi-component Build

I first created a POM that modelled the non-Maven component that my Maven components wanted to rely on, then you add this phase to your Ant build, to get it to install the POM and it’s artefacts into the local repository. That way the other Maven built components don’t need to think about Ant at all.

<!-- Maven ant task -->
<import file="../common.xml" />

<target name="maven-repo-install" depends="install-parent-pom">
	<artifact:dependencies settingsFile="../tools/maven/conf/settings.xml" />
<property name="M2_HOME" value="../tools/maven" />

	<artifact:localRepository id="local.repository" path="c:/repository" layout="default" />

	<!-- install main pom -->
	<artifact:pom id="pom.es" file="pom.xml">
		<localRepository refid="local.repository" />
	</artifact:pom>

	<artifact:install>
		<localRepository refid="local.repository" />
<pom refid="pom.es" />
	</artifact:install>
</target>

The gotcha with this is that when you call Ant on this build.xml, you need to add the maven-ant-tasks jar to it’s classpath. You can also make a .bat or .sh out of this:

REM // add maven-ant-tasks ant targets
ant -lib ..\tools\maven-ant-tasks-lib %*

In order to make sure your Ant built project’s Maven twin has access to it’s parent if it needs it, add this step as a dependency on your installation:

<!-- setup maven parent -->
<target name="install-parent-pom">
	<!-- install pom -->
	<maven basedir="../" goal="install" mvnargs="-N" />
</target>

In order to easily call a Maven command from an Ant script, I pilfered this and modified it, from the JBoss Application Server build script – thanks guys!:

<?xml version="1.0" encoding="UTF-8"?>
<project name="common-ant-tasks">

	<!-- maven execution target definition -->
	<macrodef name="maven">
		<attribute name="goal" />
		<attribute name="basedir" />
		<attribute name="mvnArgs" default="" />
		<!--
			programRoot should point to the root directory of the oasis project
			structure. I.e. the directory which contains setup and tools.
		-->
		<attribute name="project.root" default="${basedir}/../" />
		<element name="args" implicit="true" optional="true" />
		<sequential>
			<!-- maven location -->
<property name="maven.dir" value="@{project.root}/tools/maven" />
<property name="mvn" value="${maven.dir}/bin/mvn.bat" />
<property name="thirdparty.maven.opts" value="" />
			<!-- check mvn exists -->
			<available file="${mvn}" property="isFileAvail" />
			<fail unless="isFileAvail" message="Maven not found here ${mvn}!" />

			<!-- call maven -->
			<echo message="Calling mvn command located in ${maven.dir}" />
			<echo message=" - from dir: @{basedir}" />
			<echo message=" - running Maven goals: @{goal}" />
			<echo message=" - running Maven arguments: @{mvnArgs}" />
			<java classname="org.codehaus.classworlds.Launcher" fork="true"
				dir="@{basedir}" resultproperty="maven.result">
				<classpath>
					<fileset dir="${maven.dir}/boot">
						<include name="*.jar" />
					</fileset>
					<fileset dir="${maven.dir}/lib">
						<include name="*.jar" />
					</fileset>
					<fileset dir="${maven.dir}/bin">
						<include name="*.*" />
					</fileset>
				</classpath>
				<sysproperty key="classworlds.conf" value="${maven.dir}/bin/m2.conf" />
				<sysproperty key="maven.home" value="${maven.dir}" />
				<arg line="--batch-mode ${thirdparty.maven.opts} -ff @{goal} @{mvnArgs}" />
			</java>
			<!-- check maven return result -->
			<fail message="Unable to build Maven goals. See Maven output for details.">
				<condition>
					<not>
						<equals arg1="${maven.result}" arg2="0" />
					</not>
				</condition>
			</fail>
		</sequential>
	</macrodef>
</project>

And if you have other legacy Ant built components that want to be able to blindy use Ant to build their dependencies, and you aren’t using Maven Ant tasks in that project, you can wrap the dependent Maven built projects Maven build using this, which uses the above Ant macro definition:

<?xml version="1.0" encoding="UTF-8"?>
<!-- Redirects to Maven to build -->
<project name="anAntBuiltProject" default="install" basedir=".">

	<!-- Maven ant task -->
	<import file="../common.xml"/>

	<target name="install">
		<maven basedir="${basedir}" goal="install -N"/>
	</target>

</project>

Conclusion

The future’s got to go somewhere, and Maven is a huge innovation in the build department, and is definitely a big leap in the right direction.

In fact, I wouldn’t be surprised if someone migrates the Ant build to Maven 😉 – that’s a joke btw.

*** I haven’t actually tried this yet 😉 Let me know how you get along if you give it a go.

** This is the best case scenario, and for most projects i think it’s true. However, if you get yourself into funky corner cases, sometimes you gotta throw in some plugin-Fu or some embedded ant-Fu to get things just the way you want. Although, i have only had to do this when dealing with the headache of grafting maven onto legacy projects which didn’t have standard build procedures in mind.

* Ok yes, but you have to know how to use Maven. But Maven is a lot easier to use in the long run than Ant fo sure.

July 27, 2008

It’s time to let go of Commons Logging – Long Live SLF4J!

Filed under: Java — Tags: , , , , , — Antony Stubbs @ 2:04 am

I support the complete abandonment of Commons Logging in favour of Simple Logging Facade 4 Java (SLF4J). Especially in libraries that use it – I think it’s amazing that Spring to link to it in their libraries!

For those of you that are blissfully un-aware of the hazards of the run time binding that commons logging uses and have been luckily enough not to have run into it yet – educate yourselves.

“Commons-logging promises to bridge to different logging APIs such as log4j, Avalon logkit and java.util.logging API. However, it’s dynamic discovery mechanism is the source of painful bugs.”

Apparently it’s his fault. Well mostly, as he admits:

“I’ll come right out and admit it: commons-logging, at least in its initial form, was my fault, though probably not mine alone.”

This guy deserves a medal. He freely hosts a great service for Maven users (yay! go Maven! (and Ivy and Gradle for that matter) ) that let’s you deal away with Commons Logging in your project easily.

The summary of the solution is basically:

<repositories>
    <repository>
        <id>Version99</id>
        <name>Version 99 Does Not Exist Maven repository</name>
        <layout>default</layout>
        <url>http://no-commons-logging.zapto.org/mvn2</url>
    </repository>
</repositories>
<dependencies>
    <!-- get empty jar instead of commons-logging -->
    <dependency>
        <groupId>commons-logging</groupId>
        <artifactId>commons-logging</artifactId>
        <version>99.0-does-not-exist</version>
    </dependency>
</dependencies>

LogBack as mentioned in my previous post, along with SLF4J is a step into the recent present for logging technology (as opposed to living in pre-historic times of commons logging.

It’s really amaizing that something as seemingly simple as logging causes such a nightmare for so many people!

Update

Maven issue MNG-1977 will address this issue nicely by letting you globally exclude all commons-logging dependencies across all transitive dependencies in one statement.

Please so your neighbor-hood developer a favour, and vote for it 🙂

June 9, 2008

RE: I’d love to quit my job! (sort of)

Filed under: Java — Tags: , , — Antony Stubbs @ 2:09 am

I really admire this guy, he is proposing to do and appears as though he really is going to do exactly what I have been thinking about doing for quite a while but with out the travel, except he’s even got people to donate!

He is actually taking 6 months time out from doing contract programming to focus full time on open source projects. You can read Gregory Brown’s article over here.

Oh and Gregory, if you have an actual blog you’re still running – comment and let me know – I’d love to follow along…

P.s. Congratulations Obama!

P.s.s I have 16 draft blog posts! I am going to set a day every week to finish writing some of these posts. I will start with the simpler ones first 🙂

February 22, 2008

My foray into the world of Scala

Filed under: Java, Scala — Tags: , , , , — Antony Stubbs @ 1:09 pm

For some reason, I suddenly felt like playing around with Scala for a couple of days, and having gotten over my perceived difficulty of the language vs Groovy, and after actually trying to write something in it – I really like it 🙂

At first glance, advanced functional programming in Scala can look a little freaky to someone who’s only been writing Java for the last few years. But if you start slowly, it all slides into place. I started to get into it by reading this is a really good series of articles introducing the language

What follows are two examples of Scala. The first, LoveGame is a demonstration of programming a simple algorithm in Scala along with a little comarpison with Java. The second is a little toying around i did with Scala to create a front end for JScience with the “Pimp my library” pattern.

The Loooove Game

I’m teaching a course to our new hires in Melbourne at the moment and one of their assignments was to write a program called “Love Game” which implemented a simple algorithm to calculate the compatibility between two people based on the letters in their names. I’m sure you all remember doing this as a kid?

Well, in the weekend, I couldn’t help but start thinking about how the algorithm could be expressed so much better. So I thought this was the motivation to give Scala another shot.

The algorithm in java looks like this:

// Creates 2 character arrays that will be compared
char bothNamesArray[] = "roger federer maria sharapova".toCharArray();
char compWordArray[] = "loves".toCharArray();

// Creates an Integer array for storing count results
Integer tallyArray[] = new Integer[compWord.length()];

int tallyArrayPointer = 0;
int matchCounter = 0;

/*
 * Counts the number of times each character in compWordArray also
 * appears in bothNamesArray
 */
while (tallyArrayPointer < compWord.length()) {
 for (int i = 0; i < bothNames.length(); i++) {
 	char nameLetter = bothNamesArray&#91;i&#93;;
 	char compLetter = compWordArray&#91;tallyArrayPointer&#93;;

		if (nameLetter == compLetter) {
 		matchCounter++;
 	}

		tallyArray&#91;tallyArrayPointer&#93; = matchCounter;
 }

	matchCounter = 0;
 tallyArrayPointer++;
}

int tallyCounter;
int totalAdder;

/*
 * Calculates the compatibility percentage by adding consecutive
 * numbers in tallyArray together.
 */
while (tallyArray&#91;2&#93; != -1) {
 tallyCounter = 0;

	while ((tallyCounter < tallyArray.length - 1)
 		&& (tallyArray&#91;tallyCounter + 1&#93; != -1)) {
 	totalAdder = tallyArray&#91;tallyCounter&#93;
 			+ tallyArray&#91;tallyCounter + 1&#93;;
 	tallyArray&#91;tallyCounter&#93; = totalAdder;
 	tallyCounter++;
 }

	tallyArray&#91;tallyCounter&#93; = -1;
}
int finalPercentage = (tallyArray&#91;0&#93; + tallyArray&#91;1&#93;) * 2;

// Displays the compatibility percentage
System.out.println("\nCalculated compatibility = " + finalPercentage + " %" );&#91;/sourcecode&#93;

&lt;pre&gt;NB: Yes we could re-write the Java implementation using recursion also, but it doesn’t fit as nicely as it does into the Scala solution, and the Java code would still require most of it’s verbosity.&lt;/pre&gt;
It works in two steps;
<ol>
	<li>Counts the frequency of occurrence of the letters in the word love, in the full names</li>
	<li>Takes this list of frequencies and reduces it down to one number by adding up pairs of values in the list, creating a new list, then repeating until there’s one value left. That value is then multiplied by two, and that’s your result.</li>
</ol>
This is what the Scala code to do the same thing looks like:

I can't stop sitting here and steering at how awesomely terse the following Scala code really is....


val names = "roger federer maria sharapova"
val initialList = "loves".toList map( x => names.toList count(x == _ ))

def loveReduce(numbers:List[Int]):Int = numbers match {
  case head :: Nil => head * 2
  case _  => loveReduce(numbers zip(numbers tail) map {case (a, b) => a+b})
}

// Displays the compatibility percentage
println("Compatibility = " + loveReduce(initialList) + " %" )

As you can see, the algorithm can be written in far fewer lines of code, and is far simpler to understand, expecially the second stage. This means it’s much easier to see that it is correct.Ironically, the Java code above actually fails in the case where the “verb” (loves), is replaced with a smaller work like “is”. There may also be other cases where it fails. However the Scala code works perfectly!

Step One

"loves".map( x => names.count( y => x == y ))
NB: toList() has been removed for clarity.

The first step is to find our list of frequencies.

List.map()

What the map fuction does, is apply a given function F(x) to a list of elements, returning a new list, such that each element is now the result of F(x).

For example:

Consider the array

[1,2,3,4]

Applying map(F(x)) results in:

[F(1),F(2),F(3),F(4)].

If we imagine the function we pass is F(x) = x+1, then the returned array would look like:

[2,3,4,5]

In our code, we have defined our function F(x) to be the number of times our x appears in the String names. We do that by using the count() function.

List.count()

The count function counts the number of elements in a list which satisfy a given condition. We pass that ‘condition’ in as a function. In our case, we are breaking up the word loves into individual letters, so we want to count how many times that letter occurs in the names string.

By using the count function, we iterate through the names string, letter by letter, counting the times a given letter in the name – y – equals the letter we access from our outer loop – x. We then perform the same operation, but with the next letter in the word loves.

In our code above, you can see we are passing the function x == y ( x == _ is short hand). Again, where x is a letter in the word loves and y is a letter in the names. Note that this one line is an O(n^2) complexity.

Step Two

def loveReduce(numbers:List[Int]):Int = numbers match {
  case head :: Nil => head * 2
  case _  => loveReduce(numbers.zip(numbers.tail).map{case (a, b) => a+b})
}

Ignore the method signature at this point, let’s explain the algorithm.The second part in the algorithm deals with the list of frequencies. For the example “Roger Federer loves Maria Sharapova”, the frequency count for the letters in the loves is:List(0, 2, 1, 4, 1)We then to add up each pair of numbers to create a new array e.g. List(2, 3, 5, 5) and repeat until are left with a single element which we multiply by 2.To express this very tersely, in Scala, recursion is involved. We could re-write the Java implementation using recursion also, but it doesn’t fit as nicely as it does into the Scala solution, and the Java code would still require most of it’s verbosity.

The list gets reduced as so:
List(0, 2, 1, 4, 1)
List(2, 3, 5, 5)
List(5, 8, 10)
List(13, 18)
List(31)

Apart from the pattern matching (we’ll get to that soon), most of this method is similar to how you would do it in Java. The really interesting Scala part of this example is this line:

numbers.zip(numbers.tail).map{case (a, b) => a+b}

What this is doing is constructing a list of tuples (pretty much two element arrays) by combining two lists, and then applying another map function, the function being adding together the two elements in the tuple.numbers.tail returns the list without it head (with out it’s first element). So we are effectively calling:[ 1 , 2 , 3 , 4 ] . zip( [ 2 , 3 , 4 ] ) The result of this call is a list which looks like;

List((1,2), (2,3), (3,4))

It is one element shorter than the original list because the zip() is defined such that when zipping together lists of different length, remaining elements in the longer list are ignored.

We then apply the map() function to this array of tuples, with the function F( case (a,b) => a+b). The case statement uses Scala pattern matching to retreive a matchin tuple element. The result from this is now the tuples all added together, and the list one element shorter:

List(3, 5, 7)

Then we simply recursively call the loveReduce method again on the newlist.

numbers match {
  case head :: Nil => head * 2
  case _  => loveReduce(...)
}

This is similar in nature to Java’s switch statement, but far, far more powerful. It uses a powerful concept called pattern matching and is not simply restricted to either ints or enums.In our example, we are performing a math on our numbers list where our first case is the end of recursion. This being where the parameter passed in as number, matches a list (as Scala knows it’s a list) where the list has only a head element and nothing else (one element in the list). The statement which would match a list which had two or more elements would be “head :: tail“.The reason we don’t need to return from a case is that in Scala, unlike Java, cases do not overflow into each other, so to end or recursion, we simply return the head element, multiplied by two, as per our LoveGame algorithm.The second case uses the universal operator “_” which is used in several places in Scala, but used consistently. This is similar to the “default” case in Java. In this case, we want to perform our zip and map, and make our recursive call again.And to breiflly explain the method signature of loveReduce

def loveReduce(numbers:List[Int]):Int

In Scala, everything is an object, even functions. Above we define the loveReduce function. It takes one parameter called numbers of type List[Int]. As you can see Scala supports type parameters, just like Java’s Generics. Usually, in Scala, returns types of a method can be inferred by it’s type inference system. However, there are cases where the return type of the method has to be specified and this is one. But don’t worry – the compiler will tell you when you need to! Here we simply define the return type of the method is Int.On a side note – in Scala, everything is an object – even simple ints. So unlike in Java where auto-boxing was introduced in 1.5, Scala treats all simple numbers as Objects.

Engineering

That is the compact version of the code. Originally I wrote it in a more engineered way, in order to learn more Scala, and also so that I could learn more about implicit conversions.

/**
 * LoveGame algorithm. Uses a recursive function to reduce the frequency
 * list to the final result.
 */
class LoveGame(verb:String) extends MyConversions {

  /** Reduces a list of numbers down to a single number by continually
   *  adding up consecutive pairs until there is only one element left,
   *  and multiplies it by one. */
  def loveReduce(numbers:List[Int]):Int = numbers match {
    case head :: Nil =&gt; head * 2
    case _  =&gt; loveReduce(numbers.zip(numbers.tail).map{case (a, b) => a+b})
  }

  /** Runs the LoveGame on the words in the String names based on
   *  the class parameter verb.
   */
  def compute(names:String):Int = {
    val initialList = verb.toLowerCase.countOccurances(names.toLowerCase)
    loveReduce(initialList)
  }
}

/**
 * A collection of implicit conversions used in this component.
 */
trait MyConversions {
  implicit def string2MyRichString(str:String) = new MyRichString(str)
}

/**
 * Pimps out the String class with extra methods.
 */
class MyRichString(str:String) {
  /** Counts the frequency each letter in occures _haystack_, returning a list. */
  def countOccurances(haystack:String) =
    str.toList.map( x =&gt; haystack.toList.count(x == _ ))
}

/**
  * Bootstrap for the LoveGame program. Should be replaced by a xUnit test.
  */
object LoveGameProgram extends Application {
  val names = Array( "Roger","Federer","Maria","Sharapova" ).mkString( " " ) // loves - 62%
  val loveGame = new LoveGame( "loves" )
  println(names)
  println(  "Compatibility = " + loveGame.compute(names.mkString( " " )) + " %" )
}

Update:

Inspired by my friends C# code below, here’s the love game in Scala again, but using the while loop approach in 4 lines (i.e. minus pattern matching, recursion and functions):

  var loves = "loves".toList map( x => "roger federer maria sharapova".toList count(x == _ ))
  while (loves.length > 1)
    loves = loves zip(loves tail) map {case (a, b) => a+b}
  println( "3 line compatibility = " + loves.head * 2 + " % " )

Pimp My Library

In honor of this JScience and Groovy example, here’s something in Scala going for the same sort of thing – extrapolate out, use your imagination.

For those of you who don’t know what JScience is: “JScience is a comprehensive Java library for the scientific community.

Its vision is to create synergy between all sciences (mathematics, physics, sociology, biology, astronomy, economics, geography, history, etc.) by integrating them into a single architecture.”

It is also going to be the in Java soon as the Reference Implementation (RI) for JSR-275: javax.measure.* (Latest Draft Specification).

It allows you to do stuff like this in Java:

Measure length = Measure.valueOf(50, SI.CENTI(SI.METER)).plus(Measure.valueOf(25.0, SI.METER));
Measure lengthInCentimeters = length.to(SI.CENTI(SI.METER));
System.out.println("length in centimeters is " + lengthInCentimeters.getEstimatedValue());

But that’s pretty verbose, wouldn’t you rather just write:

var len = 50.centimeters + 25.meters

println("length in centimeters is " + len)

Well, below is the start of writing a wrapper for the JScience Library using Scala’s implicit conversions.

println(2.miles.to.meters)

class MyInt(quantity: Int) {
  def miles() = new Mile(quantity)
  def mile() = new Mile(quantity)
}

abstract class Measure {
  def to() = new Converter(this)
  def toMeters()
}

class Mile(quantity: Int) extends Measure {
  def conversionFactorToMeters:Double = 1.609344 * 1000;
  override def toMeters() = quantity * conversionFactorToMeters
}

class Feet(quantity: Int) extends Measure {
  def conversionFactorToMeters = 1/0.3048
  override def toMeters = quantity * conversionFactorToMeters
}

class Converter(quantity: Measure) {
  def meters() = quantity.toMeters()
}

implicit def numberToMile(number:Int) = new MyInt(number)

How powerful is that? Implicit conversions allow the “Pimp my library” pattern which can be really powerful. I certainly know it would have been really useful in my last project!P.S. If anyone knows anything about Scala code high lighting in hosted WordPress drop me a line! Sourcode tag doesn’t support it.

Older Posts »

Create a free website or blog at WordPress.com.