Wednesday, December 13, 2006


Why are even the simple things sometimes hard? Attempting to get a element attribute. Here is a table of what the various browsers return when trying to get the class attribute having already found an element:
var element = document.getElementById("theSpan");
in a document containing the HTML:
<span id="theSpan" class="something">
Browser element. getAttribute( "class" ); element. getAttribute( "CLASS" ); element. getAttribute( "className" ); element. attributes[ "class" ]; element. className;
Firefox 2.0 something something null [object Attr] something
Opera 9.0 something something null [object Attr] something
Safari 1.3.2 something something null undefined something
IE 5.5 (Mac) null null something something something
IE 7.0 null null something [object] something

Updated: Thanks to anonymous for pointing out that you can just use element.className to get the class name used and it seems to work correctly across all browsers I tested.

It seems that there is no simple way to get the class attribute across browsers. The solution seems to be to get the class attribute and if it undefined try the className attribute. Most other attributes should work fine, it is just that IE doesn't like the class attribute that throws the spanner in the works. Here is the quick testing page.

Tuesday, December 12, 2006

Label Based Feeds

When I was at the Sakai conference I was asked if I wanted my blog added to Sakai Planet, the thing is that I blog about more than just Sakai and so only the posts for Sakai should end up in the planet. After a little googling I discovered you can have feeds for labels and it seems that my sakai feed works. I wonder why Google don't make this the default RSS feed when viewing all the posts for a tag?

Tuesday, November 28, 2006

Marketing through

I have had an account on for a while now. I send most of music listening to it. A few days ago I got a friend invite from TheHijacker. Looking at the profile it seems this is the profile for a previous member of Dodgy called Nigel Clark who decided to send me a friend invite because I was one of the top listeners to Dodgy in a last week (25 listens). I also a day later got a private message from him saying I should visit his website. I believe he it trying to promote his new album 21st Century Man. To be honest I don't mind this sort of promotion much, as it is obvious that I will quite happily listen to Dodgy and so will probably like his new album. This sort of promotion also isn't at the moment automated and so he obviously feels it is worth his time sending me the message. However I signed up with because it allows me to find other music rather than have it pushed on me. Although I'm guessing that Nigel Clark hasn't yet reached a big enough critical mass opn to be recommended to me as other people aren't yet listening to it.

Friday, November 03, 2006

Sakai Site.equals()

Took a little while to figure this one out but in Sakai the equals() method will return true when passed a string if it matches the ID of the site. The reason they do this is so that they can save a list of site IDs and then have a collection of sites and call siteList.removeAll(stringList);. This can be seen is action in the CharonPortal where is is used to hide sites that the user isn't interested in.

Wednesday, November 01, 2006

Starting Rhythmbox in the Notification Area

I normally have Rhythmbox running on my desktop to provide me with music. However normally when you start up Rhythmbox it displays its window, now as I added Rhythmbox to the list of programs to automatically startup when I login I wanted it in the notification area (tray). The trick for doing this is to start Rhythmbox with the command: rhythmbox-client --hide Now if only gaim would have some way todo the same thing.

Friday, October 27, 2006

Terta & Bodington "Review"

Michael Feldstein has written a couple of very good posts about the Bodington VLE following the Tetra annoucement. His posts aren't a "review" of Bodington but more of an exploration of how Bodington does access control which is often over simplified in other tools.
Here at Oxford we use WebAuth for web based Single Sign On (SSO) and when looking at tools we like to see how easily they can be WebAuth enabled. Thankfully Sakai makes it all pretty easy. Sakai has the ability to support two authentication methods at the same time so we can have a login route for both Oxford users (WebAuth) and for internal Sakai users.

Previously we have used the Apache WebAuth module but this time I decided to attempt to use the Java WebAuth Filter developed as part of the SIPE project here at Oxford. One note is that the Standford WebAuth download pages have a newer version of the Java WebAuth filter than the SIPE pages. The Java WebAuth implementation only works with Java 1.5.

The first thing is to configure Kerberos on the machine so that it points to the correct servers, on my Linux box at Oxford this means having a configuration file called /etc/krb5.conf containing:

default_realm = OX.AC.UK

OX.AC.UK = {
kdc =
kdc =
kdc =
admin_server =

[domain_realm] = OX.AC.UK = OX.AC.UK

I also needed the kerberos tools too, on Ubuntu these come as part of the krb5-user package (sudo apt-get install krb5-user). Once my machine was running with Kerboros I needed to visit the systems development team and get a kerberos principal, this involved saying hello to the nice people upstairs and then typing a password for my new principal (buckett/itss). This principal then had rights over my webauth principal (webauth/ so that I could key a keytab for my webauth principal. I get a keytab with the commands:

buckett@oucs-matthewb:~ $ kadmin -p buckett/itss
Authenticating as principal buckett/itss with password.
Password for buckett/itss@OX.AC.UK:
kadmin: ktadd -k /home/buckett/.webauth.keytab webauth/
Entry for principal webauth/ with kvno 4, encryption type Triple DES cbc mode with HMAC/sha1 added to keytab WRFILE:/home/buckett/.webauth.keytab.
Entry for principal webauth/ with kvno 4, encryption type DES cbc mode with CRC-32 added to keytab WRFILE:/home/buckett/.webauth.keytab.

This gives me a keytab file that allows the WebAuth filter to authenticate with the Kerberos server without having to ask me for a password every time I start it up. I also need to create a keyring file.

touch /home/buckett/.webauth.keyring

and set the permissions on both of the files to be as restrictive as possible (chmod 400 ~/.webauth.*).

The the sakai login tool needs to be changed to include the WebAuth filter.
This means dropping all the JARs from the WebAuth distribution (bcprov-jdk15-132.jar, commons-httpclient-3.0.jar, commons-logging-api-1.0.4.jar, commons-codec-1.3.jar, commons-logging-1.0.4.jar, webauth-java-1.2.jar) into the WEB-INF/lib folder.
The web.xml then needs some extra sections added (in bold).

  <description>Sakai 2 sample tools: login</description>

  <!-- Webauth Filter Start -->
   <filter-name>Webauth Filter</filter-name>
 <!-- Webauth filter end -->





   <!-- Webauth Filter Mapping Start -->
       <filter-name>Webauth Filter</filter-name>
   <!-- Webauth Filter Mapping End -->


You will probably need to change the WebAuth filter configuration to point to the correct keyring/keytab files for you local installation.

The just edit your, making sure these are set.


# to include the user id and password for login on the gateway site

# to let the container handle login or not (set to true for single-signon type setups, false for just internal login)


Now if you startup Sakai it should provide you with two login buttons in the top right of the portal. One that uses WebAuth and one that uses the internal Sakai authentication.

Wednesday, October 25, 2006

Sakai and Log4J

In my digging around with Sakai I came across the fact that the Sakai logging JARs are deployed to tomcat/common, the extra JARs are deployed there are sakai-util-log-dev.jar and log4j-1.2.8.jar. Placing these files in the common classloader means that they are available to both the deployed applications as well as the servlet container (Tomcat) itself. Now often log4j along with commons-logging JARs are placed into the common classloader so that Tomcat (5.5) will use log4j for its own internal logging. If other log4j JARs don't exist in any of the other classloaders all logging will go through the same log4j class and configuration. Applications are still free to provide their own log4j implmenetation along with configuration which should be insulated from the container log4j. This is because the common classloader is checked last (after the webapp and shared ones). Sakai would only need to deploy its logging code to common if it wanted to control the logging of the container as well as it's own logging. Some Sakai log messages may end up in the container logs if any of the Sakai tools log against the servlet context (getServletContext().log(String)). I don't think there is any harm to Sakai by placing the JARs in shared as all that happens is you can't control the container logging. To get Sakai logging working correctly there would also need to have commons-logging in the common classloader, so I'm not convinced that this setup works to control the container logging. As a side note Tomcat ships with a commons-logging-api but this doesn't provide the full logging framework, just enough to get going. Helpful web documents include:

Friday, October 20, 2006

Sakai Classloaders

I am beginning todo a little bit of work with Sakai and was trying to deal with uploaded files from a web form when inside a tool. Now Sakai uses commons-filupload to handle uploaded files and parses them automatically and adds the results back to the request as attributes. I was then trying to access this attribute in my tool with a line: FileItem fileItem = (FileItem)request.getAttribute("file"); This supprisingly was giving me a ClassCastException complaining that it was unable to cast a DefaultFileItem to a FileItem despite the fact that DefaultFileItem implements the FileItem interface. After a little head scratching Alexis suggested classloader issues we found the problem. Sakai has the parsing of the uploaded files in one classloader(the portal, getting commons-fileupload from shared) which then hands control off to the tool in another classloader (the tool, done by dispatching the request across servlet contexts). The problem was that both classloaders had a copy of the commons-fileupload jar and so when the FileItem class what loaded it wasn't in the same classloader as the DefaultFileItem and so couldn't be cast. Removing the copy of commons-fileupload from the tool fixed the problem. This issue would have been a little easier to debug if when the ClassCastException occured they provided the ID of the classloader that the two classes had come from in the exception message.

Globalsign and Java (again)

It seems that I keep having to contact servers with Globalsign certificates and have Java throw a wobbily ( There are a whole load of globalsign certificates under and this time I needed the Server.cacert one. Now my JDK has two extra Globalsign root certificates installed...

Thursday, October 05, 2006

Plusnet LLU Disconnection

Well it seems that Plusnet have finally sorted out the disconnects on the LLU platform as I have been connected for almost 16 hours without a disconnect. The whole issue has been badly dealt with but hey. In side news my exchange has been ADSL2+ enabled as my router now syncs at: Bandwidth (Up/Down) [kbps/kbps]: 619 / 17,509 In speedtests I don't get anywhere near 17mbs but this is probably due to still having 802.11b (11mb) wireless clients. I should really attach a PC directly to the router and see how it does.

Saturday, September 02, 2006

Plusnet Disconnects Continue

Well I am still getting disconnected every so often from Plusnet, this has now been going on for more than a week and becomming beyond a joke. For example today I have already been disconnected 18 times. Maybe its time to look for a new ISP.

Friday, September 01, 2006

Plusnet LLU (Tiscalli)

I currently have my ADSL providied by plusnet and they currently have a problem with thier LLU provider (Tiscalli) in that authentication is failing, one useful bit of information in this posting is the test username for LLU lines ( as the standard BT username (bt_test@startup_domain) doesn't work. In other plusnet news my LLU unbundled line is currently getting random disconnection every few hours which is very annoying when having Skype calls. Generally I am very unimpressed with the LLU line, but am not sure I want the hassle of trying to switch back to a BT one.

Thursday, August 31, 2006

Blogger Beta

When I logged into Blogger this morning they offered me the option of migrating to the new Blogger Beta and so I accepted. So far everything seems ok although my old template (Rounders 4) was very slighly broken (image didn't fill the header) so I have switched to a new minimalist theme. Everything else should be pretty much the same.

Tuesday, August 22, 2006

jmap and heap dumps on 1.4.2

At work we have what might be a memory leak on our production server (it is running the Sun JDK 1.4.2 Update 12 on a Linux server), now it seemed that we might be able to use jmap which has been backported to 1.4.2 to get a heap dump. When I tried this on my desktop I kept getting the error: buckett@oucs-matthewb:~ $ jmap Exception in thread "main" java.lang.NoClassDefFoundError: sun/jvm/hotspot/tools/JMap Now initially I though that I had a problem with JAVA_HOMEs and PATHs. But it turns out that Sun only backported jmap for Solaris (Windows is no luckier), but continue to ship everyone the binary just tempt them. The other option is -Xrunhprof but as we didn't start the JVM orginally with this option there doesn't seem to be a way to get a heap dump without restarting the service. gcore could give us a 1.5Gb core file for the process but there doesn't seem to be a nice way to process this under 1.4. Maybe it is time to switch to 1.5?

xserver-xorg-core upgrade broken

This morning I installed a new version of xserver-xorg-core on my Ubuntu desktop and so did my colleage Alexis, he then restarted his computer only to find that X11 (the Windowing System) would not start and was giving an error of: (EE) No devices detected. Fatal server error: no screens found There is a bug report in the Ubuntu bug database about this: The easiest way to fix this is to download the previous version of the package and install it. The packages are in and I have a tinyurl to the i386 binary, version 1.0.2-0ubuntu10.1 as 1.0.2-0ubuntu10.3 is broken. Here is how to downgrade, the stuff I type is in bold: Ubuntu 6.06.1 LTS oucs-matthewb tty1 oucs-matthewb login: buckett Password: Last login: Tue Aug 22 10:00:09 2006 on pts/0 Linux oucs-matthewb 2.6.15-26-686 #1 SMP PREEMPT Thu Aug 3 03:13:28 UTC 2006 i686 GNU/Linux The programs included with the Ubuntu system are free software; the exact distribution terms for each program are described in the individual files in /usr/share/doc/*/copyright. Ubuntu comes with ABSOLUTELY NO WARRANTY, to the extent permitted by applicable law. No mail. 1 failure since last login. Last was Tue 22 Aug 2006 10:01:12 BST on pts/0. buckett@oucs-matthewb:~ $ cd /tmp buckett@oucs-matthewb:/tmp $ wget --10:01:34-- => `hshot' Resolving, Connecting to||:80... connected. HTTP request sent, awaiting response... 301 Moved Permanently Location: [following] --10:01:34-- => `xserver-xorg-core_1.0.2-0ubuntu10.1_i386.deb' Resolving Connecting to||:80... connected. HTTP request sent, awaiting response... 200 OK Length: 3,529,260 (3.4M) [application/x-debian-package] 100%[====================================>] 3,529,260 9.78M/s 10:01:34 (9.78 MB/s) - `xserver-xorg-core_1.0.2-0ubuntu10.1_i386.deb' saved [3529260/3529260] buckett@oucs-matthewb:/tmp $ sudo dpkg -i xserver-xorg-core_1.0.2-0ubuntu10.1_i386.deb dpkg - warning: downgrading xserver-xorg-core from 1.0.2-0ubuntu10.3 to 1.0.2-0ubuntu10.1. (Reading database ... 107659 files and directories currently installed.) Preparing to replace xserver-xorg-core 1:1.0.2-0ubuntu10.3 (using xserver-xorg-core_1.0.2-0ubuntu10.1_i386.deb) ... Unpacking replacement xserver-xorg-core ... Setting up xserver-xorg-core (1.0.2-0ubuntu10.1) ... Maybe this might help someone, you will continue to be prompted to upgrade to the broken version but hopefuly Ubuntu will soon produce a fixed version and you should be able to skip the broken version. The way to check what version you are uprgading to is to use the Update Manager (in Administration menu).

Monday, August 07, 2006

Speedtouch and NAT Loopback

My Speedtouch 780WL has an option called NAT Loopback that allows you to access the external IP address from inside your home network. This is useful because if you run a server inside your network you can use the same IP address (and so hostname) to address the server both at home and when you are outside. To enabled NAT Loopback login to the command line (through telnet) and enter: ip config natloopback=enabled saveall Then it should all work (maybe after a reboot). The only gotcha is that it doesn't seem to loopback ICMP packets so you won't be able to ping the external IP.

Friday, August 04, 2006

Permissions in a Tree

One of our users of WebLearn here at Oxford pointed out one of the current problems we have with the permission based model Bodington uses. If for example you have a course that contains a couple of pigeon holes (drop boxes). To allow students to use the pigeon holes you need to grant them upload rights. Now rather than granting the students upload rights to each pigeon hole it would be useful if you could just grant the permission once. The obvious solution is to have the pigeon holes inherit permissions from the containing course and then grant the permissions on the course. This way you only have to manage the permissions in one place which reduces the administration and chances of mistakes being make. The problem is that upload permissions in the course allow students to upload content to the course container which you probably don't want. This is what happens when you reuse permission in different tools for slightly different things. The ways around this are either to have a special permission for pigeon hole submission which doesn't mean anything in the course container or the have the idea of roles and a student role means diffrent things in different locations. As for which solution we will move towards I'm not sure.

Tuesday, August 01, 2006

Gem of CSS

I found this little gem of a CSS definition the other day: .red{ color:red; } which just made me shudder. At least the formatting is good.

Reloading How To Re-read is a nice short post on how to re-read a log4j properties file. On our java project we were calling PropertyConfigurator.configure(filename); but never calling LogManager.resetConfiguration(); which meant if you removed a logging property from the configuration and reloaded it it carried on logging. It is a shame that the log4j APIs are so badly documented. LogManager being an example.

Monday, July 31, 2006

Busy Weekend

Had a busy weekend, on Saturday we went to Gemma and James's Wedding which was magical and then visited old neighbours Beth, Daniel and Benji on the way home. Along the way we took a few photos. I need another weekend now.

Enabling Ping on Speedtouch WL780

By default you can't ping the speedtouch WL780 from the internet, I'm not paranoid about security so I wanted to enable it. Todo this telnet to the router and login. The execute the command: service system ifadd name=PING_RESPONDER group=wan This should enabled ICMP from the internet which then means you can use a service such as l8nc to monitor your internet connection. I currently have l8nc graphing my connection.

Thursday, July 27, 2006

MaxDSL and Poor Router

Recently we had been getting allot of disconnects on our ADSL line with the router (Safecom SWAMRU-54108) failing to reconnect and requiring a reboot or even being powered off for a short while. After trying to fix it unsuccessfully and with it being out of warranty I decided to get a new router. I opted for a SpeedTouch 780WL from DSL Source which is an ADSL router, wireless access point and voice over IP gateway all in one. They are currently selling it for 75 pounds with free delivery and it arrived the next day. After a very easy configuration it was connecting at 8,157kbs down and 636kbs up and has been rock stable for the past few days. This is much better than the old router which was only managing to sync at about 3,600kbs down. I don't know if my old router had developed a fault recently or was just a bad design. However the old router was very poor in comparison to the new router, it didn't have any polish and things never worked quite as they should.

Wednesday, July 12, 2006

Nintendo DS Lite (White)

I ended up trading in my PS2 for a white Nintendo DS Lite. Although there isn't the breadth of games there are some great ones:
  • MarioKart DS - Having played MarioKart 64 as a student I felt right at home and this has to be one of the best games for the DS. Friend Code: 257771 280248
  • New Super Mario Bros. - Never played the orginal so this is all a little new, but very enjoyable.
  • Advanced Wars Dual Strike - A great turn based strategy game, although the characters are a little annoying the gameplay is excelent and well balanced.
  • Tony Hawk's American Sk8land - Having enjoyed Tony Hawks on the PS2 I picked this one up, although it is probably the worst game I own, although still good. Friend Code: 068792 703458
  • Metroid Prime Hunters - A great 1st person shooter which makes very good use of the touchscreen. I'm not very good at it but it is still very enjoyable. Friend Code: 2534 7629 8211
I'm really impressed with the DS lite, it looks good, is nice and small, has some great games and the battery life is excellent.

Monday, June 26, 2006

Spring MVC Forms

Some notes about Spring MVC and forms. In simple situations when the HTML forms maps well onto the domain model validation can be performed directly on the domain model and the domain model can be used as the command class. This reduces the ammount of code that needs to be written and also means that validators can be used across multiple controllers. The problem is that this only works so long as the form maps well to the existing model. In more complicated situations you will need a seperate command class to handle the processing of the form submission and will then want to validate that so that you can easily map errors in validation back to the form fields. If you map the command class onto the domain model and then validate, discovering which errors relate to which parts of the form becomes tricky. The problem is that then you may end up having duplicate validation code for multiple controllers. When mapping a text input box <input type="text" name="number"> you have several options as to how the mapping is performed.
  • Don't attempt any parsing of the field when mapping the field to the command class and leave it as at String. This then means that your controller has to perform the conversion to an integer and generate the apropriate error message. This complicates your controller with code that could be performed better elsewhere.
  • Have it mapped to a primative int. This means that no matter what the input in the form as long as the conversion and validation succeeds you will always get a number back from the command object even is there wasn't one present in the form, and this is the problem, you can't tell when the user left the field empty.
  • Have it mapped to an Integer. This has the advantage that you can now tell if the user left the field blank as you will get back null. You can only do this though if you change to use a custom property editor as the default is to attempt to parse the empty string which fails and raises an error. I do this in my controller: protected void initBinder(HttpServletRequest request, ServletRequestDataBinder binder) {   // Map numbers to Integer but allow them to be empty.   binder.registerCustomEditor(Integer.class, null, new CustomNumberEditor(Integer.class, true)); }
  • If you use property editors to perform your conversion from strings to integers then you will probably also want to change the error messages that are generated when a problem occurs. Have a look at DefaultMessageCodesResolver for the messages to map. As a starting point I have a property: convert to a number. which is displayed when the data binder can't convert a string to a number.

Locking Issues in Bodington

Currently most of the locking that is performed in Bodington is done using synchronized blocks in the Java code. Examples of this can be found in org.bodington.server.resources.ResoureTree and org.bodington.server.resources.UploadedFileManager. This works reasonably well except when you want to scale your application across multiple JVMs as the locking only applies to one JVM. The two ways around this are either to use explicit locks in the database or to code optimistic locks in your application. As deploying Bodington across multiple JVMs isn't important at the moment we are continuing to develop using the existing pattern of synchronized blocks. This is important to me because in my quota implementation I need to lock sections of the tree while I calculate the current quota usage. While I am calculating the usage I don't want that section of the tree to change. As long as the locking is ok the quota calculations should stay in sync with the usage.

Monday, June 19, 2006

Uploaded File Quotas

The implementation of uploaded file quotas is progressing reasonably well in WebLearn. They are now at a state where they seem to work reasonably well and I think the locking should prevent errors occurring when two people are uploading files at the same time. Implementing quotas on uploaded files was reasonably easy as there is one API through which all uploaded files are created/deleted. Next I have to implement quotas for resources which will be much harder as at the moment there isn't a standard API for creating resources although ResourceTree is a starting point. The user importer is now in a reasonable state for next years students and I am just waiting for there to be a larger dataset and then will finish off the testing/debugging. Hopefully there shouldn't be too many bugs in it as most of the logic for group creation was taken from a previous imported and that was working reasonably well.

The Observer | Business | Telecoms pray for time when the Skype finally falls in

The Observer | Business | Telecoms pray for time when the Skype finally falls in talks about how the telecoms companies are worried about Skype and mentions Oxford University as an example of an institution that no longer bans Skype. The reason for this is that in order to use Skype at Oxford you have to configure it so that the peer-to-peer (P2P) part of the client can be filtered by the University firewall preventing "leaching" of bandwidth. We have a page documenting Oxfords Skype setup. It seems that the news never has the full story.

Monday, June 12, 2006

Phone Line Repair and MaxDSL Speed

Last wednesday evening our phone line went dead along with our ADSL service. So the next day we called our provider and asked them to test to line and after a little nagging they got on to BT and had the line tested on Friday, they say there is a fault on the line and an engineer needs to be called. They said an engineer would be out to us on Wednesday of the following week. I was not impressed. Anyway it turns out that a BT engineer appears around lunch time on Sunday and after a while fixes the phone line. After he's left I reconnect the ADSL and find that it isn't working, thankfully he gave us his mobile and so we drop him a call. He says that it will probably need another engineer to look at it tomorrow as nobody who knows about the ADSL stuff is about on a Sunday. Monday morning I get up and discover that the ADSL is working, great. However looking on the router it is syncing at 3584 Kbps/640 Kbps. As I mentioned in previous post I was getting allot more (it settle down to around 5500Kbs/448Kbs), so although I now have working internet access again I have lost 2Mbps in the process. I suspect that there isn't anything I can do about this as the line works fine, just not as well as it used to.

Building Tomcat 5.5 from SVN problem

I was attempting to build Tomcat 5.5 from SVN using the instructions from tomcat pages, however I was getting an error of: /home/buckett/src/tomcat/build.xml:49: The following error occurred while executing this line: /home/buckett/src/tomcat/build/build.xml:791: The following error occurred while executing this line: /home/buckett/src/tomcat/container/webapps/docs/build.xml:88: java.lang.ClassNotFoundException: after a little head scratching I discovered that I had installed ant as an Ubuntu package and I was missing the ant-optional package which provided this extra functionality. After intalling this additional package everything went smoothly.

Bodington Database Objects and User Importer

I am currently working on some new user import code for Oxfords installation of Bodington (WebLearn) which takes a feed of about 35000 users and creates accounts and groups for them. Due to the large number of users involved and the small number of changes that are made each day it is important to try and keep the import running fast. At the moment the bottleneck seems to be the database which eats up most of the CPU during the import, a quick look at a process listing shows PostgreSQL using 80% or the CPU and Java using 10%. Looking at the statement logging it seemed that for each user it was always updating the user in the database even if nothing had changed. Update statements are expensive for a database compared to select statements so cutting down the number of updates seemed an obvious way to improve the performance. Every object that is stored in the database by Bodington extends org.bodington.database.PersistentObject which provides the skeleton for database objects One feature of PersistentObject is that it keeps track of wether an object has unsaved changes or not. This is done through calling the setUnsaved() method which is done by most setters of the subclasseds. Eg: public void setName(String name) { = name;   setUnsaved(); } Now this means that even if the name is set to the same value the object is flagged as being unsaved. One option would have been to have add checking to the userimporter so that it only updateded the name of a user if it was different to the current value: if (!user.getName().equals(newName))   user.setName(newName); However there is no reason why other sections of Bodington shouldn't make use of this checking and it is also a database issue so I moved this code into the setter and ended up with: public void setName( String name ) {   if (name == null)     throw new IllegalArgumentException("Name cannot be null");   if (name.equals(     return; = name;   setUnsaved(); } The checking to see that if name == null is because the database has a NOT NULL constraint and this also prevents a NullPointerException from being thrown if setName(null) is called. I added this checking to org.bodington.server.realm.Users and org.bodington.server.realm.Aliases and a quick test of updating 1000 users gave the results:
  • Always Saving: 1minute 3 seconds.
  • Saving Changes: 30 seconds.
which isn't a bad improvement in performance. One side note is that always saves the object and I could have changed it so that it only saved changed objects but I'm not sure that all the setters of the subclasses call setUnsaved() so I am doing my checking in the importer. The group mapping has come wholesale from some old user import code with a few changes and just needs a little tweaking and it should be nearly ready for some proper testing.

Monday, June 05, 2006

Glassfish and Invalid URL Pattern

I was attempting to deploy WebLearn to the Glassfish application server and was gettting the error "Invalid URL Pattern" for /site/* which is used in various places in our web.xml configuration file. It turns out Glassfish doesn't like extra whitespace in it's url-patterns and we had things like: <url-pattern>   /site/* </url-pattern> when they need to be: <url-pattern>/site/*</url-pattern> I've got the web.xml fixed on WebLearn and will push it across to Bodington at shortly.

Ubuntu Dapper Drake Upgrade

With Ubuntu Dapper Drake being released on the 1st of June I decided to upgrade my work desktop machine today. I just edited my sources.list file and then ran apt-get update; apt-get dist-upgrade. Most of the upgrade went fine, but here are a few things that other people may fin helpful were. nVidia Monitor Detection It seems that the new nVidia driver was autodetecting that my monitor could do 76Hz vertical refresh rate at 1280x1024. Not while it can display this it didn't seem to be able to adjust the screen enough and as a result I was unable to see the two left most columns of pixels. I attempted to fix the vertical refresh rate in my configuration file ( /etc/X11/xorg.conf ) by setting the line VerticalRefresh 50-60 but every time I started X up it would continue to use 76Hz which was outside the range. After quite a bit of Googling I discovered that the nvidia driver attempts to autodetect the monitor settings by default and the way to turn it off is to add an option to the file: Section "Device"   Identifier "NVIDIA Corporation NV18 [GeForce4 MX 440 AGP 8x]"   Driver "nvidia"   Option "UseEdidFreqs" "false" EndSection Pinned Rhythmbox Somehow rhythmbox was pinned in the Synaptic Package Manager, this didn't seem to be affecting apt-get which was upgrading it without any problems. The problem turned out to be that synaptic can have its own list of pinned applications which are stored in /usr/lib/synaptic/preferences. As this only entry in this file was the pinning of rhythmbox I deleted the file and everything worked fine. Packaged Java Ubuntu now packages the Sun JDK which can be install just like any other package. I had a problem though in that all my alternatives (/etc/alternatives) were pointing to the wrong versions. I found a good post on galternatives that allows you to easily edit them.

User Import and Bodington Developers Meeting

The user import code is coming along, last week I added parsing of colleges, courses and departments and some tests against small bits of data. The status handling has also been sorted out so rather than working with strings they are converted to an enumeration. Most of last week was taken up by a Bodington developers meeting in Leeds. The meeting went over two days with the first day being for discussion of short term Bodington developments and the second day for the longer term Bodington vision. Out of the first day we decided to release Bodington 2.8 shortly, most of the code is already in CVS for Bodington 2.8 so once a few extras have been added and some tested has been performed 2.8 should go out the door. Other things discussed included a catchup on what developments have been happening at the various sites that run Bodington and the JISC projects that are using it. The second day we discussed longer term direction including Portals, SOA, SOAP, REST. At the end of the day I just want to build a better product.

Saturday, June 03, 2006

I owe PayPal money

I owe PayPal 56 pence as a result of a currency conversion charge from a long time ago I think. Today I recieved an email nagging me to correct my balance and decided to attempt todo something about it, so I login to my account and attempt to add funds but it refuses to allow it just returning me to the form highlighting the fields that I have filled in sucessfully. So I looked at the email that PayPal had sent me and noticed it ended with: We appreciate your cooperation and invite you to reply to this email with any questions. Yours sincerely, PayPal Please do not reply to this email. This mailbox is not monitored and you will not receive a response...... So I replied to the mail and we'll see how we get on. They do accept cheques to fix my balance but I signed up with PayPal to get away from cheques....

Tuesday, May 30, 2006

Quotas in WebLearn and User Import

Last week I started working on supporting quotas in WebLearn. This involved adding support to BuildingSession to allow the getting and setting of the resource and file quota. So now for any resource you can check and set the quota. I also built some pages to allow users to view the quotas and for sysadmins to set them. Next I need to add support to resource creation so that it always checks there is enough quota before allowing he creation, but this is where it gets a little more messy as there isn't on API for resource creation. I had to stop work on the quotas to start work on the new user import tool as we had the new user import files available. At the moment we get our user details from LDAP but due to a redesign in the data contained in LDAP we need to go direct to the source. The files we get will be processed once a day and any new users created and any new group memberships added. It would be nice if we supported the removal of group memberships this time but I think that will have to wait until the end. So far I have got basic parsing of the users going and need to work on the parsing of the colleges, departments and courses. Then I can start doing user creation and then the mapping of attributes to group memberships.

Sunday, May 28, 2006

Trip to Wales

Had our first proper short holiday away with Wilf last weekend when we went for an extended weekend to Wales (Tenby). It was really just an escape from the South East for us and so we didn't have a bit list of things todo. Just visited castles, beaches and galleries. Took a few photos and bought a print of a painting to brighten up our house. Traveling was fine with Wilf with the main travelling taking about 4 hours and him sleeping through most of it (waking for a feed half way through each time). As with everything else now things take so long, previously we would have crammed 4 outings into a day and now with Wilf we struggle to get 2, but I don't mind one bit. My brother commented "Your becoming old" when I described our holiday to him, but you can't really go windsurfing, climbing and kite flying with a 10 week old baby, I'll just have to wait a few years. Wilf didn't seem to mind being in a different place and I think we prevented him from throwing up on anyone else's furniture.

Tuesday, May 23, 2006

MyWebLearn Bookmarking

Screen capture of the MyWebLearn bookmarking functionality Some more progress on the bookmarking functionality of MyWebLearn has been made which will allow users to bookmark any WebLearn page, adding it to their own space. This is very similar to but is tightly integrated into the WebLearn interface the access controls. Further work needs doing on the internal links inside Bodington to improve the way the links work and are displayed but the basics are there. The reason for developing this is that people often don't want to navigate the whole tree to find the resources that they regulaly use and often don't have a fixed PC so local bookmarks dont work. Hopefully this help solve some of those problems.

Sunday, May 14, 2006

Everyones Blogging

Well it seems even my other half Anna has started a blog although I think the subject matter is going to be slightly different to mine.

Saturday, May 13, 2006

Trends in VLE/LMS

Using Google Trends you can get an idea about how interested general Google users are in different VLE products. A graph of several popular VLEs shows that while Blackboard leads the race the product that is gaing ground in all of this is Moodle which is just about to overtake WebCT. The sharp dips for all the products are due to everyone stopping work around Christmas and I believe spike each year is just as the new intake of students happens and maybe it is also at that time that people start looking at other products for their own institutions. Interestingly although Sakai has a reasonable ammount of news coverage (Moodle has no news) general users don't seem all that interested in it. Maybe this is due to the type of users/teachers that Moodle attracts compared to the people that are interested in Sakai. Sakai is aimed squarely at large instituinal deployment compared to small departmental systems that Moodle supports well. Despite the annountment that Blackboard and WebCT (12th October 2005) will merge there doesn't seem to be much change to the trends, does this show that the majority of people searching are users of the systems rather than administrators? The VLE that I work (Bodingont) on doesn't really feature at all but maybe it will get better?

Friday, May 12, 2006

MyWebLearn Progress

I am currently working on a development here at Oxford called MyWebLearn which is looking to allow all users to have their own space in the VLE to create content. As WebLearn (Bodington) has very flexible access controls they can also control access to the resource they create. Currently I have written the signup process and the basics are working. The next task is to tidy up all the rough edges.

Monday, May 08, 2006

Spring and Bodington

Currently Bodington doesn't use the typical MVC model and each web request maps directly to a controlling template. This makes handling things like errors in forms and redirects difficult. To try and improve matters I have attempted to integrate the Spring MVC framework into Bodington. This is a quick summary of the current state of play.
  • When a request comes in the normal request handling of loading the resource and permission checking takes place.
  • Extra URL handling has been added that means that any file beginning with bs_spring is handed off to the Spring servlet.
  • The Spring servlet has a Bodington specific mapper that looks for the resource in the request and then attempts for find a bean matching /facilityname/page.
  • Control then passes to the bean which is a normal Spring controller.
  • The controller then returns its model and view which are mapped to a JSP.
One problem is that the Spring code depends on a facility name, maybe it should depend on the resource directly? We also have alot of duplication of the facilityname all over the code, it's in the spring configuration for the bean names and it's in the view names that are returned by the controllers. Maybe we could use the package names to infer the facilityname? However this depends on having utility code that is called as you can't have the view resolved do the work as it only knows about the view string and the locale.

Tuesday, May 02, 2006

Plusnet and 8MB ADSL (MaxDSL)

I a couple of weeks ago I received an email from Plusnet saying that if I wanted to I could ask to be upgraded to 8MB ADSL as soon as possible. Looking forward to the extra upstream bandwidth I sent a note to Plusnet support confirming that I would like to be upgraded as soon as possible. I got a confirmation back saying it had been received and I would be contacted when the order was placed. Well when I got home to my computer today I found that the internet connection was down so I logged into the ADSL router and was pleasantly suppressed to see the ADSL line sync at DownStream: 6112 Kbps, UpStream: 448 Kbps. This was a nice increase from 2Mbs and all done for free. Looking at the logs I can see that the router has been attempting several speeds and according to Plusnet it could take up to 10 days to settle down. Speeds it has tried so far since about 17:00 today are: * DownStream: 6400 Kbps, UpStream: 448 Kbps * DownStream: 6560 Kbps, UpStream: 448 Kbps * DownStream: 6560 Kbps, UpStream: 448 Kbps * DownStream: 6496 Kbps, UpStream: 448 Kbps * DownStream: 6560 Kbps, UpStream: 448 Kbps * DownStream: 6784 Kbps, UpStream: 448 Kbps * DownStream: 6656 Kbps, UpStream: 448 Kbps * DownStream: 6560 Kbps, UpStream: 448 Kbps * DownStream: 6112 Kbps, UpStream: 448 Kbps * DownStream: 6208 Kbps, UpStream: 448 Kbps * DownStream: 6016 Kbps, UpStream: 448 Kbps While I have been composing this post it has tried the last two on this list. As this adjustment is going on I am seeing a little more packet loss on the line but I'm guessing this is normal. If your looking at this post today you can see a graph of the disconnects I've been experiencing. This graph shows data from the current day will probably be useless shortly... Although the line speed seems to have increased using Plusnets speed tester I am only getting the expected data rate for a 1Mbs line but this may be due to the fact that BT don't adjust the ATM line restrictions as quickly, but it should happen in the next few days. The only thing that would have been useful is if Plusnet would have sent the emails warning me that this would be happening as they said they would.

Create Patch (Eclipse 3.2)

I have been using Eclipse 3.2M6 for a little while now and I finding it a nice evolution of the 3.1 product. One simple feature that I had been wishing for was the ability to create patches but select individual files which the patch should contain. In 3.1 you could only create a patch for a folder (and all the files under it) or just one file. This was never a big problem as I could just drop back to the command prompt and do it manually, but it just means I can spend more of my life in Eclipse. This first appeared in Milestone 3 under the title of Improved Patch Support. I end up using patch support quite a bit as I work on both WebLearn and Bodington which are similar codebase but stored in separate CVS repositories so using patches is the best way to move changes across. Although looking in the new and noteworthy the ability to record refactoring should help to allow changes to easily be pushed across.

Friday, April 28, 2006

Bodington DB Layer and Empty Subclasses

The Bodington VLE has its own DB layer that allows Java objects to be stored and loaded from the database without having to touch JDBC and SQL. If you have an object that can already be persisted in the database layer and want to subclass it without adding any instance fields (so you don't need an extra table) you can store both the orginal class and the subclass in the same table. The reason this can work is that for every object stored in the database the DB layer keeps track (through the objects table) which Java type it should map to. The SQL to allow this subclassed object to be stored is something like: INSERT INTO classes (type, super_type, db_name, table_name, java_class) VALUES(109, 10, null, null, 'org.bodington.server.resources.NewSubclass') The type being the new ID for this type. And the super_type being the persisted class that you are subclassing. Using this we could have a new resource class for every type of resource and then we can use polymophism, rather than having switch statements and constants.

Thursday, April 27, 2006

GNER and Wifi

GNER Wifi service is designed to provide a Wifi service on trains. On Monday I was travelling back from a meeting on the 16:05 from Leeds to London Kings Cross (in the quiet coach) and I opened up my iBook, straight away it detected the network and I associated with it. However no matter what I did it refused to give me a DHCP lease. So all I can report is don't rely on Wifi being available as it may be broken.

Tuesday, April 11, 2006

Save Parliament! Stop the Legislative and Regulatory Reform Bill

I hope the Save Parliament campain gains a little more media attention again. If you're in the UK please spend 5 minutes reading the site.

Monday, April 10, 2006


Thanks to Joe for pointing me to GeoURL which basically says how you should put location metadata into your web pages. After a quick edit of my blog templates you'll now know that I live in High Wycombe, it would be nice if there were some support from blogger for this so that when you set you location in your profile it could exposes the geourl metadata as well, but hey.

Wednesday, April 05, 2006

Compiling for Older JVMs

I've been doing some development on WebLearn from home today and had to perform build to be deployed on our test server. Now normally I develop on the same software as we run on the production machines but I just happened to only have the Java 1.5 SDK installed on my home machine. Not really thinking I set the target to 1.4 so that the outputted class files were compatible with the 1.4 JVM that we use on our testing and production machines. This is done through our ant build file with something like: <javac srcdir="src" destdir="build" source="1.4" target="1.4"/> The cruital bit I thought was the target attibute. However after shipping the build to the sysadmin guys they come back saying they're getting a stack trace when deploying the application: java.lang.NoSuchMethodError: java.lang.StringBuffer.insert (ILjava/lang/CharSequence;) Ljava/lang/StringBuffer; org.bodington.servlet.BuildingServlet.init (Unknown Source) which shows that although the class files could be run by the 1.4 JVM they used API calls that only exist in the 1.5 JVM. Now at first this is comfusing because the code compiles fine with a 1.4 compiler and classes so what has changed as all the 1.4 API calls should exist in 1.5 (apart from the deprecated ones that were removed). The problem comes from some code similar to this: StringBuffer str1 = new StringBuffer("123456"); StringBuffer str2 = new StringBuffer(" "); str1.insert(2, str2); Under the 1.4 StringBuffer API there isn't the method insert(int offset, StringBuffer str) so the compiler uses the method insert(int offset, Object str). In the 1.5 API again there isn't a perfect match but there is a better one than the Object one, it is the method insert(int offset, CharSequence str) and as StringBuffer implements CharSequence this closer API call is used. We could have fixed the code by changing it to: str1.insert(2, (Object)str2); but this just fixes one call and there maybe others that we didn't find. The real solution to this is to compile against the 1.4 classes when you are using the 1.5 compiler. Todo this you can use something like the following: <javac srcdir="src" destdir="build" source="1.4" target="1.4" bootclasspath="/home/buckett/j2sdk1.4.2_11/jre/lib/rt.jar" extdirs=""/> But of course for this you need to have downloaded the 1.4 SDK so I might as well have used the 1.4 compiler. However it is very likely that the 1.5 compiler contains better optimisations so WebLearn should run faster. However in future I think I'll just use the 1.4 compiler as it isn't worth the hassel. Sun does have a document on this this topic but labels it Cross Compiling. Why both the Sun and the ant documentation don't mention that when using the target option you probably also want to use the bootclasspath option. It seems obvious looking back but it wasn't at the time.

Monday, April 03, 2006

Thunderbird Address Book and LDAP

Here at Oxford University we have an LDAP server that contains details of all the staff and students, although it isn't yet a production service it is available to computing services (OUCS) staff members. At first I tried configuring Thunderbird to access the LDAP server but found that searches took far too long so I just forgot about it and carried on using Thunderbird without LDAP support. Then one of my colleagues Paul Trafford was trying to switch from Eudora to Thunderbird and was complaining that Thunderbird was taking ages to perform searches so I had another look into it. The reason that the searches were taking ages is that it was performing queries on attributes that don't have good indexes and so these required full scans taking a long time. Unfortunately finding and changing the queries used by Thunderbird isn't an easy task. After a little Googling the first thing to come up was a page explaining how LDAP attributes map onto Address Book properties and although this was useful it didn't explain how to change the search that was begin run. A few people have blogged this and someone else's experience with Thunderbird and LDAP was helpful although it didn't go all the way to solving my problems. After a little more Googling I found another page that have details of hidden preferences for LDAP searches. Now using this and the search that Paul had used in Eudora I was able to recreate a useful and fast LDAP search. Here are the lines for pref.js in your Thunderbird profile directory. You should only edit this file when Thunderbird is not running as it gets saved when Thunderbird closes. user_pref("ldap_2.autoComplete.directoryServer", "ldap_2.servers.Oxford"); user_pref("ldap_2.servers.Oxford.description", "Oxford"); user_pref("ldap_2.servers.Oxford.filename", "abook-1.mab"); // Oxford uses an objectClass of oucsOrganizationPerson to show a person. user_pref("ldap_2.servers.Oxford.uri", "ldap://,dc=ox,dc=ac,dc=uk??sub?(objectClass=oucsOrganizationalPerson)"); // Our LDAP stores some other useful data that should be mapped to attributes so we can search on it. user_pref("ldap_2.servers.default.attrmap.Custom1", "universityBarcode"); user_pref("ldap_2.servers.default.attrmap.Custom2", "oucsUsername"); user_pref("ldap_2.servers.default.attrmap.Custom3", "uniqueIdentifier"); user_pref("ldap_2.servers.default.attrmap.Department", "oucsDivision"); // Although we have a displayname field it isn't indexed so search and diplay common name. user_pref("ldap_2.servers.default.attrmap.DisplayName", "cn"); user_pref("ldap_2.servers.default.attrmap.PreferMailFormat", "preferredMail"); // Search on Oxford stuff user_pref("mail.addr_book.quicksearchquery.format", "?(or(Custom1,=,@V)(Custom2,=,@V)(Custom3,=,@V)(DisplayName,c,@V))"); Adding this to your config means that you can search on the common name, barcode, uniqueID and username and it is fast. It doesn't mean that you will be able to use the LDAP server when composing mails though as it seems that some of these preferences aren't used by that code and so the search still looks at the default attributes and as a result very slow here. This has only been tested with Thunderbird 1.5

Sunday, April 02, 2006

New Eclipse Milestone (3.2M6)

A couple of days ago Eclipse 3.2M6 was released with some more nice features. It's getting to the point that I might switch to the new version (currently using the stable build, 3.1.2) for my main version as long as it is reasonably stable. I'll probably have a go next week and see how it goes.

Wednesday, March 29, 2006

Skipping WebAuth Confirmation Screen

At my employer Oxford University we use Stanford WebAuth for web based Single Sign On (SSO). One of my particular problems with the current setup at Oxford is that for each service you sign into you have to confirm that you wish to signon using your credentials to the site. At Oxford this is known as the green tick page. Thanks to the wonderfull Firefox plugin called greasemonkey I can have some JavaScript run when I visit the green tick page to automatically follow the link. The result is a greasemonkey script that means I don't have to click on the link. I've put the script up for anyone else who might find it useful. WebAuth Tick Jumper to use it you should already have greasemonkey installed.

SSH Tunnels and Multiple Copies of Firefox

I'm currently working from home and need todo some testing of a website that is only accessable from some restricted IPs inside my work offices. To get around this I use and ssh tunnel: ssh -D 1080 -N &
  • -D 1080 sets up a SOCKS server listen on localhost, port 1080
  • -N mean no command gets run when the connection is established so it can be run in the background safely (SSH 2)
Once I have connected and authenticate I want to start another copy of Firefox but keep my existing instance of Firefox running for normal browsing. Firefox by default tries to limit you to only one copy being running at any one time and so just starting Firefox up you end up with just another window for your existing instance. There are some other blog posts about running multiple copies of Firefox but they don't have a nice one liner. Mine is: ( export MOZ_NO_REMOTE=1; firefox -profilemanager ) & The brackets mean that a subshell is started so that the MOZ_NO_REMOTE only affects the single firefox instance. This then brings up the profile manager as when running multiple copies of Firefox you have to use a different profile for each one. As I am wanting to test a remote website through the ssh connection I create a new profile called SOCKS and start it. Then once the extra copy of Firefox is running I edit the connection settings (Edit -> Preferences) and specify Manual Proxy Settings with the SOCKS host set to localhost and the port set to 1080. After clicking Ok all my browsing from this copy of Firefox goes through the ssh connection so I can test the IP restricted website without having to run a VPN. The only thing left todo is to install a theme in the SOCKS profile so it is more obvious which profile I am using.

Wednesday, March 15, 2006

Wilf J Buckett

Anna gave brith to Wilf J Buckett on Friday the 10th of March 2006, he weighed 6lb 12oz and mother and baby are doing well. It doesn't feel all that different being a father I just see to be getting less sleep. The one change I have noticed is that now rather than watching TV we spend time watching Wilf. I just hope I can remember that not everyone want to hear about all the things Wilf has been doing that day.

Wednesday, March 08, 2006

Clipboard and the Web

Through a post by Phil Wilson I found a really beautiful idea about how the clipboard should work on the web. It show a rather slick way of allowing data to be transfered between web applications and also between desktop applications and the web. To understand it best just watch the screencasts. This sort of feature is one that needs consitent implementation across applications to ease the learning curve for users but with features like this the future does look good for applications being delivered through the web.

Thursday, February 09, 2006

e-Literate: Stephen Downes Missed the Point

e-Literate: Stephen Downes Missed the Point has some very good discussion on VLE/LMS architecture/development/direction. It seems to have some parrallels with the questions I have regarding the project Tetra. And what do I think? I'm very skeptical about this idea that we are close to a framework where we can take tools from different places and plug them together. If the tools are written against an existing product that defines the layers and APIs then everything works (like plugins for Firefox), but we are trying to get Firefox plugins to work in IE. For me the biggest problem in the VLE world is dealing with authentication and authorization which is often ignored by other web applications as they don't have materials that they need to restrict access to. Eg, Amazon doesn't have some books that only some people can see, so it's web APIs can be simple. In the VLE most of the current material is restricted and authorization checks have to be peformed.

Tuesday, February 07, 2006

Sakai vs. Moodle |

Sakai vs. Moodle is a rather one sided look at two VLEs that are in the news at the moment. The problem with the comparison is that they are Apples and Oranges, Moodle was designed from the ground up to be a small VLE developing all its tools internally, Sakai was designed to intergrate existing tools developed in Universities into a campus wide VLE. The task Sakai had in had is an order of magnitude harder from both a technical and management point of view. I don't believe that any small department would seriously look at Sakai as a product, it is too big and complex. But because of the simplicity of Moodle several institutions are looking at/have(?) deploying it on a large scale. Does that make Moodle better?

Friday, February 03, 2006

WebCT Blogs and Bodington

Tama’s eLearning Blog - WebCT and *ahem* “blogs” notes some of the problems with the blog tool in WebCT. What is quite interesting is that as social software and elearning tools crossover they begin to collide. In this case it is that traditionally VLEs have required a login (even if it is a guest login) before a user can participate, but something like a blog is normally completly open so when you put a blog inside a VLE should you require people to login? We here at Oxford are trying to change our VLE (Bodington/WebLearn) so that it no longer requires a login to access public pages this makes it easier to intergrate with other tools as they don't have to have special support for our VLE. Although there isn't a blog tool in Bodington if there was it would be a choice for the owner of the blog as to wether to allow public access to the blog. Bodington likes to put control back to the owner of a resource which for the most part is where it should be.

Thursday, February 02, 2006

JavaScript Variable Scoping

Although JavaScript has variable scoping it only has two scopes, global and function. Being a Java programmer this caught me out as I had some JavaScript in a function:
  var i = 8;
  if ( i ) {
     var i = 2;
Which I thought should tell me i was 8 at the end as the second i should only be available in the if statement but actually the i declared inside the if is the same variable as the one declared on the first line.

Wednesday, February 01, 2006

Firefox 1.5 and LiveHTTPHeaders (0.11)

One of the most useful plugins for firefox is LiveHTTPHeaders which allows you to see the headers for both the requests and responses for web pages. When I upgraded to Firefox 1.5 I found that I needed a new version of the plugin (0.11) and although most things worked Right Click -> View Page Info -> Headers didn't. It was giving an error message about how I needed to copy a file out of my profile and into the Firefox installation directory. After trying this a few times and it still not working I went to the bug tracker and found a bug with my described problem. Installing the attached package makes the plugin work fine again. Yay.

Saturday, January 28, 2006

Implementation of Standards (HTML & more)

As has been picked up all over the net Google has published some analysis of HTML used on the Web. Now some of the bloggers in the e-learning arena have picked up on this, notably Stephen Downs and Scott Wilson. I generally believe that people will only adhere to the standards when they are force to do so as normally it makes their life harder. As web browsers were lax in what they accepted most people were lax in what they produced, and it may be that web would be very different if web browsers strictly implemented the standard. So as long as the software implementing the elearning standards is strict in what it accepts and what it produces this should be ok. The problem is that at the moment this isn't the case. Our experienced in the Bodington community is that content packages produced by other elearning products need special code to deal with them as they are typically outside the specification. However at the moment the number of products is reasonably small so this is possible. And as Bodington is one of the smaller communities at the moment it is in our interest to interoperate as well as possible with the other products as people evaluating Bodington will blame Bodington if it fails to import a WebCT content package, rather than questioning if WebCT can product a correct one in the first place.

Saturday, January 21, 2006

Changing Unit on EKS Scales

We have some EKS Scales at home and they have no instructions and are stuck in imperial measurements. Here are my quick instructions for them.

Resetting Scales

Hold down the on button for 3 seconds and then release. After releasing - - - - should apear on the display (sometimes this doesn't) and the scales will reset to zero.

Changing the Units

Wait for the scales to turn off. Press and hold down the on button, 8 8 8 8 will apear on the display, continue to hold the on button until just the units are displayed (either lb oz or kg g) then release the on button and clicking the on button will move between the different units. Holding the on button down until 8 8 8 8 apears will save the setting. Hopefuly someone else will find this information useful.

Monday, January 16, 2006

Throwing nulls as Exceptions

Any ideas what will happen with this bit of Java and why?

        throw null;
    catch (Exception e)
        if (e == null)
                System.out.println("Exception is NULL");
                System.out.println("Exception is: "+e.toString());

Saturday, January 14, 2006

New WebLearn Build

The bug push is now over. The new build of WebLearn went live last Tuesday with a bug fix upgrade on Thursday to fix some minor errors. There are still a few minor bugs in the build which will probably get fix shortly and deployed in the next couple of weeks. Development now starts on the next set of WebLearn features. It looks like I'm going to be working on automatic login to WebLearn. Basically this means that to access public material on the site you won't have to login as a visitor, you will be taken straight to the material you requested. Currently if you link to a page inside WebLearn when a user follows that link they get asked to login before being allowed on to the page, one option is to login anonymously (if they aren't already logged in). This change will change WebLearn to work in a similar way to most other web applications where you can browse the site without logging in and only when you need to do you login. As a side effect of this change all public material in WebLearn will become indexable by search engines and so you will be able to find WebLearn through search engines such as Google and Yahoo. To try and help managers of material we are hoping to make it very clear which resources in WebLearn are publicly accessible but the method for doing this has yet to be decided.

Monday, January 09, 2006

Brutal Tomcat Shutdown

We have a copy of tomcat that is used here in Oxford for our automatic testing and it regularly runs out of memory due to the frequent reloads as the classloader memory not getting cleaned out. To prevent this from happen I attempted to restart tomcat every night in a cron job:

0 3 * * * ($HOME/tomcat/bin/ stop -force > /dev/null; sleep 10; $HOME/tomcat/bin/ start > /dev/null)

However it seemed that this was not working, the -force option was supposed to kill the tomcat process after attempting to shut it down gracefully. After reading through the script I discovered that I needed to set the CATALINA_PID variable for the -force to work. It would be nice if catalina would warn when you use the -force option without having a PID file.