Sunday, 23 December 2007

Linux - Ubuntu RAID, Modules, my NAS and Fun

Ahh to be minimalist.

It all started over 3 months ago. I have taken to decluttering our house, a personal goal. (I should have set a KPI or two). One task which leads me down this path is digitizing my music and movies.

Many years ago, Sara and I decided to never rent another DVD, but to buy them. We don't buy the latest, unless we "really want it" but will buy the DVD's on special etc, or ex-rentals. Needless to say, from the past 3 years of DVD buying we have a very impressive collection, tipping the 200 mark. Digitising this lot of DVDs has caused a ripple into the Linux world. As you can gather, it takes up lots of space.

To start with, I got rid of some old servers, but kept their hard drives, as my goal is (has been) to setup a home NAS (Network Attached Storage) as the central location for all CDs and DVDs.

So, sitting in front of me is the glorious resurrected "icemaster" (i have to find my chroming again). I have the usual mechano (tip for extending drive bays) and the following drives. 40G x 3, 80G x 3 1 x 160G and 1 x 200G.

My layout is essentially to have a 40G RAID1 / and the rest in LVM as RAID5 devices (split up by like sized partitions).

To have so many drives, of course I needed to extend the IDE channels. (not upgrading to SATA yet.. maybe another few years). I purchased off eBay (new) 2 ITE 8212 IDE RAID cards. The Linux module these use is the it821x or, in the newer Ubuntu 7.10 series, the libata based pata_it821x module. This is where the fun begins.

The card supports RAID0, RAID1, RAID1+0 and JBOD, and just a standard IDE bus. I wanted the latter as I am doing all the RAID in software using the mdadm tools. (very nice).

I have been bouncing around with a few hurdles, and I will detail them each in turn, and how I have solved the, I am sure someone will come up with a similar problem.

So to give a rundown on the hardware we are using.

It's a P4 1.8Ghz, 256M RAM (yes is low spec, but it doesn't need to crunch galaxies, just serve files).
I have a Wireless RT61 based Network Card. (used to always user the rt2x00 serialmonkey module.
There is the standard USB and AGP display card, and an extra 10/100 NIC (8139too if I recall).

Problem #1 - noticing the sd[*] instead of hd[*]
So when I first dived into installing Ubuntu 7.10, I noticed that my IDE harddrives were now appearing as /dev/sd[a-z] drives, instead of the older /dev/hd[a-z] devices. This is now because of the new libata subsystem, supporting PATA and SATA (and SCSI ??) drives. This had a little flow on effect, but largely okay. (just a symantec changed, and I like it actually)

Problem #2 - Recognising the IT8212 PCI RAID Controller.
My first figured I would install onto one of the 40G hard drives, non RAID1 and standard swap of 2G and then setup my RAID and go from there. As I started before the IT8212 cards arrived from Hong Kong (yes it was cheaper then walking to the hardware shop), I actually setup a 150G RAID on booting 40G partition and transferred all my movies and music (and Gigs of data) onto the RAID array. This was on the 2 IDE buses, ie, no CDROM (DVD) after the initial install.

.. then the cards arrived.

The IT8212 by default, in Ubuntu uses the pata_it821x module. In older documentation people have asks some odd questions of this device, but it all seemed to centre around it being used as a RAID controller, ie doing RAID in hardware (good) , but I am using RAID5, which it didn't support.


On first boot, only the primary drives showed up. ie, if the it8212 had a 80G,40G and 200G,160G. Then linux would detect the 80G and 200G only. (odd I thought, must be config).

So into the BIOS of the card, and changed the config to IDE, from RAID (stripe).
Come back out, reboot Linux. no change.

Problem #3 - detected order of the /dev/sd[a-z] changed
Now, I'll pause for a moment from probem #2 and describe #3. when the drives were plugged in, my two boot drives (2 x 40G) on the first IDE bus, were moved from sda and sdb to AFTER the detected drives on the PCI card. For example.

before the IT8212 is plugged in the drives were
/dev/sda (40G)
/dev/sdb (40G)
/dev/sdc (DVD Drive)

After plugging in the card (with the problem #2 still occuring) you see the following

/dev/sda (80G)
/dev/sdb (200G)
/dev/sdc (40G)
/dev/sdd (40G)
/dev/sde (DVD Drive)

Now, the fix to this problem is possibly two areas. One is more robust, the other I had NO success, but new it would work if I persisted.

Solution to Problem #3
When Ubuntu installs, it uses UUID's in the fstab and grub to refer to a harddrive instead of using the known (possibly changing) device location. Let me explain as I see it. A UUID is a Universally Unique Idenitifer and can be created and used as a ID for something. We use them in software / computers all the time. In this case, the Harddrive has a UUID. I don't exactly know who creates them for the drive, but they do exist. Actually, it's the partitions of the disk that have the UUID that we are interested in, the UUID of the harddrive (I assume there is one) is not that important.

So, to make it so it doesn't matter "what" your root drives come under, you use the UUID everywhere to refer to it. that way, when it changes (ie moves, unplugs on the bus etc) it will still be a known device. To locate the uuid for a partition, just use vol_id

once you have the UUID, pop it in the fstab and grub.
/etc/fstab

UUID=897ef-ab67c6-7785-abcb2 / (etc)

and /boot/grub/menu.lst

Now once I had tested that I could (re)boot back in regardless of the "sd-ness" of the drive, I could go back to solving problem #2

A little better (I feel) than the UUID=... is to use LABELs .. this seems to work nicer. But it could clash if you later add a new drive from another system which has a matching label.

So make it easier for me, I just prefixed the label with the short version name of the server. so

icem-root
icem-swap

Getting initramfs to recognise my LABELs (and the UUIDs) was a tad hard.

Solution to Problem #2
Problem #2 was actually simple. and I learnt something too.

The PCI RAID controller had to be told, via the module to load in passthrough mode so it was recognised as just an IDE bus and not a RAID controller.

With the change to libata, the it821x module changed names (and code) to pata_it821x. I read some places about the noraid=1 option for the module. So I tried a few things. Firstly.
if the module (pata_it821x or it821x) is compiled into the kernel, and not as a module, then getting this paratemer into the module is via the kernel parameters (ie, in the /boot/grub/menu.lst)
so .. kernel image ... ro it821x.noraid=1 ...
note the '.' dot in between the noraid and the module name.

I intially got stumped between the pata_ and the (sans pata_) other module.

the pata_ is the newer module, so for Ubuntu 7.10, the it821x doesn't exist (isn't compiled). but I also worked out that the "kernel" line is NOT for modules loaded on the fly.

so .. after I realised that my 7.10 had no it821x, i tried kernel.... pata_it821x.noraid=1
This would have worked had the moduled been compiled into the kernel. not to be.
So .. it was actually meant to be done the way I knew from way back when days.

/etc/modprobe.d/options

option pata_it821x noraid=1

That makes the it821x show the drives on the bus as "just" drives and NOT for linux to detect them as RAID drives. YAY! they all showed up.

Monday, 17 December 2007

Nokia N95 8G, iSync, 3 (Three), music, internet and my Mac

So, recently I have switched over my phone service to 3 (three) for the sole purpose of being a roaming IT worker. Since beginning of October I have found myself without internet access in a lot of places, so I switched to being mobile using three's Internet plans.

To do this, of course it necessitated the phone upgrade (actually, my previous phone was a good 3 year run, the shoicking iMate K-JAM (don't buy)). so .. back to Nokia I went.
The Nokia N95 8GB is the upgraded version to the N95 with the major difference being (for me) all the N95 stuff (software etc) didn't work.

So .. what I wanted was to have this N95 as my modem through bluetooth. To be able to sync calendar, pull photos off it (5MP camera, nice) and to have some stuff to listen to such as my usual Java Posse).

TO do this, seemed simple enough, but there were a few hurdles.

First. Internet access, the most important.
Turns out I had to locate som different modem scripts.
The following was my source of succcess.

<<==-- Internet Access --==>>
Download the Modem Scripts from Ross Barkman (thanks!)
You want the ones labeled Scripts for Nokia 3.5G (HSDPA) phones (26kB): Nokia 3.5G Scripts.

Once you have put them in place ("/Library/Modem Scripts")
you will have three new scripts to use. Nokia HSDPA CID[1-3]

Pair you phone through th Bluetooth preference panel.
Once paired, go into the Network Preference Page.

The following settings are what I have used.

[PPP]
Service Provider:
Account Name: 3netaccess
Password:
Telephone Number:
Alternate Number:


In PPP Options, the following are "ticked"
[v] Disconnect when user logs out
[v] Disconnect when switching user accounts
[v] Redial if busy
[v] Send PPP echo packets
[v] Use TCP header compression


[TCP/IP]
Configure IPv4: [Using PPP]

[Proxies]
** no proxies configured (i don't use any)

[Bluetooth Modem]
Nokia HSDPA CID1
I have unticked the following
[ ] Enable error connection and compression in modem
[ ] Wait for dial tone before dialling


Now that that is all done, see if you can connect :-) the "3netaccess" account name, I found that from the following link

<<==-- iSync--==>>
So, Internet Access was probably the easiest (1/2 hour research).
iSync proved to be a good week battle of trying many iSync plugins until I happened upon the only one that worked which was this.

iSync Plugin for Nokia N95 8GB

Thank you Jussi Edlund!

With this iSync plugin installed iSync works.

<<==-- Music and Pictures--==>>
So, Nokia, they have this "almost" right .. the camera appears in iPhoto, but the phone doesn't appear as an iTunes "player" (probably can't do that).
But anyway .. from Europe, download the following ..

Nokia Multimedia Transfer

Hope that all helps someone!
Have a great day.

Tuesday, 18 September 2007

Firefox Files (Crap) on the Desktop on OSX

When you use Firefox and OSX, you will come to realise a VERY annoying feature / quirk of it's file management.

If I download a file, I can choose where it goes (change preferences to set this feature) and that is all good, however, if I click on, say a PDF, to view it (and say "Open with Preview") then the file is downloaded, wait .. on my desktop, and left there. After sometime, you will notice a heck load of files that you have viewed, Excel Spreadsheets, PDFs, word documents, mp3 etc. and they just make a mess of what is supposed to be a clean machine (clean lines, clean desktop :-).

So there are two solutions to this problem.

1. Changing the directory

Firefox saves these files based on the "setting" in Safari about where files get downloaded to.
So the quick fix here is to open up Safari (never used it, except for downloading firefox :-). Go to preferences, go to the General tab and change the download directory to somewhere other than the Desktop.

Now, I actually changed it to /tmp/ (I actually have a sym link in my home directory for tmp (always have done this) .. so firrefox now dumps all these "saved" files, to /tmp/ and /tmp gets purged every shutdown/reboot.. yay! No more mess.

You could of course, make a directory in ~/ called Downloads, and then you can peruse and remove at your leisure. Might be handy to do that when you needed to "review" a file you viewed whilst "online".

2. Tell Firefox to delete these files when it shuts down
In firefox about:config, you need to add a new config parameter that tells firefox to remove these files when it shuts down.
The key is app.helperApps.deleteTempFileOnExit and needs to be a boolean.

I haven't tried this way, but don't like it much because it means, if firefox crashes, or the mac crashes, firefox won't delete the files for that session. And also, my firefox is often open for weeks at a time, never closed, so the files will still grow in number.

Of course, you could combine the 2.

Thursday, 2 August 2007

GWT Security Concerns - Object Injection

GWT opens a new way of building webapps with it's different AJAX development method(ology) and it also opens a new area of "hackable" software.

I will term this new area "Object Injection".

GWT makes the AJAX development easier as the code is written in a type safe language (Java) and "compiled" out to JavaScript, dealing with all the browser inconsistencies and nuances behind the scenes. You still have the ability to write JavaScript via a Java Native Interface (JNI) and "look and feel" is done through CSS.

The process of writing a GWT web application is to write your GUI web interface or just parts of it in Java , and using the GWT compiler (a java app) compile your Java to JavaScript.

What the GWT does, is trawls your code and converts what it finds to JavaScript. So if you use ObjectA as a POJO to make changes to a Drop Down Select Box (for example) then this ObjectA will be exposed (someway) as JavaScript.

One of the niceties that the GWT Java-to-JavaScript model brings up is that your direct domain model objects can be worked on at the front interface without the need of DTOs, a code/time saving benefit.

So .. code might look something like (simplified by not showing the Asyn Callback()

MyDomainObejct obj = finder.findDOByID(myId);
myTextBox.setText(obj.getName());

Now this code above would be compiled to JavaScript and, even with the GWT J2JS compiler set to "DETAILED" and not OBF(USCATED)" you get some pretty hefty JavaScript.

This is some real JavaScript from the GWT in Detailed Mode, showing a "BusinessEntity" object. (a POJO if you like)

function pkg_BusinessEntity(){
}

_ = pkg_BusinessEntity.prototype = new java_lang_Object();
_.java_lang_Object_typeName = package_pkg_ + 'BusinessEntity';
_.java_lang_Object_typeId = 4;
_.pkg_BusinessEntity_abn = null;
_.pkg_BusinessEntity_commencementDate = null;
_.pkg_BusinessEntity_isCurrent = false;
_.pkg_BusinessEntity_isRegisteredForGST = false;
_.pkg_BusinessEntity_name = null;
_.pkg_BusinessEntity_postcode = null;
_.pkg_BusinessEntity_registeredState = null;

So, where is the security concern ?

Well in some cases, such as most of my world, the domain model is fairly significant, with trading / transaction objects and user accounts with passwords, and money and settings etc etc.

So if, in a GWT class, someone were simply to expose the DAO (Data Access Object) service such as

service.storeObject(myUserObject);

Then, essentially we have, assuming there is no or not sufficient checks, a direct way to store objects into the domain (database) from the browser.

I will call this "Object Injection" in the same vain as "SQL Injection". An Object injection hole is could really be termed as a

Poorly Coded RPC interface which does not check values and who and why of what it is "doing".

Of course, finding an Object Injection hole in a GWT application is going to prove fairly complex to do (but surely not as hard as searching for a buffer over/under run). The types of tasks that a would-be inquisitive person might have to do is as follows.

  1. De-Obfuscate the javascript to find the "meaningful" objects that are being worked with
  2. Dump out all the "Service" methods and try and tie these to where they are being used in the GUI (Web Application).
  3. Identify the types of objects which would provide benefit of "changing", such as Bank Accounts, Limits, restrcitions, user account saving (like storing settings etc) and locate how the interface is working with the RPC method calls
  4. Write some Javascript to create a generated GWT object .. and send it to the RPC API.
The code that you write in Java to be compiled to JavaScript, is now available for all to see.
Anything you do, can be repeated, out of step with simple JavaScript calls.

If parts of the domain model are exposed, this is a good place to start hacking.

The problem is no different with current non-GWT AJAX development. It is just that GWT does a lot of smoke and mirror compiling for you and if you unwittingly create holes in your Java RPC methods (without sufficient checks), then those holes are exposed for all to see (and play with).

So what techniques can we use to prevent us making such a holes. The following would be my quick and dirty set of rules.

1. Don't expose the domain model to the GWT compiler.

Why ? Well for a few reasons

(a) It will expose your internal workings of the application to the browser, and this is one place where the would-be nefarious person can start looking to analyse your application to understand its structure, thus weaknesses. You might think, "Gee, that's a lot of work when this JavaScript is obfuscated", but think for a second, is there anywhere in you application that one can profit (monetarily or otherwise) from your application, giving them an incentive to start the work ?

(b) To expose a Domain Model Object, means you probably want to "change" and store it, which means the DAO service (or similar) will have to take an object and save it to the database. Simply, find that DAO Service and the right object that it requires, and I can then start poking in (Object Injection) other object values to see if I change some stuff at the server.

2. Have Users / Accounts and Password - and Audit them
If you use usernames/passwords and sufficient checks in the right places you will have the ability to log who is doing what. Another way to think of this is, minimise the amount of GWT you use to people who are not authenticated against your application.

A way that I always achieve this these days is to use Acgei Security with Spring and webfilters over the whole application. This way I know which users are hitting which RPC methods and could audit appropriately.

3. Expose ONLY that which needs to be - Use DTO's, simple APIs, push "work" to the server, Separate Projects.

Use DTO's to push and pull your data back and forth. This way you can check the DTO values before storing and also know what is being exposed at the front end.

Use Simple API's - Make you RPC methods simple (ie, don't push too much back and forth). This is probably just a KISS principal and will help in making sure you don't inadvertantly expose that which shouldn't be.

Move as much logic as you can to the server. This ties in with the previous point. A simple API logically means that the complex stuff is on the server. This way, code which does not need to be seen, is not seen. Plus the burden on the browser *our unknown environment* is kept at a minimum (which has an added bonus because we don't want the help desk of our application getting complain calls saying the application slows down their browser thus tarnishing the "company".)

Separate Projects - One method I am using in my GWT apps is to have a separate project for GWT compiled code, for server (RPC and JSPs etc) and business / domain model. I use Maven 2 and a parent project compiles and assembles all 3 pieces together. So three projects, but ONLY one is the GWT compiled code, so I know that, anything potentially that goes in this client project, is going to be in JavaScript. Anything OUTSIDE that project, is hidden from the browser.

4. Secure all RPC interfaces
Use Aspects or Filters (I use Acegi Security) to secure your GWT RPC methods. You will be able to lock down the method calls to the user and also ensure that the "objects" are allowed to be transported that are.

I hope this little entry helps people see the potential issues and raise awareness in their designs about this potential security "feature".

It is nothing new, of course, but the way in which GWT development is done, can have an impact on the "bad applications" we write if we are not careful.

Wednesday, 25 July 2007

GWT Fun

Over the past 4 weeks I have been building an impressive list of what to do an what not to do when it comes to GWT. In my last post I mentioned an issue with JSP 2.0 and GWT.

Tip #1
Tonight, I came across a very similar issue with IE6. (might be IE7 problem also).

the Script tag which you include the GWT module with looks like

<script .../>

But , if that is how you do it, IE won't like it. In fact it will render a big blank page with no error and "Done." down the bottom.

You won't even see an ounce of the "page" even if GWT is not the primary content.

The quick fix is to change the single script tag to have an open and close (and put some space between them).

That took all of about 20 minutes to discover.

So far, my verdict on GWT is 80% for and 20% against. I am certain the 20% against, ie things that I can find I don't like, are because I have not yet worked out the "good" or proper way of doing things.

Tip #2

Some people have mentioned about needing the rpc interfaces in the client package and that they don;t like it. Well I didn't like it either. I have an rpc package.

I found a sneaky (smelly) way of just having arbitrary "code" for GWT to find.

Create a "BaseGWT.gwt.xml" and point the entry-point class at an empy implementation of EntryPoint. But in this gwt.xml, set all the source folders that you want.

Then in the "other" modules, inherit it. Now, the compiler (maven2) complains and errors, but because of a bug in the maven2 GWT plugin, it keeps going (and doesn't fail) and the other modules happily inherit this dummy GWT module and all the trappings (other source folders).

When I find a better way of doing this, I'll holler.

Wednesday, 20 June 2007

GWT and JSP 2.0 (Oh the Pain!)

So it's been four weeks now and I'm moving along at a steady pace with GWT.
I found by shear accident that a new feature added into GWT 1.4 is that you don't need the gwt.js anymore or the <meta..> tag in the HTML and you can just include the module.nocache.js file directly. (good change guys).

I have been trying to host my modules (GWT modules) into an already existing JSP page that is written in JSP 2.0 of the XML variety. I find these good because it means you are forced to create valid HTML markup. Anyways, combining the two, GWT and XML JSPs meant I was completely unable to get the GWT modules to attach and appear on the page.

I was getting JavaScript Errors, j.write is not a function. Now, the GWT compiler is compiling by default using the OBFuscation setting. Which means that the JavaScript, is well, obfuscated, a little hard to read. Not knowing yet what the other settings are to change it (I think there is detailed and something else I saw) I dove (breifly) into the JS to see what j.write was all about.

It took a bit to track down the problem.

Essentially weird things happen to your markup, it gets all compressed to one line (obfuscated HTML) and open, no content, close tag elements get changed to /> elements to make it more valid XML. For those that have worked with JSP XML documents (ie, the <jsp:root> ...</jsp:root> variety) will know what I am talking about.

The problem is that a jsp marked up this way just fails straight away with GWT in the works. I recalled a similar problem I had two years ago that was making javascript fail because the <script> tags written in the jsp as open <script> and closing </script> with no content in the middle (ie, a <script src="somefile.js"></script>) will get collapsed in the browsers view of the HTML as <script .. />. ie, just one tag.



To get around this JS error, back then, was to add a CDATA and put a comment delimeter in the CDATA. This way the JSP XML parser would preserve the open and close tags.

This looked like the following ..


<script src="somfile.js"><![CDATA[<!-- -->]]></script>


Which will bre rendered correctly as

<script src="somfile.js"><!-- --></script>

And the problem, can't remember what it was, but javascript loading issues, would work.

Alas, this was not my problem with GWT as I changed the <script ...></script> and the <iframe.../> to have this CDATA hack and it didn't resolve my j.write is not a function error.

So .. the quick simple test, cut and past the output of the JSP that has the GWT <script></script> in it, and put that as an HTML file and try that. So, refresh my broken page, view source, cut and paste the ugly format (hard to read for others) HTML and put it into a new .html file and load that, it worked.

So I have two files, tester.jsp which has the <jsp:root> top and bottom and then a html file which is the "output" viewed from the browser of this jsp file. So as far as the browser is concerned, hitting the /tester.jsp and /tester.html renders exactly the same looking HTML.

Which lead me to only one conclusion, it was not "format" of the HTML presented that the browser was barfing on, (and then GWT not loading) but rather the Headers that the server was sending that caused the problem.

Loading up firebug (just clicking my little firebug icon in firefox.. thank you thank you firebug authors), I inspected the headers of the tester.jsp and tester.html, low (lo) and behold,

tester.jsp : Content-Type: text/xml;charset=UTF-8
tester.html: Content-type: text/html

This then leads to the quick fix, change the content type for my JSP.

<jsp:root version="2.0" jsp="http://java.sun.com/JSP/Page">
<jsp:directive.page contenttype="text/html; charset=UTF-8"/>

And then all is good.

So, what is happening here ? It would appear that the browser (Firefox 2.0 on Mac OSX in this instance) is not creating a DOM that the GWT javascript can interact with when the content type is XML as opposed to when the content type is HTML.

This is evidenced by the line in question in the GWT module.nocache.js which is

j=document;

And a few lines in,

j.write("<script>...</script>")

From this called attempt to write out some dynamic javascript to the page, the specific error message obtained is as follows

j.write is not a function

Which just shows that the DOM is not how it is supposed to look. I am sure someone from the firefox community could explain that one a little more. (to how it structures the DOM based on content-type).

So this one was not because of GWT pain, but JSP pain :-)

All good so far. I have a few more improvements to the m2 GWT plugin which I will post in the next few days.

Saturday, 9 June 2007

GWT and Maven 2, OH the pain!

GWT - Maven2 and Eclipse .. Ouch!

Using GWT for all of 3 weeks now, I am finding it is nice. It's logical, well thought out and simple.

But, I think there is a little way to go when it comes to using some de jure standards. I am a big Maven 2 fan because in an Enterprise (captial E for 'ooh' fancy) environment, standards go a LONG way and Maven 2 dictates a nice and well understood way of doing things.

So, what's me beef with GWT ? Simply this, gosh it has been hard to get it working with Maven 2.

Now, part of this is because I am still trying to get my head around how it works. I have been using the applicationCreator script to setup a quick hello world sample, then move that into the src/main/java and src/main/resources folders.
But what does that do to my src/main/webapp ? where do the images get pulled from ? why don't index.jsp load in the GWTShell (when it's tomcat underneath) ? So many questions.

So, there is a maven 2 plugin. I am currently using the 1.5.2 version of it.

Google Code Homepage : http://code.google.com/p/gwt-maven/
Maven 2 Plugin Doc : http://gwt-maven.googlecode.com/...
A list of some problems : http://code.google.com/p/gwt-maven/wiki/FAQ
Maven Repo : http://gwt-maven.googlecode.com/...
SVN Repository : http://gwt-maven.googlecode.com/.../

and as it turns out, there are a few bugs and features which need some work. (I wish I saw that FAQ page first before diving in)

The first issue I cam across was the "plural" vs "singular" naming convention. In Maven 2, a configuration item for a plugin that requires more than one value is pluralised .. ie

<options>
<option>...</option>
<option>...</option>

</options>

What's the problem ?

Well, the com.totsp.gwt plugin needs a configuration line like

<compiletarget>org.sobbo.ui.Home</compiletarget>

However, this config value in the plugin source needs an [] array or targets. (logical yes) but m2 plugins, dictate that the property name is then plural (compileTargets) otherwise you get this kind of error message.

[INFO] Failed to configure plugin parameters for: com.totsp.gwt:maven-googlewebtoolkit2-plugin:1.5.3-SNAPSHOT

(found static expression: 'org.sobbo.ui.client.Home' which may act as a default value).

Cause: Cannot assign configuration entry 'compileTarget' to 'class [Ljava.lang.String;' from 'org.sobbo.ui.client.Home', which is of type class java.lang.String

so .. the AbstractGWTMojo has to change to have a plural of configurationTargets (there must be a way else how would they be using it right now ?

It turns out there is an issue logged for this .. http://code.google.com/p/gwt-maven/issues/detail?id=37

Anyways, what's the next pain ? well to checkout the src from the subversion repository, you get this nice little gem.

$ svn co http://gwt-maven.googlecode.com/svn/trunk/maven-googlewebtoolkit2-plugin/

svn: REPORT request failed on '/svn/!svn/vcc/default'
svn: REPORT of '/svn/!svn/vcc/default': 400 Bad Request (http://gwt-maven.googlecode.com)

BANG, and there is my pain. This looks like a straight up Google SVN problem. Funny enough, I have an SVN downloaded script that works to "download" src from SVN repositories using wget.

See the end of this post for the shell script.

So .. my third issue ? Well there was some unusual "Java Execution Mojo Bootstrapping" stuff going on, and it's broken. Essentially when trying to launch java to compile (testing, shell etc) it (the GWT maven plugin) couldn't find it (java) and died. I found the problem (after downloading the src and corrected it) and logged this issue with a patch @ GWT M2 Plugin Issue 43

So .. is it ready ? This Maven 2 plugin really needs to be working, maven 2 is big and GWT / M2 / Spring / Eclipse is pretty important (to me at least).

I'll keep battling with it because this way I can probably make it a little better. I hope that it gets better.

So, that SVN downloader script, it's not completely "safe" because the revision could change by a third party part way through your download, but hey, it works.
#!/bin/sh
# Author: Ramon Buckland ramon#at#thebuckland.com
# Quick hack script to pull the latest SVN Revision version from a repo repo when you have no SVN tools
# or SVN through ISA proxy servers are just not working ..
#


function usage {
echo "$0 "
echo "svn-url: SVN URL is a URL to the trunk or a tag you interested in"
echo " eg: http://svn.apache.org/repos/asf/incubator/servicemix/trunk/"
echo "product-name: A short name of the product so that we can create a directory for you"
echo " eg: servicemix"
echo "(also, set the http_proxy=http://hostrunning-ntlmaps:port)"
exit
}

if [ "X$1" == 'X' ]; then
usage
fi

if [ "X$2" == 'X' ]; then
usage
fi

if [ "X$http_proxy" == 'X' ]; then
echo "WARNING: http_proxy is not set. Do you need it ?"
fi

SVN_REPO=$1
PRODUCT_NAME=$2

mkdir ${PRODUCT_NAME}-svn-pid-$$
cd ${PRODUCT_NAME}-svn-pid-$$

# nv=non-verbose
# nH=noHost directory created
# -np=no ascend to paremt dirs
# --cut-dirs=3
# -erobots=off .. don't look at robots.txt to see what you are and aren't allowed to do
# -m mirror

wget -nv -nH -np --cut-dirs=3 -erobots=off -m ${SVN_REPO}

NEWDIR=`grep Revision index.html | grep h2 | cut -d':' -f1 | cut -d'>' -f2 | tr ' ' _`
cd ..
mv ${PRODUCT_NAME}-svn-pid-$$ ${PRODUCT_NAME}-svn-${NEWDIR}
find . -type f -name index.html | xargs rm


Hope that helps someone. I use it a bit here and there.

Tuesday, 5 June 2007

Switching Domains

In my new role I have been pouring my brain into a new domain (industry) for the past 2 weeks.

For the years I have been writing code, moving from one language to another has never a big issue. ie, from Pascal to Modula-2, C to C++, Perl in the middle, Java from the dawn of ages, dabble in C#, poor over XML and become a quasi-guru in XSLT's. So when a new language comes about, it's a matter of learning a new language construct or few (expansion of the language domain) and then give it a crack. Being bi-lingual is not too difficult. It is probably because not a lot of things have been written under the sun recently that bend the brain.

For the most part, the "spoken language description" or nomenclature of the languages is the same ie: objects, methods, procedures, closures, fucntions, instance variables, parameters etc. So only when someone bakes up a new "thing" and gives it a name, do you have a learning exercise. This is to exclude the obvious of learning a new feature or third party system which is ever-on-going.

For those who have been coding for many years in many languages you will know what I mean. It's fun looking at new languages and it's not long before you are proficient. For ths most part, coding in another language is only slowed by the environment in which it has to operate, so OS, execution platform (JVM, native) the IDEs, no IDE, dynamic, compiled etc.

So .. how about switching the domain in which you are coding for? This, I have not done often. I have currently side stepped from funds management and trading over to insurance. I knew there would be challenges and yes there are. Insurance is another whole industry unto itself, there certainly are links and in the FM and Trading game you come across these links a bit.

It's not so much the concepts of insurance, they are easy, or easy enough, but it's the nomenclature that is used and when it applies, and who owns what. My view point is that I MUST be proficient in understanding the business before I can really code. Sometimes you won't have that luxury, perhaps due to time pressure. But if you work for an ISV or interact with a department, not being able to speak their language gives you no credibility and also makes you untrust worthy. The business domain experts will instantly pick up on this.

I am studying insurance like it's a new degree for me.

.. it's real Domain Driven Design back again (for me). It's fun. It's fun because it's forcing me to think outside the square and, as an outsider (at the moment) see how things work and then twist my brain around it to understand how it hangs together.

A policy, quotations, indications, endorsements, declines.. the list goes on. and to write a system I first have to understand the domain.

It reminds me of 7 years ago when I was first cutting systems for fund management. What a battle in the brain that was. I am finding this new domain a bit easier now than then. Perhaps because I know the right types of questions.

I think working with business people who are amenable to your plight (lack of knowledge) helps, of course :-)

So.. I encourage anyone who faces new domains, often or not often, study it, learn it. You will be noted as a good developer simply because it shows you actually do care. And, BTW, this goes for not only coding, but Systems support and DBA's and everyone.

We are like Vets, we have to know a lot in order to do our job.

Saturday, 26 May 2007

Eclipse Mylar ... Wow!

So, Subclipse is installed. I noticed (well was forced to notice) that Subclipse required Mylar before it would install. Okay .. dependency yicky, but, quickly sorted out by installing Mylar.

But, as Always, I MUST know what something is (which is a painful character trait as it does get me distracted often) and Mylar quickly became on top of my list. (Talk about being distracted, at 9:50am today (Saturday) I was starting the day for prototyping an idea I had last night at 2am using GWT, and now I am blogging, not yet into my prototype).

So, where was I.. Mylar, Mylar after it installed in Eclipse actually prompted me to look at the Getting Started Webinar, which was good because I was just about to minimise eclipse and go looking myself.

Mylar loaded a browser in Eclipse, without hassle, and on the Mac (love it when things work).. so I went along for the journey. It's a 40 minute presentation by Mik Kersten. All I can say is wow. It truely is something to be seen, even for small shops you should be using this stuff.

What is it ? Well you really need to see the webinar, but in summary it helps you do your work by showing you the things you are working on (automatically) and not involving you in the stuff you don't need to see right now. Eclipse has excellent integration hooks into the whole environment for controlling stuff (views and issues and notes etc), and Mylar has just gone one step further and controls all of this for you based on issue or tasks that you are working on (which you choose). So what you see, is only what you NEED to see, and it has the quick flick buttons to show all if you know you need to see something, and then, once you looked at that item, its now in your "filtered" view (if that's an ugly term for it).

It is far more than just filtering too, because integration into the Issue Tracking software is built in for Bugzilla, Jira and Trac which means team collaboration and project deadlines are at your fingre tips to make you more productive.

Really, watch the webinar (link is at the bottom) and you will see what I mean.

Here is (Ramon's dodgey list) the QUICKEST way to get the most out of Mylar.
  1. Install Trac (Bug tracking software) somewhere in your org (or use Jira if you can it's GREAT!)
  2. Install Eclipse 3.2.2
  3. Setup Subversion
  4. Install Mylar then Subclipse (the Subversion plugin for Eclipse)
  5. .. and watch this webinar
In my previous employment, issue tracking was managed by Tracker (Merant) and now the newer version TeamTrack. The benefits for Java Developers using Eclipse with integrated source repository and issue tracking software is just brilliant, and it's complete dark ages if you aren't. (now that I have said that I will have to practice what I preach hey).

Seriously, watch that Webinar and you will see why I have just become so excited. If anyone has worked on a project that is big, and you have to collaborate, it, Mylar, MUST be a MUST. If you aren't on a big project, then use it anyway because
  1. You are probably WANTING to make it big and you might as well start with a good foundation; and
  2. A good practice of coding ONLY leads to good quality code which relates to more time with your family and THAT is important.
Netbeans users, I see that there is something called ALM (Application Lifecycle Management). Sorry NB peoples, I MUST not get distracted, I might look at it another day. Someone can surely comment on the differences between Mylar and ALM. Not being a big Netbeans user it's lower on my priority list, sorry.

Back to Mylar, what's the name deal ..

Mylar is a term oft heard around the spacey dudes that watch the sun.

and Mik explains in his webinar

Mylar for the Sun - Aluminised film used to avoid blindness when staring at an eclipse.
Mylar for Eclipse - Task focused UI to avoid information blindness when staring at Eclipse.

So in closing

To the Mylar Team

Brilliant tool guys and what an idea. A product feature like this deserves an award and it shows that you have thought long and hard about what you wanted it to be.

And .. oh, what a fantastic name for your project .. I absolutely loved the definition.

Good Job!

I wish there was something like mylar for my web browsing which would have removed the crud I waft through daily so that I would have seen Mylar earlier (like 7 months ago) :-) Excellent job Mylar Team.. just brilliant.


References
The Mylar Webinar - https://admin.adobe.acrobat.com/_a300965365/p46246963
A Podcast from EclipseCon 2007 - http://www.eclipse.org/resources/resource.php?id=366
Netbeans ALM - http://www.intland.com/products/cb-download.html
Trac Issue tracking - http://trac.edgewall.org/
Jira Issue Tracking - http://www.atlassian.com/software/jira/
Subversion - http://subversion.tigris.org/
Subclipse - http://subclipse.tigris.org/
The Mylar Homepage - http://www.eclipse.org/mylar/index.php

Can I just say again .. Mik and the team, this deserves the biggest pat on the back of your careers. Fantastic!!! A Killer Plugin for an Already Brilliant IDE. You have made me love Eclipse even more... good work.

Friday, 25 May 2007

My old avatar, and my iPod


So, following on from the previous post, I have "reposted" all the OLD blog entries 03 - 04 (not lots, but you know, something for the kiddies).

Anyway, I also found my old avatar. Not so much an avatar as it is me back 3 years ago. :-)

It took a while to repost. iPod is still trying to find it's bad sectors... the long story is that I spent $160 on a 60G iPod Harddriver from the US.. and it doesn't work .. ARGH.. THAT really frustrates me. I might just go out and buy an iPod and be done with tinkering I think.

The WayBackMachine (internet archive)

Having recently moved my blog over to blogspot, I have wanting to "retrieve" all my old blog posts. Some are just historical and one is just funny.

In irony of a good IT professional, I was unable to find my blog backups, so instead, a waking (6:55am) thought prompted me to look on the Internet Archive (aka The Wayback Machine).

Of course, the posts were there. (this story has a good ending). So I am now proceeding to copy my old posts over to "here". For those bored, you can take a look at the RSS feeds and see which "old" posts have "modified" dates of 2007. (told you it's for the bored).

One post in particular that I didn't want to copy verbatim in is the following..

Wed, 9th Apr 2003

web load testing tools - @ 09:09:11
Here is a list of Web Testing tools that I have found
Comments on the ones I have used:

Siege
Open Source

OpenSTA
Open Source

MS Web Appplication Stress Tool (MS WAST)
Free for Download, Usage Persuant to EULA

Load (by PushToTest) - No longer Maintained

TestMaker (by PushToTest)



So I have copied it in because, technically (or chronologically) it is my first public blog posting. Bit boring, but just hilighting it's contents, testing has well moved on from here now.

Two items when it comes to GUI Unit and load testing comes to mind (actually 3).
First is that with (Google Web Toolkit) GWT, the unit testing framework is built in. (A plus of course).
Second is the mozilla initiative to provide a Java testing framework for web pages. I have not yet had the pleasure (possibly) of using it, but suffice to say it looks great.

Of course, the third is really just a note, that since this 2003 posting, I have hammered grinder, weighed jmeter even more and, .. the browsers are finally starting to behave like you want them to .. THANK you FireFox!!)

(as I type this I hear the horrid sound of
mke2fs -c -v /dev/sda

clicking away on my iPod because the HDD has completely died. .. Im scanning for the bad blocks on Linux so that I can recreate the partition map skipping the majority (with buffers) of the bad sectors. Let' see what I come up with .. (there is a MUCH bigger story to this, one which I hope has a happy ending).



Wednesday, 23 May 2007

MacOSX and .. and ..

So it's been 2 days now that I have been using the Mac for my primary machine.
I have used Mac's for a while in the past, but primarily as test machines, or in my earlier days as a network admin, supporting users on them.

Now having officially switched, I can say I like it, certainly as much as I thought I would.

So, how odd is this, I have switched from, now Windows but from Linux. Now .. it's not Linux that I don't like. In fact I love it, but it's that as a development platform, the mac powerbook is just right.

I installed eclipse on monday and used it for about 5 hours today. I had some odd problems trying to install subclipse, and also the spring ide plugin.

I had downloaded 3.2.2 and Eclipse was saying that a particular version of some package was not installed. The odd thing was that it was the stable version of the plugin(s).

Because people all around me have been talking netbeans, I figured I would give it a good try.
I have never liked the concept that ant is at it's heart, especially since I have been a maven advocate for the past 2 years. Anyways, I'll see how it goes.

So far, I have followed the setup for a maven project in netbeans, and have seen how they hook goal execution into the ant subsystem. Not a bad way of doing it.

I created a fairly comprehensive domain model today (30+ classes) and I want to now see this as a model, so I couldn't install the Eclipse UML modelling (see above) .. which is why I am trying netbeans. Perhaps I will have better luck.

In theory the install for the UML Modelling went well, but poor old netbeans didn't register that the plugin was installed.

All is not fair in war. Some things are not as they seem (as they are described) .. but never get disheartened, there will be a way around or through this.

Thursday, 17 May 2007

Desire vs Need - Part 2

In building a proof of concept, a sound architecture is abstract from it's infrastricture. In this view, the project 014 (my codename) will be put on to an ODBMS and an RDBMS.

Now, being the good timesheeting developer that I am. I will break up my development time into framework, and then also time the effort for the data access layer, and then the time for implementing the datastore.

For this datastore, I will choose one for the prototype (RDBMS or ODBMS) and then as a research (non-official) will choose the other.

I will literally ask my magic 8-ball which one to start with.

I do expect that the ODBMS will come out trumps, but lets see.

So .. the need vs desire is simply that the I know RDBMS systems extremely well which is the "need" for time to market (prototype) a time thing) and the "desire" (see previous posts)
to use an ODBMS instead.

I actually expect that developing with the RDBMS will take longer than with the ODBMS, as quite simply, it looks simpler.

So what should I choose ?
Well .. you have to compare like fruits, so
db4objects embedded has to be compared with Hibernate+Derby
db4objects client/server has to be compared with Hibernate+MySQL.

I am NOT comparing speed, I will just compare effort. I will do that afterwards.


One of the many reasons some people jump to Ruby on Rails and other "X" fancy platforms is because of all this "crap" we have to put up with. (Ruby and ActiveRecords apparently does away with the Hibernate Crud..) it just still feels wrong. So, if we like Java as much as we say we do, we MUST do something about it. Sun will, but it will probably take them about 2 years longer than 6 months it would take "us" to make change happen. After all, we are the users who "sell" this stuff.

Monday, 14 May 2007

Desire vs Need .. ODBMS vs RDBMS

I have been on the think track of not using SQL for sometime now and have completely come to th conclusion that it is entirely possible. Possible in the sense that a decent well founded ODBMS will suit my need.

I mentioned a few things that needed to be investigated in order for the switch to be solid.

1. Administration - So presumably, the chosen ODBMS (Server, Platform) will provide a level of administration, such as object browsing, log, db maintenence etc What evere the underlying "architecture" is, there has to be tools to see it, so it is not a black hole. One of the nicesties of an RDBMS is it is not a black hole in as much as SQL from any compliant tool gets you to your data, a plus.

2. Monitoring - How can we monitor this beast. I put this here because in my line of work, there are clear distinctions between those who build and those whome maintain and the maintainers want a system, that, well, can be maintained. So it's important.

SO, where is this desire and need. The desire, I want to switch, the need. The project I am embarking on does not have the funds to splash around and this puts it squarley into the hands of OSS for the chosen ODBMS. If you look around, it leaves only a few choices. After you weed the crappy and incompatible ones out, the clear winner is db4objects. But ahh, Here is my catch, db4o is GPL based, and you HAVE to code with it, ie, embed it. Which means that I can't use it, unless you buy their commercial license.

What is lacking is a non-viral license ODBMS to the quality of db4objects (Apache 2, Mozilla, LGPL) such that it suits the needs of this "commercial" yet cost constrained project.

So, there is my desire (ODBMS) and need (RDBMS). I must (need) have a running architecture with which to start.

I must make sure that this "platform" chosen (Hybernate, MySQL (PostgreSQL)) is "removeable" for the day (if) when a compatible ODBMS comes available, or the project at hand makes some money to warrant a change.

Hrmm, certainly has you thinking. I will look further into the db4o world. Here is a thought: if db4o were running as a server, in the SAME way that MySQL runs as a server, then there is no impact to your licenses. db4o is well within it's platform, wrapped in server code, .. hrmm .. cheeky. Let me take look at that.

So, the server code could be GPL'd next to db4o.

Saturday, 12 May 2007

KISS - RDBMS and ODBMS

Well, you can guess the title right there (Keep it Simple Stupid). Another title could have been (for controversey). Why the heck are we still dealing with RDBMS platforms and architectures ?
So living in the Object world of Java I have come in contact with many databases. Sybase, MSSQL, PostgreSQL, MySQL and some Oracle (in that order of exposure).

In each case I have had to deal with the RDBMS and the Relational Mapping layer (RM).

Experience has whispered to me that .. "there has to be a better way", but certainly until about a year ago, shy on stating that there has to be a better way. I have known about ODBMS's for as long as I have known RDBMS's but have never worked with one.

Certainly, legacy (and I mean that in the "ODBMS to RDBMS sense") platforms have to stay RDBMS and that is where OR mapping stays, but .. I want a simpler system.

I blogged some time back that my goal for the next project is to make it sans-SQL. Now why on earth would I want to do that ? Well for one, I want to reduce complexity. It's that simple really (Keep It Simple Stupid).

This is such a "hot" topic (many people have an opinion) that I am going out on a little (hopefully stong) limb and stating that RDBMS platforms are just simply getting long in the tooth and now is certainly a best time to move away from the RDBMS.

Why (why are you being so nuts, why move away from proven technology, why ?)
I have just finished listening to a very good talk by Ted Neward on TheServerSide
The Webcast Talk - Object/Relational Mapping and the Vietnam of Computer Science - Expert Webcast


This talk centres around the idea that when working in the Object space, we should just not use RDBMS's. For background reading, the comment of OR Mapping being the "Vietnam of Computer Science" is found here at his blog.

One of the good quotes, which aligns with my previous post is this.
Ted was talking about the "lure" of OR Mapping

"[OR Mapping has a]... selling point that says, I don't have to think about SQL, I don't have to think about tables, I don't have to think about manually unwrapping this stuff myself.. phew .. I don't have to worry about these issues anymore, and then to get bit by them .. at the 11th hour"

So true, with Hibernate, OJB, and JPA (etc etc), SQL is still there when you have to do complex stuff. It is still there when Crystal Reports or JReports or platform X comes into it.

I remember many a time sitting at night trying to tune that sproc that was mapped to my business service pouring over SQL, just to get my data right and quick. That HAS to Cost something.

There has to be an extraordinary TCO attached to using OO and an RDBMS, one which could be lost if we remove the RDBMS altogether.

Of course, there are problems, how to we make sure the ODBMS is quick, and efficient, and what do we do when we want reports ? (These answers I am going to find out)

Ted mentioned, rightly, that it all depends what is on top of the paltform, is it Objects, or relational Data. What is the priority of the platform ?

Back to the TCO. So If I have no DBA (only Operations looking after the "service" and "data" tier) and I have no need for SQL developers (if there are sprocs involved) and I don't have to worry about OR layers .. It has to mean that I have less to worry about, surely, which means that "time" to market, TCO etc is lower. It just has to .. It has to mean a simpler system.

This KISS principle, A very interesting WebCast that.

So where to from here ? I have decided that my projects will not use *SQL. So that means I have to choose an ODBMS platform.

Two issues that were rasied in the Webcast which are VERY real are the following.

- How do you administer the chosen ODBMS ?
- How do you monitor the chosen ODBMS ?

Very true, kind of a worry (possibly). We'll see.


Thursday, 10 May 2007

So glad things have moved on...

Having always been in IT for most of my (post age 13) life. I take notes, and lots of them.

Here is a snippet of some notes I took about 5 - 6 years ago. It's like my Own personal "change control" sheet. Before I even really knew what the heck a change control "form" was.

Oh, the icemaster is my "server" almost always. I have a gorgeous old 60's chrome that belongs on the back of a car, which is script says icemaster. Funny thing is I found it in the inside of the freezer box of a 60's fridge. It is
fantatic and has moved with me to all my servers over the years...

Why do I say things have moved on. Well now Ubuntu (and other distros just make this stuff so much easier). I don't need to worry about installing stuff to any great detail like you used to have to.

I remember sitting down for 6 hours with Adam Spann (Druss) and working on Slackware with was running kernel 1.2.28 in 1994. Gosh those were the days of 2 hour kernel compiles !!!

I don't think I have even compile a kernel for Linux in the last 18 months. whoa.. is that even happening, wait till I have kids, I might forget XML.

soo .. here you are .. snippet of 6 years ago

icemaster - P120 32M RAM .. lots of disk space to play with as well

2001 07 12
After much fluffing around and installing Debian 2.2r3
I have a working systems
To do .. install SCSI modules for the SCSI cards
Configure the snd card it's conflicting on IRQ10
so I need to get it off there at some stage.

2001 07 12
Install Xvnc (vncserver) .. gives us X windows vnc Access .. Neato!
tar it out
xmkmf
make World
...
cd Xvnc
make World

make install and make install.man
Edit the /etc/vnc.conf file and uncomment the fontPath settings ..
All is well .. now let's get some X interface happening

2001 07 12
Installed samba 2.2.1a with default options into /usr/local/samba and /etc/samba/conf
/var/log/samba/%m.log

Have 'make'd apache 1.3.20 .. it isn't installed yet.


2001 07 16
Installed apache 1.3.20
Attempted to connect .. no go ..
after a few mins ..
Did an netstat -an
Ahh there the bugger is .. port 8080 ,Must have been because I was compiling as a user
Ahh well .. fixed now.

2001 07 16
Setup swat
/etc/services ---> swat 901/tcp
/etc/inetd.conf ---> swat stream tcp nowait.400 root /usr/local/samba/bin/swat swat

Now .. killall -HUP inetd
DOH! .. bad username/password ..
Looked aorund using google.. possible that it needs PAM configs to auth root
Seems to need:
icemaster:/home/rbuckland/software/samba-2.2.1a/source/pam_smbpass# less /etc/pam.d/samba
auth required /lib/security/pam_pwdb.so nullok shadow
account required /lib/security/pam_pwdb.so

Couldn't find pam_pwdb.so... Try net for 2 mins
Nope .. use dselect .. Ahh there it is. ** NOPE STILL NOT WORKING **

-- Recompiling samba
~/software/samba-2.2.1a/src/
./configure --with-pam
cd src
make
make install
... YESS!

2001 07 16
Setting up jakarta-tomcat (compiling it)
+ Installed j2sdk 1.3.1

2001 07 17
Setup a symlink from /etc/samba/smb.conf --> /usr/local/samba/lib/smb.conf
so that swat can use the same file (quick fix)



... and it went on, but I have snipped it

A close friend .. replaces Rod Johnson

Was just chatting with Vishal Puri, a very close friend and colleague of mine.
He said the following ..
--------------------------
Vishal: hehe all the best!
me going to india
on 29th may
me: Way ! When did this happen
Vishal: yeahh yesterday
confirmed only today
me: did u tell me and I wasn't listening ?
Vishal: speaking in place of Rod Johnson [smile]
--------------------------


I like the casual statement .. speaking "in place of" Rod. Now who gets a chance like that. :-) He left working with me to go and work for I21. Ish pish.

Good job Vishal, you'll do well.

http://www.sda-india.com/conferences/jax-india/speakers.php

Vendor lock in

Big Appservers annoy me. Websphere, WebLogic .. even JBoss sometimes. But, you know what. They are good. Now .. there's a revoke of statements.

I have used JBoss from early V2 days and now use V4. If I could everything would just run on Tomcat but in banking circles this is just not a goer (wish it could be).

Brands, and branding sells.

A Colleague made a (un)true statement today. .. spoken from the client view ..

"It doesn't matter, we want Websphere because it scales".

Now the statement is not the (un)true part, but he said it tonque in cheek.

WAS scales, it does and that's why our clients will use it, want it.. think they want it. But what about when you have a standalone (runs on a JVM (thanks Spring)) app and then you run it in Websphere and it has a 300% degradation. ie, 1s response becomes 3s.

I suspect that the app is not "plugged" well into WAS 6.1. It's ServiceMix that we are talking here..

I mean where do you start, we can tune our code, but obviously the code runs well stand alone, so it's off to WAS to see what's going on.

I'm not acually the dev doing the performance testing. That is another one of our merry men. It's a hard task and it is only solved by starting at the top.

Back to Application Servers. I hate vendor lock in. It's a necessary evil though, I make no quarms about using a product when it's good. Weblogic 9 was great to dev with. Websphere 5.1 was clunky. 6 seems better. But any appserver, Java is meant to be portable and when you have a "from the start" idea that you use one, you might as well go MicroSoft and get far better "rubbing shoulders" experience than you will with Java.

A few things leae a bad taste in my mouth, one of them is the jboss-*.xml files that have to be in wars, ears etc .. That's just annoying. :-) (Maven solves that but so I don't have to worry about it, so did ant to think of it)

So , I am a technologist who chooses product or library X because it works.
I hate something that doesn't work.

Stuff I found works, absolutely brilliantly I might add is, Spring and Maven 2.
These two products alone have changed the way I view Java Development.

So How do I avoid vendor lockin there ?
Well, Maven doesn't go near my code, it sits on the edge, ie Ant and Maven co-exist for sure.

Spring, (some, not many) people have this conception that Spring is through your code. to me, if spring is in your code it's because you either found something there that really helps, or you have not thought well about how to keep it out.

Can't wait to see how Java Dev'ing on the Mac goes. Vishal says it's brilliant. My brother swears by it (but he's a designery type so he would ;-).

Bottom of the food chain

Free Stuff websites. That's the bottom of the food chain.. and now I landed there.

Partly because my iPod was stolen many a year ago. But let me tell you more..

These websites work by offering a "free" price in exchange (there, it's no longer free) of you completing an "offer" from an advertiser and then referring X number of friends to the site, X being based on the gift you choose.

These sites do work, simply. The offer might cost me, anywhere between $1 - $20 or more, and if I sign up for a "subscription" to the advertiser, then more of course.

The pinch is that the "gift wanter" may just take up an offer. I'm sure there are kickbacks.
Now why am I blogging this in the tech side ?
Well .. I got lured, I have chosen an iPod as my free gift, I have to have 8 "friends" do the same thing, they go to the site and accept an offer. Then I get my iPod.

(Shameless link to "my" referral) http://gifts.freepay.com/?r=38178588

So here's the tech bit.

I chose the "advertiser" that had the most non-boring thing, which was a subscription $10/wk to mobile phone games. I made sure there was an opt out option .. (scoured the advertisers website first) .. and then went through the "sign up" phase.

I entered my mobile #, it sends an SMS asking me to confirm, I confirmed with OK.

The website, in the mean time is "waiting for" my OK SMS. The SMS obviously gets picked up because it moves to stage 2 which is "success" and sends me a "welcome SMS".

Now I go back to the gifts.freepay website and refresh my "you have not completed the offer" stage. ... No change.

Well .. I refreshed once or twice and then I went to help, help says to wait 1 - 15 days to see acknowledgement from the "advertiser" that accepted the offer. ... bummer...

Then, sure enough, an email arrives in my inbox which says my offer has been confirmed. There's the technology bit. I shouldn't be too surprised, but interestingly enough, it was server to server communication. ie, the mobile phone company notified the freepay website that my offer was accepted.

Anyways, click on my link, sign up. I get an iPod. ;-)

And this the title of this post.

Monday, 7 May 2007

OJB, Hibernate and the Cage

I have been working for one company for the past 2.5 years and back about 3-4 years a go. The ORM platform chosen was OJB. Now Apache OJB (from the DB Project) had quite a potential lead in the ORM world.

It was logical in it's design, goals for JDO compatibility etc. (it made them all).

Over time, users waned and moved over, or took up hibernate. Hibernate having the market share takes the cake now for
  1. Performance
  2. Configuration Ease
  3. Options
  4. Assistance
  5. Stability
and last and NOT least

6. Plugability

This is OJB's main killer.
In my 2.5 years working with it I can honestly say that it is the worst Java library I have ever used, ever had problems with, and more than ever passionately wanted to move off.

Now this is not a diss to the authors of OJB. By all means, they did a great job. But some things just died in there and before you know it, the project is like a car running on rims with no tyres trying to drive the nullabor. Just ill equipped for a big task.

My main pet peaves for OJB are the following

1. Class loading issues


Deep in the bowels of OJB it all starts with an OJB.properties file. This is the beginning of a painful end. OJB locates this from the classpath and then inside this properties file, locates youre repository.xml file (akin to *.hbm.xml).

Now, the repository.xml has all the settings for the mappings, but it also requires you to declare you datasource (can be a JNDI) but also transaction demarcationb (the OJB way) and a few other nasties.

Coding MUST be done such that infrastcuture is a side concern. ? What do I mean. Write your code, but write it like it;s on an island. Don't include code, logicthat binds you to a database, or a way of communicating across the wire.

This classloading issues gets ugly in many ways. It I wanted to merge repository files together, I have to do this by either, writing an adaptor class to join them all up or .. just hand (or ant,maven) hack them together.

But .. remember, a repository file is for one database.. how do you do two databases ? (seperate schemas) ..

Know, that we worked it out .. how, by ensuring that one OJB for one schema was in a sibling or seperate classloader.

2. Inconsistencies between versions (minor versions)

The Apache Portable Runtime project has a very nice succint way to describe versions. http://apr.apache.org/versioning.htmlLink
OJB certainy did not follow any of these guidlines.

The library is still stuck on 1.0.4, 1.0.2 is about 3 years old and 1.0.3 was just skipped because it was full of bugs (for us).

OJB 1.0.4 seems stable enough, this was after a massive code change that occured to our base product to make it work.. oh the pain.

Now the observant among you would say that you shouldn't have to change code if you change your ORM tool. Sadly we had to because of the legacy Java Code, but even still. There was horrid work to be done just trying to work with the incompatibilities.

Why upgrade then ? Well we found some things didn't work in 1.0.2 (something with key Sequences and Sybase (yes another groan, see my Sans SQL Post) and 1.0.4 supported it.

From the site itself (on the news section)

12/2005 - OJB 1.0.4 released

Contains bug fixes and new features. For more details see release-notes.


Argh ..

3. It's just plain dead

The site is tumble weed city. Everywhere you look on OJB it just smells of a house that is falling.
Where I work(ed soon) we have a large application that relies, depends, is hard coded to, OJB, it is an inherited platform and we now know what not and what to do with OJB.

Some initial analysis has gone into the replacing it with Hibernate, but a lot of the other code is going to be shelved so it is almost not worth it now.

All the new work has a clear separation of ORM and Domain so that way it can easily be shifted in or out as you see fit. (thanks Spring).

I think the project (OJB) is dead, or should be left alone.
The mailing lists are a dieing world.
http://mail-archives.apache.org/mod_mbox/db-ojb-user/Link
Summary

Why did I mention hibernate in the title ? Well, Im never using OJB again that is why.

OJB users now and future, your mileage may vary, because you might now how to make OJB work sanely. The thing is, it was not obvious to me or about 6 other developers I worked with who also have a bad taste in their mouth from this seemingly harmless library.


Have fun, and keep the code away from the hardware :-)

Thursday, 3 May 2007

More Thoughts .. Moving from that proverbial database

I posted back in November my desire to move away from the database, somehow.

Well .. my chance has come .. I will still use it, but let's not have any SQL shall we. I am moving work-house.

So, Starting this new company in a few weeks time, I have the opportunity to not rely on a Database. Let's read that anotherway. I won't (want) have to write another piece of SQL.

Now, I actually have a great fondness of SQL. I actually becomes quite a proficient Sybase Tuner (knowing the profiler better than the walk home) back in 2002 - 2004. So I actually like the stuff and SQL is certainly a way to shift sheer lare amounts of data.

But why another language, I will already have CSS, XHTML, JavaScript, Java and a hack of XML for Maven, config, oh and Im sure my favourite XSLT language will land somewhere in the next big app I write.. so .. where does that leave SQL.

Well I am going to go dry .. I will look for all options, (logical), where I don't have to have any SQL. It is just another language, that when I am hiring or training or handing over, that I don't want to be the only one in the team that knows it so well that I get scared looking at others working on it ..
(told you it was a rambling).

So what are the options ?
How can you develop an enterprise app, that HAS a database but has no SQL (ie auto-gen'd SQL).

Here are some avenues to pursue.
- ODBMS (most likely)
- Auto Gen of OR Mappings
- Direct cache storage of Objects (eww!)

What happens when my objects grow, what about the old data etc ? That will be fun. I don't want to have to write "update" SQL (remember, sansSQL

Shoot em down, clear em out, get them bugs !

Listening to #115 of the Java Posse in traffic this morning (taking the car to the mechanics), the boys have mentioned it many times before, the findBugs plugin.

I finally remembered again to go back to the podcast listing http://javaposse.com and take a look.

FindBugs looks great, I quickly ran it across two projects I had open in Eclipse and it certainly picked up a 3 in one and about 28 in the other.

Going by class (I should go by LOC) the 28 bugs on 60 classes is a bad.

The other is 20 classes and 3 bugs .. good thing this second project is recent :-) and the other 3 years active.

Good job for that plugin I say.

Wednesday, 2 May 2007

JEE and Spring

I was in discussion with a colleague the other day who had not used Spring before. He was describing to me how Spring is not needed now because of JEE5.
It suddenly occured to me that, anyone who hasn't used Spring doesn't really understand how it works.

Se we were on the discussion of View handlers.. so the bit that finally renders (X)HTML in the browser.

Here is my theory : Spring has become well known for a lightweight IOC Container and famous for it. But people seem to miss the point that this is one area of Spring. I can take the Spring Portlet code and pop it right next to the JEE stack and it's all good.

This is akin to saying let's run struts in a JEE stack.
Same as me saying, let's run SpringMVC in a JEE stack

Interesting.. Anyways.

Monday, 30 April 2007

The Ideal Java Development Environment - Post 1

Starting a new job in a few weeks, my mind has slightly been drawn to what is the quickest way to setup a fledgling company with a stable build system and a structured project and development lifecycle for Java.

Now many of you will be very observant to note that I have tagged few big issues in that first paragraph. Yes, that is the problem.. or the focus of my attention for a short number of days (what, he's mad, setting up something in a short number of days).

So where do you start ?

My new company I am working for primarily exists to raise funds for charities, technically we are not non-for-profit, because it is not 100% of profit which goes to the charities, but let's just say it is more that 40%.

So .. costs are critical, we need to keep them fairly low, and software development certainly sits at the high end of draining resource (money).

Of course being a major supporter, user and contributor of Open Source, it will sit majorly within the realms of my new job.

At a brief glance, these are the items for which I either (a) know intricately and will bring to the table, or (b) know I need to look at them.

So lets start shall we, here is a quick run down of items I will either be using, or will investigate ..
Build and Development
  • Eclipse - (for many years now)
  • Maven 2 - Without a doubt
  • Subversion - goes without saying
  • Continuum - (CI - Continuous Integration) - Would love to use Bamboo (atlassian)
Some things I want to take a look at (suspect they may be of benefit)
  • Bamboo
  • Buildix (ThoughtWorks Out-of-the-Box of all the above)
  • Tracs (it's in the Buildix platform)
  • Artifactory for Maven - Definitely a Winner here
Some standard "stuff" I will always add to the Java projects
  • Checkstyle via Maven
  • Maven Site Build (customise it)
  • Spring (Security, JDBCTemplate)
Some stuff I will certainly look at
  • Maven Dependency Graphs http://philhoser.blogspot.com/2007/01/dependency-graph-for-maven-2.html
So that is a brief list of items that will form the new "bucko" workshop.
I'll keep you all updated on this as I grow this area.
There are a few things I am missing there, but I'll keep you all posted on that front.

Oh, one last thing I do want to add it, when the team becomes >=2 developers.
and that is hooked up to the CI .. build notification.

A lot can be said on this topic. Here is a place for it.

http://www.pragmaticautomation.com/cgi-bin/pragauto.cgi

I have seen a whacky lamp at a lighting shop in Balgowlah, hideous. $100 or so .. and it would be great to "turn" it on, when it all goes bad. It's a foul looking angel on the back of a dragon. Perhaps the ubiquitous (sp?) lava lamp will do.

I will post the steps I take in the next few postings to detail what I setup. But It wont be for a few more posting yet because I am still with my current (and fantastic employer) so need to be honourable there.

Aargh ! Letting Others 0wn U

Having a few issues with the blogs lately.

Hopefully this can be the last place of my blog. The blogger.com / blogspot would not let me migrate the old account across to my recently heavily used gmail account.

So I have ditched the old in favour of the new.

Sunday, 18 March 2007

Installing Ubuntu Server 6.10 on Dell PowerEdge 2400 Server

After much pain in trying to start the install, I was finally able to get the install underway.

What was the problem ?

I have 4 disks (2x 9 and 2 x 18G) and the installer was not recognising the logical disk that I setup on the RAID array (2 x 8 as one, RAID1 or 0)

What was the "possible" solution ?

Apparently it is a common found problem with these MegaRAID adaptors.

Essentially I2O is a specification used in these RAID arrays.

Personally I don't care too much, I just need my server running.

Apparently, the i2o module (i2o_core) for linux loads up first and then the megaraid module can't find the disks when it loads.

A few people found ways around it by manually loading the modules in the right
order (RHE, Fedora).

My answer for Ubuntu 6.10 server:

I found a tip on the RHEL "release" notes for RHEL 5 beta 2.

http://www.linuxcompatible.org/Red_Hat_Enterprise_Linux_5_Beta_2_s76179.html

It states: Change the megaraid mode from I2O to Mass Storage.

I did that .. (Ctrl-M when it pops up) and then went back through the install. No problem.

I can see my 9G and then also my 2 x 18G's.

Nice.

Thanks Red Hat. (Sorry I left you at RE 4.2)

Monday, 8 January 2007

Java and Schema Generation

You'd think that generating an XML schema from a set of Java Classes would be a simple thing. But gee it is hard.

I have spent the better part of four hours and three coffees pouring over the inner workings of the JAXB 2 SchemaGenerator classes. Odd stuff this, I know I am close because I have it working for simple types, but I am trying to work out how I get it to handle a complex map of objects. Hrmm..

[[ Edit ]]

I did work it out in the end. It was horrid code, in that there was so much of it which I suspect means that I didn't have it right. (it does do what I needed it for though so, close that door).



Current 5 booksmarks @ del.icio.us/pefdus