In Loki We Trust The many projects of Lokkju, Inc


AngelList Newest Startups dashboard

Put together a quick hack to display the newest (claimed) startups on AngelList using the angel list api - this lets you see the newest startups as they are created - which hopefully means you see the coolest things as they come into existence:

Filed under: Code 5 Comments

Open Source Part of Speech Taggers

I'm on a new project at work, and it requires we tag parts of speech. To that end, I'll be evaluating some of the existing open source NLP/POS Taggers - I'll post results once I'm done, but for now here is a list of current open source POS Taggers.

I'll be concentrating on MorphAdorner and OpenNLP, as they seem to have the most favorable licensing and most activity.

Filed under: Code, Projects No Comments

New wordpress plugin published

I've just released a small wordpress plugin for showing paypal funds received vs a target amount (with time periods, so, monthly for instance).

You can find it here


Installing Mono and ASP.Net on Bluehost (and other shared hosting providers)

I get a request from a friend the other day to get Mindtouch working on Bluehost. It's not working yet, but Mono, XSP, and mod_mono are all fully working... and I thought I'd share the process. It's pretty basic to get compiled and running, and only takes a few modifications to the build files.

details after the break...


Flex SWC class parser

Quick tool that prints out all the available classes exported by a SWC file.

Usage [file [...]]

# Filename:
# Author : Lokkju Brennr <[email protected]>
# License : GPL
# Copyright 2010 

import sys
import getopt
import zipfile
from xml.etree import ElementTree

def usage():
  print "%s <swcfile>" % (sys.argv[0])
  print "  Prints all classes exported by the provided swc(s).  Supports wildcard globbing"
def main():
  # parse command line options
  if len(sys.argv) < 2:
  files = sys.argv[1:]
  for swcfile in files:
    z = zipfile.ZipFile(swcfile,'r')
    catalog ="catalog.xml")
    catalogxml =
    tree = ElementTree.XML(catalogxml)
    defs = tree.findall(".//{}def")
    for d in defs:
      id = d.attrib.get("id")
      type = d.attrib.get("type")
      if type is None:
        print "%s:%s" % (swcfile,id)
if __name__ == "__main__":
Tagged as: , , No Comments

Supporting dynamic FlexFileSets in Flex’s compc and mxmlc Ant tasks

If you are using the mxmlc and compc tasks in Ant to compile flex code, there is no documented way to make the fileset-like children accept a dynamic include pattern - that is, one you set based on conditionals.

In my case, I need to have a list of included libraries in my, and only include those ones in my compc task. The solution is to use patternsets and a custom ant macro. See the source below, but essentially you do the following:

  1. Create a new patternset, assigning it an id
  2. Use the macro to add a new patternset for each pattern in your list of library patterns (as defined in you, or dynamically, or...)
  3. assign that patternset to the library-path of compc

Important: All the patterns must decend from the same root directory as set in the library-path. If you need multiple root directories, you must use multiple library-path directives and multiple patternset refids.


<?xml version="1.0" encoding="utf-8"?>
<project name="My Component Builder" basedir=".">
  <taskdef resource="flexTasks.tasks" classpath="${basedir}/flexTasks/lib/flexTasks.jar" />
  <property file=""/>
  <property name="FLEX_HOME" value="C:/flex/sdk"/>
  <property name="DEPLOY_DIR" value="c:/jrun4/servers/default/default-war"/>
  <property name="COMPONENT_ROOT" value="components"/>
  <macrodef name="">
    <attribute name="patternset"/>
    <element name="nested" optional="yes" implicit="true"/>
      <patternset id="tmp">
        <patternset refid="@{patternset}"/>
      <patternset id="@{patternset}"><patternset refid="tmp"/></patternset>
      <patternset id="tmp"/>
  <patternset id="compc.library-path" />
  <for list="${compc.libraries}" param="lib">
      < patternset="compc.library-path">
          <include name="@{lib}" />
  <target name="main">
      include-classes="custom.MyButton custom.MyLabel">
      <source-path path-element="${basedir}/components"/>
      <include-file name="f1-1.jpg" path="assets/images/f1-1.jpg"/>
      <include-file name="main.css" path="assets/css/main.css"/>
      <library-path append="true" dir="${compc.libdir}">
        <patternset refid="compc.library-path" />
  <target name="clean">
      <fileset dir="${DEPLOY_DIR}" includes="MyComps.swc"/>


Adding functionality to HTSQL v2

HTSQL is a very cool open source product that gives you a RESTful interface to multiple database backends. Because it uses it's own simple, but very powerful, syntax, you avoid most of the risks involved in passing in SQL. Currently it supports SQLite and PostgreSQL, but for my current project, I need to support Geometry columns.

For the first draft, I just wanted to add support for Spatialite, a spatially enabled version of SQLite. Since SQLite is already supported, this turned out to be relatively easy - though in hindsight, I may not even have implemented it in the easiest way possible - but I'll get to that in another post.

So, each database backend is in a namespace called htsql_[name], and they all subclass files in the main htsql namespace. I started off by cloning the htsql_sqlite tree into a new namespace called htsql_spatialite, and ripped out most of the code, leaving me with a basic structure. I then subclassed any SQLite classes I wanted to override - most importantly:

  1. Changed to import pyspatialite instead of pysqlite2.
  2. Added my own Column and Data types (Domains) in
  3. Modified to handle my custom Domains, as well as to handle the blank Column type sometimes given for Geometry columns

I also, and here is where most of my functionality was added, overrode a few classes in tr/

  1. Class SpatialiteSerializeLeafReference(subclassing SerializeLeafReference) tests if I am selecting a Geometry column, and if I am, wraps it in the "AsText" function, to return WKT.
  2. a new Adaptor, FormatGeometry, which handled the representation of the WKT when returned to the client. Right now, only HTML is supported, but JSON, CSV, and the rest are easy to add in the same way.

The last thing you have to do is add a line in to point to your new database engine's entry point - it is in a list called ENTRY_POINTS.

Interestingly, I think I could better utilize the plugin architecture - but as I'm just discovering HTSQL, and there isn't all that many samples, nor much documentation, I'm pretty happy with what I accomplished.

You can see the full source of my additions at


Converting from Spatialite to PostGIS

A quick one-liner using ogr2ogr to convert from spatialite to postgis:
ogr2ogr -f PostgreSQL PG:"host=host_ip user=username password=password dbname=database" -lco LAUNDER="YES" sqlite.sqlitefile -skipfailures


Using Office Automation on IIS7

Though there are a lot of articles out there on office automation in dotnet (most of them telling you not to do it), there are very few covering how to get office automation up and running under IIS7 on a 64bit machine - and it is possible.

I needed to do this recently, and found one hint on how to get it working at . They key is to use Process to launch the application you need, then attach to it.

Sample code in C#:

Microsoft.Office.Interop.Word.Application app = null;
Process proc = null;
Document doc = null;
ProcessStartInfo procinfo = new ProcessStartInfo(WORD_PATH, "");
procinfo.WorkingDirectory = AppDomain.CurrentDomain.BaseDirectory;
procinfo.CreateNoWindow = true;
procinfo.WindowStyle = ProcessWindowStyle.Hidden;
proc = Process.Start(procinfo);
app = (Microsoft.Office.Interop.Word.Application)System.Runtime.InteropServices.Marshal.GetActiveObject("Word.Application");
if (app == null) { throw new Exception("Word not found"); }
app.Visible = false;
app.DisplayAlerts = WdAlertLevel.wdAlertsNone;
#region Declare Params
object fileName = filename;
object ConfirmConversions = false;
object ReadOnly = true;
object AddToRecentFiles = Type.Missing;
object PasswordDocument = Type.Missing;
object PasswordTemplate = Type.Missing;
object Revert = Type.Missing;
object WritePasswordDocument = Type.Missing;
object WritePasswordTemplate = Type.Missing;
object Format = Type.Missing;
object Encoding = Type.Missing;
object Visible = false;
object OpenAndRepair = Type.Missing;
object DocumentDirection = Type.Missing;
object NoEncodingDialog = Type.Missing;
object XMLTransform = Type.Missing;
doc = app.Documents.Open(ref fileName, ref ConfirmConversions, ref ReadOnly, ref AddToRecentFiles, ref PasswordDocument, ref PasswordTemplate, ref Revert, ref WritePasswordDocument, ref WritePasswordTemplate, ref Format, ref Encoding, ref Visible, ref OpenAndRepair, ref DocumentDirection, ref NoEncodingDialog, ref XMLTransform);
object saveChanges = WdSaveOptions.wdDoNotSaveChanges;
object OriginalFormat = Type.Missing;
object RouteDocument = Type.Missing;
((_Document)doc).Close(ref saveChanges, ref OriginalFormat, ref RouteDocument);
doc = null;
catch { }
((Microsoft.Office.Interop.Word._Application)app).Quit(ref saveChanges, ref OriginalFormat, ref RouteDocument);
app = null;
catch { }
catch { }


Installing Hyper-V Linux Integration Components v2 in CentOS 5.2

Microsoft still hasn't released a CentOS or RHEL RPM for Hyper-V's Linux Integration Components, so you still have to build them yourself. Of course, since neither RHEL or CentOS are supported platforms, Microsoft won't help you much.

So, with help from Julian Field's work log on installing the original LIC v1, I've put together a minimal instruction set for performing a LIC v2 install on a fresh CentOS 5.2 install.

I started with an absolute minimal install, so there should be no packages other then those below that are needed.

Lines starting with "$" are shell commands.
Lines starting with "#" are something you need to do.

# In Hyper-V: Mount CentOS 5.2 ISO image

$ mkdir -p /media/cdrom
$ mount /dev/cdrom /media/cdrom
$ yum --disablerepo=\* --enablerepo=c5-media install gcc make gnupg kernel-devel
$ umount /dev/cdrom

# In Hyper-V: Mount Linux Integration Components ISO image

$ mkdir -p ~/linux_ic2
$ mount /dev/cdrom /media/cdrom
$ cp /media/cdrom/drivers/dist/* ~/linux_ic2/ -R
$ cd ~/linux_ic2/
$ make install
$ reboot


To verify, look for the new seth* network interface.

Wasn't that easy?

Tagged as: , 4 Comments
forum address