In between professional assignments, I have completed a couple of non-commercial (closed source) projects. These were mainly meant to provide better support for my various hobbies, but also got some appreciation from within the involved communities. As an additional benefit, I learned and applied some new technologies on the way. I have no plans whatsoever for further initiatives, mainly because these endeavours take quite a bit of time to complete (especially the last one) and because I feel it's time to give up writing software in favor of my more recent developing interests. If any new software inspiration would still come up, then I'll make sure to pass it on to someone else ...
Each project incorporates a different custom static website generator, allowing for the regeneration of the entire website (from its respective XML main content source) by means of a single button push. Most projects took a couple of months full-time to complete (including initial content provision), except Magic Brew (which took more than a year full-time).
The OKbridge DoubleDummy Analysis website, containing detailed postmortem analyses of all bridge boards I've played through my online bridge club (please, check the OKbridge webpage for more introduction)
Developed in 2002 (no further content updates, discontinued since 2012)
Technology used: Windows XP, Eclipse, Ant, Java (J2SE), JDOM, XML, XSLT 1.0, XHTML 1.0 (Transitional), CSS, commercial GIB bridge component (doubledummy engine, Windows executable)
After manually capturing emailed OKbridge tourney or individual board results as flat text files, their respective Java handler components parse this input into a common XML representation. After that, another Java component (helped by a commercial 'doubledummy engine' bridge component) applies some non-trivial bridge algorithms, resulting in an enriched XML representation. The next Java component summarizes these (and past) results into global XML navigation and player/board performance information. The final Java component styles all generated XML into a XHTML/CSS based local website. New website parts (excluding player performance) are then automatically ftp'd to update the existing internet website. This whole parse/calculate/regenerate/upload process is managed by a single Ant invocation, taking about 1.5 hours to complete for each new 12 boards batch. (This is due to the intensive bridge calculations, excessive styling, and limited ftp bandwidth.)
The Curbies website, containing all possible information about the karting team I'm managing (please, check the Curbies webpage for more introduction)
Developed in 2006 (4-weekly content updates)
Technology used: Windows XP, WS_FTP, Eclipse, Ant, XML, XSLT 2.0, SAXON, XHTML 1.0 (Basic & Strict), CSS, JavaScript
All Curbies management information (including race results), news, and multimedia content is specified in a single XML file. Upon new Curbies Cup lap timings availability, an XSLT component calculates the new race & overall rankings, taking into account different ranking algorithms for different calendar years. After that, another XSLT component styles the enhanced XML data into a cross-referenced XHTML/CSS based local website. Finally, changed content is manually uploaded to the internet website (sometimes as little as a single web page). The nice thing about this website is that it can also be viewed on resource constrained devices, such as early cell phones (achieved via a separate CSS style sheet).
The Eric Baes website, i.e. my personal website (the one you're looking at right now ...)
Developed in 2007 (ad hoc content updates)
Technology used: Windows XP, WS_FTP, MS Word, Eclipse, Ant, XML, XSLT 2.0, SAXON, HTML 4.01 (Transitional)
Website content is captured in a single XML file and processed by means of an XSLT component, generating a simple but uniform HTML based local website (the character encoding being somewhat different on the generated ICT/CV page, because of MS Word based HTML conversion). Everything is manually uploaded to the internet website.
The Magic: The Gathering Decks website, visualising constructed deck annotations and generated statistics, including sideboarding against different matchups
Developed in 2014 (no further content updates, discontinued since 2015)
Technology used: Windows XP, WS_FTP, Eclipse, Ant, XML, XSLT 2.0, SAXON, HTML 4.01 (Transitional), JavaScript
Website content is generated from XML deck description/annotation files and processed by means of an XSLT component, generating a simple but uniform HTML based local website. Everything is manually uploaded to the internet website.
The Magic Brew website, displaying tailored card search filters for more efficient deck construction and card pool synergies lookup, particularly supporting Booster Draft strategy discovery/improvement (please, check the Gatherer webpage for applicable filter criteria)
Developed in 2016/2017 (3-monthly content updates, discontinued since 2018)
Technology used: Windows XP, WS_FTP, Eclipse, Ant, XML, XSLT 2.0, SAXON, XHTML, HTTrack web crawler
Website content is generated from approximately 1000 custom defined/tested XML card search filters, by means of several XSLT components applied on all 'Constructed' and 'Limited' card pool XML specifications: first generating search filter queries, then extensively crawling the Gatherer website plus removing empty result filters, and finally adding cross-referenced glossaries, pointers to readable filter definitions, and a card pools overview as the generated website's entry point. Everything is manually uploaded to the internet website. (A full filters specification/testing/publishing cycle, including regeneration of the 'eternal' formats, is performed upon each new 'Standard' Magic card set release, i.e. currently quarterly. Iteratively performing strict quality testing while narrowing down the initial search filters specification (eventually kicking off the generated website as a first release after incorporating the 'Kaladesh' card pool) was a 'once in a lifetime' immensely time-consuming endeavour !
Halted in 2009, after a couple of months.
In addition to all of the above (and in fact inspired by results from the first project above), I had been contemplating an even larger and more ambitious project. I had been thinking about perhaps illustrating its progress and results from different viewpoints (like project management, requirements engineering, software architecture, data quality, etc.), but my limited time availability would imply getting extremely organized/focused before necessarily proceeding in very small functionality increments, to the point where it would become impractical to proceed alone. In fact, I already crunched/implemented some hard algorithms and I had a couple of backend components ready (plus some user interface prototypes), but I obviously was far away from a working end-to-end product. It's extremely unlikely I'd ever restart this project (especially since I stopped playing bridge), so I published some related ideas at
The Bridge Dummy website, intended as a bridge knowledge mining platform (powered by doubledummy number crunching), enabling improved bridge partner choice through automated player performance analysis