My interest in the concept of distributed computing started back in the mid 1990’s almost by accident. I was an Analyst Programmer in a small Life Assurance company and we had a big problem in our Telesales Department. They were using a bespoke quotation system that had been built in Paradox For Windows under Windows 3.1 and every time a Quotation was printed out for a customer the Users machine performing the print would lose 2% of Windows system resources. Eventually, when hitting around 30% remaining resources, the quotation printing function would start producing garbage by way of random characters showing up on the printed page. In addition the printing process, which required a number of different formatted reports, took sometimes up to 5 minutes to run. During this time the Users machine would whir away with various windows popping up, coming and going, in an old school buffered keystrokes sort of way until the print was sent to the print queue. This issue went on for months and cost a lot of time, frustration, and wasted paper. The problem was thrown my way and the solution I was asked to build was based on the re-engineering of the printed reports from Paradox For Windows into Crystal Reports.
Doesn’t sound like distributed computing yet does it? Well to relieve the burden from Users machines we sent the request to print to a queue held in a database table and the servicing of the print jobs themselves was performed by sentinel machines i.e PCs that polled the queue every 30 seconds and picked up work as it came into the queue. Me being me I built the software in such a way that more sentinel machines could be added PLUS I wrote the software to able to run executable programs at certain times of the day i.e a scheduling service. There was no processing or messaging between sentinel machines or User machines at first but As time went on I built different types of jobs that automated tasks to relieve the burden of various systems in the company. I did try to persuade the company to purchase the early versions of DCOM that allowed communication between different machines on a network but I was knocked back. Rather than be defeated I wrote my own version which made use of hot-pluggable DLLs with text files in a structured format as the messaging medium. The success of this enabled me to save the company money on license fees on things like Address/Postcoding software as well as automate the collection of Management Information from Production copies of data into Spreadsheets which had all kinds of predictive operations and sales forecasts built in.
Looking back I realise that I was actually a little out of control. I told my boss at the time that from 9 to 5 I work on whatever I’ve been asked to officially work on but after that I will do what I want. It actually worked out pretty well both for the company and for myself because I was, and have always been, of the mind that if you look after the company then the company will look after you. I was very creative, and worked very quickly (and for long hours I might add) and I pushed the capabilities of the company forward in any way I could. I also pushed the development team by sheer force of effort into using Delphi for software development and in my own time I rewrote 3 or 4 systems all because I wanted to make the company more efficient as well as to further my knowledge. A couple of these system re-writes were done in no more than a couple of weeks. I built everything in a component-based fashion and within a year or so I took the company from being able to generate one quotation per user every 5 minutes to 1200 quotations a second by moving the mechanics of the actuarial quotation engine from using lookup data in DB Tables to in-memory based structures. I built DLLs for fun and wrote Delphi Components in my spare time. It’s just how I was. I was also always being given all kinds of special tasks to do directly by the Managing Director of the company, an Actuary, and did things a Software Developer shouldn’t really get involved in. For instance I helped to administer the companies ReAssurance Programs and worked on Pricing Model development for new products. I enjoyed learning the maths and putting into practice the concepts of component based development. I also got involved in writing management information reports and did things like building predictive sales models using “chain/ladders” (development triangles) techniques and automated the production of them.
Eventually of course I outgrew the I.T Department and moved into Risk Management. I learnt more about Life Assurance products and as part of my new role I owned the development and Production of the companies Management Information. During this period I studied the products the company sold and reverse engineered our Pricing Models, the only “documentation” of which was the code in Object Pascal, back into Excel. I also collected relevant data from everywhere, studied the UK Housing Market, and was introduced to the world of Stochastic Modelling.
So – I’ve kind of gone off at a tangent here. In writing this I allowed myself to do so partly because I’m still amazed I got away with doing everything I did and also just to demonstrate that when it comes to doing what I’m interested in I’m very driven. The foundations I laid back in the 1990’s and the early 2000’s in respect of the work I was doing and the skills I taught myself have in a very real sense come full circle.
I “grew up” around Actuaries and developed a feel for the technical lifecycle of insurance products. I just wanted to learn and to provide a meaningful contribution. I often discovered that the things that people did in business aren’t always for the good of the company – not everybody was altruistic in their intentions – and the back and forth of management politics bothered me immensely. What I set out to do, to kill off some of the pointless discussion, was to ensure that the data and information was up to date and from the same datasets. Doing this one simple thing at least gave any discussion about business process a stable basis – no more arguing about whose data was correct! The next step was to build statistically relevant analysis.
So the concept of using statistical analysis for decision making is not new to me. What is new however is that companies everywhere, and not just those predisposed to using data (Insurance Companies), are using statistical analysis for decision making. What’s more is that the computing power required to perform what is required to analyse data, given the huge mountain of it generated everyday, is more readily available than ever before. Somehow all of the required elements have come together for me and have created a perfect storm in my fractured mind. The use of distributed and parallel processing power is an important concept and is one that I’m determined to learn. To learn this from the ground up I decided a while back that I wanted to build a Supercomputer or Cluster using Raspberry Pi’s. It just so happened that the triggers for my interest in the area of cluster development occurred just as the Raspberry Pi3 was released so I naturally gravitated towards the idea of using those. This series of posts will record my foray into the development of my Pi3 Cluster.