We are all a sum of our experiences whether self-learnt or influenced by others through our journey in life. The information below represents my journey and will hopefully shed some light on my background.
I have always believed that the true record of a career is based on the contributions and achievements one makes and the people one influences rather than the name or size of a particular organization. Whilst I have worked in various organizations of various sizes I have mostly enjoyed those who presented with complex challenges often hindered by budget constraints and those organizations who permit and appreciate innovation.
|Period||1994 – 1999|
|Organization Overview||I started my career as employee #7 as a member of the support team assisting customers with technical support enquiries. In the days of early Internet and well before the dot-com boom this small ISP was supported with 15 incoming analog lines and a 64k ISDN connection.After being acquired a number of times and growing from strength to strength it became the third largest Internet Services Provider in Southern Africa before being acquired a final time to form MWEB which is still in operation today.|
|Positions Held||Technical Support, Systems Engineer, Network Engineer, Software Developer (R&D)|
During my early career I was fortunate to experience several roles and opportunities.
Day-to-day I provided telephone support to customers on a number of technical matters ranging from dial-up problems to software and PC matters.
Having taken a keen interest in the experience of customers I undertook the configuration and maintenance of the software install disks making use of custom scripts for modem initialization as well as the Netscape and Microsoft Internet Explorer SDKs to affect branding and customized settings.
With limited access to the server I undertook and maintained a support area of the official web site with this copy taken by the WayBackMachine in 1997.
I was lucky enough to be given a decommissioned Sun SPARC 20 on which the only two things that would work was SunOS and Linux. The device was previously the primary dial-in server for SLIP connections. Having never experienced Linux I installed Slackware on to the machine and started learning the intricacies of systems administration and Linux.
The machine ran for several years as an IRC server which helped me transition to a different team as well as experience communications with different parts of the world.
Whilst part of the systems engineering team I was responsible for day-to-day administration of servers and services, data migration, security, upgrades and other traditional functions.
Working with a completely manual Linux distribution (Slackware) I quickly learnt two things: how to write scripts and how to compile (and debug) software. Where certain features were wanting I was able to extend the code base and implement the missing features.
Having done certain customization to the Livingston RADIUS server I was transferred to the software development team to undertake further projects. It was a thoroughly enjoyable experience which jump started my interest in innovation and creating new technologies.
Working with the team I undertook various projects in the development of information systems such as the early days of billing and CRM systems. Whilst most initial code started as either C or C++ based much of the code base was transitioned to Java in the later years (1998 onwards).
I was very fortunate from the beginning of my career to be afforded multiple opportunities to experience different areas of the organization and I quickly found my flair for innovation.
In the days well before popular services such as YouTube, Icecast and Shoutcast I discovered a US-based company selling ‘RealAudio’. With the correct hardware in place this amazing software could digitize an audio stream and serve it to thousands of online listeners which was virtually unheard of at the time. In fact MP3 was not yet widely adopted or available. With support from a commercial radio station I established the first live audio streaming service in Southern Africa.
Click play to listen to the audio quality of early Internet radio streaming
Whilst the ISP was rapidly growing a new problem was emerging; How does one become aware of service outages? Network equipment of that era was either controlled via serial access or telnet and neither SSH or APIs were available on Linux servers.
With no ready solution available the initiative was taken to abstract the serial and telnet services to accessible layers and to wrap them in an easily accessible service suitable for polling. Similar initiatives were made to affect similar abstraction to SMTP and POP3 services.
Consequently with the ability to poll services for status a trivial yet functional monitoring and alerting system was developed.
Most likely the pinnacle of my career at Global Internet Access is the invention of a tool to analyse the quality of analog modems. At the time South Africa was already hindered by a strain on the telecommunications infrastructure due tot the emergence of the Internet and ISP’s were being hampered by confusion over the two competing high-speed standards (K56Flex and V.90).
Cisco Systems had just released their new flagship access gateway (AS5300) and whilst it supported primary rate ISDN its firmware had mixed success on 56k analog signals leading to unhappy customers and strained relationships. With access to an ISDN line and a PCI dial-up card I developed a driver to place an analogue call and analyze the tone and carrier quality of the handshake. These fundamental lessons in digital signal processing placed me on a path to do other amazing things.
Whether this innovation or the sheer influx of complaints to Cisco led to the overall fix the result was that overall stability was achieved and the equipment continued to service the provider until the superseding ADSL technology became mainstream some years later.
Before the advent of digital broadcasting analogue televisions used to produce pictures by scanning lines from the top left to the bottom right of the screen. When it reached the bottom right there was a brief interval, known as the vertical blanking interval, where the TV did not make a picture otherwise you would see a white diagonal line across the screen.
Whilst the television did not display an image during this interval it still received a signal. By multiplexing data in to the vertical blanking interface the television signal was able to carry a limited amount of data, mostly text, without interfering with the quality of the video and audio transmission.
I was handed a TV receiver card and software to the major broadcaster (South African Broadcasting Corporation) with a brief to “try and do something with this“.
Using some very early-stage APIs and access to services such as the airport data, weather bureau, Usenet news groups and tapping in to media sources from Reuters and AAP by wire protocol I was able to establish a continuous information feed. The project progressed to establish a content management platform whereby the broadcaster and third-parties could manage information feeds. Sadly as teletext-capable television sets were not prominent and the service not widely accessible the project was decommissioned.
I’ve been blogging before the term ‘blog’ was invented. In 1997 I maintained a popular weekly (highly opinionated) column called ‘Attitude On-Line’ which crossed-over on occasion to talk-back radio.
|Organization Overview||Before migration to Australia I furthered my knowledge in the delivery of Internet streaming and data casting with Global Access who is a private television broadcaster in Southern Africa occupying multiple satellite TV stations.|
Their impressive clientele includes banking institutions and vehicle and machine manufacturers who use the service to manufacture and broadcast training materials and multimedia content.
|Positions Held||Integration Engineer|
As a satellite TV channel can carry a significant amount of downstream data, albeit one-way, this afforded me an opportunity to establish a content delivery platform that would make use of MPEG data services to transmit and deliver multimedia content.
Predominantly the scope of the role included the development and ongoing stabilization of this content delivery platform.
Although initially tasked to deliver a content delivery platform I delivered two additional initiatives.
As satellites are inherently multicast and deliver the same content to multiple subscribers a system was developed to deliver produced content to various establishments. Using a computer connected to a TV or monitor the computer would play the produced content locally to provide both entertainment and information to customers waiting in a queue.
The solution relied upon developing a ‘packed’ data stream that would cause the intended targets to download and store the content locally. As the stream included forward-error correction very few packets were missed. For incomplete transmissions the individual computers would signal back to the platform any missed data chunks which would then be included in subsequent stream broadcasts.
In this manner a single broadcast channel was re-usable across multiple different types of targets and content was efficiently disseminated to across multiple subscribers and industry types.
Taking advantage of the inherent multicast nature of satellite broadcasts I developed a way to include multimedia streams inside MPEG data and for those streams to be received by a on-premise gateway located at the customer.
This gateway could then convert the stream to local uni-cast streams or make use of network multicast to deliver video and audio to desktops with near-perfect synchronicity.
|Period||2000 – 2003|
|Organization Overview||After migrating to Australia I joined Pacific Internet (later PacNet) as a Systems Engineer in Sydney undertaking a range of systems and data migration projects.|
Pacific Internet (Australia) is part of the larger PacNet group specializing in the provision of business-grade Internet services and technologies in Australia.
|Positions Held||Systems Engineer|
Although primarily tasked with migration activities I assisted in the day-to-day management of servers and infrastructure as well as participating in after-hours on-call and monitoring activities.
My journey in Pacific Internet saw me assist in many interesting activities such as migration of complex financial records between systems as well as smaller projects in networking and voice systems.
The Pacific Internet Mail Proxy was an innovative proxy solution for customer email. At its core it was a fully functional POP3, IMAP4 and SMTP front-end proxy responsible for the mapping of customer email addresses and delivering these to the correct server and mailbox.
As Pacific Internet was migrating and integrating the customer base of five newly acquired ISPs on to centralized infrastructure a delicate solution was required with which to provide modern services without inconvenience to the existing customer base. The solution was to devise a means by which the customer can continue business as usual without change in email address or software settings.
Inherently this called for a sophisticated protocol-compliant proxy server. Whilst facilitating connections from existing customers and servers the proxy would re-map the customer’s address to the correct server in the cluster and deliver or retrieve mail from the correctly mapped folder.
This solution continued to run for many years until PacNet migrated the last of the customers under their singular brand.
With the increase in email came the increase in spam and other unsolicited forms of marketing. The solution was to extend P.I.M.P. to include powerful inbound and outbound email scanning facilities which would provide a first-line of defense against unsolicited email and virus attachments.
This type of solution became invaluable during rampant virus outbreaks as it would protect against the receiving of viruses and, should a customer have been infected, prevent against the distribution of the virus from the infected computer.
|Period||2005 – 2008|
|Organization Overview||People Telecom was a rare gem operating in Perth, Western Australia with its main head office in Sydney, New South Wales. Its rare attraction was that it operated one of the first gigabit fibre-optic networks in and around the Perth CBD and provided the first-generation business-grade and triple-play services.|
|Positions Held||Systems Engineer/Team Leader|
I had not been a NOC engineer for many years so it was pretty exciting that my desk was located in the NOC which provided a constant stimulus and dynamic environment. Apart from the colorful staff and customers the busy and sometimes pressured atmosphere provided for a work hard and play hard environment.
Apart from the generic systems engineering and administration tasks this role provided a healthy dose of projects from the establishment of core infrastructure to refreshment of storage systems, voice equipment and several data-centre relocations.
Equipment refreshes provide the perfect opportunity to undertake a 360 degree review of infrastructure and services and to weight these against the future plans and visions of the business.
On this occasion it was clear that most hardware was not being optimally used and thus the discussion of virtualization took place. For various reasons VMWare is too heavy for the intended applications requiring a full install of the guest operating system for each instance and therefore we experimented with Linux kernel patches effectively isolating the CPU, memory, kernel and user-space so as to enable an alternate form of virtualization.
Thereafter it was only a matter of installing the correct packages or binaries as per the requirements in order to create a fully operational virtualized system.
I am by no means a voice expert but I enjoy the excitement of the unfamiliar and the challenge of the impossible. The sheer amount of dropped calls from facsimile devices were constantly setting off monitoring alarms which could not be squelched so a solution was needed.
I had previously worked with some digital signal processing techniques and it was well known in the office that it was a hobby of mine so when the boss heard that faxes were failing due to DSP issues the problem came my way. One should appreciate that fax machines were built for an entirely different era and telecommunications infrastructure and thus do not cope well with the subtle timing and signal differences of voice over IP networks.
It took some effort but fortunately the work of other projects such as SpanDSP made the process significantly more simple requiring only the code to make up the protocol stack for signaling and session establishment. Further work was done to make the stack compatible with Cisco equipment such as the AS5400 series and greatly reduced the number of dropped fax calls.
However because each manufacturer had their own nuances the stack only remained compatible with major brands with some others still experiencing difficulties. At some point the decision was made to not continue with development as the cost vs. benefit was simply not there for the smaller and lesser known manufacturer compatibility.
I love messaging challenges – I really do. When you have multiple disparate systems all requiring or generating pieces of information the natural tendency always seems to fall back to some sort of point-to-point solution. It was time to put this practice to rest and come up with a more robust and streamlined solution.
The answer presented itself in the form of an open-source enterprise services bus which facilitated the connection to multiple disparate endpoints or transports and numerous techniques to extract and transform messages of information from the endpoint-specific format to a universal format and vise versa.
In this manner it was easy to centralize the gathering and dissemination of messages allowing for centralized monitoring, reporting and management of inter-system communications.
|Period||2008 – 2011|
|Organization Overview||Downer EDi (Mining) is Australia’s leading mining contractor specializing in a wide range of services across metals and minerals.|
|Positions Held||Solutions Architect|
The telecommunications industry, at the time, experienced a downturn with little or no innovation occurring and with most of the work either being outsourced or driven by vendor solutions. At the same time the mining industry was experiencing a boom and thus I started on a new adventure.
And that’s exactly what it was. It was an adventure mixed with exotic trips to the most remote parts of Australia and yet remained grounded in the daily life of a very busy corporation.
My role was extremely varied and mostly centered around the activities required to establish IT and communications on new sites as well as the activities surrounding their decommissioning and it is here that I made my mark by introducing many improvements in process and operational efficiencies.
Despite the lateral career shift I embraced the new environment undertaking all the requisite basic mining training and progressing right through to the “G1” level usually reserved for accident investigations and up to “S3” for supervisors even though I had no direct reports. When stranded for two weeks on a site due to heavy rains I took advantage of the time and qualified for my learners permit for a 380 tonne rear dumper although I never had the opportunity again to proceed further.
In the pre-cloud era the understaffed IT department was challenged by lengthy timelines in establishing communications and facilities at various remote sites. Vendor deliveries of equipment were slow and the lengthy time taken to undertake cabling infrastructure provided a significant barrier to success. Typical IT services timelines took between 30 to 45 days. This was in stark contrast to other departments who were able to move entire excavators and parts warehouses in what seemed like days.
This incredible disparity provided a wonderful opportunity for improvement which spanned across people and process.
In its first iteration an extensive guide was produced which set forth a comprehensive standard of site deployments. Vendors were standardized, equipment was standardized and even the cabling was standardized to color and function. Rather than relying on the organization first winning a contract and then using its budget to fund the site establishment a better approach was to engage with the business and regularly discuss and evaluate the upcoming contracts and their likelihood of success. On this basis it became possible to work with vendors and guide them on delivering critical services and equipment to meet forecasted timelines.
Merely by talking with the business and its people I learnt how other teams were able to mobilize their equipment so rapidly. This led to a complete innovation of a bespoke 40-foot container which included a small dust-proof data centre as well as adequate storage for all IT equipment from PC’s through to photo copiers. Simply by loading a container on to a truck from the central stores the entire infrastructure could be delivered and established on site in a matter of days.
Remote Australia is exactly that. It is devoid of just about everything. The nearest town may be 300km away. There is no mobile reception, no service stations and the environment consists of mainly rocks, dust and the occasional shrub. Working in such an environment presents significant IT and communications challenges.
People need to communicate. Work has to be done and those who sacrifice everything in such a hostile environment to provide for their families need to be able to communicate home. Until a better solution was available expensive satellite phones were the only option.
The inspiration to address this problem came to me whilst I attended a military display stationed at the location of an ANZAC parade. Whilst their primary purpose was to recruit the next generation of protectors I took great interest in their communications technology being a radio amateur myself. I was particularly impressed with a mobile satellite system mounted on a truck which carried all of their voice and data signals.
Downer EDI does not have access to military technology nor the military budget but the idea stayed with me for a number of weeks whilst I worked on the idea in the background. It took some fine negotiation skills but I was able to convince a project manager to fund the ‘lead of faith’ and I produced a smaller mobile satellite system on a trailer.
In line with safety-first requirements the dish could be stowed, rapidly, in case of severe weather and it provided both voice and data services via a powerful VSAT system. Although the voice-over-IP (over satellite) was delayed significantly most people on-site did not report any major differences over the satellite phone.
The difference however was significant to the bottom line of Downer EDI who paid a significantly lower rate for the VSAT services over the satellite phone and with a competitive rate of untimed calls to anywhere in Australia costing only 10c the personnel were free to make calls and never miss those critical moments or events so important to families.
As the technology became more adopted it found its place not only in remote communications but also in cases where immediate communications and services were required ahead of major site establishment activities (explorations, geological mapping and the like).
As Downer EDi (Mining) had its own MPLS network it seemed counter-intuitive to continue establishing sites using lengthy runs of jelly cables for telephone systems as well as lengthy runs of network and fibre optic cable for network services.
After the organization conducted a ‘back-to-basics’ review an opportunity presented itself to convert the traditional analog and digital telephone systems to IP telephony and have these carried over the MPLS network at significantly reduced rates. Downer EDi’s aggressive negotiation with telecommunications carriers meant that its per-megabit data rates were significantly cheaper than its long-distance and cellular calls and thus the decision was made to implement voice-over-IP throughout the major offices and sites in stages and as opportunities presented themselves.
The benefit was realized in the company’s bottom-line which saw a significant reduction in telephone related expenses with additional benefits realized through significantly easier deployments and decommissioning expenses versus traditional PABX equipment.
|Period||2011 – 2012|
|Organization Overview||Anittel was a specialized full-spectrum organization offering business-grade services to small and medium enterprises such as managed IT services, telecommunications services, cloud services and professional consultancy services. At its height it employed close to 150 persons with 11 offices across Australia.|
|Positions Held||National Carrier Manager|
Anittel was an embattled company plagued by shifting goal posts, an economic downturn and significant capital investments in areas which yielded poor return. I joined the organization with full knowledge of these facts and despite the many barriers I was honored to have worked with an insanely dedicated team and met some of the industry’s brightest minds.
My role was to manage the telecommunications services under Anittel’s carrier license and to bring about improvements in the profitability of the division.
Despite various initiatives the board of directors voted to sell the carrier division to one of its competitors as well and the remainder of the divisions were further sold and transitioned to other organizations.
|Period||2012 – 2016|
|Organization Overview||Globe Telecom commonly shortened as Globe, is a major provider of telecommunications services in the Philippines, supported by over 6,200 employees and nearly 1.05 million retailers, distributors, suppliers, and business partners nationwide. The company operates one of the largest mobile, fixed line, and broadband networks in the country, providing communications services to individual customers, small and medium-sized businesses, and corporate and enterprise clients. Globe currently has about 48.4 million mobile subscribers, nearly 3.5 million broadband customers, and 859 thousand landline subscribers.|
I was fortunate to have the opportunity to relocate to the Philippines and in doing so took up a consultancy opportunity with Globe Telecom.
As a busy Solutions Architect I rapidly found my satisfaction in architecting solutions for the delivery of over-the-top content services such as YouTube, Spotify, NBA, NetFlix, Disney and others as well as a range of application and integrations solutions.
The architected solutions has an unintended downstream effect with solution designers and implementation teams consistently deviating from the design as a result of unavailable technologies or skilled persons for implementation.
In an effort to serve the organization better I applied for a transfer to a division responsible for innovation and emerging technologies. It was here that I created Programmable Globe.
I created Programmable Globe in response to the desperate need to prevent and eliminate ‘spaghetti’ integrations and point-to-point solutions.
Prior to Programmable Globe each solution or integration would require multiple systems to be analyzed and modified to accommodate new functionality which increased the delivery times of solutions, cost of implementation and well as the costs of ongoing support and maintenance.
Thus Programmable Globe is a multi-tiered framework of technology based on the foundations of reusability. It successfully exposes back-end system functions as a series of reusable micro-services and application interfaces to greatly improve the access to critical services and functions for use in projects, solutions and third-party integrators.
The introduction of Programmable Globe has helped the organization to bring leading services to South-East Asia such as Spotify, YouTube, NetFlix, NBA as well as cashless card payment solutions.
After trying to apply to the organization for my own broadband connection led to a one-month delivery timeframe I began to investigate where the most significant lead time was. The discovery led to widespread changes in the way information is handled.
Prior to these changes the process of delivering a broadband connection (and many others) was a completely manual process beginning at the time of accomplishing a paper-based application form through to the customer acceptance. during the process the paperwork accumulated and hand-off to other departments increased exponentially. Applications were tracked through daily status reports on an Excel spreadsheet.
After using this experience to hold a mirror to the typical experience of a customer I was successful in the campaign of a Business Process Management solution which was greatly assisted by the willingness of another division to fund the purchase of such.
Since then the BPM solution has been systemically used in the eradication of paper based and manual process and was integrated in to Programmable Globe making use of its extensive catalogue of APIs.
At its core Globe generates millions of messages and transactions between systems on an hourly basis and whilst systems are designed with some form of transactional abilities between them these were neither standard or at times well understood.
I augmented Programmable Globe with two message brokers. One, a cloud-based solution capable of an average throughput of 22,000 messaged per second with guaranteed delivery and two, a hardware solution capable of over 2 million messages per second with 190,000 per second guaranteed delivery.
The messaging service became an integral backbone of Programmable Globe ensuring that transactions were resilient and guaranteed to reach their destination with multiple failover options should an individual system or end-point become unavailable.
Certain API’s were transitioned to make use of the message broker without impact to end-users of the APIs. This resulted in a greatly improved capacity to handle sudden peaks without loss of message or transaction integrity.
|As a struggling student I was forced to originate my own 1200bps radio modem in order to play with the ‘big boys’ on amateur radio.|
Click play to listen to an audio sample
|Using a handful of available decoder/encoder microchips, a custom modem driver and some downloadable software I was able to access radio-based PC’s for a number of years until the Internet essentially eroded this interest.|
|From an early age I had maintained a healthy interest in digital signal processing. Slow-scan television is a means to encode and decode images by modulating a picture, line-by-line, in to several spectral bands.|
Click play to listen to an audio sample
|Images are ‘scanned’ three times. Once for red, another for green and finally blue with each given a particular audio tone to indicate intensity. Over time this creates a melody of audio tones which is received by a corresponding decoder and converted to an image. Typically these images would be send on HF radio across continents.|
|SpaceCam 1 was a lengthy project between MAREX and NASA to place amateur radio slow-scan television aboard the International Space Station (ISS) in an effort to have school students and hobby groups receive continuous images from the station as it passes during orbit.|
Click play to listen to an audio sample
|Click here to verify SpaceCam 1 Team Members|
|I have a healthy obsession with trying to make music as loud and consistent as possible without introducing distortion, artifacts or removing from the natural dynamics of the sounds. This has led me on an extensive journey of discovery in the field of digital signal processing and one which I still enjoy and actively pursue to this day.|
Click play to listen to an unprocessed audio track.
Click play to listen to a processed audio track.