About Lev Lafayette

Crocodile Logo

Lev Lafayette has an MSc (Information Systems) from Salford University, and an MBA (Technology Management) from the Chifley Business School, where he was on the Dean's List. He also has a Graduate Certificate in Project Management from the same institution, and an honours degree from Murdoch University in Politics, Philosophy and Sociology which is commented upon by the Vice-Chancellor of the time. Many years later he completed a Graduate Certificate in Adult and Tertiary Education at the same institution. He is currently undertaking a Masters in Higher Education at the University of Otago.

He is a certified PRINCE2 Practioner, and an Adult and Workplace Trainer. Clearly not satisfied with two masters degree, he's started a third, this time a Master of Higher Education at the University of Otago. With a interdisciplinary approach, Lev's interests include the political implementation of universal pragmatics, the relationship between communications technology and society, and comparative economic systems. On again and off again, he plods his way through completing a PhD in Social Theory as well.

Professionally however, Lev is an experienced systems administrator, specialising in the Linux operating system and scientific applications, a project manager, systems engineer, and quality management systems coordinator, specifically for ISO 9001 (Quality assurance) and ISO 270001 (Information Technology Security). He also does a lot of training for researchers and technical staff in Linux, High Performance Computing, mathematical programming, Postgresql, and related subjects, with graduates and post-doctoral researchers from a variety of organisations including: RMIT, La Trobe University, the University of Melbourne, Deakin University, Swinburne University, Victoria University of Technology, Monash University, the Australian Synchrotron, the Department of Environment and Primary Industries, the University of Sydney, Macquarie University, the University of New South Wales, the University of Western Australia, the Australian Institute of Health Innovation, the Westmead Millennium Institute, the Australian Radiation Protection and Nuclear Safety Agency, and the Australian Institution of Marine Science.

Previous employment and clients include several years working as a computer systems trainer and database management for the Parliamentary Labor Party in Victoria. Following this he worked for the Ministry of Foreign Affairs in Timor Leste (East Timor) managing their computer network and providing training and technical expertise to that Ministry in their first year of self-governance. Dr. Ramos-Horta provided the following comments on his work.

Lev works for the Research Computing Services group at the University of Melbourne as the Senior High Performance Computing Development and Operations Engineer, and prior to that Victorian Partnership for Advanced Computing, as a systems administrator for Linux clusters. As per those roles, this site is mostly dedicated to issues concerning High Performance Computing, Scientific Computing and Supercomputing. Lev is involved in Linux Users of Victoria, having spent four years as President, two years as Public Officer, two years as Vice-President, a year as Treasurer and is now in his third year as an ordinary committee member. He is has a coordinating role in the annual Multicore World conference and typically take the role of MC.

The crocodile logo was designed by Victoria Jankowski. It was first used on the cover of Neon-komputadór, the first IT training manual for the Ministry of Foreign Affairs in East Timor which was printed and translated by the United Nations Development Programme. The crocodile represents the Timorese people and is the emblem of their land. The integrated circuit represents their independent connectivity to the wider world.

You can also find a political site that Lev subscribes to, The Isocracy Network, a synthesis of several progressive political orientations, and RPG Review which covers his interests in roleplaying and simulation games, including as editor of the namesake journal. This includes being the author of one very ironic RPG (Papers & Paychecks) and supplement (Cow-Orkers in the Scary Devil Monastery), a co-author of another (Fox Magic, author a supplement, Rolemaster Companion VI), as well as plot and character development in the computer game Cargo. He has also been a playtester for RuneQuest, Traveller, Basic Role Playing, and Eclipse Phase.

As a naturalistic pantheist with an interfaith perspective, he manages and contributes to the Lightbringers website which includes various addresses and essays on philosophy and religion. Recently he has taken up the role of University Outreach Officer for the International Society for Philosophers.

Finally, he also has a livejournal account, which will probably be quite boring to anyone who doesn't know him personally.

That's enough of me talking about myself in the third person like Cerebus The Aardvark.

Interactive HPC Computation with Open OnDemand and FastX

As dataset size and complexity requirements grow increasingly researchers need to find additional computational power for processing. A preferred choice is high performance computing (HPC) which, due to its physical architecture, operating system, and optimised application installations, is best suited for such processing. However HPC systems have historically been less effective at the visual display, and least of all in an interactive manner, leading into a general truism of "compute on the HPC, visualise locally".

APA Style vs IEEE Style: Fight!

There is much that irks me in academia. The way that disciplines are almost randomly assigned to artium, scientiae, or legum, without any reference to their means of verification or falsification. Or, for that matter, the Dewey (or Universal) Decimal Classification for libraries, which, in its insanity, places computer applications in the same category as "Fundamentals of knowledge and culture" and "Propaedeutics". One could also describe ask why the value "Dead languages of unknown affiliation" also belongs with "Caucasian languages". I suppose most of them are "near dead", right?

Then there is the eye-watering level of digital illiteracy among academics, researchers, students, and professional staff. It is little wonder that closed-knowledge academic journals and proprietary software companies fleece the university sheep and make out like bandits. They don't even realise that they've been robbed, such is the practical ignorance of the lofty principles that they espouse. Ever received a document in a proprietary format explaining how important it is to make content accessible for the visually impaired? Yeah, it's like that all the time, a combination of hypocrisy combined with willful ignorance.

But I reserve a special spot in my hell for referencing systems.

Eazy-Photoz EasyBuild Script and Test Cases

"EAZY is a photometric redshift code designed to produce high-quality redshifts for situations where complete spectroscopic calibration samples are not available", which is a pretty excellent project. However, it has a few issues that are illustrative of typical problems when developers aren't thinking in terms of operations. I like this software, it carries out a valuable scientific task with ease, which will be awesome in an HPC environment, especially when run as a job array.

Image Watermarks in Batch

A common need among those who engage in large scale image processing is to assign a watermark of some description to their images. Further, so I have been told, it is preferable to have multiple watermarks that have slightly different kerning depending on whether the image is portrait or landscape. Thus there are two functions to this script, one for separating the mass of images into a directory into whether they are portrait or landscape, and a second to apply the appropriate watermark. The script is therefore structured as follows and witness the neatness and advantages of structured coding, even in shell scripts. I learned a lot from first-year Pascal programming.

eResearchAustralasia 2020

With annual conferences since 2007 eResearchAustralasia was hosted online this year, due to the impacts of SARS-CoV-2. Typically conferences are held along the eastern seaboard of Australia, which does bring into question the "-asia" part of the suffix. Even the conference logo highlights Australia and New Zealand, to the exclusion of the rest of the word. I am not sure how eResearch NZ feels about this encroachment on their territory.

Contributing To the International HPC Certification Forum

As datasets grow in size and complexity faster than personal computational devices are able to perform more researchers seek HPC systems as a solution to their computational problems. However, many researchers lack the familiarity with the environment for HPC, and require training. As the formal education curriculum has not yet responded sufficiently to this pressure, leaving HPC centres to provide basic training.

Spartan: From Experimental Hybrid towards a Petascale Future

Previous presentations to eResearch Australiasia described the implementation of Spartan, the University of Melbourne’s general- purpose HPC system. Initially, this system was small but innovative, arguably even experimental. Features included making extensive use of cloud infrastructure for compute nodes, OpenStack for deployment, Ceph for the file system, ROCE for network, Slurm as the workload manager, EasyBuild and LMod, etc.

Monitoring HPC Systems Against Compromised SSH

Secure Shell is a very well established cryptographic network protocol for accessing operating network services and is the typical way to access high-performance computing (HPC) systems in preference to various unsecured remote shell protocols, such as rlogin, telnet, and ftp. As with any security protocol it has undergone several changes to improve the strength of the program, most notably the improvement to SSH-2 which incorporated Diffie-Hellman key exchange. The security advantages of SSH are sufficient that there are strong arguments that computing users should use SSH "everywhere".

Process Locally, Backup Remotely

Recently, a friend expressed a degree of shock that I could pull old, even very old, items of conversation from emails, Facebook messenger, etc., with apparent ease. "But I wrote that 17 years ago". They were even dismayed when I revealed that this all just stored as plain-text files, suggesting that perhaps I was like a spy, engaging in some sort of data collection on them by way of mutual conversations.

For my own part, I was equally shocked by their reaction. Another night of fitful sleep, where feelings of self-doubt percolate. Is this yet another example that I'm have some sort of alien psyche? But of course, this is not the case, as keeping old emails and the like as local text files is completely normal in computer science. All my work and professional colleagues do this.

What is the cause of this disparity between the computer scientist and the ubiquitous computer user? Once I realised that the disparity of expected behaviour was not personal, but professional, there was clarity. Essentially, the convenience of cloud technologies and their promotion of applications through Software as a Service (SaaS) has led to some very poor computational habits among general users that have significant real-world inefficiencies.

Pages