Critical Issues in the Teaching of High Performance Computing to Postgraduate Scientists

Presentation to ICCS 2014 International Conference on Computational Science, Cairns, June 10, 2014

Abstract

High performance computing is in increasing demand, especially with the need to conduct parallel processing on very large datasets, whether evaluated by volume, velocity and variety. Unfortunately the necessary skills - from familiarity with the command line interface, job submission, scripting, through to parallel programming - is not commonly taught at the level required for most researchers. As a result the uptake of HPC usage remains disproportionately low, with emphasis on system metrics taking priority, leading to a situation described as 'high performance computing considered harmful'. Changing this is not of a problem of computational science but rather a problem for computational science which can only be resolved from an multi-disciplinary approach. The following example addresses the main issues in such teaching and thus makes an appeal to some universality in application which may be useful for other institutions.

For the past several years the Victorian Partnership for Advanced Computing (VPAC) has conducted a range of training courses designed to bring the capabilities of postgraduate researchers to a level of competence useful for their research. These courses have developed in this time, in part through providing a significantly wider range of content for varying skillsets, but more importantly by introducing some of the key insights from the discipline of adult and tertiary education in the context of the increasing trend towards lifelong learning. This includes an andragagical orientation, providing integrated structural knowledge, encouraging learner autonomy, self-efficacy, and self-determination, utilising appropriate learning styles for the discipline, utilising modelling and scaffolding for example problems (as a contemporary version of proximal learning), and following up with a connectivist mentoring and outreach program in the context of a culturally diverse audience.

Keywords adult and tertiary education, high performance and scientific computing

1. Potential Demand for High Performance Computing Due to Big Data

The increase in the volume, velocity, and variety of datasets is a well recognised issue. The rate of dataset increase has also been well-researched with Hilbert and López reporting [1] an increase of 23% per annum increase in information storage from 1986 to 2007, and a 28% increase in bidirectional telecommunications. It is acknowledge that this did include analogue data, although by 2007, digital data consisted of 94% of the total. The total sum of stored data, by 2007, was 2.9 × 10^20 compressed bytes. As is well recognised this increase is a significant challenge for storage, curation. and transfer issues. For these particular aspects cloud computing technologies have been a promising solution, albeit with continuing issues in security[2], consistency[3], and - often overlooked - economic viability[4].

Storage issues are, of course, only part of the problem of the deluge of data. It can be reasonably assumed that datasets are stored for a purpose, which means that they must be manipulated or modified in some way. As can be expected with processing of big data a high performance computing system would be necessary. Distributed or loosely coupled computational systems, such as various grid computing architectures (e.g., SETI@home, folding@home) is arguably not the solution for large datasets except in cases where the dataset can be broken up, in which case it is a large collection of smaller datasets. Whether or not this still constitutes a large dataset is moot[5].

The capacity of high performance computers is well-studied, such as with combined metrics, like HPC Challenge, and even with popular metrics such as the Top500. The general architectural principles are well understood as well, with relatively high processor density and high-speed I/O. The combination of high performance computing with parallel programming has provided the sort of metrics that processing large datasets plausible, although these too suffer real challenges. For example the Square Kilometre Array of radio telescopes is expected to produce 2^60 bytes (c 1 billion gigabytes) per day. In contrast, the performance of the top-rated high performance computer has roughly doubled every 14 months since 1993. As of June 2013, the top-ranked Tianhe-2 has a theoretical maximum processing capacity of 54.9024 PFlop/s. At the same programming models such as MapReduce[6], have complemented existing shared and distributed parallel programming standards (OpenMP, MPI respectively) to make the hardware resources accessible to computational tasks.

2. The Critical Knowledge Gap Between HPC and Scientific Computing

Whilst hardware capability and software models, interfaces, and wrappers are necessary for the challenge of big data, they are not sufficient. The over-concentration on these matters without due (or even any) consideration of the uptake by the scientific research community is what led Greg Wilson to famously (some say notoriously) suggest in the pithy slogan "High Performance Computing Considered Harmful"[7]. Whilst the accusation 'considered harmful' is perhaps a phrase subject to some hyperbolia [8], there are real concerns. Perhaps a better title is perhaps to illustrate the problem by way of understatement is "High Performance Computing Less Than Optimal". Wilson is particularly critical of the tendency to conflate scientific computing with high performance computing. According to Wilson, The core issues is that HPC is not all about speed and power. Increasingly the issues of usage, productivity, correctness, and reproducibility must be considered. For the users necessary education and ease-of-use is highlighted. This mean that users should not need to know the details of parallelism, let alone code, although there are always some who will express that interest. Issues like shared versus distributed memory, deadlocks, and race conditions should not be of concern to most scientific computing users.

What is being highlighted here is the third component of successful HPC. On an ontological basis, the first is "hardware", the second "software", and the the third "wetware", to use Rudy Rucker's increasingly accepted terminology [9]; the machines, the applications, and the users. Combined, these make up the necessary and sufficient conditions for successful high performance and scientific computing. Whilst a great deal of effort and attention whilst spent on the metrics, performance, and improvements in the first two components, the third - the active and initiating component is woefully neglected. Despite Wilson's legitimate differentiation between high performance computing (the hardware architecture, and the software or resource managers, schedulers, and compiler wrappers), and scientific computing (applications and the actual practises of research scientists) it should be readily apparent of the need to connect the two together. If the scientific research community do not have the opportunity to learn the basic skillset for high performance computing, they will continue to work away with their preferred applications on their desktop environments and as a result relative research output will decline, a point well-illustrated by Apon et.al., in their study of research productivity and investments in HPC for doctoral-granting institutions [10]

By identifying researchers themselves as a necessary component of successful HPC and scientific computing, their teaching and training is as much a critical issue as hardware manufacture and software programming. The term 'critical' is no exaggeration either. Forget the popular dictionary definitions that associate it with "disapproving comments". What is meant is dealing with those issues that seek to identify a crisis [11]. In medicine a crisis represents a point where, regardless of individual will, the physiological system of an individual is tested to capacity in its ability to heal. In literature, it is the point of the narrative where the protagonist either successfully confronts their antagonist, be that the setting, circumstances or another character or, in the case of tragedy, confronts their own weaknesses. In the environment, or social systems, it is also sensible to speak of crises, points in time and place where the capacity of the system is faced with a "life or death" test in its abilities to continue. One may, especially in the context of proposed funding levels, speak about a financial crisis for many organisations dedicated in the development of positive externalities.

3. HPC Training and Adult Education Insights

The Victorian Partnership for Advanced Computing (VPAC) was a not for profit registered research agency established in 2000 by a consortium of Victorian Universities. The principle of its formation is not uncommon; research institutions required the performativity of manycore clustered computing systems, but lacked the finances and expertise to individually purchase their own system. Collectively however they could do so, and through a proportional of cycles equal to contribution, VPAC provided several systems over the years, including two systems that were in the Top 500. By the mid-2000s the organisation had introduced a number of training courses both in general HPC usage and in MPI programming. It is fair to describe these courses as being technically competent, but lacking the increasing need for integrated education that was required. These courses were conducted by people with excellent technical knowledge, but whose education and training capabilities were more a case of natural ability rather than systematic knowledge.

It is from this knowledge that the critical issue of delivery becomes important. Just as there is a critical issue for research institutions to actually provide high performance computing at the skillset of the users with the increasing prevalence of big dataset problems, once the path of raising the skillset of users is adopted, then the aspects of successful delivery becomes paramount of which the most obvious, and unfortunately sometimes overlooked, is the basic need for lesson planning and post-delivery review. Without such plans the delivery of content in a timely manner can only be achieved by luck, and without such reviews error-checking and assuring the relevance of the content is increasingly difficult. But more specifically for postgraduate scientists is the adoption of the insights of adult education in a context of lifelong learning. The most important of these is recognising the particular characteristics of adult education as distinct from childhood education. This of course must been seen as a continuum, but hopefully at the stage of the postgraduate it is more at the adult end of the line.

These features begin by noting that adult learners tend towards more voluntaristic engagement, rather than compulsory requirements. They are more self-motivated and orientated towards independent learning, and have a greater range of experiential resources. Certainly for the case of training in high performance computing, they are usually attending because they have specific problems relevant to their research that they want solved. The learner, in this sense, is autonomous and usually self-determining with internal rather than external motivations. Further, adult learning tends increasingly towards a research orientation rather than a teacher-training ("chalk and talk") orientation [12]. The reasons for this include chronological opportunities for life experience and development of social networks, the acquisition of legal rights, and neurological changes. The main ramifications is the need for educators to provide a learning environment that is more orientated to a greater level of equality between learner and educator, the opportunity to express knowledge from life experiences, and the flexibility to engage in study directions suited to their own practical tasks and interests.

Content in such an environment must be provided in a structured manner - it is certainly fair to say in our pre-programmatic provision that this was not the case. Structural knowledge suggests the notion that understanding is achieved when knowledge exists in an embedded organised structure with other knowledge, and can be built to acquire further knowledge. Such knowledge can be differentiated from mere recognition [13]. Structured knowledge is what distinguishes the novice from the expert providing "deep knowledge" of the subject matter. Notably knowledge itself doesn't have to be broad - it just must be rationally integrated, and therefore can be elaborated to other circumstances. In contrast Svinicki describes "inert knowledge" (i.e., unstructured) as "the kind of knowledge that students cram the night before the exam and forget... [s]tudents can remember it when explicitly asked for it, but can't use it for anything else. This is the ultimate evil in education". Whilst describing it as "the ultimate evil" may be slightly hyperbolic, but nevertheless one must appreciate their concern. Structured knowledge also adds to the self-efficacy of the learner [14], their own knowledge of their own capabilities and therefore their ability to accurately locate gaps in their knowledge. A challenge in self-efficacy is ensuring that the learner realises that the purpose is not orientated towards how well they have done in the allocated task (a performance outcome), but rather their confidence in carrying out future tasks of a similar nature, as the point of the self-evaluation activity is for accurate estimations future learning capability.

From the trainers perspective, learner self-efficacy also allows for the application of effective development models and "scaffolding" to overcome existing limits. Utilising Vygotsky's classic notion of the Zone of Proximal Development which a person can undertake only with assistance [15]. By engaging in dialogue with the learners as they are undertaking the task uncertainties, ambiguities, and questions concerning the task are raised by the learner to the trainer, who is able to guide and inform with grounded reasons for success or failure in the given task, propositions which are then able to be tested, giving the learner a structured knowledge of the situation. This knowledge is clearly an example of self-efficacy, where the learner is not only aware of their own capabilities, they also know the reasons why they are capable and, perhaps most importantly, they should also know what they don't know. This can be carried out through both appealing to learning style preferences through combination of hands-on activities, real-time examples, and reference material. Empirically, it is recognising that whilst individuals may suggest a learning style preference, it is really the content and discipline that is important [16]. With particular reference to education in computer science there is significant research to indicate the importance of visualisation technology to illustrate concepts [17], the use of interactive lectures as distinct from verbatim learning [18], and an overall constructivist student-centred approach that that argues that ontological attachments (e.g., program to the computer) is largely irrelevant, epistemology is relativistic and fallible, knowledge is acquired recursively, and that learning must be active [20].

These features have all been introduced to the training program at the V3 Alliance, the successor organisation to VPAC. Three main integrated courses are offered every second month in a small class of twelve learners from Victoria's postgraduate research community. Approximately 80% of attendees do have not had any experience with the command-line interface. In the first day they initially are exposed to the environment familiarity and text modification with a relevant context. This is followed by an elaboration of environment modules and multiple examples across broad disciplines of basic job submission, monitoring, retrieval, visualisation, and analysis. In the second day the learners gain a more detailed overview of the command-line environment and are introduced to regular expression and scripting tools. These new skills are introduced to more advanced job submission examples (i.e., jobs with variables, iteration, and conditional branches), along with job arrays, dependencies, and interactive jobs. The second day's training concludes with an example of a basic MPI programs, which then serves and an introduction to the third day's session which covers computer architecture, parallel processing limitations, and some thirty core functions of OpenMPI, including three main test-cases for learners to collaboratively test their knowledge. In each case course material is orientated towards inspirational examples, such as how to discover genetic sequences which reveals propensity towards melanoma, structural integrity of buildings in high winds, biomass in high CO2 environments, drug docking with aspirin and phosoholipse, patterns in earthquake data, and game theory analysis.

4. A Multicultural Audience and Future Directions

Whilst it may initially seem surprising that learners can begin with no command-line experience and by the third day are writing MPI programs, this is no accident. Not only does the capacity exist among the learners. They are, of course, postgraduate scientists and they're typically quite smart. However intelligence is a necessary but not sufficient condition for success. The utilisation of integrated structured knowledge with models and scaffolding and encouraging hands-on collaborative learning, produces users who have abilities, are confident of their abilities, and have reached a deep level of practical understanding of high performance computing and parallel processing. This has been achieved through constant anonymous user reviews of the facilities, trainer delivery, and each component of the course content. An initial review of research output according to course attendees indicates a strong correlation between participation in the course and research output.

One initially unexpected component is the discovery that over three-quarters (75.55%) of attendees are from non-English speaking background (NESB). This raises interesting questions about the appropriateness of the course content and instruction to the audience. Given that these symbolic gestures are largely arbitrary and may even be contradictory across different cultures, it is difficult to expect any teacher to have expertise in the nuances of each and every culture and potential culture that they encounter in the learning environment. Rather, the safest orientation is to be orientated towards neutrality. Nevertheless, a nagging question does remain on why there is such a significant and disproportionate number of students from NESB are enrolling and attending these courses. Research from Andrew Harvey and Kemran Mestan of the Access and Achievment Research Unit of La Trobe University have reported (Harvey, Marsten, 2012) may provide some insight to this question, which notes that the existence of a significant language and collaborations barriers tends to mean that at the tertiary level, NESB students "are underachievers at university and underemployed after it" [21]. The probable reason for the enrolment number demographic and even the responses become obvious only when considered from the adult learner perspective; enrolment occurs because the learners believe that they need training in the subject. In these circumstances further mentoring and outreach, including other collaborative workshops such as "hackathons" are being carried out which have a particularly high level of success in a multicultural environment in computer science [22].

This project of mentoring, outreach and collaboration provides the the first of three elaborations on the training program conducted by the V3 Alliance to enhance researcher productivity. Another project, partially inspired by Wison's "software carpentry" project is to elaborate the skillset of postgraduate researchers in other general areas, specifically a background course in scientific programming (using C, Fortran, and Python), a course in mathematical programming (using Octave and R), and a course in useful tools (SQL, revision control, and Makefiles). A challenge exists in providing these courses in an integrated manner with structured knowledge. A final project that has been recently initiated is steps toward providing an accredited graduate degree in high performance computing, which would provide much more detailed knowledge on relevant computer architecture, variations in resource management and submission tools, and different implementations of parallel programming. Such a graduate programme is introduced partially in recognition of the increased need for specialists in this field, and partially because the the lack of such a courses in general. Most importantly however, is the realisation that of the critical importance to create the next generation of HPC researchers, and to successfully do so with the implementation of the insights from tertiary and adult education into these programmes.

References

[1] Martin Hilbert, Priscila López. The World’s Technological Capacity to Store, Communicate, and Compute Information, Science Vol. 332 no. 6025 1 April 2011, pp. 60-65
[2] Cong Wang, Qian Wang, Kui Ren, Wenjing Lou., Privacy-preserving public auditing for data storage security in cloud computing. INFOCOM, 2010 Proceedings IEEE, 2010
[3] David Bermbach, Markus Klems, Stefan Tai, Michael Menzel., Metastorage: A federated cloud storage system to manage consistency-latency tradeoffs, 2011 IEEE International Conference on Cloud Computing (CLOUD), 2011, p452-459
[4] David SH Rosenthal, Daniel C Rosenthal, Ethan L Miller, Ian F Adams, Mark W Storer, Erez Zadok., The economics of long-term digital storage, Memory of the World in the Digital Age Conference, 201
[5] See for example: Michael D Beynon, Tahsin Kurc, Umit Catalyurek, Chialin Chang, Alan Sussman, Joel Saltz, Distributed processing of very large datasets with DataCutter, Parallel Computing Vol 27 No 11, 2011, p1457-1478
[6] Joe Hellerstein, Programming a Parallel Future, Technical Report No.UCB/EECS-2008-144, University of California Berkeley, 2008
[7] Greg Wilson, High Performance Computing Considered Harmful, 22nd International Symposium on High Performance Computing Systems and Applications, 2008
[8] There are some sixty five papers, presentations, and rants that from the discipline of computer science that have "considered harmful" in their title. Some of the more sensible ones include Edsger Dijkstra "Go To Statement Considered Harmful" (1968), William Wulf and Mary Shaw"Global Variable Considered Harmful" (1973), and the deliciously recursive Eric A. Meyer "Considered Harmful Essays Considered Harmful" (2002).
[9] Rudy Rucker's cyberpunk tetralogy consisted of Software (1982), Wetware (1988), Freeware (1997) and Realware (2000).
[10] Amy Apon, Standley Ahalt, Vijay Dantuluri, et. al., High Performance Computing Instrumentation and Research Productivity in U.S. Universities, Journal of Information Technology Impact, Vol 10, No 2, pp87-98, 2010
[11] Jurgen Habermas, Legitimation Crisis, Beacon Press 1975 (FP 1973)
[12] Malcolm Shepherd Knowles, Elwood F. Holton, Richard A. Swanson, The Adult Learner: The Definitive Classic in Adult Education and Human Resource Development, Gulf Publishing Company, 1998
[13] Marilla D. Svinicki, Learning and Motivation in the Postsecondary Classroom, Anker, 2004
[14] Dale Schunk, Ability versus effort attributional feedback: Differential effects on self-efficacy and achievement. Journal of Educational Psychology, 75, p848-856, 1983, and Dale H. Schunk and Frank Pajare. Self-efficacy in education revisited: Empirical and applied evidence. In D. M. McInerney & S. V. Etten (Eds.), Big Theories Revisited, pp. 115-138, Information Age Publishing, 2004 and Schunk, D.H., Pintrich, P.R., & Meece, J.L.. Attribution theory in D.H.Schunk, P.R. Pintrich, & J.L.Meece, Motivation in education: Theory, research and applications., pp.79 -110. Pearson, 2008.
[15] Lev Vygotsky, Thought and language. MIT Press, 1986 (FP 1934) and Mind and society: The development of Higher Psychological Processes, Harvard University Press, 1978
[16] Maureen Drysdale, Jonathan Ross, Robert Schulz, Cognitive Learning Styles and Academic Performance in 19 First-Year University Courses: Successful Students Versus Students at Risk, Journal of Education for Students Placed at Risk, 6(3), p271-289, 2001
[17] Naps, T. L., Rößling, G., Almstrum, V., Dann, W., Fleischer, et. al.. Exploring the role of visualization and engagement in computer science education. ACM SIGCSE Bulletin, Vol. 35, No. 2, pp. 131-152. ACM., 2002
[18] Susan H. Rodger, An Interactive Lecture Approach to Teaching Computer Science. ACM SIGCSE Bulletin, 27(1), 278-282, 1995
[19] Mordechai Ben-Ari, Constructivism in Computer Science Education. Journal of Computers in Mathematics and Science Teaching, 20(1), p45-73, 20001.
[20] A. Harvey, K. Mestan, Language too big a barrier for non-English speakers, The Australian, October 17, 2012
[21] Peckham, J., Stephenson, P., Hervé, J. Y., Hutt, R., & Encarnação, M.. Increasing student retention in computer science through research programs for undergraduates. ACM SIGCSE Bulletin Vol. 39, No. 1, pp. 124-128. ACM. 2007