A b s t r a c t s
of Keynote Speakers

Alexander Boukhanovsky
ITMO University, Russia

Urgent Computing for Metocean Extreme Events: St. Petersburg's Flood Protection Barrier

Abstract:

The report presents the experience of creating a flood prevention system in St. Petersburg based on the principles of urgent computing. The main models, workflow, planning methods and practical implementation, as well as the estimation of the quality of calculations are described. The system has been successfully in operation since 2011, during this period of time 16 floods were prevented.

Prof. Dr. Alexander Boukhanovsky is the Chair of High Performance Computing (HPC) Department in ITMO University. In 2005, he defended his dissertation on Concurrent Software Statistical Measurements of Spatial-Temporal Fields. Since 2006 he has been working as a Professor of Information Systems and Head of the Parallel Software lab in ITMO University. In 2007 he created the eScience Research Institute where his team has created CLAVIRE (CLoud Applications VIRtual Environment).
In recent years he attracted several grants including mega grants of the Russian Federation Government, e.g. decree #220 “on measures to attract Leading Scientists in the Russian educational institution” and decree #218 “cooperation of Russian higher education institutions and organizations implementing complex projects of high-tech industry”.
His research interests are high-performance computing, computer modelling of complex systems, intelligent computational technologies, statistical analysis and synthesis of spatial-temporal fields, parallel and distributed computing, distributed environments for multidisciplinary researches, decision support systems and technologies, statistical analysis and simulation in marine sciences. He is the author of 230 publications (cited over 1000 times) and has successfully advised 23 PhD candidates.


Mario Cannataro
University "Magna Græcia" of Catanzaro, Catanzaro, Italy

Efficient Preprocessing and Analysis of Omics Data

Abstract:

Genomics, proteomics, and interactomics disciplines are gaining an increasing interest in the scientific community due to the availability of novel, high throughput platforms for the investigation of the cell machinery, such as mass spectrometry, microarray, next generation sequencing, that are producing an overwhelming amount of experimental omics data. The increasing volume of omics data poses new challenges both for their efficient storage and integration as well as for their efficient preprocessing and analysis. Managing omics data requires both support and spaces for data storing, as well as efficient, possibly parallel algorithms for data preprocessing and analysis. The resulting scenario comprises a set of methodologies and bioinformatics tools, often implemented as services, for the management and analysis of omics data stored locally or in geographically distributed biological databases. The talk describes some parallel and distributed bioinformatics tools for the preprocessing and analysis of genomics, proteomics and interactomics data, developed at the Bioinformatics Laboratory of the University Magna Graecia of Catanzaro. Tools for efficient statistical and data mining analysis of mass spectrometry proteomics data (MS-Analyzer, EIPEPTIDI), as well as gene expression and genotyping data (micro-CS, DMET-Analyzer, DMET-Miner, OSAnalyzer, coreSNP) will be briefly underlined.

Mario Cannataro is a full professor of computer engineering at the University "Magna Graecia" of Catanzaro, Italy. He is the Director of the Data Analytics research centre of University of Catanzaro. His current research interests include bioinformatics, parallel and distributed computing, data mining, problem solving environments, and medical informatics. He is a Member of the editorial boards of Briefings in Bioinformatics, Encyclopaedia of Bioinformatics and Computational Biology, Encyclopaedia of Systems Biology. He was guest editor of several special issues on bioinformatics and he is serving as a program committee member of several conferences. He published three books and more than 200 papers in international journals and conference proceedings. Prof. Cannataro is a Senior Member of IEEE, ACM and BITS (Bioinformatics Italian Society), and a member of the Board of Directors of ACM Special Interest Group on Bioinformatics, Computational Biology, and Biomedical Informatics (SIGBio).


Manuel Castañón-Puga
Universidad Autónoma de Baja California, Tijuana, Baja California, México

Ethnography and Computation: Towards Agent-Based Modelling from Ethnography Observations

Abstract:

Ethnography is the part of anthropology that deals with the scientific description of human communities. The ethnographer mainly records stories in text field notes form, to describe scenarios which he has observed and lived with members of a community. The narratives are the first steps to studying their social and cultural systems. The Agent-Based Modelling (ABM) is a computational modelling paradigm used to approach the society members. On this talk, we examine some computational ideas starting from ethnography observations toward agent-based modelling for social simulation. We used IBM Watson services to explore ways to discover entities and relationships from ethnographic text sources. We present some experiences and results of this first approach, and we discuss the methodological issues for agent-based modelling and simulation.

Dr Manuel Castañón-Puga is a full Professor of computer sciences and computer engineering at the Universidad Autónoma de Baja California in México. He’s the leader of the "Complexity and Computation" academic group and the “Computational Modeling and Complex Systems” academic collaboration network. His current research interests include multi-agent systems, hybrid-intelligent software agents, social simulation, social-inspired ICT, social-computation, computational intelligence, complexity, and software and computer engineering. His research of modelling and simulation, agent-based simulation, hybrid-intelligent agents and multi-agent systems explores the way in which software agents could be used to describe multidimensional environments, innovation, evolution and adaptation in complex adaptive systems. He collaborates with multidisciplinary researchers and scientists to create multi-dimensional computer simulations of societies, political ideologies, trading economies and urban landscapes. His research also intends to incorporate the ideas of complexity into the mainstream of computer engineering and in particular to its instruction at the undergraduate and graduate levels.


Keith McCormack
The University of Sheffield, Sheffield, UK

‘EurValve’ as a Map for Pervasive Computational Healthcare

Abstract:

Healthcare is rightly the most conservative of sophisticated human endeavours: ‘better safe than sorry’. But complexity has conspired with necessity, to drive clinical practice into the arms - indeed the code-typing fingertips - of the computational medical physicist, where image processing, machine learning, 4D modelling and predictive analysis can outstrip the keenest minds and the longest memories, with the speed, accuracy and infinite data resources of computational medicine. The future is a world of continuous predictive health information, constantly updated with every step, every meal, every sneeze. The fundamentals of this integrated, pervasive HealthCare 2050 are already being established, and many can be witnessed in the EC-funded EurValve project, where the machines make the measurements, calculate the odds, and suggest the treatments.


Eduardo J. Simoes
The University of Missouri-Columbia School of Medicine, Columbia, USA

Effect of eHealth on the Treatment and Control of Type 2 Diabetes

Abstract:

Title: Effect of Health Information Technologies in Glycemic Control among Patients with Type II Diabetes.
Authors: Eduardo J. Simoes1, M.D., D.LSHTM, M.Sc., M.P.H.1, Susan Boren, PhD1, Mihail Popescu, PhD1, Diana Kennedy, PhD (Abd)1, Jesus Soares (PhD)2
1 University of Missouri School of Medicine, Department of Health Management and Informatics
2 Centers for Disease Control and Prevention, Division of Nutrition, Physical Activity and Obesity

Objective: The purpose of this meta-analysis was to synthesize findings and reveal the effects of health information technologies (HITs) on glycemic control among patients with type II diabetes.
Methods: We systematically searched Medline, Cumulative Index of Nursing and Allied Health Literature (CINAHL), and the Cochrane Library for peer reviewed randomized control trials that studied the effect of mobile or potentially mobile technology on glycemic control (HbA1c). We also used Google Scholar to identify additional studies not listed in the abovementioned databases. We performed a hand search using references lists of eligible articles and of relevant systematic reviews and review articles to identify potential missed articles. We analyzed data using random effects meta-analytic models.
Results: 20 studies (25 estimates) met the criteria and were included in the analysis. Overall, HITs resulted in a statistically and clinically reduced estimated average glucose level (HbA1c %). The combined HbA1c reductions was -0.700 [Standardized mean difference (SMD) = -0.700, 95% CI (-0.916, -0.485)]. The reduction is significant across all four types of HIT intervention under review, with short message services and mobile phone-based approaches generating bigger effects [SMDs were -0.757 (-0.996, -0.517) and -0.716 (-0.941, -0.490), respectively].
Conclusions: HITs can be an effective tool for glycemic control among patients with type II diabetes. Future studies should examine HITs’ long-term effects and explore factors that influence the effectiveness.

Dr. Eduardo J. Simoes is Chair and Alumni Distinguished Professor in the Department of Health Management and Informatics, University of Missouri School of Medicine. He has a Doctor of Medicine (University of Pernambuco-Brazil), Diploma and MSc in Community Health (London School of Hygiene and Tropical Medicine, University of London-England), and MPH (Emory University). Previous notable appointment: Director of the Prevention Research Centers Program at the Centers for Disease Control and Prevention, State Epidemiologist in Missouri. In the fields of public health, medicine, health informatics and epidemiology, he has published over 120 peer-reviewed articles, nine book chapters and over 30 reports; and presented 150 conference papers worldwide.


Alfredo Tirado-Ramos
The University of Texas Health Science Center, San Antonio, USA

Patient-Centered Computable Phenotyping in Health Disparities Research

Abstract:

Computable phenotypes are sets of computable inclusion/exclusion criteria for patient cohorts. Such criteria should be specific and objective enough that they can be turned into machine-readable queries, yet are generalized enough that they are portable between different data sources. There are a number of methods for creating and consuming computable phenotypes, such as OMOP, PCORNet Front Door, i2b2, and SHRINE, though the biggest challenge is still creating a baseline infrastructure to understand these systems well enough to use them in cutting edge biomedical research. In this talk we will discuss our experiences, lessons learned and next steps in creating a cluster of excellence in biomedical informatics research based on computable phenotyping.

As the chief and founder of the Clinical Informatics Research Division of the University of Texas Health Science Center at San Antonio, Dr. Tirado-Ramos leads a full-spectrum biomedical informatics program and explores the intersection between informatics, translational science, and clinically relevant data-centric problems including, but not limited to, computable phenotype-based research in health disparities, obesity, amyotrophic lateral sclerosis, aging, and cancer. He and his team have created and maintain an information research system for interdisciplinary collaboration between pediatric endocrinologists, cancer researchers and neurologists, creating new institutional governance frameworks along the way. He also co-directs the informatics core at the Claude Pepper Older Americans Independence Center, a National Institute on Aging award, where he works on state of the art informatics infrastructures to investigate innovative interventions that target the aging process as well as aging-related diseases, with a major focus on pharmacologic interventions. Previous to arriving at the University of Texas, he served at Emory University School of Medicine as Associate Director for the Biomedical Informatics Core at the Center for AIDS Research at the Rollins School of Public Health. He also served as Scientific Member of the Winship Cancer Institute Prevention and Control Program and Senior Member of the Research Staff at the Center for Comprehensive Informatics. His work at Emory University focused on informatics applied to clinically-relevant biomedical challenges, including the correlations between infectious disease and cancer, as well as whole genome sequencing for vaccine development research.


Hai Zhuge
Aston University, Birmingham, UK

Human-Machine-Nature Symbiosis

Abstract:

Cyberspace is being more and more tightly linked to the physical space and socioeconomic space to emerge a cyber-physical society, where humans, machines and the natural environment interact with each other, efficiently share resources and co-evolve to emerge patterns from different spaces. The emerging cyber-physical society is providing a new environment for experiencing, understanding and thinking. Human-machine-nature symbiosis is a basic mechanism that coordinates humans, machines and natural environment to realize the harmonious development of cyber-physical society. Studying the fundamental symbiotic mechanism in cyber-physical society is a way to understand cyber-physical society and influence its evolution. Human-machine-nature symbiosis is a basic mechanism for developing cyber-physical society and investigating its influence on computing, intelligence and society. Human-machine-nature symbiosis provides a flow-driven symbiotic method for studying Cyber-Physical-Social Intelligence and managing the sustainable development of cyber-physical society.

Dr Hai Zhuge is a professor in Aston University and Chinese Academy of Sciences, a Distinguished Scientist of ACM and a Fellow of British Computer Society. His major research interest is to explore the fundamental issues on semantics, knowledge, dimension, and self-organisation in a multi-disciplinary background. One of his contributions is semantic modelling. He created the Semantic Link Network model and the Resource Space Model and integrated them as a fundamental semantic space to support semantics-based management and exploration of various resource spaces. Research has established a uniform theory and method for modelling, organising, retrieving, and managing both cyber objects and concepts so that various information services can be efficiently provided with understanding. A set of high-performance semantics-based distributed platforms was established to provide a self-organised and adaptive architecture for efficiently sharing and managing various cyber objects. His model, theory and method have been applied to many applications, including summarisation, question answering and recommendation. In recent years, he is leading research towards a new science and engineering for Cyber-Physical Society. Homepage: http://www.knowledgegrid.net/~h.zhuge/


Robert Adamski
Intel Corporation

Solving Atari Games with Distributed Reinforcement Learning

Abstract:

Intel collaborates with partners like deepsense.ai to make a mark on the cutting-edge research leading towards intelligent machines by providing practical machine learning tools and designs that make it much easier for scientists to track their experiments and verify novel ideas. One particular step towards achieving this was to distributing a state-of-the-art Reinforcement Learning algorithm on a large Intel Xeon cluster, allowing super-fast training of agents that learned to master a wide range of Atari 2600 games.


Ben Bennett
Hewlett Packard Enterprise

The Next ‘Giant Leap’ for Computing

Abstract:

This talk looks at how HPE and NASA are addressing the issues of on-board computing required for Mars travel. Moreover, similar problems back here on Earth are causing HPE to redefine the computer, as we know it.

Organizers:
AGH University of Science and Technology
AGH University of Science
and Technology
ACC Cyfronet AGH
ACC Cyfronet AGH
Department of Computer Science AGH

Department of Computer
Science AGH