Friday, July 10, 2020

Ph.D. Scholarship @ Unitrento: Assessment of social and economic impacts caused by natural hazards in mountain regions

Death tolls and economic losses from natural hazards continue to rise in many parts of the world. Only in 2018 they caused almost 12000 deaths across the world and over 130 US billion dollars of economic losses (CRED, 2018). European states are experiencing a continuous and significant burden from multiple natural disasters (Wolfgang et al., 2019): in 2016 Germany, Belgium, and Switzerland have been hit by a series of flash floods and storms causing over $2.2 billion in losses; in 2013 Storm Xaver caused to northern Europe at least 15 fatalities, dozens of injured, and more than €800 million total economic losses (e.g. Rucińska, D., 2019); the July-August 2003 European heat wave that caused a total of 70,000 deaths (e.g. Russo et al., 2019; Bouchama, 2004).  


A combination of several factors contribute to explain the increasing social and economic toll caused by natural hazards: increase in exposed assets, i.e. rising population and capital at risk (e.g. Visser et al., 2014), effects of anthropogenic climate change on climatic extremes (e.g. Donat et al., 2016; Bouwer, 2011), better impact reporting procedures (e.g. Doktycz and Abkowitz, 2019).

International agreements on disaster loss reduction (Sendai Framework for Disaster Risk Reduction 2015–2030) explicitly recognizes the benefits of multi-hazard early warning and forecasting systems (MHEW&F-S)”. In 2017 Member States of the United Nations stated the deemed need of MHEW&F-S and agreed on its the definition as integrated system that “address several hazards and/or impacts of similar or different type in contexts where hazardous events may occur alone, simultaneously, cascadingly or cumulatively over time, and taking into account the potential interrelated effects” (UNISDR, 2017). Here the term early warning (EW) is extend with the term forecasting (&F) to explicitly acknowledge that each hazard have a specific forecast lead time which can varies from minutes/hours for flash-floods, days for pluvial floods or heat/cold waves, to months for drought hazard. The scientific community also agree in the need of novel approaches and local scale models for assessing impacts caused by climate change (e.g. Schewe et al., 2019).  In order to answer to this call and to move towards a rigorous framework for multi-hazard risk assessment in this project I propose to implement a novel local scale multi-hazard impact-centered forecasting system. It aims to:
·       Quantify the three fundamental components of the risk, i.e. hazard, exposure, and vulnerability, and to combine them in a multi-hazard framework, exploiting the most recent dataset and the more appropriate models;
·        Provide timely effective warnings (not just of the hazards but also) of the most probable sectorial impacts that may be triggered by multiple hazard conditions.
The system will be unique and novel because it will be the first operative system for multi-risk quantification including:
·       a local high-resolution meteorological forecasting system with operationally runs at 1 km resolution capable to explicitly model convective phenomena;
·       a detailed and component-based open-source framework for multi-hazard quantification, locally and automatically calibrated for estimating the probability of occurrence of floods, droughts, shallow-landslides/debris-flow, heatwave/coldwaves and windstorms;
·       a new set of exposure and vulnerability layers variable in space and time to account for accounting of socio-economic changes in the risk analysis;
·       an innovative framework based on a probabilistic graphical model (Bayesian Network) dynamic in time and variable in space, which consider all the risk components (hazards, exposure, and vulnerability as in Formetta and Feyen, 2019) as stochastic variables and models all their possible interactions using probabilistic expressions. The latter will be inferred using a Bayesian learning process involving: 1) a database of reported impacts (fatalities and economic losses) occurred in the past (1980-2018) in the study area specifically organized in the project and 2) the corresponding hazard probabilities, exposure, and vulnerability at the time of the reported event (computed by using points ii and iii).

The project study area is the Trentino Alto-Adige region located in the eastern Italian Alps. The choice of the area is motivated by different reasons: i) there is no such a system currently running (this is also valid for all the others Italian regions); ii) in the near future mountain regions will be even more exposed to the occurrence of climatic extremes due to climate warning. The selected geographical domain is only a test-bed where the framework will be implemented, set up, tested, and verified against observed data for each single component, i.e. meteorological forecasting skills (against rainfall or air temperature measurements), hydrological calibration and validation (against historical measured river discharge), hydrological forecasting skills (against observed river-discharge using forecasted meteorological forcing data), historical and forecasted impacts (against reported fatalities and economic losses

Please write to giuseppe.formetta@unitn.it for information or see directly the Department call (here)

References
Bouchama, A. (2004). The 2003 European heat wave. Intensive care medicine, 30(1), 1-3.
Bouwer, L. M. (2011). Have disaster losses increased due to anthropogenic climate change?. Bulletin of the American Meteorological Society, 92(1), 39-46.
Donat, M. G., Alexander, L. V., Herold, N., & Dittus, A. J. (2016). Temperature and precipitation extremes in centurylong gridded observations, reanalyses, and atmospheric model simulations. Journal of Geophysical Research: Atmospheres, 121(19), 11-174.
Doktycz, C., & Abkowitz, M. (2019). Loss and Damage Estimation for Extreme Weather Events: State of the Practice. Sustainability, 11(15), 4243.
Formetta, G., & Feyen, L. (2019). Empirical evidence of declining global vulnerability to climate-related hazards. Global Environmental Change, 57, 101920.
Rucińska, D. (2019). Describing Storm Xaver in disaster terms. International journal of disaster risk reduction, 34, 147-153.
Russo, S., Sillmann, J., Sippel, S., Barcikowska, M. J., Ghisetti, C., Smid, M., & O’Neill, B. (2019). Half a degree and rapid socioeconomic development matter for heatwave risk. Nature communications, 10(1), 19.
Schewe, J., Gosling, S. N., Reyer, C., Zhao, F., Ciais, P., Elliott, J., ... & Van Vliet, M. T. (2019). State-ofthe-art global models underestimate impacts from climate extremes. Nature communications, 10(1), 1-14

Friday, July 3, 2020

On Doing Large-Scale Hydrology with Lions: Realising the Value of Perceptual Models and Knowledge Accumulation A Review.

This is a review of the paper by Wagener, Thorsten, Tom Gleeson, Gemma Coxon, Andreas Hartmann, Nicholas Howden, Francesca Pianosi, Shams Rahman, Rafael Rosolem, Lina Stein, and Ross Woods. 2020. “On Doing Large-Scale Hydrology with Lions: Realising the Value of Perceptual Models and Knowledge Accumulation.” EarthArXiv. https://doi.org/10.31223/osf.io/zdy5n.

Since the Authors uploaded it to EarthArXiv making it available as Preprint, My review can be public too. 

The paper main statement can be formulated by saying that in global hydrology and related science there remain large areas of knowledge which could be easily explored because we have now the data and the tools to do it, but we do not do. There are unexplored geographical regions and substantially the Authors asks for a “everywhere modelling effort” by saying that as in the old maps where it was written “hic sunt leones” there are large areas on Earth whose hydrology is essentially unknown (a known unknown indeed).  They have a point.  The paper's language is good and the writing pleasant but I would prefer a more simpler organization which focuses more on the two or three main statements. A sound knowledge of literature is interesting for the general reader but it is not in my opinion used to focalise the issues. On the contrary there are a lot of paragraph that, reporting the state of art, let with the impression that there is no problem at all. This does not mean that those paragraph, read alone, are not well written, informed, or interesting but they do not serve to goal of highlighting well the issues. You get easily the main ideas but I had difficulties to grasp the whole paper contents, even after many readings. For instance, "the lions" refers to the known-unknown, I cited above, or to an unknown-unknown with regards models' structure and their granularity how the manuscript seems to indicate sometimes ?

I understand that the Authors invoke two main solutions for the issues they rise:
  • a larger sharing of perceptual models of catchments
  • better strategies for organization of the current knowledge which is seen as not efficient with systematic metadata, development of tools for knowledge harvesting, standardization of databases and data in general.

These two directions of work are remarked within two sections, and I would say that without this separation, I would have hard time to obtain this synthesis.

Understanding what a “perceptual model” is,  is  part of making the reader understand the concepts supported by the Authors. What a perceptual model is, however, is not clarified in the paper, and could remain obscure to non-hydrologists.   What is that ? Can they define it more precisely ? Is it a drawing ? Is a set of relations ? Has it a specific mathematical representation ? Overall,  I believe a little more should be said on what a hydrological models at large scale are, without fall into an annoying classification or taxonomy but discussing what these models are or should be. Recently, Frigg et al (2020) tried a general  discussion on scientific models from the point of view of philosophy of science which could help to clarify what these models are.

The domain of the paper is a little slippery. While the title of the paper and the main statements look at the large-scale hydrology, sometimes, the Authors indulge in observations that have to do with a finer granularity of the processes than the one required by this type of modelling. Maybe there is a lack of definition of what “large-scale hydrology” is, especially with respect to the methods, and the granularity of the processes described. The Authors should adopt or try one. Not that it cannot be partially deduced from what it is written in the present manuscript, but this knowledge should not be given for granted in readers who come from other disciplines.

A final comment have to be made on the advancement of science.  The Authors cite Popper and Kuhn but I do not think their topic is in the same domain, which is in my opinion in the area of theories and interpretations of  a theory, but in the application of a given theory (or a set of theories) to the cases to which it is thinkable they could apply. Therefore I would esclude a “Kuhnnian” direction here. Eventually the topic here could be the practice of Popperian theory of falsification. If the repeated application of the models reveals unsatisfactory in fact, it could bring to the necessity of a new theory or a new model.
But I guess this grows too philosophical for me and I do not want to pursue this argument further, I concede I do not have the understanding required to treat it properly.

Therefore, I like very much the issue the paper rises and I think that the topic is of interest for hydrologists and a wider audience. However, I think that the Author should make the effort to reframe and refocus a little bit more their manuscript.

Below some sparse comments on specific statements.

Page 6 - Line 1 “… for new scales of management …” What doest it means ? It seems to me, whatever it means, that it diverts the attention from the fact that “it also contains hydrologic lions" is the point by moving the thinking to the lateral issue of scale of analysis.

Page 8 - Line 5  - I think the main problems with pure data assimilation is that it tends to be erroneously inductive, without any   hypothesis to test. Some phrases that follow in the subsequent pages, seem to support that science advancement is not hypotetical-deductive but inductive.  This is, maybe, a marginal point in the context of the paper, but, because I think it is wrong, I ask the Authors to be more clear on it.

Page 9 - Line 12 - “As Mc Donnell et al. …” - Yes, correct, but: how this statement is coupled with large-scale hydrology ? Does it means a support to the reductionist view that, if we describe well all the hillslope of the world, we have the best large scale-model ? Secondly, it is used to support  inductivism ? Or to deny it ?

Page 9 - Line 44 - “How much can we reduce model uncertainty … “. This phrase conveys the idea that  large scale models should be constrained by some expected large scale behavior. But what does have to do with the hydrologic "lions" ?  I was expecting from the geographical example that these "lions" were referring to unexplored geographical areas, where data or modelling are scanty, not to the general aspect of modelling. Do the Authors mean that we apply our models to large areas but nevertheless we do not know well their validity and foundations (other Lions indeed) ? If it is so, maybe the concept of lions is not so clear too me, and the authors should clarify more its extent.

Page 9 - Line 48 - "Some studies have shown that simpler …” This regards again the “inner Lions” of the large scale hydrological modelling. It is a critique of the way such model are actually done and verified. The Authors suggest (probably with some unexpressed example in mind) a directions to better characterizing those models.  I agree with the single statements, but I do not feel the topic is properly prepared and introduced/discussed in the paper.

Page 9 - Is, to their knowledge, Boorman 1995 the unique paper that deals with conceptualised parameters of a model ?  It was 25 years ago though. Or did I misunderstand what you want to say ?

Page 9 - Line 55 - “It has been widely discussed … “ Frankly, I do not buy this statement. It is not exactly true that complex models, as those, for instance that solve partial differential equation have this degrees of freedom in practice. There are, at least,  two reasons for that: 1 - Models that conserve mass (and BTW energy) usually cannot be stretched to reproduce any measured time series as accurately as one desires (for instance, a model that solves Richards equations cannot reproduce macropores flow with any characterization of the soil parameters); 2 - Even if, in principle, optimization of the parameters of a spatially distributed model can involve arbitrary and different values of the parameters in each site, in practice the calibration is extremely time consuming and usually unfeasible even with a ten of them.  Therefore the possibility to really explore a large set of parameters in high dimension is simply not possible. I’ve tried it several times. If anything, the strength of physically based spatially distributed models is their physics,  the capability to accomodate heterogenous inputs and obtain spatially distributed outputs.

Page 11 - Line 8 - “Supervisors” I would better say “oral communication”, like for instance the one given by Tom Dunne.

Page 12 - line 51 - Among the experiences that merit to be cited in categorizing the hydrological description, at least two should be cited: the CF (https://cfconventions.org/) convention and the Basic Model Interface (now in its second version: https://bmi-spec.readthedocs.io/en/latest/).


References

Frigg, Roman, and Stephan Hartmann. 2020. “Models in Science.” In The Stanford Encyclopedia of Philosophy, edited by Edward N. Zalta, Spring 2020. Metaphysics Research Lab, Stanford University. https://plato.stanford.edu/archives/spr2020/entries/models-science/.

Monday, June 29, 2020

A practitioners’ view on the application of water and flood directives in Italy

This is the preprint of a chapter of the book: P. Turrini, A. Massarutto, M. Pertile and A. de Carli (eds.), Water Law, Policy and Economics in Italy: Between National Autonomy and EU Law Constraints, Springer (forthcoming 2021). In the chapter we talked about the application of the Water Framework Directive and the Flood directive in Italy, from the point of view of engineering, hydrology and hydraulics. It derives from our experiences in working in the directives' application in the last ten years and hope it could be a contribution for a better application for the next deadline, expected in 2022. The preprint is available through OSF preprints by clicking on the Figure below. 
The directives are a complex topic that has interplay with the organization and legislation of Italy. Here our Abstract: "The commonly called Water Framework Directive1 (WFD) and Flood Directive2 (FD) represent pivotal points for European water policies. They do not need any further introduction here since there are other contributions to this book that present them in detail. In this chapter, we briefly describe how they affect people working in Italy in water resources management, exploitation and protection of and from water bodies. In this contribution, we try to present the work needed to fulfil the directives generally, who did the work and with what responsibilities in past implementation cycles, and what was actually done in implementation cycle for both directives up to 2016. The result is a picture of the Italian water management system; a system not only defined by laws and norms, but also by habits and the way Institutions have developed during recent history through their interplay with growing technical knowledge, the implementation of policies, and the evolution of Italian society. This chapter is divided as follows: section 1 reports what has to be done to accomplish the directives generally; section 2 summarizes who performed the actions connected to the directives in past implementation cycles; section 3 and 4 report and discuss the Italy’s application of the directive; section 5 covers the role of science in the implementation of the directives; and, finally, section 6 contains some considerations on the main critical aspects and on the challenges the future application of the directives (2021- 2027) is going to face." 

Monday, June 8, 2020

Concentration time, if existent, is a statistical concept


Among the various times we use in describing the catchment, concentration time is one of them. It is referred, in the old textbooks, as the largest travel time of water parcels (i.e. statistically significant amount of water molecules that are though to move together) in a catchment. Travel time, in turn is the time a parcel of water employs to across the catchment from its injection (as rainfall) to its exit (as part of discharge). The Figure 1 below illustrate two parcels with different travel times, with parcel 1 arriving faster to the outlet, for being close to it.
The concept of concentration time gained its importance since the Mulvaney theory of "the rational method" reported, for instance,  in K. Beven book (2012). For giving a meaning to it, we can assume that,  if parcels are though to move with constant velocity in a catchment, then, once their distance from the outlet along the drainage directions (see the width function concept) is known,  travel times is obtained by dividing  that distance by the parcels’ velocity.
Rigon et al., 2016 gives a review of this concept in the framework of the geomorphological unit hydrograph based on the width function (or WFIUH). The oldest hydrologists would also remind a simplified version of the story, where, essentially the catchment is seen as a rectangular planar hillslope and the flow is though to be parallel as in Figure 2 below.
Parcels move in essentially rectilinear paths, with constant velocity. Parcels like the no  2 are on the divide and parcel like the no 1 very close to the outlet, that is in Figure 1 a sort of trench. In this case, varying the duration of precipitations, we obtain a hydrograph  which is a triangle or a trapeze. It can be demonstrated that when a rain of constant fixed intensity falls on this catchment, we obtain the maximum discharge possibile when its duration equals the parcels no2 travel time, the largest one. Continuing to argue about models, not about what happens in reality, it can be seen also that, from the point of view of the instantaneous unit hydrograph theory (IUH),  concentration time is the extension of the domain of definition of the IUH distribution function ($t_c$  in Figure 3). 
Unfortunately, most of IUHs do not have a finite domain but an infinite one, the simplest being probably the exponential IUH $$IUH(t;\lambda) = \frac{1}{\lambda} e^{-t/\lambda}$$ (see also Rigon et al., 2011). This implies that for most IUHs, the concentration time does not exist as a rigorous concept.   Besides, the dynamics of water parcels as depicted in simplified theories was completely screwed up by tracers experiments that have determined that the age of water in floods is very much larger than believed, and usually what we see in rivers and torrents is old water not the one just fallen during the last precipitation (though undoubtedly was the rainfall to trigger it). 
The concept of concentration time,  resists in operational hydrology because there is a certain evidence that floods are generated by precipitations of increasing duration with increasing basins area,  and this correlates with the idea of concentration time exposed above for the planar hillslope.  However,  in complex catchments, it cannot be something different from a statistical concept. We already mentioned briefly that a catchment is not a huge planar hillslope and that water parcels move in complicate ways through it.  Moreover, the expansion of the river networks during storms (e.g. Durighetto et al.,  2020)  implies the necessity to add a further dynamic to concentration times perceptual model.

After all the above considerations, if something like the concentration time exists, it is a characteristic statistical time which identified the duration of the rainfalls that generate the  largest peak discharges. It should depend on catchment size and topology (besides on the rainfall). We believe that it increases with catchment size, but being any catchment different, it remains a slippery concept. A solid statistical study would be required to clarify, once for all, the issue.

References



Friday, June 5, 2020

The Zero Notebook for GEOframe components

There is the necessity to properly document the code we developed.  The state-of-art is that many Jupyter Notebooks were written to document may of the actions requested for running them. These Notebooks are made available when to sample projects are downloaded through their osf (which stands for Open Science Framework)  repository. This is probably a temporary solution which will be unified once forever in Github. However, these notebooks, see for instance the case of the Winter School  ofter are missing of an overall description which conveys all the information regarding the Component, part of which, for some component, was written in a custom LaTeX format and made available through the GEOframe blog. To make some order, I am proposing here to put the basic information in a Notebook, whose template you can find by clicking on the Figure below.
The notebook is a work-in-progress and who wants to give suggestions is welcomed. There are other two scopes for this Notebook Zero,  one is that the materials it contains can serve for a chapter in a Thesis where the component is described for its informatics and its content, with minimal modifications;  the other,  that it could be used for a possible submission of the component code to JOSS.  The latter goal would require some improvement in our GEOframe component Github site though in order to have tagged version of the software, a clean way to submit issues (a issue tracker), a set of unit test for the continuous integration of the components. We made a lot of progresses in recent years, but we are not yet there, really operational. A companion issue is where we do upload the .sim files and the data corresponding to tagged version of the components. So far they were assembled together in someone computer, compiled, eventually uploaded to Zenodo (or OSF) and made public. Streamlining the whole process in Github would be probably convenient. Going even more general, there is an installing problem of the OMS/GEOframe stuff. So far we replicated the jars (i.e. the Java executable) several times, each time we needed a a new project. It is time, I guess, to have the executable in a unique place, at Computer or User level, while the directories with data etc (so fare recognised as the OMS projects), freely replicable for different simulations, but without having to get along any time with copies of the executables.

Thursday, May 14, 2020

Equivalences and differences among various Hydrological Dynamical Systems

In this paper we want to show that once the topology of a Hydrological Dynamical System is determined, the structure of the equations of the Water Budget Dynamical System is determined but with it also the travel time and residence time distributions are. This is obvious since our paper on age-ranked functions but here it is rigorously stated and worked out.
Ideally this paper is also a continuation of the paper on the representation of Hydrological Dynamical systems with Petri Nets of which in some sense, it represents an extension. In a ideal menù, the reader should read first the paper on the historical-critical approach to the GIUH, I would say excluded its last section, then the Age-Ranked paper, then the Petri Net paper and finally this one. Clicking on the Figure you can have the preprint, or on the bibliography below you can have access to all the manuscripts. This is part of my research program on trying a statistical-mechanical approach to hydrological modeling which across all my research activity since the last (almost) thirty years. 

References



Monday, May 11, 2020

SMASH

Hydrologis, is the mini-company formed by Andrea Antonello and Silvia Franceschi (GS). Notwithstanding they are just two they accomplished really a lot in their career in the world of Open and Free GIS, where they have a solid reputation. First they built, when collaborating with me, the JGrass GIS that eventually was a clean part of the udig GIS. In contemporary Andrea, for his Ph.D. wrote BeeGIS whose ideas, with exploding the mobile devices wave, flowed into GEOpaparazzi that works with Android OS and can be downloaded from GooglePlay. In the meanwhile they also produced a port of the Horton Machine into gvsig  and as standalone (download the executable from here, the Horton Machine is on Github). I forgot LESTO (EGU Abstract), the product for analysis of LIDAR signals that Silvia developed for her Ph.D.
Now, finally it arrives SMASH  (IOS, Android), the new tool for digital field mapping that work on the two main mobile platform. To describe how it works, Silvia made a video which I uploaded to my VIMEO channel. Unfortunately it is in Italian.
Hoping soon we will have one in English. All the best!