Saturday, December 30, 2023

Help ! The AboutHydrology mailing list was shutdown by Google

The groups was restored. Many thanks to who have helped !!!

Dear Subscribers,

The AboutHydrology mailing list connects nearly 6000 researchers, students, and practitioners in the field of hydrology and related sciences. However, since the beginning of December, it has been flagged as suspicious by Google and subsequently disabled. We are actively working to have this decision overturned and the list reactivated. While some of you may have been inconvenienced by AGU session announcements, marking the posts as "spam" has caused unintended consequences that affected many individuals.

For those who are impatient, please note that unsubscribing from the list is a simple and viable option if you no longer wish to receive emails. In the future, if you find yourself overwhelmed with emails, we kindly request that you unsubscribe without causing trouble for others. Who knows how to contact Google for solving the issue, please write privately to abouthydrology <at> or contact themselves their Google people to support the group reactivation.
If Google does not reactivate the mailing list, we will explore migrating the list to another platform but obviously, it will require some time

The AboutHydrology managing group

Tuesday, December 26, 2023

Code Washing

 This time, I want to address the concerning issue of students inappropriately reusing open-source code without a clear understanding of open-source licenses.

It's crucial for students to grasp the essence of open source licenses, understanding that they are not just permissions to copy but guidelines for responsible use. Engaging with open-source code should involve a genuine learning process, encouraging students to comprehend and apply the principles embedded in the code they explore.
Merely having access to code doesn't grant the right to take it, make superficial changes, or translating from a programming language to another, remove original authors, and claim the altered code as their own. While open source encourages learning through code exposure, wholesale copying with only minor alterations, especially without restructuring for object-oriented code, doesn't constitute "creating a new code base."
In such instances, phrases like 'I looked at Mickey Mouse code, but I am using my own code' are, at the very least, misleading and likely a form of plagiarism. I term this practice "code washing." My plea: steer clear of it and adhere to ethical behavior.
The notion of "code washing" not only undermines the integrity of individual work but also compromises the collaborative spirit of open source. It's essential to emphasize that acknowledging and respecting the original authors not only aligns with ethical standards but also fosters a culture of transparency and collaboration in the coding community.

Friday, November 17, 2023

Some pills on what we do for agriculture droughts

 Just to introduce the debate about droughts simulation, agriculture, new technologies that can be used for improving agriculture. Below the presentation.

Just click on the Figure to see the presentation given at the Festival della Meteorologia 2023

Thursday, November 9, 2023

Java for Hydrologists 101

There are a few postings on Java in this blog. Since I want to teach it to my students, I am quietly starting to populate this page with presentations which, eventually, will constitute the core of an informal class (;-)) the Java for Hydrologsts 101. The first version of this blog dated back to more than 10 years ago and this gives the idea on how slow thing could go. The text presented here is at present the same but I am progressively modifying it. 

The primary aim of JfH-101 is not merely to impart basic Java knowledge, but to delve into topics and issues that align with my hydrologist experience. In collaboration with my colleagues and friends from GEOframes, we plan to cover not only Java, but also OMS3, and when appropriate, Geotools and Horton Machine (former JGrasstools). We won't overlook AntMavenGradleGiteither. Meanwhile, we'll address topics pertinent to object oriented programming

Programming is less about discussing theory and more about practical application. Therefore, many of our slides will prompt you to take action and apply what you've learned.

Various (overwhelming) material can be found at the Java for Hydrologist OSF repository The new video will be added to the Java 101 for hydrologists Vimeo Showcase.  Here below, you'll find all the material in a ordered form: 


0 - Getting Started (mostly things to read -or start to read- before the start) (Vimeo2023)

Old Topics

* -  A few diversions
* - Reading  data from the system's console
* - Reading data from a File

* - Working with Git
-----Not yet implemented: ----

* - Programming the heat diffusion equation
* - Making the heat diffusion an OMS3 components
* - Building Java projects with Ant, Maven and GRADLE
* - GEOtools essentials
* - Commenting the programming of the GEOframe-ET
* - The Java REPL
* - A little on Java Modules in Java 9
*- Setting the continuous integration in GIT (using Travis)

The source code is available for download to from GitHub.


Please go to this blogpost.

^* - From the links you can quite understand the I rely very much on Lars Vogel site for the basic stuff. It is not obviously the only good resource available (stackoverflow is another one, for instance, and many others will be addressed).

Saturday, October 28, 2023

CARITRO Project: Snow droughts e green water: how climate change modifies the hydrological cycle in the Alpine Region.

Due to the impact of climate change, the Alpine region is experiencing a dual effect: a decrease in snowfall leading to snow droughts, and an increase in water losses through evapotranspiration, also known as green water. These changes have significant implications for the sustainable management of water resources and the preservation of ecosystems. This project, funded by the CARITRO foundation, aims to address these challenges by developing innovative models to accurately quantify snow melt and evapotranspiration losses. The ultimate goal is to provide practitioners with user-friendly calculation tools that are more advanced than traditional lumped models but less complex than intricate "process-based" 3D models. Initially proposed by Niccolò Tubini, the project has been taken up by John Mohd Wani with minimal modifications.  

The complete project plan can be found here

Friday, October 27, 2023

Open Science by Design

In the framework of the meeting "Community over Commercialization \, Open Science, Intellectual Property and Data" I was graciously invited by professor Roberto Caso to talk about my experience with developing open source models and promoting open science. Various the topic I tried to rise: the transmission of science in a university environment, why open source coding, why open science, which methodology can be used.

The presentation can be found @ and if any video record will be available, I will share it. 

Friday, October 20, 2023

Identifying Snowfall Elevation Patterns by Assimilating Satellite- Based Snow Depth Retrievals

Precipitation in mountain regions is highly variable and poorly measured, posing important challenges to water resource management. Traditional methods to estimate precipitation include in-situ gauges, doppler weather radars, satellite radars and radiometers, numerical modeling and reanalysis products. Each of these methods is unable to adequately capture complex orographic precipitation. Here, we propose a novel approach to characterize orographic snowfall over mountain regions. We use a particle batch smoother to leverage satellite information from Sentinel-1 derived snow depth retrievals and to correct various gridded precipitation products. This novel approach is tested using a simple snow model for an alpine basin located in Trentino Alto Adige, Italy. We quantify the precipitation biases across the basin and found that the assimilation method (i) corrects for snowfall biases and uncertainties, (ii) leads to cumulative snowfall elevation patterns that are consistent across precipitation products, and (iii) results in overall improved basin-wide snow variables (snow depth and snow cover area) and basin streamflow estimates.

The analysis of the snowfall elevation patterns' spatial characteristics indicates that the proposed assimilation scheme results in more accurate spatial patterns in the snowfall distribution across the entire basin. The derived snowfall orographic patterns contribute to a comprehensive improvement of mountain hydrologic variables such as snow depth, snow cover area, and streamflow. The most significant enhancements in streamflow are observed during the spring and summer months when peak flow observations align more accurately with the posterior cases than the prior ones. These results primarily stem from the fact that the assimilation of Sentinel-1 assigns less snowfall to the lower-elevation regions of the basin, while higher rates are assigned to the higher elevation. As summer approaches, water is released more slowly from the higher elevation via snow-melt than in the prior case, which aligns better with observations. The assimilation of Sentinel-1 effectively downscales coarser-resolution precipitation products. While the prior snowfall cumulative elevation pattern has a small gradient across elevation bands, these patterns are consistent across elevations and precipitation products after the assimilation of snow depth retrievals. In conclusion, this study provides a framework for correcting snowfall orographic patterns across other seasonally-snow dominated mountain areas of the world, especially where in-situ data are scarce. The full paper can be found by clicking on the Figure above.

Girotto, Manuela, Giuseppe Formetta, Shima Azimi, Claire Bachand, Marianne Cowherd, Gabrielle De Lannoy, Hans Lievens, et al. 2023. “Identifying Snowfall Elevation Patterns by Assimilating Satellite-Based Snow Depth Retrievals.” The Science of the Total Environment, September, 167312.

Thursday, October 19, 2023

Water4All - WaMaWaDit project

The project WaMA-WaDiT: Water Management and Adaption based on Watershed Digital Twins was financed in the Water4All call and therefore, we will be able to start a new exciting adventure with some challenge. 

This proposal aims to understand the impact of extreme climate events such as droughts and floods on water management systems, with the goal of developing optimized management strategies that maximize water security under both current and future climate change conditions. The knowledge gained will be used to create a watershed digital twin framework, applicable to various watersheds with different water-related issues. A guide will be published detailing the process of building digital twins for specific watersheds and problems.

The proposal  that you can find in its complete form by clicking on the above logo, pursues three main objectives: the scientific, the practical, and the product objectives. The scientific objective focuses on improving our understanding of how drought and floods affect water management systems, and how optimal strategies can mitigate these effects. This involves several sub-objectives, such as determining the best databases for modeling water management problems, analyzing systematic errors in climate and hydrologic predictions, improving the inclusion of groundwater dynamics models, incorporating complex snow dynamics, assessing the effect of long-term forecasts of extreme events on reservoir management, and improving the parameterization of single hydrological processes.

The practical objective is to create a methodology that systematizes the proposal and assessment of adaptation measures in reservoirs. This methodology will provide a clear guide on how to develop decision frameworks based on the most robust numerical models or digital twins of the watershed. It will also tackle how to manage hydroclimatic extremes like floods and droughts, emphasizing dynamic management of safety margins to maximize water availability and ways to reduce the impact of persistent droughts.

The product objective is to implement this methodology in a free, open-source software tool that simplifies the use of scientific knowledge for decision-makers and reservoir managers. This tool aims to be robust and scalable, providing a first-order approximation to any problem. It will encourage end-users to adopt optimal tools for their needs by demonstrating the power

Tuesday, October 10, 2023

Notes about the dynamic nature of the GEOframe-Po Project

Here below you can find some provisional notes, to be improved in the next days about our Deployment of the GEOframe system to the river Po for the basin Authority of the river Po.  

Basin extraction

it's not a straightforward operation. In fact, it has never been done systematically all over Italy. It serves two opposing needs: to be objective and to align with the official grid provided by basin Authorities and Regions. The initial phase relies mainly on slope analysis and requires processing digital terrain data, which have become available only in recent years, especially if we refer to data produced with laser altimetry. The starting point is the Digital Elevation Models (DEMs) provided by the regions, which have been reprojected and standardized to correct reference systems. The initiation of the hydrographic networks is determined by an area threshold, while sub-basins, for the Po river, are delineated to have an average area of 10 km2. Procedures have been standardized in geographic information systems (GIS) over the last twenty years, but for this specific task, the Horton Machine library developed by Univrsity of Trento and HydroloGIS was used (Abera et al., 2016, serving as reference), incorporating some innovative elements: a parser to aggregate smaller basins into adjacent larger ones and addressing certain topological situations, especially those in flat areas for the subsequent use with GEOframe.
The tools was named GEOframeInputBuilder.

The extraction of lakes, particularly the large Lombard lakes and Lake Garda, required special attention and made the process less automated. Visual analysis reveals a differentiated geometry between mountain basins and lowland inter-basins, since the early years of fluvial geomorphology, but now objectively observed. The database, now available, enables statistical analysis of their geometry and topology, which previously relied on more qualitative cartographic analysis. The basin initiation with an area threshold is functional to the hydrological modelling but the reader should be aware that this topic is a very alive hydrological research topic, especially along with the work by Gianluca Botter and coworkers [insert CITATION].

The grid, as currently constructed, will be distributed for free use and will serve as a fundamental standard for further cartographic-digital and hydrological analyses and developments.

Photo by Luigi Ghirri


Interpolation techniques have seen significant development between the 1980s and 90s [insert citation], but especially geostatistical methods have slowly made their way into the practice of digital analysis of meteorological forcings in the hydrological cycle. These require the definition of an estimation model of the correlation between measurements, known as a variogram, the robustness of which is fundamental to the reliability of the result.
The starting database is made up of measurements collected by ground stations from regional entities operating on the Po basin. These data have been analyzed, cleaned, and subsequently interpolated, currently on each centroid of the sub-basins identified in the first phase of the work. The interpolation was carried out for precipitation and temperatures on a daily scale, as a first step to produce hourly or sub-hourly interpolation at any point of a suitable one-kilometer grid.
The interpolation technique used was kriging with drift to account for orographic effects, especially on temperature. For the interpolation of the experimental variogram, a ?linear? Exponential? What else? model was used using the interpolators implemented in GEOframe.
The interpolation covered the entire period from 1990 to today, and the data are stored in CSV files in folders containing the data for each individual sub-basin.

It is clear that the procedure is a first approximation that will serve as the basis for future improvements. For example, the extension of the interpolation on the one-kilometer grid is one aspect. The next improvement could be to introduce high-resolution reanalysis data, combining geostatistical techniques with simulations of atmospheric circulation and any data coming from radar and satellite. Convergent research come from atmospheric physics and meteorology whose resolution is arrived at the scales useful for hydrology. Some work should be done for connecting better the two communities.


GEOframe-NewAGE allows numerous configurations, as various components are available for the same phenomenon. For the basic configuration of each single Hydrologic Response Unit (HRU), the one already partially tested in [insert citation] called Embedded Reservoir Model (ERM) was chosen, the description of which can be found in the cited bibliography or in the linked videos. In summary, the ERM model is composed of a component for interception, one for snow, when present, a fast surface runoff separator based on the Hymod model, a nonlinear reservoir for the description of the root zone, and a second nonlinear reservoir for groundwater. Structurally, it is not much different from the HBV Model [insert citation]. In the basic configuration, flood propagation is neglected.
For the part of evapotranspiration, a simple Priestley-Taylor model was used, where however the radiation is provided through a rather accurate model [insert citations].
Each of these ERM models was then connected to the others through the Net3 infrastructure [insert citation] to form a directed acyclic graph in which each node represents an HRU. Potentially, each HRU can be characterized not only by its own topographic and meteorological data, but also by its own models.
In the basic configuration, however, the same model structure is usually used for all HRUs while the values of the model parameters are obtained by subsequent calibration with spatially differentiated parameters, if the available data allow it.
The potential setup variants are numerous, encompassing at least three options for snow modeling, three for evapotranspiration modeling, and an array of choices for reservoir modeling. The inclusion or exclusion of flow propagation modules, as well as the potential elimination or addition of compartments to be modeled and their diverse connections, further expand the possibilities. An overview of potential topological configurations is presented, for instance, in [insert MaRmot citation]. As even a novice reader can comprehend, the possible combinations multiply far beyond exponentially with the number of connected Hydrological Response Units (HRUs), which can, in turn, be linked in various manners. This complexity underscores why our comprehensive study on the Po River necessitates distribution and further refinement by others to enhance the precision of the results and better align them with local needs which cannot be gained by a single yet very productive team of people. In turn this open the question on how the re-analysis performed by external researchers or teams can be accepted and inserted back into the main project.

The analysis of multiple configurations is therefore entrusted to later phases of the project.


Among the phases of a simulation, the calibration phase is the most time-consuming. It essentially consists of a large number of attempts to combine the model parameters to reproduce the measured data as faithfully as possible. The space of possible parameters is generally very large, even for a single simulation HRU. Therefore, the tools for calibration try to use intelligent strategies (including ML) to quickly guess which are the best parameter configurations.

The goodness of the simulated values' fit to the measured ones is usually quantified through some goodness of fit (GOF) algorithms. In our case, these are generally the KGE [insert citation] or the NS [insert citation]. An analysis of the various GOFs can be found in [insert citation], whose result can be further detailed, in the validation phase (see below), with additional indicators such as those presented, for example, in Addor et al., 2017. Another method of analysis, post-hoc of the goodness of the simulations, much more refined, is that presented in [insert Shima work citation]. The latter can also serve as a Bias corrector of the final result and it is going to be systematically applied to the results of the Po project.

From an algorithmic point of view, the calibration carried out in the project is based on the LUCA model [insert citation], which is a sophisticated implementation of SCEM-UA [insert citation], but a particle swarm [insert citation] could also be used. The calibration procedure follows some standards. Having a set of data to base the calibration on, the data are usually divided into two subsets, one used for calibration and another for the so-called validation phase. In the former, the problem of having available input and output data is solved, determining the parameters (or models) in a way similar to what is done in normal ML techniques (which, for this purpose, could probably be used profitably). In the latter, the performance of the model solution on data not used for parameter determination (and should be "independent" of the former) is evaluated. As already mentioned, in the validation phase, additional GOF indicators can be used to better discern the performance of the adopted solution.

A note concerns the word "validation". This is the term used but does not imply any ontological meaning about the nature of the truth described by the model, but only a practical meaning related to the reliability of the model in predicting a certain sequence of numbers.
The calibration/validation procedure can be implemented for a single variable, in the specific case, usually the flow in a section of the hydrographic network, or for more variables, for example, snow cover, soil water content, evapotranspiration, if these measurements are available. These latter possible measures, however, have a different character from the discharge as, while discharge is an aggregate variable, resulting from the concentration of the fallen water on the watershed area in a single point, the others remain variables distributed spatially, before being aggregated for the purposes of the watersheds budget, and therefore the methods of determining the goodness of reproduction of the measured data follow more articulated paths, if not more complex. The good thing is that GEOframe allows you to calibrate the various quantities separately, as each of them is modeled by "different components" that can be used separately from the overall model. The use case is performed throufh quite a lot of manual intervention so far and could be made more automatic.

In any case, if the target variables are more than one, we speak of multi-objective calibration, while if there are variables measured at multiple sites, we speak of multi-site calibration [insert citation].

I would like further to suggest an enhancement to our analysis and move from the daily to hourly time scale. This is particularly crucial for understanding processes within smaller watersheds, approximately on a 1km^2 scale, where many significant phenomena demonstrate sub-daily dynamics.

Simulation/ Analysis/ECP

The validation phase is already a simulation stage (with predetermined parameters) and represents the normal completion of operations in a production phase. This production phase is usually understood in the hydrological literature as hindcasting, that is, as functional to the understanding of past events for which an explanation is sought in a quantitative framework. This involves the use of more accurate analysis and indicators than those used in the calibration/validation phase which require a certain speed. One of these is the analysis through empirical conditional distributions, as illustrated in Azimi et al., 2023. These analyses can eventually lead to a rethinking of the setup and calibration/validation phases in order to obtain more accurate results. As shown in Azimi et al (2023, 2024), ECPs can also be used as bias correctors and improve the overall statistical performance of the model's results, at least if it shows a certain stationarity of temporal behavior, that is, if, for example, the effects attributable to global warming do not significantly impact the structure of the model (including its parameters). The determination of the "reliability" of the models is then a key concept in the development of digital twins of the hydrological system (Rigon et al, 2022).

Another matter, and much less frequented by hydrologists, is that of forecasting future events. These future events, obviously, have to do with the time series input to hydrological models and therefore require forecasts of precipitation, temperature, wind speed, and air humidity. It is known that the meteorological system (global and local) is affected by a lack of predictability (predictability) due to deterministic chaos effects [insert citation]. To date, weather predictions have reliability, with respect to weather categories, of a few days, they have the ability to predict atmospheric temperatures, but they are still very imprecise in determining the amount of precipitation, in essence, they can be used to predict the hydrological future but with questionable quantitative value. The theoretical reason for this debacle has been somewhat said, but there are also others, for example, the heterogeneity of ground conditions and the absence of a description of the soil-atmosphere feedbacks, both conditions not described in meteorological models. Hydrological forecasts can therefore only be of a statistical nature and produce scenarios [insert citation], which are not devoid of meaning and practical value. In this area between Hydrology and meteorology the search for a common ground is mandatory for any evolution. In GEOframe, however, the input data treatment/modelling is quite well separated from the hydrological computation and any new source of data can be easily (but not without person/months work) included.

Distribution of results and participatory science

A fundamental aspect, already widely discussed in Rigon et al., 2022, is to understand how the results of a model can be shared with various users, but also how the model, its setup (including a very expensive phase of DEM analysis, weather data interpolation, and calibration/validation) can be shared, saving other researchers time. GEOframe is built in such a way that this is possible (share ability is by design of the informatics) and some experiences have already been made in this sense. Some within the Trento working group, others with external research groups from the University of Milan (whose work is to be incorporated) and the Polytechnic of Turin, where the basic data and models already pre-digested by the University of Trento served for further developments and analysis on some parts of the Po basin already processed.
The question on how to preserve, make use of multiple contributions to code, data, simulation configurations and simulations, is still open though.
It should be clarified that the GEOframe system is not only a set of data and models, but also a library of analysis tools, especially developed through Python Notebooks and often documented through a series of slides and video lessons [add the links here] and Schools []. Although this system can be improved and automated, it has allowed the group from the Polytechnic of Turin to dramatically shorten the modeling times of a series of basins in Piedmont and will allow, for the moment in the planning stage, the sharing of the setup and analysis of the Lombard area of the large Alpine lakes. Other analyses, developed in parallel on areas such as Friuli by the University of Udine, can easily be inserted into a possible national system that covers all of Italy, even though they were developed separately.
From the informatics point of view organizing all of this information through appropriate repositories would be mandatory in the future for an effcient use of the resources.


The GEOframe-Po project is more than just a collection of models; it envisions a comprehensive system that encompasses a variety of input and output datasets, model configurations, and the flexibility to operate on diverse platforms such as laptops, servers, and the cloud (leveraging the OMS/CSIP platform). The interfaces, as evidenced by the available self-instruction materials, can range from simple text-based designs to more sophisticated visual tools, including augmented reality devices.
The system is designed for continuous improvement and customization, with the ability to implement changes with minimal overhead. This was a strategic requirement pursued at various levels of the information technology aspect of the project [insert citations]. The current models can be broadly categorized as physically based, with the majority of the implementation comprising what is referred to in literature as "lumped" models. However, the system is designed to accommodate extensions to more distributed models, a possibility that has already been partially realized in some research lines of the Trento group.
The integration of machine learning techniques into the system is also possible [insert citation], even though they have not been utilized to date. The design of the GEOframe-Po project, therefore, represents a flexible, adaptable, and forward-thinking approach to modeling and data analysis.

Saturday, September 30, 2023

Constraints to transpiration in a simple (but not too simple) model of transpiration

In our collaborative work with Concetta D'Amato  for the WATERSTEM project, we encountered the initial constraint of transpiration imposed by the hydraulic conductance of the stem-root system. Through our research, inspired by Manzoni et al. [2013], we discovered that the sigmoidal form of conductivity leads to an optimum for transpiration. We attempted to reproduce this phenomenon using the data provided by Kroeber et al. [2-13]. After considerable effort, we successfully generated the gray curve in the Figure, which exhibits a peak just before -4 MPa and enables too high transpiration.

However, we realized that the soil resistance was missing from our analysis. To address this, we incorporated the conductivity of a Silt Loam soil using the van Genuchten Mualem parameterization. The resulting brownish curves serve as evidence that the soil plays a crucial role, as anticipated by Carminati and Javaux [2020]. It is important to note that these curves depict the limits imposed by the soil and stem, which determine the potential sapflow rates, but do not reflect the constraints imposed by plant physiology. To account for plant physiology, we introduced the stomatal resistance, represented by the three dashed curves under different working hypotheses whose parameterization was taken from Daly et al. [2004]. The red points in the Figure represent the plant's working points (although the coupling with the atmospheric boundary layer is not depicted). One notable aspect of the Figure is that at typical soil suctions, the sapflow curves appear relatively flat, and the working points result in relatively constant sapflow despite variations in xylem/leaves pressure. The complete story will soon be available in Concetta's Ph.D. thesis, and the detailed process of creating the Figure can be found in its supplemental material notebooks.


Carminati, Andrea, and Mathieu Javaux. 2020. “Soil Rather Than Xylem Vulnerability Controls Stomatal Response to Drought.” Trends in Plant Science 25 (9): 868–80.

Daly, Edoardo, Amilcare Porporato, and Ignacio Rodriguez-Iturbe. 2004. “Coupled Dynamics of Photosynthesis, Transpiration, and Soil Water Balance. Part I: Upscaling from Hourly to Daily Level.” Journal of Hydrometeorology 5 (3): 546–58.<0546:cdopta>;2.

Kröber, Wenzel, Shouren Zhang, Merten Ehmig, and Helge Bruelheide. 2014. “Linking Xylem Hydraulic Conductivity and Vulnerability to the Leaf Economics Spectrum—A Cross-Species Study of 39 Evergreen and Deciduous Broadleaved Subtropical Tree Species.” PloS One 9 (11): e109211.

Manzoni, Stefano, Giulia Vico, Gabriel Katul, Sari Palmroth, Robert B. Jackson, and Amilcare Porporato. 2013. “Hydraulic Limits on Maximum Plant Transpiration and the Emergence of the Safety-Efficiency Trade-Off.” The New Phytologist 198 (1): 169–78.

Wednesday, September 6, 2023

A Fermi's like estimation of water fluxes in a plant (to check some consistencies)

 A Fermi's problem is an order-of-magnitude problem (or order-of-magnitude estimate, order estimation), is an estimation problem designed to teach dimensional analysis or approximation (in this case approximation) of extreme scientific calculations, and such a problem is usually a back-of-the-envelope calculation (cit. Wikipedia)

Let's assume that a plant transpires 1 cm per day (just to exaggerate) per unit of area. Suppose this plant canopy covers an area of 100 m^2. The transpired volume in one day is ET = 0.01 *  100 = 1 m^3 (which is a lot, plants are reported to transpirate "hundred of liters", not cubic meters).

Now let's consider the specific hydraulic conductivity KS in Kg m^{-1} s^{-1} MPa^{-1}. According to Krober et al. (2014) and their database, the maximum hydraulic conductivity of Castanea Henryi is approximately (simplifying the numbers) 10 Kg m^{-1} s^{-1} MPa^{-1}. Skipping some details, the maximum sap flow, E_S, derived from this is of the same order of magnitude, expressed in Kg m^{-1} s^{-1} (hint: you need to calculate K(\psi) \psi, with K varying with psi, and psi being the pressure (in MPa) in the xylem, as in Manzoni et al., 2014).

To compare E_S and ET, I need to multiply E_S by the active trunk cross-sectional area CSA (according to Thurner) and divide it by the plant height (10 m) to account for the gradient. Then, I need to convert from Kg per second to Kg per day (multiplying by 10^5) and divide by the density of water to obtain the result in terms of volume (10^3 kg/m^3). Therefore:

E_S = 10 [ES value] * 10^5 [Seconds in a Day] * CSA [Cross-sectional Area] / 10^4 [Plant Height * Water Density] = 100 CSA

From ES = ET, it follows that:

CSA = 0.01 m^2

which could not  be an unreasonable value (plant physiologists have to tell me). If the density measurement made by Kroeber et al. is actually related to the entire branch/trunk they used, it could mean that in a 1 m^2 stem (if the stem were 1 m^2), 1% contributes to the xylem flow. Unless I have forgotten any factor somewhere (which would be embarrassing, but I'll take the risk) or the measurements made by Kroeber et al. need to be adjusted differently.

According to Lüttschwager's study, this value would imply a much higher specific hydraulic conductivity than the KS observed in the outermost regions of the trunk where the flow is concentrated. Another consequence is that the less conductive species of this Chinese chestnut (38 out of 39 in the study) could only sustain such evaporation demands with much larger stems, which seems unreasonable, or a large percentage of vessels.

I would like to ask if the numbers I presented seem correct and reasonable to you, and if there is anything blatantly wrong in my reasoning or deduction from Kroeber's work (for those familiar with it) or elsewhere. Any comments are welcome.

P.S. - Most species in Kroeber's study have a KS that is 10 times smaller, which would require a CSA 10 times larger for the same evaporative demand.


Manzoni, Stefano, Giulia Vico, Gabriel Katul, Sari Palmroth, Robert B. Jackson, and Amilcare Porporato. 2013. “Hydraulic Limits on Maximum Plant Transpiration and the Emergence of the Safety-Efficiency Trade-Off.” The New Phytologist 198 (1): 169–78.

Kröber, Wenzel, Shouren Zhang, Merten Ehmig, and Helge Bruelheide. 2014. “Linking Xylem Hydraulic Conductivity and Vulnerability to the Leaf Economics Spectrum—A Cross-Species Study of 39 Evergreen and Deciduous Broadleaved Subtropical Tree Species.” PloS One 9 (11): e109211.

Lüttschwager, Dietmar, and Rainer Remus. 2007. “Radial Distribution of Sap Flux Density in Trunks of a Mature Beech Stand.” Annals of Forest Science 64 (4): 431–38.

Thurner, Martin, Christian Beer, Thomas Crowther, Daniel Falster, Stefano Manzoni, Anatoly Prokushkin, and Ernst-Detlef Schulze. 2019. “Sapwood Biomass Carbon in Northern Boreal and Temperate Forests.” Global Ecology and Biogeography: A Journal of Macroecology 28 (5): 640–60.

Thursday, August 31, 2023

Some papers that discuss tree allometry to obtain biomass and sapwood cross sectional area

In order to accurately determine the water budget of trees (see also yesterday's post), it is crucial to establish a connection between the quantity of sapwood and the transpiration rate from the leaves. One essential factor in this process is obtaining accurate measurements of the sapwood cross-sectional areas (CSA). However, it is important to note that these CSA measurements can vary significantly from one plant to another. Acquiring this data can be challenging, and as a result, researchers have conducted studies aiming to establish allometric relationships as a means to estimate these measurements. To assist me in finding relevant literature on this topic, I reached out to my colleague involved in the WATERSTEM project. Below, you will find the literature they recommended.


Berry, Z. Carter, Nathaniel Looker, Friso Holwerda, León Rodrigo Gómez Aguilar, Perla Ortiz Colin, Teresa González Martínez, and Heidi Asbjornsen. 2018. “Why Size Matters: The Interactive Influences of Tree Diameter Distribution and Sap Flow Parameters on Upscaled Transpiration.” Tree Physiology 38 (2): 263–75.

Kubota, Mitsumasa, John Tenhunen, Reiner Zimmermann, Markus Schmidt, Samuel Adiku, and Yoshitaka Kakubari. n.d. “Influences of Environmental Factors on the Radial Profile of Sap Flux Density in Fagus Crenata Growing at Different Elevations in the Naeba.”

Lüttschwager, Dietmar, and Hubert Jochheim. 2020. “Drought Primarily Reduces Canopy Transpiration of Exposed Beech Trees and Decreases the Share of Water Uptake from Deeper Soil Layers.” Forests, Trees and Livelihoods 11 (5): 537.

Lüttschwager, Dietmar, and Rainer Remus. 2007. “Radial Distribution of Sap Flux Density in Trunks of a Mature Beech Stand.” Annals of Forest Science 64 (4): 431–38.

Niccoli, Francesco, Arturo Pacheco-Solana, Sylvain Delzon, Jerzy Piotr Kabala, Shahla Asgharinia, Simona Castaldi, Riccardo Valentini, and Giovanna Battipaglia. 2023. “Effects of Wildfire on Growth, Transpiration and Hydraulic Properties of Pinus Pinaster Aiton Forest.” Dendrochronologia 79 (126086): 126086.

Petrík, Peter, Ina Zavadilová, Ladislav Šigut, Natalia Kowalska, Anja Petek-Petrik, Justyna Szatniewska, Georg Jocher, and Marian Pavelka. 2022. “Impact of Environmental Conditions and Seasonality on Ecosystem Transpiration and Evapotranspiration Partitioning (T/ET Ratio) of Pure European Beech Forest.” WATER 14 (19): 3015.

Thurner, Martin, Christian Beer, Thomas Crowther, Daniel Falster, Stefano Manzoni, Anatoly Prokushkin, and Ernst-Detlef Schulze. 2019. “Sapwood Biomass Carbon in Northern Boreal and Temperate Forests.” Global Ecology and Biogeography: A Journal of Macroecology 28 (5): 640–60.

Wednesday, August 30, 2023

A Rosetta stone for connecting the various forms of the Darcy-Buckingham law use in Hydrology and Plants Physiology

The information presented here is derived from the study conducted by Carminati and Javaux in 2020, which aimed to provide insights into plant hydraulics. Carminati referred to the work of Kroeber et al. in 2014, who conducted extensive measurements on a variety of plants and reported their data. However, a discrepancy arises between hydrologists and plant physiologists in the units used to measure hydraulic conductivity. While hydrologists measure it in meters per second (m/s), plant physiologists measure it in kilograms per meter per Pascal second [Kg m/(Pa s)].

In their study, Kroeber et al. reported conductivity per unit area, denoted as Kk, measured in kilograms per meter per Pascal second [Kg/(m Pa s)]. This unit might seem unfamiliar or obscure. To bridge the gap between my background and the new papers, Carminati and Javaux provide a clue. They suggest that the relationship between Kk and the commonly used hydraulic conductivity, K_w, expressed in centimeters per day (cm/day), can be established using the enigmatic equation K_w = g * 100 * 10^(-6) * 3600 * 24 * Kk. Now, the question arises: Is 'g' referring to the acceleration due to gravity?

So I dedicated a couple of days of my life to build a Rosetta Stone to translate the units and check the coherence of what done. The result is a short paper by me and Concetta D'Amato that you  can find here

For obtaining this I had to walk through the valley of the water potentials expressed in different units, but also this can be interesting for the reader.  

Next step is understand which is the value of the cross section through which the water flow to obtain, at the end, real cubic meter per second or kg per second. 


Carminati, Andrea, and Mathieu Javaux. 2020. “Soil Rather Than Xylem Vulnerability Controls Stomatal Response to Drought.” Trends in Plant Science 25 (9): 868–80.

Kröber, Wenzel, Shouren Zhang, Merten Ehmig, and Helge Bruelheide. 2014. “Linking Xylem Hydraulic Conductivity and Vulnerability to the Leaf Economics Spectrum—A Cross-Species Study of 39 Evergreen and Deciduous Broadleaved Subtropical Tree Species.” PloS One 9 (11): e109211.

Monday, July 31, 2023

Some observations about long rainfall and the generated discharges

 In well-known hydrologic response theories like the IUH, it has been established that for a specific catchment and a constant rainfall, there exists a 'critical rainfall duration' resulting in the maximum discharge for that catchment, which is usually known as concentration time

The next step is to associate a return period with the constant rainfall. This allows us to demonstrate that given a precipitation with an assigned return period, there is a critical rainfall duration that yields the highest possible discharge in that river section.This is what has been accomplished in Rigon et al., 2011 (but the research dates back to early 00, which is another interesting story). BTW, In the paper, we have also shown that this time is less or equal to the concentration time. 

The above argument may lead to the misconception that the “maximum discharge” for the catchment cannot be exceeded (keep in mind that the concept of maximum discharge obtainable is incomplete when you do not mention a return period).  Consider doubling the duration of the rainfall while keeping the intensity fixed. The first impulse results in the highest discharge with the assigned return period. Yet, it also has a discharge tail that, depending on the catchment's features, can last quite long. When the second impulse of precipitation arrives with the same intensity, it adds to the recession of the first impulse, usually increasing the discharge beyond the maximum discharge obtained with a single impulse.

In certain cases, like in the kinematic hydrograph model (uniform IUH) the rise of the new impulse discharge may precisely compensate for the decreasing recession of the older impulse, resulting in a constant discharge. However, this is not the general scenario, as simple calculations can show and sticking with this idea can be erroneous. Typically in fact, and especially when there is a marked contrast between the response time of the surface and subsurface storm flow waves, the recession discharge generated of the first impulse decreases more slowly than the increase in the new impulse discharge, effectively acting as additional rainfall. This effect is equivalent to increase the intensity of the effective rainfall to a return period which can be estimated through inverse modelling. In other words, two subsequent rainfall impulses, each with an assigned return period, are equivalent to a precipitation event with a higher return period. While the IUH theory establishes a precise equality between the return period of rainfall and discharge for a single impulse, the two return periods of discharges and rainfall become decoupled when multiple rainfall impulses occur.  

Although real-world precipitations are not constant and uniform, and the response of the catchment may not be time-invariant,  the main qualitative findings described above remain statistically valid and could be tested by generating ensembles of time-variable precipitations with numerical models. Besides, there are additional factors like sediment and vegetation transport that can add volume to the water (see for instance these posts),  increasing more than linearly the return period of discharge with increasing rainfall intensities. 


Rigon, R., P. D’Odorico, and G. Bertoldi. 2011. “The Geomorphic Structure of the Runoff Peak.” Hydrology and Earth System Sciences 15 (6): 1853–63.

Tuesday, July 4, 2023

C3A Six Years Plan

C3A is the Center for Agriculture (Agricoltura), Food (Alimenti), Environment (Ambiente) of the University of Trento (UniTrento).  As you can see in the brief history you can find in the document below, it was established to increase the involvement in the high education and research of the Province of Trento in Agrifood (and environmental field) together with the Edmund Mach Foundation (FEM) six years ago. 
The collaboration was actually not easy but at the same time fruitful and had a change in the recent years that were brought to a new agreement between the FEM and UniTrento. This agreement was the basis for the new six-years plan of the Center (due according to the regulations of the University of Trento) which you can find below. It design the research and educational activities for the next 6 years. 

The plan can be found by clicking on the above Figure. Here you can also find the slides I presented to the board of the University for presenting the Center and the plans.

Thursday, June 22, 2023

A Ph.D. position on Po River, DARTHs, Earth Observations

I have an open Ph.D. position which closes at July 6: - Evolution of the system GEOframe/OMS3/CSIP for the building ofa Digital Twin of the Hydrology of river Po - E66E23000170001 

It looks like it is very dedicate to informatics (see also here) but let me say that the candidate should write their  project with a broader view, although it must remain within the scope of what we are doing in the context of the Po River basin project and  related to the exploitation of satellite data to support hydrological modeling. The project that funds it, besides PNRR,  is 4DHYdro, which collects some of the best hydrological modellers in Europe (and from the projects' goal you can find inspiration). 

The general focus of the study are droughts and can contains more computer science-related parts, more conceptual parts, or more applied parts. The themes related to the processes are: snow, plant transpiration, and crop needs. The enabling technology is precisely the systematic use of Earth observation, and the concept paper for the whole system is the one about DARTHs. Further information on DARTHs can be found here

If, at this point, you are a little convinced to apply also consider the philosophy of our group that you can find in a sequence of posts, here and and links therein.

Our group is a  crew of international fellows: 2 Indians, 1 Pakistani, 1French, 1 Iranian, 1 Algerian and 8 Italians, including two professors, one researcher (at Eurac), two postdocs, and nine Ph.D. students already. 

Saturday, June 10, 2023

Transit time, Residence time, Response time, Life expectancy

 Just to help someone, a few definitions:

  • Travel Time (a.k.a. Transit Time), T: It is the time a parcel of water stays inside a control volume. If $t_{in}$ is the time it entered the control volume and $$t_{ex}$$ the travel time it exits, then $$ T= t_{ex}-t_{in} $$ For an observer placed at the outlet(s) of the control volume, since their actual (clock) time coincides with $$t_{ex}$$, i.e $$t_{ex}=t$$ it is $$ T= t-t_{in} $$. The actual variable in this definition is $$t_{in}$$

  • Residence time is $$T_R = t-t_{in}$$
  • Life expectancy is $$ T_L = T_{ex} - t$$ so, it is also: $$ T = T_R+T_L $$ 
  • Response time is $$R = t_{ex}-t_{in}$$ but only restricted to all the parcels injected at the same $$t_{in}$$ estimated at $$t=t_{in}$$ and is their life expectancy at  $$t=t_{in}$$. The actual variable here is $$t_{ex}$$.
All of these definitions are given in statistical sense, meaning that they are stochastic variables described trough their distributions. Looking at the above definitions  and if we do not read them very carefully, it looks like that transit time and response time are the same thing and we though for decades it was. Instead they are not since transit time distribution is conditional to the clock time, while response time distribution is conditional to the injection time.  The first is a distribution in $$t_{in}$$, the second in $$t_{ex}$$. 

To learn more, you can get further details at: 

Rigon, Riccardo, Marialaura Bancheri, and Timothy R. Green. 2016. “Age-Ranked Hydrological Budgets and a Travel Time Description of Catchment Hydrology.” Hydrology and Earth System Sciences Discussions, May, 1–22. (This paper has a little weird description of life expectancy though)

Rigon, Riccardo, and Marialaura Bancheri. 2021. “On the Relations between the Hydrological Dynamical Systems of Water Budget, Travel Time, Response Time and Tracer Concentrations.” Hydrological Processes 35 (1). (ad especially give a look to the supplemental material)

See also a concise summary unpublished paper available here

Video lectures in the topic can be found in this blog. The most recent ones are:

Travel Time, Residence Time and Response Time
Here below we started a little series of lectures about a statistical way of seeing water movements in catchments. This view has a long history but recently had a closure with the work of Rinaldo, Botter and coworkers. Here it is presented an alternative vie to their concepts. Some passages could be of some difficulty but the gain in understanding the processes of fluxes formation at catchment scale is, in my view, of great value and deserves some effort.  The way of thinking is the following: a) the overall catchments fluxes are the sum of the movements of many small water volumes (molecules); b) the water of molecules can be seen through 3 distributions: the travel time distribution, the residence time distribution and the response time distributions; c) the relationships between these distributions are revealed; d) the relation of these distributions with the the treatment of the catchments made through ordinary differential equations is obtained through the definition of age ranked distributions; e) The theory this developed is a generalizations of the unit hydrograph theory. 

Monday, May 15, 2023

A talk on how plants eat the Sun

 Tomorrow, for the "23rd International Day of Light" I am giving a talk about how through the photosynthesis plants take water and carbon dioxide to produce oxygen and carbohydrates (lignin and other stuff).  The title was kind of suggested by a book by Oliver Morton of about the same name

Slides and talk are in Italian but I will provide soon an English version of it. By clicking on the figure, please find the presentation pdf. 

Friday, May 12, 2023

Attacking the issue of hydrological scales within GEOframe

 How to face the scale problems in GEOframe. Daniele Andreis is giving his answer by using the capabilities of the GEOframe system.  His approach is simple. Taken a catchment subdividing further the HRUs (with an appropriate software component) in smaller HRUs you are able to run a GEOframe modelling solution to the larger catchment (if required)  with a specific catchment refined and then you can compare the models parameters of the refined solutions with the coarser one. (That's actually not really  done yet ;-)). 

What you see in this poster is the preliminary work where you can notice a couple of nice features: the time series of the monthly, daily and hourly water budget which is unusual to find and the comparison of a neutron probe signal with the simulated root zone water content. The signal are clearly correlated and this can be though as an indirect assessment of the GEOframe modeling solution chosen and let's hope that we can actually calibrate somewhat the root zone parameters with the neutron probe. Clicking on the above figure, please find a larger figure of the poster,  

Monday, April 24, 2023

For inquiring students. - II (and what is ph.D education about)

"My name is John, and I am an M.Sc. student of water engineering at the Beautiful University of Technology. My M.Sc. thesis subject is a groundbreaking one - 'Application of WRF Regional Model for Rainfall Prediction (Case Study: Northwest of Antarctica).' Under the guidance of Prof. Best Hydrologist, I have delved into the impact of predicted precipitation by the WRF model on flood forecasting in the Hec-Hms (or SWAT) model. My expertise and keen interest in climate change impacts on hydroclimatic extremes, numerical weather prediction, hydrology and water resource management, hydraulic and flood forecasting make me a force to be reckoned with in the field. "

I already wrote on the topic of students inquiring about the possibility to do a Ph.D with me. You can find previous notes here. However, I would still add something.

It's heartening to see that many schools are taking the direction of training students to run WRF, Hec-HMS, SWAT, R, Python and other tools. However, mastering tools is just one aspect of research. To truly investigate nature, one must understand where problems are and be able to modify paradigms and tools to solve them appropriately. 

As Richard Feynman said, 'The problem is not people being uneducated. The problem is that people are educated just enough to believe what they have been taught, and not educated enough to question anything from what they have been taught.' 

Becoming a solid Ph.D applicant requires making this switch and adding a few phrases to your  CV that show that you have understood the Feynman's point.  Besides, you should be inquisitive and show some fire inside for discovering new things and achieve results beyond the state-of-art (and for what regards working with me, love computer programming). A PhD is not just a status symbol or a way to gain a good salary - it's a calling for those who are truly passionate about pushing the boundaries of knowledge (see this nice infographics from an old post)

P.S. -  I also wrote this "Essential for Hydrologists" that could further help. 

Tuesday, April 18, 2023

Using ChapGPT 4 for Learning Java

 Currently, there's a lot of hype surrounding ChatGPT v4. I was curious to explore its capabilities, so I used OpenAccessGPT to bypass current Italian law and access the BOT.

I tested ChatGPT v4 by asking it to search for literature on a specific topic - the stomata action under the effect of decreasing tension in xylem. While the results were interesting, they weren't necessarily the best. In fact, when I tested the BOT on more specialized topics, its performance seemed immature.

To improve my experience, I decided to seek help in programming and discovered that ChatGPT v4 could be a valuable resource. I soon realized that my idea wasn't entirely original, as there are many resources available online, such as this YouTube video. Perhaps ChatGPT v4 could have even provided me with this information.

This is what ChapGPT tells on the topic:

"ChatGPT can be a valuable resource for learning Java in several ways:

  1. Asking Questions: You can ask ChatGPT any question you have about Java, such as how to use a specific feature or how to solve a particular problem. ChatGPT can provide you with a detailed explanation or point you in the right direction to find the information you need.
  2. Practice Exercises: ChatGPT can provide you with practice exercises to help you improve your Java skills. These exercises can range from simple coding challenges to more complex projects.
  3. Code Examples: ChatGPT can provide you with Java code examples that illustrate how to use various features and solve different problems. These examples can help you understand Java concepts more easily.
  4. Tutorials and Resources: ChatGPT can recommend Java tutorials and resources that can help you learn more about the language. These resources can include online courses, books, and websites.

Overall, ChatGPT can be a valuable tool for anyone looking to learn Java. It can provide you with personalized assistance, practice exercises, and resources to help you improve your skills and become a proficient Java programmer. "

Browsing the YouTube class, I realized that the approach to Java was good but not the way I desired. So I asked: "Please help me in learning Java but not following the traditional way. I would like that you introduce me early on Java design patterns and topics related to numerical computation. I need also you to introduce me to the use of generics. I would say introduce me to modern Java".  Here below the result

Sure, I can design a ten-hour class on the topics of Java design patterns, numerical computation, and generics. Here's a rough outline:

Day 1 (2 hours):
1. Introduction to Java design patterns
2. Singleton pattern
3. Factory pattern
4. Observer pattern
5. Hands-on exercise

Day 2 (2 hours):
1. Introduction to numerical computation in Java
2. Overview of Apache Commons Math and JScience libraries
3. Basic mathematical operations (addition, subtraction, multiplication, division)
4. Hands-on exercise

Day 3 (2 hours):
1. Statistical analysis using Apache Commons Math
2. Descriptive statistics (mean, median, mode, variance, standard deviation)
3. Hypothesis testing (t-tests, ANOVA)
4. Hands-on exercise

Day 4 (2 hours):
1. Linear algebra using JScience
2. Matrices and vectors
3. Matrix operations (addition, subtraction, multiplication)
4. Eigenvalues and eigenvectors
5. Hands-on exercise

Day 5 (2 hours):
1. Introduction to generics in Java
2. Defining and using generic classes
3. Defining and using generic interfaces
4. Defining and using generic methods
5. Hands-on exercise

Each day will consist of a mixture of lecture-style teaching and hands-on exercises, to ensure that students have a chance to practice what they've learned. The hands-on exercises will involve writing code and solving problems related to the topics covered in class.

By the end of the ten-hour class, students should have a good understanding of Java design patterns, numerical computation, and generics, and be able to apply these concepts to real-world programming problems.

Not sure that I am really satisfied but my further experiments with more detailed questions went better. Therefore, up to a point, the user has also to lear how to interact with the BOT.  When you go on more detailed questions the performance are less generic. For instance, I asked:

Can do you do examples on how define and using generics classes in Java ?

The answer was:

public class Box<T> {
    private T contents;

    public void setContents(T contents) {
        this.contents = contents;

    public T getContents() {
        return contents;

This answer is correct by a little obscure to a newbie. So I refined my request: "How can I use the previous class in a main() ?  And I have got:

public class Main {
    public static void main(String[] args) {
        // Create a Box that holds strings
        Box<String> stringBox = new Box<>();

        // Set the contents of the box to a string
        stringBox.setContents("Hello, world!");

        // Get the contents of the box and print it to the console
        String contents = stringBox.getContents();

with the comment:

"In this example, we create a Box that holds strings using the syntax Box<String>. We then set the contents of the box to the string "Hello, world!" using the setContents method, and retrieve the contents using the getContents method. Finally, we print the contents to the console using System.out.println".

Overall (there were also some other example, this was quite satisfying.  Next step: making ChapGPT learn about our own staff and being able to support, for instance, people in using GEOframe. 

Tuesday, March 21, 2023

2023 Stockholm Water Prize went to Andrea Rinaldo

As any year, the first day of Spring and the day before the Water Day, it is announced the Stockholm Water Prize. The Stockholm Water Prize is often described as the Nobel Prize of water and is characterized by a similar selection process. Since 1991, the Stockholm Water Prize has been awarded to people and organizations for extraordinary water-related achievements. Its official site is SIWI. Last year the Prize was given to Wilfried Brutsaert

This year prize was assigned to Andrea Rinaldo, first Italian to receive it, as a recognition of his achievements and its role for the Italian and International Community of Hydrologist. Professor in Padova and at Ecole Politechnique in Lausanne, he was one of the founder and head of the Civil and Environmental Engineering Department of the University of Trento of the degrees in Environmental Engineering.  The the official announcement has been shared on SIWI’s Youtube channel on Tuesday 21 March, at 3pm CET.  SIWI website ( contains further information. 

Andrea Rinaldo (center) in Mesiano - Trento - Department of Civil, Environment and Mechanical Engineering Department 

Andrea Rinaldo  is one of the most eminent Italian Hydrologists ever. The Italian community of hydrologist bloomed in the last twenty years and is now one of the most active and productive in the World. This was certainly due to the positive action that Andrea Rinaldo started since the late eighties of last century to promote young researchers and put them in contact with some of the most outstanding scientists in the field, starting from Ignacio Rodriguez-Iturbe,  Gedeon Dagan and Peter Eagleson (all of them have received the Stockholm Water Prize),  Raphael Bras and others. He was him between the very few who captured the attention of these greats with his creativity and dedication and involved them in researchers with Italians gifting many youngster of an incomparable education in hydrology. 

He steered the Italian Water Community to put attention on the important topics of water management, hydrology and geomorphology.  Some of the younger Italian scientists could not be aware of that but when some good research is published from Italians, there is a good chance that at least some drops of it have a seed in Rinaldo's academic and scientific work. 

Andrea Rinaldo   himself is an extraordinary researcher. He gave fundamental contributions in understanding the shape of river networks and his book with Ignacio Rodriguez-Iturbe is a long-standing milestone on the subject. The theory of optimal channel networks not only gives explanation of how the river networks evolve and behave but also has risen the interest for hydrology of scientists  working on related disciplines, and, in particular, of those working on transportation networks making hydrology an epitome of many other physical phenomena. 

He contemporarily worked on the theory of the Instantaneous Unit Hydrograph, where his contributions are among the the most cited. More recently he renewed the topic with a couple of papers that completely changed the view on travel times and residence times and related issues. Those contributions are definitive in clarifying the subject and they close with a neat theory ninety years of research history. At the same time those contributions are the seeds of the current studies with tracers and isotopes that are going to push greatly forward our knowledge on water movements in hillslopes and vegetation.

Together with  Rodriguez-Iturbe and younger fellows, he was architect of many eco-hydrological studies since the year 2000 that brought new insights in the soil-vegetation-atmosphere interactions and inspired new directions of research.

While continuing to frequent his vast collections of research topics that he manages masterfully, and where there are many “minor’’ contributions that could just be the reward of a carrier for others, he worked intensively in the last twenty years to understand the mechanics of  the spreading of the water borne diseases and, in general, of the spreading of populations. In this he actually joined together several of his research favorite topics, river networks, water, transportation issues, complexity, theoretical, field and laboratory work. His recent book with Marino Gatto and Ignacio Rodriguez-Iturbe is a compendium of  provoking ideas and methods that certainly will last long in the libraries (virtual or not) of the researchers of the field.