Monday, November 30, 2020

A note on Wilting Point (and water stress in vegetation models)

 The paper by Veihmeyer and Hendrickson seems definitive in its affirmation about both the field capacity and the wilting point. That they exist and depends upon the soil characteristics. Because I certainly know that field capacity is just an interpretation of phenomena and not some characteristics written in Mose’s tables, I have the doubt that also the second about the wilting point is at most an approximation. 

Gardner in his interesting 1965 paper, in fact says the following: "Whatever the exact value of the pressure potential at wilting, the wilting phenomenon is a consequence of a change in the mechanical or elastic properties of the plant leaf and appears to be associated with a rather definite value of the relative water content and water potential for leaves of a given species and given age for a given cell solute content.” 
This statement is much less assertive and leave large space to variation among plants types. 
Actually I was interested in the wilting point because $\theta_w$ is used in traditional standard models for determining the water stress induced by droughts on plants. Actually then, what I am interested in is not the “wilting” point, but the point at which stomata close with respect to this, the CLM model, in fact, acts in a different way:
  •  It uses suction instead than water content (i.e. formula 2.34 of the manual, Oleson et al, 2013)
  • It gives a table according to which different vegetation types close stomata at different values of suction (Table 8.1)


The second fact implies that the water content at which a plant type close its stomata varies with soil types (this is obtained by inverting the suction using the soil water retention functions). 
Clearly the two views cannot be both true when applied to transpiration. Either the one or the other must be the correct one (well, this is optimistic: probably they are both wrong). 
The reading of the Garner paper also suggests that anyway the wilting point, i.e. when plants wilts, depends upon the properties of plants’ cells, which actually could have a certain homogeneity among the various specie and the one of the wilting point  is a topic that needs further browsing of literature (but the issue related to stomata closure, maybe not). Anyway Gardner 1965 set it between -12 and -15 Bars, which is, actually a quite wide range. 

Since the arguing went to the topic of stresses it is also to remark that isohydric and anisohydric behave differently. Therefore this has to be reflect in the mathematical form of the stress which actually is not.

.... Still mumbling .......

News !!!

Finally it comes a very recent paper. Brought to my attention by Nunzio Romano, Chagas Torres et al. (2021) sortes out the problem with very recent analysis techniques. It shows that wilting point is effectively variable and depending both on soil types and plants. It also contains references to other recent literature I was not aware before. 

References

Chagas Torres, Lorena, Thomas Keller, Renato Paiva de Lima, Cássio Antônio Tormena, Herdjania Veras de Lima, and Neyde Fabíola Balazero Giarola. 2021. “Impacts of Soil Type and Crop Species on Permanent Wilting of Plants.” Geoderma 384 (February): 114798.

Keith W. Oleson, David M. Lawrence,Gordon B. Bonan, Beth Drewniak, Maoyi Huang, Charles D. Koven, Samuel Levis, Fang Li, William J. Riley, Zachary M. Subin, Sean C. Swenson, Peter E. Thornton, Anil Bozbiyik, Rosie Fisher, Colette L. Heald, Erik Kluzek, JeanFrancois Lamarque, Peter J. Lawrence, L. Ruby Leung, William Lipscomb, Stefan Muszala, Daniel M. Ricciuto, William Sacks, Ying Sun, Jinyun Tang, Zong-Liang Yang. 2013. “Technical Description of Version 4.5 of the Community Land Model (CLM).” NCAR.

Gardner, W. R. 1965. “DYNAMIC ASPECTS OF SOIL-WATER AVAILABILITY TO PLANTS.” Annu. Rev. Plant. Physiol.

Veihmeyer, F. J., and A. H. Hendrickson. 1950. “Soil Moisture in Relation to Plant Growth.” Annual Review of Plant Physiology 1 (1): 285–304. 

Sunday, November 22, 2020

Evaporation from a capillary tube coupled with the dynamics of flow

This is the result of some quick research of mine, initially stimulated by the model of evaporation from soil by Lehman and Or.  I did not read them and I am annotating them here in order not get them lost.  Among those I found, I would start from:


The next two, written the same Authors, have a theoretical part and experimental part. They looks like rigorous but a little intimidating at first. So I would let them for a second reading. 
  • Polansky, John. 2016. “An Experimental and Theoretical Investigation of Evaporating Meniscus Dynamics and Instabilities.” Carleton University. https://curve.carleton.ca/b5d63fef-d97d-4df1-88ae-6548ea935426.
  • Polansky, John, and Tarik Kaya. 2016. “Stability of an Evaporating Meniscus: Part II--Experimental Investigation.” International Journal of Thermal Sciences 105: 75–82.
Finally, the third group of papers contains some experiments and, I think they are worth a reading"
  • Luzar, Alenka, and Kevin Leung. 2000. “Dynamics of Capillary Evaporation. I. Effect of Morphology of Hydrophobic Surfaces.” The Journal of Chemical Physics 113 (14): 5836–44.
  • Leung, Kevin, and Alenka Luzar. 2000. “Dynamics of Capillary Evaporation. II. Free Energy Barriers.” The Journal of Chemical Physics 113 (14): 5845–52.
  • Leung, Kevin, Alenka Luzar, and Dusan Bratko. 2003. “Dynamics of Capillary Drying in Water.” Physical Review Letters 90 (6): 065502.
 Not having read them, I do not know how much they account properly for the evaporation demand, if it is fixed or if it is varying. In the latter case the equation for the capillary should be coupled with the atmosphere and what I can suggest, is to give a reading to the second chapter of dr. Michele Bottazzi dissertation. 

Monday, November 16, 2020

Life Expectation & Response Times explained

This is the continuation of the presentations that try to explain what residence time, travel time and response time are. The first presentation, on travel and residence times can be found here.


This presentation regards response times and their relation with life expectancies, which is an aspect quite unresolved so far and that was clarified to our knowledge for the first time in the supplemental material of the paper "On the relations among HDSys" by Rigon and Bancheri


Friday, November 13, 2020

Earth Observation For Water Cycle Science 2020

 I receive from Luca Brocca and I think the topic is interesting and the project mentioned worth to be followed. 

Dear colleagues


we have organised a discussion session at ESA conference EO FOR WATER CYCLE SCIENCE 2020  on Digital Twin Earth Hydrology (DTE Hydrology).
DTE Hydrology is a very recent activity (started one month ago) that aims at building an integrated and interactive system providing the best possible reconstruction and simulations of the water cycle and the hydrological processes and its interactions with human activities at unprecedented resolutions and accuracies (see also here for more information). The first implementation of DTE Hydrology will be carried out over the Po River Basin.

The session is on Tuesday 17 November 15:00-16:30 (UTC+1) (here the agenda).  The general agenda is obtained by clicking the image above.

Thursday, November 12, 2020

Proietti's lesson to hydrologists

 Below is an excerpt from the memories of Gigi Proietti (the book is from 2013) brought to my attention by the colleague Giovanni Pascuzzi



"When I began to take my first steps on stage, I did nothing but devote myself to the technical aspect of acting. Gassman said I was manic. And I thought: "from which pulpit the sermon comes".
Like someone who plays a scale after another maybe even quickly but without putting into it the warmth of invention, "er core" as the jazz musicians in Rome say.
I lacked the humility of those who put themselves at the service of the public, of those who are able to look at themselves with the eyes of another without hiding behind pure technique.
I had to get naked and instead I insisted on excessive virtuosity that covered every exchange, broke all communication. I continued with that attitude until I made my big discovery, the closest thing to an epiphany I have ever experienced: “I am fucking unsympathetic". Excessive technique made me unbearable. Today I notice it when I see some footage of that time.
Too many times, in the cinema world, I had heard myself say: "you are too good" which is different from "You are very good, outstanding". It hides the fact that you are unpleasant. When you overdo it, you get the opposite result: you do not pop on. And getting out of that trap that you  have set for yourself is not easy. You are confused and you ask yourself: "Should I do less?". But for someone who wants to give the best of himself, doing less is very difficult. Even on television the "experts" said I didn’t pop on camera. Then when I did Marshal Rocca they were silent." 

While this has to do with teaching and research ?  Both of them require possessing techniques (the standards, they would say a jazz musician) but you do not have to be possessed by the technique. The technique is a tool not the topic (unless your topic is the technique). 

Wednesday, November 11, 2020

Travel times and Residence Times explained

Long time ago it was believed that Residence times and travel times were the same concept. This is not true. Long time ago it was also assumed that they could be chosen as time invariant. Recent literature showed that this is not usually the case.  All of this is explained, hopefully in a clearly definitive way in the following presentation.


Yo can find it by clicking on the figure above. This presentation uses drawings, plots, in a way that it cannot be done in a paper (but would be useful sometimes) and can be considered to be a companion of part of our 2016 paper below. Other presentations will follow on the topics of response time and its relation to life expectation. The talk I gave for the WATZON project is here below

 

 

Rigon, Riccardo, Marialaura Bancheri, and Timothy R. Green. 2016. “Age-Ranked Hydrological Budgets and a Travel Time Description of Catchment Hydrology.” Hydrology and Earth System Sciences 20 (12): 4929–47.

Friday, November 6, 2020

The State-of-Art and the perspectives for next GEOframe research

These are the contents of an e-mail I sent to a friend and colleague to push forward our collaboration. Despite it has been written to a specific person, I think some of the topics can be on general interest for the topics it covers. At least for who is interested in Hydrological Modelling.

Dear Friend,
 
I am trying to simplify here the objectives of my research in order to see where we can find convergence of aims and goals.

Overall, I want to pursue a tight connection between the theory of hydrological processes and their sound (replicable, robust, reliable) implementations (see also here for other explanations). Actually, I work both on the theory and on the implementation. Soon in my career, I realized that a poor implementation of a correct theory often produces wrong results and moreover its incorrect falsification brings credits to flawed ideas (the case of  "tranchant" judgments on Richards equation, based on unreliable integrators is one of the cases). I also grew the idea that, differently from some other colleagues, I wanted to build not just doing-programming but programming-system products (e.g. - Brooks, 1975), i.e. reusable software on which other people can build new knowledge (yes, the idea that building on each others shoulders, and working that way, possibly having the couple of smart intuitions, we can arrive where only the giants, usually arrive). 


My main goal is to build better models than those existing, more controllable, and less prone to devastating bugs. My research tends to be more “methodological” than applied. It is exactly this approach that moved me towards OMS3. It, in comparison with other options, followed by successful colleagues, presents a clean design, support for models’ controllability, encapsulation of modelling solutions, easy and ordered reuse of modules, intrinsic documentation with annotations, support to technical issues, like parallelization and calibration, without overwhelming the hydrologist concentrated on physics. Besides, a wise use of components, could help to dose information to the users and set it visible only when and where required.

Coming to the practice of my research, I focused mainly to two types of models, those I call Hydrological Dynamical Systems (HDSys), mainly based on the solution of multiple coupled non-linear, non-autonomous, ordinary differential equations, and those that have the space variables explicit and solve partial differential equations. From a different perspective, my goal has been to cover entirely the various aspects of the hydrological budget (often including terrain and soil/sediment) not only focusing, as traditional models do, one some or one of the aspects, like discharge or evaporation, or infiltration, or groundwater. Coupling the water with the energy budget, has been an objective of both the types of modelling, especially, but not only, because temperature is easily measurable, even from remote sensing. Tracers, Nutrient and pollutants, were never deeply considered, except recently, but they are part of my modelling tools since many years.

For a community to grow around the previous ideas, some key tools are still missing. The points below summarize: i) what I envision is needed from different perspectives/users; ii) what futures OMS developments should potentially consider. Some of these points are already being reached with the work we did, some are planned to be implemented, for some others we need your support to reach a critical mass and make them happen.

Power User side:
  • Smoothing out some part of the process to deploy a modelling solution to a specific concrete catchment. With students of the GWSs and Hydrological modelling class the two processes of extracting the HRU and interpolating the data were instructive but too much detailed and cumbersome.
  • Using the console is easy and usually hassle-free. However, because we use mostly Jupyter notebooks for the treatment and analysis of inputs and outputs this is a further environment to learn. Using Docker and the command line inside Jupyter could be a choice but with Docker we had hard times on Windows. A console inside Jupyter would be the best choice.
  • In general, a convergence of our tools with those tools people use the most, as Jupyter, decrease the learning curve and developer commitment to bring in and maintain tools.
  • Some ancillary tools for “joining” catchment studied by different people are required
  • calibration revealed to be a time-consuming effort that needs parallelization and speed-up.
  • Probably a server or a “hub” to store and retrieve the collective work, including parameterisations, inputs and so on.
  • Manage the possibility of having multiple treatments of the same catchments would also be required sometimes in the future.

For institutional user
  • Connections to Delft-Fews could be an option to investigate.
  • Our group needs also to experiment with CSIP .
  • A distribution of all the material and the code, through some tool like Anaconda would be desirable.
  • They need dedicated interfaces. For they, modifying parameters and models structure should be not be an option as for researchers or power users
  • Scalability of the computing effort should be the standard (and this should include Net3)

Developers: 

  • Source code should be available on public repository like Github
  • To improve developer appropriate/specific documentation for them should be a continuous effort

Potential developer/researchers

  • When they come from Environmental Engineering or sciences, they usually are familiar with Python, R or Matlab. They do not have notions of OO programming, nor of basic software engineering background. Therefore, appropriate material providing all of this knowledge should be produced. I started with a Java for Hydrologist 101 but I am far to have completed it.
  • They are not comfortable to use tools like Git, Github, Docker, Unit tests, and other commons tools which are necessary for software carpentry and collaborative work. Therefore, some training course on these should be also provided .

The above is more a wish list which we are keeping in mind. Frankly we do not have yet the all the competence to treat them all. As you see I did not list any machine learning tool: but this does not mean that we are not looking to them. For the moment is just safe for us to concentrate to enhance and bring to an optimal state what we have and publish that ten of papers that we have in production on he work we have already done. We are looking for resources though and, resources arriving, we could also think to statistical/machine learning methods to be introduced. One thing to be remarked is that GEOframe-NewAGE can easily replace PRMS. The module we have, usually, are different from those PRMS has, but implementing them the very same way PRMS does should be VERY easy, if this is the goal. A greater integration with AGEs would be also advisable. The main differences to be treated for compatibility are the IO. For now, we often stick with complex data formats but abstract the algorithms from them is an objective we have in mind. Mostly we had to follow our way so far to be sufficiently comprehensive. To be sincere, IMHO, some parameterisation of the processes inside AGEs are simply old hydrology, not currently supported by researchers but, yes, still in use by practitioners (which worldwide use SWAT, though). In all I mentioned I forgot to mention the work by Daniele Dalla Torre that ported SWMM to OMS3. That is a thread that is, at present in a dead end but it can come back alive any time.

Below, I give further information on: i) the reasons why we use OMS; ii) the new components I mentioned before or we are going to develop.

When I arrived to OMS3, my most recent achievement was a stable version of GEOtop, a model that solves the water and energy budget, as its foundational paper told. After 15 years, GEOtop remains quite unique in the panorama of “process-based” models. It in fact includes what is usually present in other process-based models, i.e. an integrator of Richards, Groundwater and Surface water equations, with what usually appears in soil, vegetation, atmosphere models. Besides it has a solid model for snow height evolution, used operationally all over the Alps, and freezing soil, which constitutes a third type of process-based model usually cared by a different scientific community. I’ve certainly sinned arrogantly in doing what others still not do, even with much larger resources, and I will probably go to hell for that. GEOtop has a decently extended literature and I could have capitalized better its treasures, but I preferred to move on, because while I was getting GEOtop stable I’ve been also touching its limits.

Its monolitic structure made of thousands lines of code, made it not easily modifiable and improvable with incoming research and understanding.

Its ambition to cover all the areas of hydrological modelling have made exploding the number of input parameters a fact that most researchers found overwhelming. Introducing competing ideas to model some of the processes became practically impossible and any science advancement nullified.

That’s why I was looking from an intrinsically modular system that could resolve the above issues and boost collaborative work, and that’s why I moved to OMS3 in 2008.
I would have stick with that GEOtop objective but in the same year had quite unexpectedly financial support for studying the management of draughts of river Adige. GEOtop was impractical for that use because of its inability to be calibrated and some flaws in its subsurface-surface water interactions. Necessity brought to the implementation of GEOframe New Age version 0.

In the subsequent decade I and collaborators worked on the GEOframe model perspective, faster to calibrate and, nevertheless quite complete from the point of view of processes integration. For many ancillary parts of the system, it was reinventing the wheel again from scratch but this finally produced the mature product that GEOframe-NewAGE is today. It was conceived for using the natural spatial fractal - graph-like structure of rivers for distributing spatially the hydrologic response unit (HRU) physics and computation. The main driving idea behind GEOframe is that we do computation on a graph nodes which exchange mass and energy according to the interactions among parts described by graph's connections. These nodes can be spatially distinct entities, like hillslopes and HRU, or concurrent processes like discharge and transpiration. In principle an engine under the hood is responsible for distributing the computation along the graph, while the hydrologist takes care of describing the processes with the appropriate degree of refinement. This had a first implementation with Net3 but I believe can be improved in several directions. Before Net3, sure, river networks were schematized as graph, but their topology was hardcoded and no variation was possible in the spatial structure of the model without disrupting the whole. With Net3 the topological structure of the connections can be modified just before the run time, inserting or eliminating human infrastructures, diversions, new nodes of calculation, lakes, reservoirs.

Net3 opened also to the possibility for different researchers to work simultaneously on different part the catchments (actually of the graph) enabling the possibility for a sort of “crowd modelling action” to cover the whole Earth with GEOframe based modelling performed by a crowd of researchers or simply trained people. Clearly for this an infrastructural work is still missing but potentially it could provide a collective works that highly surpasses the present global scale hydrological applications which are based on rough characterization of parameters and scanty local reanalysis of data which is not possible, even for large research groups. The first application of this modeling strategy will be the application of the model to the river Adige, separated in a thousand or more HRU of which we have, so far, some work done by the students of my course of Hydrological Modelling. River Adige is relatively small (10^4 square kilometers) but a variety of climate situations and anthropic activities and settlements that make it very challenging to be modeled. Eventually the simulations will be extended to the whole Alps and beyond.

From the point of view of interacting components, GEOframe has many: the traditional set of tools coming from Hydrologis for terrain analysis; a set of Krigings for interpolation of hydrometeorological variables, the estimation of shortwave and longwave radiation including shadows and topography effects, interception of rainfall by canopies, three simplified models for snow water equivalent modelling, various tools for reservoirs-like modelling (a la PRMS), Muskingham-Cunge and 1d deSaint Venant propagation, Priestley-Taylor, Penman-Monteith and a new model called Prospero which implements a revision due to Penman-Monteith by Schymanski and Or. All the models can make use of LUCA and PSO tools for calibration and the dedicated papers constitute a guideline for their use.
Giuseppe and I just hired a coupled of Ph.D. student and, they, among the other stuff, will work on data assimilation and possibly on some OMS issues.
Did I abandoned then process based modeling? Not at all, the original plane to build the new GEOtop 4.0 is actually very alive.
The nucleus are the tools growing around WHETGEO. At present we have:
  • An integration of Richards 1D with and without temperature, decoupled and coupled with the Energy budget (return the soil temperature profile)
  • An integration of Richards equation 2D (hillslope profile, for instance). No coupling with the energy budget yet.

These tools make leverage on terrain analysis, radiation estimation, interpolation of data, estimation of Evaporation and Transpiration already present in GEOframe but have a gridded domain instead that an HRU separation. With respect to GEOtop, the integration algorithms are completely redesigned around the Newton-Casuli-Zanolli (NCZ) algorithm for Richards, and an appropriate implementation of the grids where topology and geometry are separated. In GEOtop a more traditional Newton-Krylov (NK) algorithm was used whose convergence is not granted a priori and the equations were written for a structured (regular) grid which causes artifacts into the results. Using appropriate design patterns, a twofold objective was obtained: to make room for changes in parameterisations of the equations and, and to maintain as simple as possible (but not too simple) the contents of the inputs. The use of standard formats like NetCDF for outputs contained the number of otherwise exploding output files. The input and output format were decoupled from the algorithms though, in order to maintain flexibility for changing the outputs format. As seen many components developed for GEOframe-NewAGE could be reused for WHETGEO and, in fact, in the foreseen future, also the Net3 infrastructure could be possible used to aggregate various hillslopes simulated by independent WHETGEO runs.
Because I have experience with them and notwithstanding the opinion of many colleagues, I do not see in fact that the lumped reservoir models can cope with those processes which have a well definite spatial history. Remote data is a new frontier and, besides, WHETGEO is fully able to exploit them.

Evolving Prospero will bring into WHETGEO green waters and vegetation and GEOtop 4.0 will be much greener than GEOtop 3.0. This achievemente is almost there. Carbon cycle evolution, forestry and crop, will be a set of ODEs attached to sites, either described as a grid cell or a HRU. Their mathematics is quite the same that for HDSys, with just a different interpretation of the parameters. Transport of tracers and pollutants, via the advection dispersion equation is also almost obtained, because these equations belong to the same family of the heterogeneous transport of heat in porous media that we have already implemented.
Giuseppe in his magic hat has already setup models for hillslope stability analysis that just wait for the 2D and 3D WHETGEO to be tested (the first) and fully implemented (the second). But the 3D solution is more a problem of drawing the grid than everything else with respect the 2D solver.
A possible threat to this WHETGEO is its computational burden. The algorithms are efficient but a high-resolution three-dimensional grid has potentially millions of nodes and simulations are time consuming. Parallelizing its core routines in a way that does not clash with the other forms of parallelism present if OMS3 will be a challenge.
Therefore, what I foresee in the next years is these tools to get maturity and to be used by a potentially large set of users. GEOtop 4.0, built on WHETGEO, Prospero, and other tools would be a really operational tool for instance for landslide risk early warning; for the detailed soil moisture account for precision, regenerative agriculture; for small catchments runoff and sediment production (the latter a feature to be implemented), and obviously would be a great tool for studying any aspect of the critical zone, in any climate past present or future. At the present we cannot foresee when we could add accurate modules for snow (like those or better than those already in GEOtop) but the modules already present in GEOframe could be used easily on a pixel base to surrogate them. Calibration tools for WHETGEO process-based modules, I think, will require some adjustment of the calibration tools now present in OMS3 and we did not try anything about yet.
This mail was pretty long but, I hope it serves to clarify my point of view and the legacy I have with my previous research, I also included Tim in the mail, because I thin he can be interested on many of the research I exposed. Thanks to the friendship we have, I hope the way to strengh our past collaboration in a few future objectives where we can find reciprocal satisfaction.

All the best,

ric (and the guys)

What is meant by programming a system product

 Here I just reproduce the very first pages of  The Mythical Man-Month: Essays on Software Engineering. Addison-Wesley which clarify the main concepts I experimented in my programming models.

"The Programming Systems Product One occasionally reads newspaper accounts of how two programmers in a remodeled garage have built an important program that surpasses the best efforts of large teams. And every programmer is prepared to believe such tales, for he knows that he could build any program much faster than the 1000 statements/year reported for industrial teams. Why then have not all industrial programming teams been replaced by dedicated garage duos? One must look at what is being produced. 



In the upper left of Fig. 1.1 is a program. It is complete in itself, ready to be run by the author on the system on which it was developed. That is the thing commonly produced in garages, and that is the object the individual programmer uses in estimating productivity. There are two ways a program can be converted into a more useful, but more costly, object. These two ways are represented by the boundaries in the diagram. Moving down across the horizontal boundary, a program becomes a programming product. This is a program that can be run, tested, repaired, and extended by anybody. It is usable in many operating environments, for many sets of data. To become a generally usable programming product, a program must be written in a generalized fashion. In particular the range and form of inputs must be generalized as much as the basic algorithm will reasonably allow. Then the program must be thoroughly tested, so that it can be depended upon. This means that a substantial bank of test cases, exploring the input range and probing its boundaries, must be prepared, run, and recorded. Finally, promotion of a program to a programming product requires its thorough documentation, so that anyone may use it, fix it, and extend it. As a rule of thumb, I estimate that a programming product costs at least three times as much as a debugged program with the same function. Moving across the vertical boundary, a program becomes a component in a programming system. This is a collection of interacting programs, coordinated in function and disciplined in format, so that the assemblage constitutes an entire facility for large tasks. To become a programming system component, a program must be written so that every input and output conforms in syntax and semantics with precisely defined interfaces. The program must also be designed so that it uses only a prescribed budget of resources—memory space, input-output devices, computer time. Finally, the program must be tested with other system components, in all expected combinations. This testing must be extensive, for the number of cases grows combinatorially. It is time-consuming, for subtle bugs arise from unexpected interactions of debugged components. A programming system component costs at least three times as much as a stand-alone program of the same function. The cost may be greater if the system has many components. In the lower right-hand corner of Fig. 1.1 stands the programming systems product. This differs from the simple program in all of the above ways. It costs nine times as much. But it is the truly useful object, the intended product of most system programming efforts."

The reading of this classic is a great pleasure:

Brooks, Frederick Phillips, and Frederick P. Brooks Junior. 1995. The Mythical Man-Month: Essays on Software Engineering. Addison-Wesley.