Journal cover Journal topic
Drinking Water Engineering and Science An interactive open-access journal

Journal metrics

  • CiteScore<br/> value: 0.79 CiteScore
    0.79
  • SNIP value: 0.813 SNIP 0.813
  • SJR value: 0.228 SJR 0.228
  • IPP value: 0.719 IPP 0.719
doi:10.5194/dwesd-5-85-2012
© Author(s) 2012. This work is distributed
under the Creative Commons Attribution 3.0 License.
 
02 Apr 2012
Review status
A revision of this discussion paper for further review has not been submitted.
Development of a iron pipe corrosion simulation model for a water supply network
M. Bernats1, S. W. Osterhus2, K. Dzelzitis3, and T. Juhna1 1Department of Water Engineering and Technology, Riga Technical University, Latvia
2Department of Hydraulic and Environmental Engineering, Norwegian University of Science and Technology, Norway
3Department of Composite Materials and Structures, Riga Technical University, Latvia
Abstract. Corrosion in water supply networks is unwanted process that causes pipe material loss and subsequent pipe failures. Nowadays pipe replacing strategy most often is based on pipe age, which is not always the most important factor in pipe burst rate. In this study a methodology for developing a mathematical model to predict the decrease of pipe thickness in a large cast iron networks is presented. The quality of water, the temperature and the water flow regime were the main factors taken into account in the corrosion model. The water quality and flow rate effect were determined by measuring corrosion rate of metals coupons over the period of one year at different flow regimes. The obtained constants were then introduced in a calibrated hydraulic model (Epanet) and the corrosion model was validated by measuring the decrease of wall thickness in the samples that were removed during the regular pipe replacing event. The validated model was run for 30 yr to simulate the water distribution system of Riga (Latvia). Corrosion rate in the first year was 8.0–9.5 times greater than in all the forthcoming years, an average decrease of pipe wall depth being 0.013/0.016 mm per year in long term. The optimal iron pipe exploitation period was concluded to be 30–35 yr (for pipe wall depth 5.50 mm and metal density 7.5 m3 t−1). The initial corrosion model and measurement error was 33%. After the validation of the model the error was reduced to below 15%.

Citation: Bernats, M., Osterhus, S. W., Dzelzitis, K., and Juhna, T.: Development of a iron pipe corrosion simulation model for a water supply network, Drink. Water Eng. Sci. Discuss., 5, 85-120, doi:10.5194/dwesd-5-85-2012, 2012.
M. Bernats et al.
M. Bernats et al.

Viewed

Total article views: 743 (including HTML, PDF, and XML)

HTML PDF XML Total BibTeX EndNote
294 412 37 743 37 15

Views and downloads (calculated since 01 Feb 2013)

Cumulative views and downloads (calculated since 01 Feb 2013)

Saved

Discussed

Latest update: 25 May 2017
Publications Copernicus
Download
Share