Files in this item
|(no description provided)|
|Title:||Heat transfer and flow regimes during condensation in horizontal tubes|
|Author(s):||Dobson, Monte Keith|
|Doctoral Committee Chair(s):||Chato, John C.|
|Department / Program:||Mechanical Science and Engineering|
|Degree Granting Institution:||University of Illinois at Urbana-Champaign|
|Abstract:||An experimental study of heat transfer and flow regimes during condensation of refrigerants in horizontal tubes was conducted. Measurements were made in smooth, round tubes with diameters ranging from 3.14 mm to 7.04 mm. Four refrigerants were tested: R-12, R-22, R-134a, and azeotropic blends of R-32/R-125 in 50%/50% and 60%/40% compositions.
Flow regimes were observed visually at the inlet and outlet of the test-condenser as the heat transfer data were collected. Stratified, wavy, wavy-annular, annular, annular-mist, and slug flows were observed. True mist flow without a stable wall film was not observed during condensation tests. For the purpose of condensing heat transfer behavior, the various flow regimes were divided into two broad categories of gravity-dominated and shear-dominated flow.
The heat transfer behavior was strongly related to the flow regime. In the gravity-dominated flow regime, the dominant heat transfer mode was laminar film condensation. This regime was characterized by heat transfer coefficients that depended on the wall to refrigerant temperature difference but were nearly independent of mass flux. In the shear-dominated flow regime, forced convective condensation was the dominant heat transfer mechanism. This regime was characterized by heat transfer coefficients that were independent of temperature difference but very dependent on mass flux and quality. Separate heat transfer correlations that were developed for each of these flow regimes successfully predicted data from the present study and several external sources.
The heat transfer correlations were combined with existing pressure drop correlations to develop a simple condenser model. This model was used to explore the existence of an optimum diameter. Simulations showed that the required condensing length increased slowly as the diameter was decreased over a wide range. As the diameter became sufficiently small, the condensing length began to increase dramatically because much of the driving temperature difference was destroyed by pressure drop. An optimum diameter existed where the condensing surface area was a minimum. An analytical solution showed that this optimum diameter corresponded to a decrease in the inlet temperature difference of between 23% and 37%. The predictions of this analytical solution agreed very well with the simulation model.
|Rights Information:||Copyright 1994 Dobson, Monte Keith|
|Date Available in IDEALS:||2011-05-07|
|Identifier in Online Catalog:||AAI9522197|
This item appears in the following Collection(s)
Graduate Dissertations and Theses at Illinois
Graduate Theses and Dissertations at Illinois
Dissertations and Theses - Mechanical Science and Engineering