|Abstract:||Based on the relationship between the family of Broyden methods and the EN method, a new family of iterative methods, the family of EN-like methods, is developed and analyzed. These methods are shown to be related to a variety of other known methods, which comprise the Broyden methods, GCR, GMRES, Newton's method for approximating the inverse, and a combination of a Galerkin step followed by a step of Richardson's method. Scaling-invariant versions and implementations of higher efficiency are developed, and their complexity is examined. The convergence of the new methods, as well as their restarted and truncated versions, are examined. Various convergence results are derived, which include termination within a finite number of steps and estimates for residuals and errors. The methods are also shown to be suitable in the context of inner/outer iteration schemes, and, for two of the methods, orthogonality preserving inner/outer iteration schemes are developed. Adaptive versions, which are a combination of truncated and restarted versions, and which automatically increase the size of the Krylov subspace, are included in the software package PARASPAR, which achieves robustness by reevaluating a parametrized preconditioner whenever poor convergence or instability is encountered. We present numerical experiments that demonstrate the efficiency of several members of this new family in comparison with other known methods, in the context of PARASPAR, and in the context of inner/outer iteration schemes. The experiments show that even though methods like CGS and BiCGSTAB may converge faster for many problems, EN-like methods are in general more robust, since, like GMRES, they have the option of increasing the size of the Krylov subspace. Additionally, they often require less memory than GMRES or ORTHOMIN. They are also very suitable as iterative solvers inside PARASPAR, since they evaluate, without additional cost, parameters that estimate the quality of the preconditioner. Some EN-like methods also show drastic divergence when applied to an ill-conditioned problem. Consequently, it is possible to quickly recognize when to reevaluate the preconditioner. Finally, nonlinear EN-like methods are developed, and their convergence behavior is investigated.