EMG 2405 Control Engineering I (45 Lecture Hours)
Prerequisites: SMA 2278 Differential Equations , EMG 2304 Mechanics of Machines II
Purpose
The aim of this course is to enable the student to;
1. understand control systems engineering and the control action.
2. model control systems
3. conceptualize stability of a control system.
Course objectives
At the end of this course, the student should be able to;
1. calculate the response given the input to a control system
2. determine the stability of a give system using Routh and Hurtwiz criterions
3. determine the stability of a give system using Nyquist analysis
Course description
Control systems: Definition; control action, open loop, closed loop, linear time invariant
systems, time varying systems, stochastic systems. First and second order systems. Modeling
of control systems: Differential equations, block diagrams, block diagram algebra. State
space representation. Linearization of non-linear mechanical, electrical, hydraulic and thermal
systems. System response: Transfer functions. Laplace transforms; application of Laplace
transforms to the solution of linear constant coefficient differential equations. Steady-state
and transient responses. Forced and free response; the D-operator and the characteristic
equation. Typical test signals for time response, unit step, unit ramp and unit impulse.
System frequency response; sinusoidal inputs. Stability: Characteristic equation and the
root locations, the s-plane. Routh stability criterion. Hurtwiz stability criterion. Nyquist
analysis; polar plots, Nyquist stability plot, Nyquist criterion. Methods of improving stability.
Control elements and systems: Control elements; rotating machines, transducers, controllers,
electronic amplifiers, thyristors. Control systems; speeds control, numerical control machine
tools and process control. Transient motion in control systems.
Course text books
1. Distefano J. J., Stubberud A. R. & Williams I. J. (1994) Feedback and Control
Systems: Theory and Problems (Schaum’s Outline Series), McGraw-Hill, 2 Ed.
2. Ogata K. (1996) Modern Control Engineering, Prentice Hall, 3rd Ed.
References
1. Kuo B. C. & Farid G. (2002) Automatic Control Systems, Wiley, 8th Ed.
2. Gene F. (2005) Feedback Control of Dynamic Systems, Prentice Hall, 5th Ed.
3. Journal of Dynamic Systems, Measurement, and Control
Teaching methodology: 2 hour lectures and 1 hour tutorial per week and at least three 3-
hour-laboratory sessions per semester organized on a rotational basis.
Mode of assessment: 70% written exam, 30% Continuous assessment
Instruction materials/equipment
1. Mechanical Engineering laboratories;
2. Computer laboratory;
3. Overhead projector;
, INTRODUCTION
There are many reasons why we need control systems, but the four most important ones are
performance, economics, safety, and reliability.
Many systems cannot achieve specified levels of performance without controls. For
Example; A robot depends greatly on its control system to achieve good accuracy.
Economic considerations are also important, especially in process control, i.e. in the
control of production systems. There are many so-called continuous processes, designed
to operate in the steady state. Power plants, distillation columns, and paper machines are
all examples of systems that are designed to keep constant operating conditions except
during start-up and shutdown. The operating points of the units of such processes are
often chosen to maximize economic returns, subject to inequality constraints imposed on
certain variables by quality or safety considerations.
Safety is a third justification for control. An aircraft landing under poor visibility
conditions must be controlled to follow a safe glide path. A nuclear reactor must operate
in such a way that key variables are kept within safe limits. Many systems have "danger
zones"; it is the task of the control system to avoid them.
Control is used to achieve better reliability. Physical systems are usually less prone to
failure if they are operated smoothly and with low levels of fluctuation. Automobile
manufacturers advise drivers to avoid quick starts and stops in order to lengthen the lives
of their cars. Nature appears to abhor radical, quick changes, as it is said to abhor a
vacuum.
PHYSICAL ELEMENTS OF A CONTROL SYSTEM
Figure 1 ·Physical components of a control system
Figure 1 depicts the elements of a control system. The word "plant" is used generally, to
denote the object under control; in this context, a vessel is a plant. A plant has output variables,
some of which are the ones to be controlled. These variables are measured by sensors.
A sensor is basically a transducer, i.e., a device that transforms one type of physical quantity
to another, usually electrical. Examples of sensors are tachometers, accelerometers,
thermocouples, strain gauges, and pH meters.