System development in the Information Technology world typically follows a sequence of formal and well-established procedures and methods, which are designed to improve the productivity, enhance the quality of the final product and to speed up the process from inception to final implementation of the system. Can practitioners involved with developing real world OR applications benefit from using this type of methodology? Dr. Rahnama will address the above question drawing on her years of industry experience, both in OR and pure IT environments.
Plant Design and Noise Control Optimizing Noise Levels through Design, or "Minimizing the Bang, Saving the Buck"
The most effective time to include noise control measurements is at the design
stage. Once the plant is "steel in the field", the costs of removing or altering
existing equipment (to reduce the noise levels) goes up considerably. Typically,
retrofits cost 3 to 5 times the cost of doing it right the first time. In some
cases, because of equipment or space constraints, retrofit becomes extremely
expensive or almost impossible. Implementing proper noise control measures at
the design stage, to meet EUB or similar regulations, have been found to obey
(approximately) the following equation:
Cost (as % of project) = 1000 / (Distance to nearest receiver in metres)
This talk will review basic acoustical concepts, common noise control techniques,
and three methods of combining noise control with reducing operating costs.
The three methods are:
TOPIC Rapid Modelling using Spreadsheet Software
Over the last several years we have developed software to approximate work flow performance within capacity constrained systems, such as are found in manufacturing. Such systems can frequently be modelled as networks of queues. A spreadsheet environment can be used to effectively model these networks and allow users to investigate behaviour readily. The effects of workloads, bottlenecks, flow patterns, lot sizes, setup times, machine reliability and scrap rates are some of the things that can be investigated quickly and efficiently. While this software was developed with a focus towards educational and industrial training, it has also proved important as a research tool. This presentation will briefly describe what Rapid Modelling is and then demonstrate the use of our software to model and analyze simple jobshop and batch production environments. If time permits, we will also briefly present our recent use of Rapid Modelling to solve the "Multi-Item Capacitated Lot Sizing" problem. This achievement has important implications for lot-size setting in batch production environments.
SPEAKER Tom Morrison - Senior Partner, Morrison Scientific Inc.
TOPIC Statistical Estimation of Parameters of Distributions for Grouped Data
Statistical distributions, used as input to operations research models, are often forgotten or left until the end of a project because of the excitement and effort required to make a model. Operations research modelling requires good estimation of the parameters of statistical distributions, however in many cases the data are not in the exact form an OR researcher requires for a good fit to a distribution to be made. One significant problem is that the data are grouped and estimates have to be made usinr estimates of the parameters of distributions.g the grouped data. There are some new methods that can be used for some types of data, and this talk is about some of those methods. The objective of this seminar is to introduce OR researchers to some of the more modern methods that can be applied to grouped data to obtain better estimates of the parameters of distributions.
Speaker: Dan Ruiu, Powerpool of Alberta
Topic: Electric Power Systems - Short Term Load Forecasting (STLF) Using Artificial Neural Networks (ANN)
Abstract
The presentation will include a brief description of the ANN concepts, STLF uses and requirements, utilization of ANN for STLF and comparison of STLF results obtained using "classic" and ANN methods
BIOGRAPHY Mr. Ruiu is an electrical engineer with post-graduate degrees in power engineering and international relations (economics). He has worked in the electric power field with private and public utilities, the World Bank, consulting organizations and equipment manufacturers for over 40 years. In addition to his work assignments in Canada and the USA, he has worked on power projects in several countries in Europe, Africa and Asia. He has written over 30 technical papers and articles published in different technical magazines and proceedings in Canada and USA. Mr. Ruiu is a professional engineer registered in Ontario and a senior member of the Institute of Electrical and Electronic Engineers (IEEE).
Henri Theil: His Contributions to Econometrics and Operations Research.
Econometrics is the branch of economics that deals with building and estimating mathematical models in economics. These models describe national economies as wel as sectors or single variables. They are used for forecasting and economic policy making.
Henri Theil, who died last month, was one of the world's top econometricians and a candidate voor the Nobel Prize in Economics.
A number of the tools he invented are surveyed, such as the estimation of macro-economic models, the evaluation of forecasts using inequality coefficients, quadratic programming, and the optimization of both economic policy and management decisions.
A brief sketch of his life and his career, in the Netherlands and the US will be given.
Roads and road transportation are essential to economic development and the integration of the national economy. The Central Road Institute serves as a central resource of research and consultation for the Federal Government of India and its states regarding all technical and economic aspects of building and maintaining the national road network.
Dr Kanchan will describe the various aspects of the work of the institute and will briefly talk about some of his own work dealing with a road maintenance optimization model.
The Use of Data Envelopment
Analysis (DEA) for Bank Branch Efficiency
Even accountants think the traditional budget process is outdated. Today's dynamic, fast paced, business environment demands more complex strategic tools to help organisations see the future, and their place in it.
The Balanced Scorecard is one of the new, popular strategic tools. Like most new strategic tools, there are many practical problems to be overcome before it can be successfully implemented.
Data Envelopment Analysis (DEA) is a nonparametric (mathematical programming) tool that provides a comprehensive measure of relative performance for each unit in a multidimensional measurement environment. This makes it an excellent tool for overcoming the practical measurement problems imposed by more sophisticated strategic tools, like the balanced scorecard.
Murray's research uses DEA to measure the relative performance of bank branches, using output measures that you would find in a balanced scorecard. I will briefly review current developments in using DEA to measure bank performance, and then present the results and practical problems uncovered in my research. There will be a general discussion of the potential uses of DEA to support strategic planning.
Present values for amounts to be paid or received in the future
represent a basic feature of most business-problem modeling. However,
what is the appropriate discount rate? Does the common practice of
setting a return target, which is then used as the discount rate in a
net-present value ("NPV") calculation, at least ensure recovery of a
sunk capital investment in a risky cash-flow? When cash flows are
uncertain and the irreversible investment costs can be delayed, does the
hurdle-rate NPV threshold produce correct decisions when there is this
timing option? Application of a simple stochastic dynamic programming
model, calibrated with parameters observed in the markets, provides
powerful results that address these questions and point towards
normative standards for choosing discount rates and for timing
investments subject to risk. A simplified development project for a
lease on an oil spacing unit serves as an illustration to relate the
model to concrete numbers. Data from the NYMEX and the NYSE are used to
provide calibrated parameters for the examples. Some of the conclusions may be startling.
A Mixed Integer Non-linear Program for Oilfield Production Planning
Bob Dawson will speak about the paper that he and Dr. J. David Fuller
published, in the May 1999 INFOR, entitled A Mixed Integer Non-linear
Program for Oilfield Production Planning. This paper details a model called
the Oil Production Optimisation Program (OPOP). This prototype of a tool to
assist heavy-oil operators in maximising the profitability of producing
operations captures both discrete pump-replacement and continuous-time
production rate decisions. In contrast to conventional medium- and
light-crude production, heavy-oil production rates are more strongly
influenced by technology selection and scale of implementation. Smaller
margins in heavy-oil, from generally higher input costs and lower output
prices, increase the relative benefits of employing OR tools for optimal
development planning. One such tool is the algorithm developed by Mr.
Dawson and Dr. Fuller.