فهرست مطالب

Journal Of Industrial Engineering International
Volume:4 Issue: 1, Jan 2008

  • تاریخ انتشار: 1387/02/11
  • تعداد عناوین: 8
|
  • R. Sadeghian *, G. R. Jalali, Naini Page 1

    Although knowing the time of the occurrence of the earthquakes is vital and helpful, unfortunately it is still unpredictable. By the way there is an urgent need to find a method to foresee this catastrophic event. There are a lot of methods for forecasting the time of earthquake occurrence. Another method for predicting that is to know probability density function of time interval between earthquakes. In this paper a new probability density function (PDF) for the time interval between earthquakes is found out. The parameters of the PDF will be estimated, and ultimately, the PDF will be tested by the earthquakes data about Iran.

    Keywords: Forecasting, Probability Density Function (PDF), Distribution function, Earthquake
  • M. Aghdaghi, F. Jolai * Page 7

    The vehicle routing problem with backhauls (VRPB) as an extension of the classical vehicle routing prob-lem (VRP) attempts to define a set of routes which services both linehaul customers whom product are to be delivered and backhaul customers whom goods need to be collected. A primary objective for the problem usually is minimizing the total distribution cost. Most real-life problems have other objectives addition to this common primary objective. This paper describes a multi-objective model for VRPB with time windows (VRPBTW) and some new assumptions. We present a goal programming approach and a heuristic algorithm to solve the problem. Computational experiments are carried out and performance of developed methods is discussed.

    Keywords: Vehicle routing problem, Backhaul, Soft time windows, Goal programming, Heuristic
  • J. R. Sharma*, A. M. Rawani Page 19

    In this ever-changing business scenario, the manufacturing product industries have to be in position to recognize the ever changing pulse and demands of the market. Customer satisfaction and quality management has become a strategic issue for companies in the new millennium. Quality Function Deployment (QFD) literature suggests that building up the House of Quality (HoQ) is not a difficult task, however to analyze and interpret the information available is replete with a lot of uncertainty and presents less than optimal solutions. This paper attempts to address these twin issues of the Post-HoQ analysis and its interpretation through SWOT. The development and mechanics of QFD model is presumed to be known to the followers and the paper deals specifically with post-HoQ model through a well-defined and structured approach to comprehensive matrix analysis. The paper contributes a method for evaluating and analyzing the customer data and technical data in QFD as a function of the generation of useful information resulting in a better decision making process. The outcome of the study is a comprehensive solution which discusses post-matrix analysis through underlying concepts; requisite steps; information needed; and the computations involved. The applicability of the proposed model is demonstrated with an illustrative hypothetical example of a medical-care product - disposable syringe and needle.

    Keywords: QFD, HoQ, SWOT, Prioritization, Decision making
  • P. Hanafizadeh*, E. Salahi Parvin, P. Asadolahi, N. Gholami Page 32

    There are three major strategies to form neural network ensembles. The simplest one is the Cross Validation strategy in which all members are trained with the same training data. Bagging and boosting strategies produce perturbed sample from training data. This paper provides an ideal model based on two important factors: activation function and number of neurons in the hidden layer and based upon these factors, it compares the results of the trained single model with the cross validation one in a case which uses the presidential election data in US. The trained single model is called single best model. In this experience, the comparison shows that the cross validation ensemble leads to lower generalization error.

    Keywords: Ensemble strategy, Neural networks
  • M. S. Sabbagh*, M. Roshanjooy Page 39

    Presented here is a generalization of the implicit enumeration algorithm that can be applied when the objective function is being maximized and can be rewritten as the difference of two non-decreasing functions. Also developed is a computational algorithm, named linear speedup, to use whatever explicit linear constraints are present to speedup the search for a solution. The method is easy to understand and implement, yet very effective in dealing with many integer programming problems, including knapsack problems, reliability optimization, and spare allocation problems. To see some application of the generalized algorithm, we notice that the branch-and-bound is the popular method to solve integer linear programming problems. But branch-andbound cannot efficiently solve all integer linear programming problems. For example, De Loera et al. in their 2005 paper discuss some knapsack problems that CPLEX cannot solve in hours. We use our generalized algorithm to find a global or near global optimal solutions for those problems, in less than 100 seconds. The algorithm is based on function values only; it does not require continuity or differentiability of the problem functions. This allows its use on problems whose functions cannot be expressed in closed algebraic form. The reliability and efficiency of the proposed algorithm has been demonstrated on some integer optimization problems taken from the literature.

    Keywords: Algebraic form, Function values, Generalized implicit enumeration, Integer programming, Linear speedup
  • S. Saati* Page 51

    In models of Data Envelopment Analysis (DEA), an optimal set of input and output weights is generally assumed to represent the assessed Decision Making Unit (DMU) in the best light in comparison to all the other DMUs. These sets of weights are, typically, different for each of the participating DMUs. Thus, it is important to find a Common Set of Weights (CSW) across the set of DMUs. In this paper, a procedure is suggested to find a CSW in DEA. In the proposed procedure by solving just one linear programming a CSW is achieved. To demonstrate the concept, a numerical example is solved.

    Keywords: Data envelopment analysis, Weight restriction, Common set of weights, Linear programming
  • A. Makui*, A. Alinezhad, M. Zohrehbandian Page 57

    A characteristic of data envelopment analysis (DEA) is to allow individual decision making units (DMUs) to select the factor weights that are the most advantages for them in calculating their efficiency scores. This flexibility in selecting the weights, on the other hand, deters the comparison among DMUs on a common base. For dealing with this difficulty and assessing all the DMUs on the same scale, this paper proposes to use a multiple objective linear programming (MOLP) approach for generating common set of weights under the DEA framework. This is an advantage of the proposed approach against general approaches in the literature which are based on multiple objective nonlinear programming.

    Keywords: MOLP, Maximin method, DEA, Efficiency, Ranking, Weight restrictions
  • O. E. Charles-Owaba, A. E. Oluleye, F. A. Oyawale, S. A. Oke* Page 64

    The conventional method towards deriving schedule for a fleet of ships to minimize cost alone has the shortcoming of not addressing the problem of operation revenue losses associated with delays during maintenance at ships dockyards. In this paper, a preventive maintenance schedule for a fleet of ships that incorporates opportunity cost is presented. The idea is to assign a penalty cost to all idle periods that the ship spends at the dockyard. A version of the scheduling problem was defined as a transportation model of minimizing maintenance costs. Fixed maintenance duration and dockyard capacity were the two constraints of the formulation. Relevant data from a shipping firm owing 8 ships and a dockyard in Lagos with a maintenance capacity of three ships per month were collected over a 24-month period. The maintenance cost function was then formulated with the parameters estimated and the transportation tableau set up. The considered eight ships arrived at the dockyard between the 1st and 20th month, and were expected to spend between 2 to 5 months for preventive maintenance. The optimal schedule of the cost function resulted in ships 1 to 8 being idle for 74 months. The results of the study showed that to reduce the cost and delays, decisions for scheduling preventive maintenance of a fleet of ships should be based on opportunity cost.

    Keywords: Preventive maintenance scheduling, Maintenance cost, Opportunity cost, Fleet of ships scheduling