فهرست مطالب

Quality Engineering and Production Optimization - Volume:2 Issue: 1, Winter - Spring 2017

Journal of Quality Engineering and Production Optimization
Volume:2 Issue: 1, Winter - Spring 2017

  • تاریخ انتشار: 1395/01/08
  • تعداد عناوین: 6
|
  • Masoud Rabbani *, Farzad Mehrpour, Amir Farshbaf-Geranmayeh Pages 1-16
    Lean manufacturing is a strategic concern for companies which conduct mass production and it has become even more significant for those producing in a project-oriented way by modularization. In this paper, a bi-objective optimization model is proposed to design and plan a supply chain up to the final assembly centre. The delivery time and the quality in the procurement and low fluctuation of the production are the most important lean production principles that are considered. Because of the long-horizon planning and the subjective data gathered, it is necessary to handle uncertainty. Therefore, a robust credibility-based fuzzy programming (RCFP) approach is proposed to perform the robust optimization and to obtain the crisp equivalent of an MILP model using the chance constraint programming method in terms of simultaneous credibility measurement. A real industrial case study is provided to present the usefulness and applicability of the proposed model and programming approach.
    Keywords: Lean manufacturing, Supply chain network design, robust optimization, credibility measure, fuzzy programming
  • Azam Goodarzi, Amirhossein Amiri * Pages 17-26
    The reliability data is getting used to monitor and improve the quality of products or services. Nowadays, most of products or services are the results of processes with dependent stages referred to as multi-stage process. In these processes, the quality characteristics are affected by the quality characteristics in the previous stages, called as cascade property. In some cases, it is not possible to collect all the lifetime data due to resource limitations. Thus, the control charts have been compared under two different scenarios; censored and non-censored data. In this paper, the accelerated failure time (AFT) model is used and two control charts are presented to monitor the quality characteristic in the second stage under the censored and non-censored reliability data. The exponentially weighted moving average (EWMA) and cumulative sum (CUSUM) control charts are used based on the proposed residuals. The performance of the proposed control schemes is evaluated in terms of zero-state and steady-state average run length criteria through extensive simulation studies. The results generally show that CUSUM control chart performs better than the EWMA control chart for monitoring lognormal reliability data in a two-stage process. However, the EWMA control chart outperforms the CUSUM control chart under small shifts when there is no censoring in reliability data or in the censoring rate of 20%.
    Keywords: Accelerated failure time (AFT) model, Cascade property, Censored data, CUSUM control chart, EWMA control chart
  • Iman Rastgar, Rashed Sahraeian * Pages 27-46
    This paper considers the scheduling problem of parallel batch processing machines with non-identical job size and processing time. In this paper, a new mathematical model with ready time and batch size constraints is presented to formulate the problem mathematically, in which simultaneous reduction of the makespan and earliness-tardiness is the objective function. In recent years, the nature-inspired computational intelligent algorithms have been successfully employed to achieve the optimum design of different structures. Since the proposed model is NP-hard, a metaheuristic algorithm according to a harmony search algorithm is developed and analyzed for solving the batch processing machine scheduling problem addressed in the current paper. Various parameters and operators of the proposed harmony search algorithm are discussed and calibrated by means of the Taguchi statistical technique. In order to evaluate the proposed algorithm, instance problems in concordance with previous research are generated. The proposed algorithm and basic harmony search, improved harmony search and global best harmony search are solved and the results of all the algorithms are compared. The conclusion reveals that the proposed algorithm performs better than the other algorithms.
    Keywords: batch processing, harmony search Algorithm, scheduling, Taguchi design of experiments, parallel machine
  • Saeed Hosseinabadi, Mohammad Ranjbar *, Sepehr Ramyar, Masoud Amel-Monirian Pages 47-64
    In this paper, we consider a scheduling problem for a set of agile Earth observation satellites for scanning different parts of the Earth’s surface. We assume that preemption is allowed to prevent repetitive images and develop four different preemption policies. Scheduling is done for the imaging time window and transmission time domain to the Earth stations as well. The value of each picture from different target regions and the limitations of the satellite constellation in terms of memory and energy cause high computational complexity for this problem and thus obtaining an optimum solution with a deterministic method is very time-consuming. Consequently, a genetic-based metaheuristic algorithm with a specific solution representation is developed in order to maximize the total value of the observation process by establishing heuristic rules in the initial population of this algorithm. Comparison of the results from the proposed model with the results of cases where repetition of observed areas is not ignored indicates that the proposed model can bring about a significant increase in profits in the planning horizon.
    Keywords: scheduling, agile earth observation satellite, preemption, genetic algorithm
  • Fateme Marandi * Pages 65-76
    This study is concerned with how the quality of perishable products can be improved by shortening the time interval between production and distribution. As special types of food such as dairy products decay fast, the integration of production and distribution scheduling (IPDS) is investigated. An integrated scheduling of both processes improves the performance and costs because the separated scheduling of these processes without considering mutual requirements leads to non-optimal solutions. An optimal solution to IPDS requires simultaneously solving of the production scheduling and vehicle routing problems. This article deals with a variation of IPDS that contains a short shelf-life product; hence, there is no inventory of the product in the process. Once an amount of products is produced, they must be transported with non-negligible transportation time directly to various customer locations. The objective is to determine the minimum cost of the makespan and number of vehicles required to complete the distribution of the products to satisfy the demand of a given set of customers over a wide geographic region. The overall problem consists of permutation flow shop scheduling with machines, jobs and vehicles with different speeds and transportation capacities which transport jobs from the manufacturing company to customers distributed in various zones by determining the vehicle routes and number of vehicles. After developing an Integer Linear Programming (ILP) model of the problem, because it is NP-hard, a new graph-based heuristic method is proposed to efficiently solve the problem.
    Keywords: Production, distribution, Permutation flow ?shop scheduling, Vehicle routing problem, Integration, Graph-based scheduling
  • Fahimeh Tanhaie *, Nasim Nahavandi, Sayyid Dawood Ahmadi Motlagh Pages 77-88
    Drum–Buffer–Rope is a theory of constraints production planning methodology that operates by developing a schedule for the system’s first bottleneck. The first bottleneck is the bottleneck with the highest utilization. In the theory of constraints, any job that is not processed at the first bottleneck is referred to as a free good. Free goods do not use capacity at the first bottleneck, so very little attention is given to them in the Drum–Buffer–Rope literature. The objective of this paper is to present a methodology that improves the Drum–Buffer–Rope material flow management with attention to the second bottleneck and free goods. This paper presents a comparative analysis of Drum–Buffer–Rope material flow management and the proposed methodology in a job shop environment. To study the impact of free goods and the second bottleneck on the performance of the DBR method, 18 job shop simulation models were developed and data analysis was done for each simulation model. Lead time and throughput are the system performance measurement output parameters. The simulation result shows that the proposed methodology significantly improved the lead time and throughput.
    Keywords: Drum–Buffer–Rope, Theory of constraints, Free goods, Bottleneck