فهرست مطالب

mathematic and modeling in Finance - Volume:2 Issue: 2, Summer - Autumn 2022

Journal of mathematic and modeling in Finance
Volume:2 Issue: 2, Summer - Autumn 2022

  • تاریخ انتشار: 1401/10/11
  • تعداد عناوین: 12
|
  • Asma Hamzeh *, Faezeh Banimostafaarab, Fatemeh Atatalab Pages 1-14
    The rating of insurance companies is one of the necessary and operational policies to regulate and evaluate the performance of the insurance industry. It informs shareholders, customers, insurers, and even regulatory authorities, as well as formal and informal support bodies, about the current performance of insurance companies and their capabilities and prospects for the future. The rating of insurance companies in terms of the regulatory indicators and decision-making and implementation of the administrative measures for the companies based on the regulatory rating of each company is one of the needs of the regulatory body. Therefore, doing this properly requires using the indicators in principal areas, weighting them according to their importance, and implementing the model, finally. For this reason, in this study, first, the effective indicators for the regulatory rating of insurance companies were identified using documentary studies and relevant writings, and the initial indicators were scrutinized and completed using the results of a questionnaire. Then, the indicators prioritization and weighting and implementation of the model for regulatory rating of insurance companies are performed for 2019. Weighting the indicators is done by the Shannon entropy method, and the rating of insurance companies is implemented under three different scenarios with the TOPSIS model and the weighted average method.
    Keywords: Regulatory Rating, The Shannon Entropy Method, The TOPSIS Model, Weighting
  • Hadi Bagherzadeh Valami *, Zeinab Sinaei Nasab Pages 15-36
    Data Envelopment Analysis (DEA) is an effective method for measuring the efficiency of Decision Making Units (DMUs) In the process of evaluating the Decision Making Unitss, two factors of efficiency and production size can be used. When the production size of a unit is not optimal, its Returns To Scale (RTS) determines that changing the resources in another direction would enhance its productivity. In most previous research, RTS is considered to be increasing or decreasing, and frontier analysis is used to determine it. The concept of RTS in Network Data Envelopment Analysis (DEA) is so interesting.In this paper a method based on MPSS in several steps is developed, in addition to determining that RTS of units for each unit in directional manner, the shortest changes in resources for achieving the right size for network production is also obtained. In this approach, the computational complexity, and the ambiguity in units RTS is not present.
    Keywords: Data Envelopment Analysis, Network data envelopment analysis, Returns to Scale, Efficiency, Most Productive Scale Size
  • Saman Vahabi, Amir Payandeh Najafabadi * Pages 37-52
    There are a variety of products in the life insurance literature. These products differ in how the benefits are paid and the execution time. In this paper, we designed pure-endowment insurance contract and obtain the optimal strategy and consumption for a policyholder with CRRA utility function. In our designed contract, premiums are received from the policyholder at certain times. The insurer undertakes to pay the premiums by a certain guarantee rate, in addition, by investing in a portfolio of risky and risk free assets and share invest profits. The optimal stochastic control method can be used in a financial market with a risk free asset and a risky stock asset with jump by infinite activity L\'{e}vy model. We employed Variance Gamma process as a representative of infinite activity jump models and sensitivity of jump parameters in an uncertainty financialmarket has been studied. Also we compared results using by two forces of mortality.
    Keywords: Optimal Strategy, Force of Mortality, Pure-Endowment, Infinite Activity L' {e}vy Model
  • Mahboubeh Aalaei * Pages 53-61

    In this paper‎, ‎fuzzy set theory is implemented to model internal rate of return for calculating the price of life ‎settlements‎‎. ‎D‎eterministic‎, ‎probabilistic and stochastic ‎approaches ‎is ‎used ‎to ‎price life ‎settlements‎ in the secondary market for the Iranian insurance industry‎. ‎Research findings were presented and analyzed for whole life insurance policies using the interest rates announced in the supplement of Regulation No‎. ‎68 and Iranian life table‎, ‎which recently has been issued to be used by insurance companies‎. ‎Also‎, ‎the results of three approaches were compared with surrender value‎, ‎which indicates the surrender value is lower than ‎the fuzzy‎ price calculated based on the probabilistic and stochastic approaches and it is higher than the price calculated based on the deterministic approach‎. ‎Therefore‎, ‎selling life settlements in the secondary market ‎in ‎Iran‎ based on ‎calculated fuzzy price using ‎probabilistic and stochastic approaches will benefit the ‎policyholder‎. ‎Also‎,‎ ‎the price is obtained in the form of an interval using the fuzzy sets theory ‎and the investor can decide which price is suitable for this policy based on financial knowledge‎‎‎. ‎‎Furthermore, ‎in ‎order ‎to ‎show validity of the proposed fuzzy method,‎‎‎ the findings ‎are‎ ‎compared ‎to ‎the results of ‎using ‎‎the random ‎internal‎‎ ‎rate ‎of return‎.‎

    Keywords: Life settlements‎ Fuzzy random variables‎ life expectancy‎ Secondary market‎, ‎ Adjustment multiplier
  • Robabeh Hosseinpour Samim Mamaghani, Farzad Skandari * Pages 63-90
    In this paper, we considered a Bayesian hierarchical method using the hyper product inverse momentprior in the ultrahigh-dimensional generalized linear model (UDGLM), that was useful in the Bayesianvariable selection. We showed the posterior probabilities of the true model converge to 1 as the samplesize increases. For computing the posterior probabilities, we implemented the Laplace approximation.The Simplified Shotgun Stochastic Search with Screening (S5) procedure for generalized linear modelwas suggested for exploring the posterior space. Simulation studies and real data analysis using theBayesian ultrahigh-dimensional generalized linear model indicate that the proposed method had better performance than the previous models.
    Keywords: ‎ Ultrahigh dimensional, Nonlocal prior, Optimal properties, Bayesian Variable Selection, Generalized Linear Model&lrm
  • Sajad Nezamdoust, Farzad Skandari * Pages 91-106
    The paper considers the problem of estimation of the parameters in finite mixture models. In this article, a new method is proposed for of estimation of the parameters in finite mixture models. Traditionally, the parameter estimation in finite mixture models is performed from a likelihood point of view by exploiting the expectation maximization (EM) method and the Least Square Principle. Ridge regression is an alternative to the ordinary least squares method when multicollinearity presents among the regressor variables in multiple linear regression analysis.Accordingly, we propose a new shrinkage ridge estimation approach. Based on this principle, we propose an iterative algorithm called Ridge-Iterative Weighted least Square (RIWLS) to estimate the parameters. Monte-Carlo simulation studies are conducted to appraise the performance of our method. The results show that the Proposed estimator perform better than the IWLS method.
    Keywords: Finite Mixture Model, Least Square Principle, Iterative Weighted Least Square, Ridge Estimation
  • Mohsen Seighaly *, Emad Koosha, Ebrahim Abbasi Pages 107-128
    The purpose of the present research is to use machine learning models to predict the price of Bitcoin, representing the cryptocurrency market. The price prediction model can be considered as the most important component in algorithmic trading. The performance of machine learning and its models, due to the nature of price behavior in financial markets, have been reported to be well in studies. In this respect, measuring and comparing the accuracy and precision of random forest (RF), long-short-term memory (LSTM), and recurrent neural network (RNN) models in predicting the top and bottom of Bitcoin prices are the main objectives of the present study. The approach to predicting top and bottom prices using machine learning models can be considered as the innovative aspect of this research, while many studies seek to predict prices as time series, simple, or logarithmic price returns. Pricing top and bottom data as target variables and technical analysis indicators as feature variables in the 1-hour time frame from 1/1/2018 to 6/31/2022 served as input to the mentioned models for learning. Validation and testing are presented and used. 70% of the data are considered learning data, 20% as validation data, and the remaining 10% as test data. The result of this research shows over 80% accuracy in predicting the top and bottom Bitcoin price, and the random forest model’s prediction is more accurate than the LSTM and RNN models.
    Keywords: Algorithmic Trading, Random Forest, Recurrent Neural Network, Long-Short term memory, Top, bottom price prediction
  • Samaneh Mohammadi Jarchelou, Kianoush Fathi Vajargah *, Parvin Azhdari Pages 129-150
    The investment process is related to how investors act in deciding on the types of tradable securities to invest in and the amount and timing. Various methods have been proposed for the investment process, but the lack of rapid computational methods for determining investment policies in securities analysis makes performance appraisal a long-term challenge. An approach to the investment process consists of two parts. Major is securities analysis and portfolio management. Securities analysis involves estimating the benefits of each investment, while portfolio management involves analyzing the composition of investments and managing and maintaining a set of investments. Classical data envelopment analysis (DEA) models are recognized as accurate for rating and measuring efficient sample performance. Unluckily, this perspective often brings us to get overwhelmed when it's time to start a project. When it comes to limiting theory, the problem of efficient sample selection using a DEA models to test the performance of the PE portfolio is a real discontinuous boundary and concave has not been successful since 2011. In order to solve this problem, we recommend a DEA method divided into business units based on the Markowitz model. A search algorithm is used to introduce to business units and prove their validity. In any business unit, the boundary is continuous and concave. Therefore, DEA models could be applied as PE evaluation.
    Keywords: Data Envelopment Analysis, performance measurement, Portfolio Optimization, Stocks, securities
  • Asma Khadimallah *, Fathi Abid Pages 151-166
    This paper has potential implications for the management of the bank. We examine a bank capital structure with contingent convertible debt to improve financial stability. This type of debt converts to equity when the bank is facing financial difficulties and a conversion trigger occurs. We use a leverage ratio, which is introduced in Basel III to trigger conversion instead of traditional capital ratios. We formulate an optimization problem for a bank to choose an asset allocation strategy to maximize the expected utility of the bank's asset value. Our study presents an application of stochastic optimal control theory to a banking portfolio choice problem. By applying a dynamic programming principle to derive the HJB equation, we define and solve the optimization problem in the power utility case.The numerical results show that the evolution of the optimal asset allocation strategy is really affected by the realization of the stochastic variables characterizing the economy. We carried out a sensitivity analysis of risk aversion, time and volatility. We also reveal that the optimal asset allocation strategy is relatively sensitive to risk aversion as well as that the allocation in CoCo and equity decreases as the investment horizon increases. Finally, sensitivity analysis highlights the importance of dynamic considerations in optimal asset allocation based on the stochastic characteristics of investment opportunities.
    Keywords: Contingent convertible bond, Stochastic Optimal Control, asset allocation strategy, bank capital structure, optimization problem, power utility
  • Azadeh Ghasemifard, Seddigheh Banihashemi, Afshin Babaei * Pages 167-180
    ‎The aim of this paper is to numerically price the European double barrier option by calculating the governing fractional Black-Scholes equation in illiquid markets‎. ‎Incorporating the price impact into the underlying asset dynamic‎, ‎which means that trading strategies affect the underlying price‎, ‎we consider markets with finite liquidity‎. ‎We survey both cases of first-order feedback and full feedback‎. ‎Asset evolution satisfies a stochastic differential equation with fractional noise‎, ‎which is more realistic in markets with statistical dependence‎. ‎Moreover‎, ‎the Sinc-collocation method is used to price the option‎. ‎Numerical experiments show that the results highly correspond to our expectation of illiquid markets‎.
    Keywords: Option pricing, Iliquid market, Sinc collocation method, Price impact&lrm
  • Hamid Abbaskhani, Asgar Pakmaram *, Nader Rezaei, Jamal Bahri Sales Pages 181-194
    Despite the growing need for research on the going concern and bankruptcy of companies, most of the conducted studies have used the approach of quantitative data for predicting the going concern and bankruptcy of companies; on the other hand, it is possible to manage these quantitative data by company managers. As a result, there appears to be a need to examine alternative methods for predicting going concern and bankruptcy based on qualitative data from the auditor's report. The purpose of this research is to determine the ability to predict the going concern of the companies using quantitative and qualitative data. The study period was from 2011 to 2021, with a sample of 54 companies admitted to the Tehran Stock Exchange. The results of the first hypothesis test show that the coefficient of determination of text-mining approach model prediction with the presence of a life cycle variable is greater than the determination coefficient of text-mining approach model prediction with the presence of a company size variable. The test of the second hypothesis shows that the difference in the increasing explanatory power of the first model compared to the second model in the companies accepted in the stock exchange is significant.
    Keywords: Financial Forecasting, Going Concern, Tone Analysis, Auditor Reporting
  • Shokouh Shahbeyk * Pages 195-204
    In this paper, we discuss some of the concepts of robustness for uncertain multi-objective optimization problem. An important factor involved with multi-objective optimization problems is uncertainty. The uncertainty may arise from estimation of parameters in the model, error of computation, structure of problem and so on. Indeed, some parameters are often unknown at the beginning of solving a multi-objective optimization problem. One of the most important and popular approaches for dealing with uncertainty is robust optimization. Markowitz's portfolio optimization problem is strongly sensitive to the perturbations of input parameters. We consider Markowitz's portfolio optimization problem with ellipsoid uncertainty set, and ‎apply set-based minmax and lower robust efficiency to ‎this ‎problem. The concepts of robust efficiency are used in the real stock market and compared ‏‎to ‎each ‎other. ‎‎F‎inally, ‎‎‎the‏ increase and decrease ‎ effects of uncertainty set parameters ‎on these robust ‎efficient‎ solutions ‎‏are‎ verified.‎‏
    Keywords: Portfolio Optimization, robustness, Ellipsoid Uncertainty Set