Friday, April 5, 2019
Optimizing Cash Management Model With Computer Intelligence
Optimizing Cash Management cast With calculator IntelligenceAlliand M.M. RamyaAbstractIn todays technical era, the fiscal organizations have great challenges to h peerless the property management process. Maintaining minimum change leads to customer frustration. At the same time, upholding excess cash is a loss to the organization. Hence, soft computing based cash management solutions argon required to maintain optimum cash balance. An Artificial Neural Ne bothrk (ANN) is sensation such technique which plays a vital position in the handle of cognitive science and engineering. In this opus, a novel ANN-based cash prophecy ride (ANNCFM) has been proposed to identify the cash requirement on daily, weekly and monthly basis. The six cash requirement parameters summons course of study (RY), Month of the Year (MOY), Working Day of the Month (WDOM), Working Day of the Week (WDOW), net income Day Effect (SDE) and Holiday Effect (HDE) were fed as insert to ANNCFM. Trials were car ried out for the selection of ANNCFM interlocking parameters. It was found that sum up of unfathomable neurons, sending rate and the nerve impulse when set to 10, 0.3 and 0.95 respectively yielded better results. consider absolute percentage error (MAPE), and mean squared error (MSE) were utilize to evaluate the performance of the proposed model. MSE that was less(prenominal) than 0.01 proves the capability of the proposed ANNCFM in estimating the cash requirement.Keywords ANN, ANNCFM , neuron, back-propagation, urge, learning rate.Introduction foretell cash demand needs to be more accurate for any financial organization including banks 1-3. If the forecast is flawed, in addition to making financial losses to the banks, it results in customer dissatisfaction. In banking industry, an earlier cash requirement study was made utilise feed forward anxious web with back propagation for short term entropy of two months 1. Subsequently another comparative study was made for the cash anticipation victimization a simple time series models and counterfeit nervous networks 2. The daily cash requirement models for a bank were optimized with touch swarm and compared with least square method for short term data 3. The main objective of the paper is to design, develop and essay a unique supervised method to forecast the cash requirement for banks from their historic data.1.1 ANN BackgroundANN is an efficient alikel in understanding the complexities of real world problems in all fields of our daily life4. It is used as a utilisation optimizer for linear as well as nonlinear problems in science,engineering,technology,management and finance5-9. Artificial neural network learning methods provides the best approach for approximating discrete,real and vector treasured site functions 10-12, for complex problems, which are not possible to solve by conventional mathematical methods like uninflected and numerical technique. ANN are applied in forex trade callio n,portfolio optimization, conclusiveness making, metrological parameters forecasting13-19 etc.,The various ANN based approaches applied by researchers in finance field as an alternative to traditional time series model includes fiscal and economic forecasting, credit authorization screening, simulation of market behavior, mortgage risk assessment, risk rating of investments and perception of regularities in security price movements 15-19.2.0 Design of Proposed ANNCFM ArchitectureThe process of designing a neural network in many fields resulted in a satisfactory performance but building a neural network forecast for a particular problem is nontrivial task. The modeling issues that affect the performance of the neural network must be selected carefully.2.1. Selection of ANN ParametersIn general, the design of multi level ANN can have many works where a bed represents a set of distributed parallel processing nodes. The three layered ANN network with one comment, one output and one i ntermediate private layer is sufficient to approximate any complex non-linear function. In the case of forecasting studies many observational results also confirms ANN with one hidden layer is enough to predict the required data 6-8. The model architecture of ANNCFM is shown in the Fig1.Fig1 Architecture of ANNCFM ModelThe important critical decision is to determine the architecture is i) number of layers, ii) number of neurons in each layer, iii) number of arcs which interconnect with nodes , iv) activation function of hidden and output nodes, v) study algorithm, vi)data transformation or normalization, vii)training and test sets and viii)performance measures.3.0 Design of Proposed ANN ModelsThe proposed ANNCFM model consists of one input, one hidden and an output layer as discussed in section 2.1. In this study the data was salt away from a semiurban area bank located in India. The typical daily cash requirement for thebank for one form is shown in Fig2.Fig. 2 Typical Cash R equirement for a YearThe collected data was for a period of three years (2010 to 2012) and was used for training and testing with the undermentioned input parametersRY- Reference year ranges in the midst of 1 to 3 as three yearsMOY-Month of the year ranges from 1to 12,WDOM-Working day of the month ranging from 1 to 27,WDOW Working day of the week ranging from 1 to 6,SDE- Salary day military group ranging from 1 to 3, andHDE- Holiday and the week end effect either 0 or 1.The fore mentioned parameters were used as six input neurons. In the hidden layer, the number of neurons were varied from 8 to 50.The output layer had one neuron that corresponds to the best cash requirement for a day.3.1 Pseudocode- ANNCFMMain()W, V, Voj, Wok=ANNCFMtrain( x,nip,nh,op,,,t)yk = ANNCFMtest(ts, W,V, Voj,Wok,t)Mserr,Mape=ANNCFMevaluate()FunctionANNCFMtrain(x,nip,nh,op,,,t) returns network with modified weight duplicateFor each training sample x(I,nip)// pass on forward computation//Determine the outp ut neuron amid input layer and hidden layer//Determine the output neuron between hidden layer and output layer//Compute the error signal between the output and hidden layer// modify the weights between the output(k) and Hidden(j) layerIf itr=1 thenelseEnd if//update bias between the output and hiddenIf itr =1 thenElseEnd if//Update the weights between the input(i) and Hidden(j) layerIf itr=1 thenElseEnd if//Update bias between the hidden and inputIf itr =1 thenElseEnd ifUntil mseFunction ANNCFMtest(ts, W,V, Voj,Wok,t ) returns output(y)For each testsample ts//Feed forward computation//Determine the output neuron between input layer and hidden layer//Determine the output neuron between hidden layer and output layerANNCFM evaluate(tk ,yk,ts)4.0 Evaluation inflectionIn order to derive and evaluate the performance of the most appropriate model that fulfils our objective of optimizing the cash management, few metrics were used. The accuracy of the proposed ANNCFM is evaluated use MAPE and MSE which are defined as followsMSE=Where Xt is the actual data at period t, Ft is the forecast at period, t, et is the forecast error at period t, small-arm n is the number of observations.5.0 Results and discussionThe data for a period of three years (2010-2012) was collected from metropolis Union Bank (CUB)-ukt bank branch to simulate the network using MATLAB .For the proposed study the total number of data for the three years is 879, in which the first two and half years, 737 data were used for training(80%) and the rest six months 142 data sets (20%) were used for testing. Studies found that input data normalization with certain criteria, prior to training process, is crucial to obtain full results, as well as to fasten significantly the calculations J.Sola J. Sevilla. Hence the input data was normalized before training.In ANNCFM, 15 runs were made by varying the number of hidden neurons from 10 to 50 using gradient descent with momentum back-propagation (traingdm) for the default training parameters learning rate =0.01, momentum=0.95, Goal=0, and number of grummets as 6000, are illustrated in table 1-column2.The convergence of ANNCFM is influenced by number of hidden neurons in which by varying the number of hidden neurons between 10 through 50. The error was stripped when the number of hidden neurons was set to 10, 20, 40, 45 and 50, by achieving a MSE of 0.0079 as observed from column 3 of table 1. As the number of hidden neurons increase, there is a significant increase in the computational time. Hence the number of hidden neurons in the proposed study was fixed as 10. The pictorial representation for the best hidden neuron against its MSE are shown in Fig. 3. Fig.3 Optimal Number of hidden neurons.The learning rate lr arrives at a local optimum for the superiorer learning rate and global optimum for slow learning process. Different trials were made to identify the optimal learning rate to avoid the unstable shape and fluctuations in the results. Learning rate was varied between 0.1 through 0.5 in which 0.3 yielded an optimal learning rate for the precondition data set, as shown in Fig-4.Fig 4 Optimal learning rateThe momentum plays a vital role in identifying the convergence point. Momentum, when set too low, it may get stuck into local minima, and if it is too high, network will become unstable. So there is a need to identify the optimal momentum value for ANNCFM, various momentum set were tested between 0.8 and 1.0, the trained results shows that the optimal momentum value was 0.95 are shown in the Fig-5Fig 5 Optimal Momentum rateIn the ANNCFM model to train and test the cash requirement for a day, week, month the following parameters values are selected based on their performance from the variant number of runs made above i) the number of input neurons=6, ii) maximum number of iteration=6000, iii) learning rate= 0.3, iv) momentum=0.95, v) transfer function=tansig/tansig (hidden and output layer). The optimal selection of the above parameters helped in improving the performance, by minimizing the error rate. This is evident from hold over 1, that shows the MSE achieved before and after parameter selection.Table1 ANNCFM performance for different number of hidden neuronsThe ANNCFM was used to estimate daily, weekly and monthly cash requirement. The estimated values were compared with the actual values for the testing period are shown in Fig.6a,b,c.for the daily ,weekly and monthly prediction. The obtained results shows the ANNCFM was found to perform reasonably good for all the three models .The weights calculated by our ANNCFM was found to be sufficient for cash prediction in which RY,MOY,WDOM, WDOW are essential parameters, and SDE,HDE are additional parameters .The connection weight approach was used to quantify the vastness of input multivariate 20. The preference of the input parameters were found based on the weights obtained was evident from Table 2, column-4.Table 2 ANNCFM Weigh ts-Preferences.The input parameters SDE and HDE plays a vital role in daily and weekly model as it was observed from the above table it effectively takes care the need of peak cash requirement at the beginning of every month and during holiday periods. The role of SDE in the weekly cash prediction could be easily understood for the weeks like 1,5,14, where the cash requirement is maximum since the beginning of the month lies within the week. only for the 9th and 10th as well as for the 18th and 19th week cash requirement shows the tender month starts between the weeks. The monthly model was plotted for six months as shown in Fig.6c in which the experimental results shows that the estimated values were most influenced by WDOM .The cash required and predicted was minimum for the fourth month in which WDOM was minimum. The MAPE and MSE for ANNCFM are shown in Table 3 .Fig.6-a ANNCFM Daily ModelFig.6-b ANNCFM Weekly ModelFig .6-c ANNCFM Monthly ModelTable 3 MAPE and MSE errors for A NNCFMThe comparison made between the actual and forecast data shown from the figures indicates that the six input variables selected in our model is sufficient to identify the cash need which is changing from time to time.6.0 ConclusionThe observations from the experimental results of this study shows that ANNCFM is a useful tool to predict the cash requirement in emerging banking sector. ANNCFM using feed forward neural network training with back-propagation algorithm optimize the needs of cash on daily, weekly and monthly basis. In the implementation process the data set used for the years between 2010 and 2012 were trained and tested to measure the performance. The input parameters were initialized and different runs were made for the proposed model to find out the optimal number of hidden neurons as 10, momentum as 0.95 and learning rate as 0.3 to train and test the network using sigmoid transfer function. The estimated results were with minimal error for the better performance with an accuracy of 91.23%.References.Fraydoon Rahnama Roodposhti , FarshadHeybati and Seyed Reza Musavi, A comparison of classic time series models and ersatz neural networks in anticipation of cash requirements of banks A case study in Iran , Academic and Business Research Institute International Conference, Orlando, USA, 2010.PremChand Kumar and EktaWalia , Cash Forecasting An Application of Artificial Neural Networks in Finance, International Journal of Computer Science Applications , Vol. 3, zero(prenominal) 1, pages. 61-77, 2006.Alli A, Ramya M M, Srinivasa Kumar V, Cash Management Using Particle Swarm Optimization, International conference of data Mining and Soft Computing, SASTRA University, Thanjavur, India, 2013.Haykin, Simon, Neural Networks A Comprehensive Foundation. Macmillan College Publishing Company, invigorated York,1994.Nakamura, Emi, Inflation forecasting using a neural network. Economics Letters, great deal 86(3), pages 373-378, 2006.Refenes, A.P. and H. W hite , Neural Networks and Financial Economics, International Journal of Forecasting, Volume 6(17),1998.F. Aminian, E. Suarez, M. Aminian and D. Walz, Forecasting economic data with neural networks, Computational Economics 28, pages. 7188,2006.A. Hanna, D. Ural and G. Saygili, Evaluation of liquefaction potential of skank deposits using artificial neural networks, Engineering Computations 24 , pages. 516,2007W. Gorr, D. Nagin and J. Szczypula, Comparative study of artificial neural network and statistical models for predicting student point averages, International Journal of Forecasting 10, pages. 1734,1994Zhang, G., Patuwo, B. E., and Hu, M. Y. Forecasting with artificial neural networks The state of the art. International Journal of Forecasting, 1435.62,1998.Z. W. Geem and W. E. Roper, Energy demand estimation ofSouth Korea using artificial neural network, Energy Policy, vol.37, no. 10, pages. 40494054, 2009.R. Yokoyama, T. Wakui, and R. Satake, Prediction of energy demands using neural network with model appellation byglobal optimization, Energy Conversion and Management, vol.50, no. 2, pages. 319327, 2009.Bishop, C. Bishop, Neural networks for pattern recognition, Oxford University Press, New York ,1999.H. Taubenbck, T. Esch, M. Wurm, A. Roth and S. Dech, Object-based feature extraction using high spatial resolution satellite data of urban areas, Journal of Spatial Science, Volume 55, Issue 1, pages 117-132,2010.P. Tenti, Forecasting Foreign Exchange Rates Using Recurrent Neural Networks, Applied Artificial Intelligence, Vol. 10, pages 567-581, 1996.W. Leigh, R. Hightower and N. Modani, Forecasting the New York stock exchange composite index with past price and interest rate on condition of volume spike, Expert Systems with Applications , pages. 18,2005.Manfred Steiner and Hans-Georg Wittkemper, Portfolio optimization with a neural network implementation of the coherent market hypothesis, 1997, Volume 100, Issue 1, Pages 2740, July 1997.M. Carolin Mabel a nd E. Fernandez, Analysis of wind power generation and prediction using ANN A case study, Volume 33, Issue 5, Pages 986992,May 2008,Sharda, R. and Delen, D. Predicting Box-office Success of Motion Pictures With Neural Networks. ExpertSystems with Applications 30, pages 243254, 2006.Julian D.Olden, Michael K.Joy, Russell G.Death, An accurate comparison of methods for quantifying variable importance in artificial neural network using simulated data,2004.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment