International Conference on Applications of Statistics and Probability in Civil Engineering (ICASP) (12th : 2015)

Reproducing kernel-based support vector machine for structural reliability analysis Lu, Da-Gang; Li, Gong-Bo Jul 31, 2015

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
53032-Paper_182_Lu.pdf [ 253.11kB ]
Metadata
JSON: 53032-1.0076065.json
JSON-LD: 53032-1.0076065-ld.json
RDF/XML (Pretty): 53032-1.0076065-rdf.xml
RDF/JSON: 53032-1.0076065-rdf.json
Turtle: 53032-1.0076065-turtle.txt
N-Triples: 53032-1.0076065-rdf-ntriples.txt
Original Record: 53032-1.0076065-source.json
Full Text
53032-1.0076065-fulltext.txt
Citation
53032-1.0076065.ris

Full Text

12th International Conference on Applications of Statistics and Probability in Civil Engineering, ICASP12 Vancouver, Canada, July 12-15, 2015  1Reproducing Kernel-Based Support Vector Machine for Structural Reliability Analysis Da-Gang Lu Professor, School of Civil Engineering, Harbin Institute of Technology, Harbin, China Gong-Bo Li Graduate Student, School of Civil Engineering, Harbin Institute of Technology, Harbin, China ABSTRACT: How to choose a kernel function for a support vector machine (SVM) is an important ingredient for high-dimensional and nonlinear classification and regression problems to overcome the curse of dimension. In this paper, a reproducing kernel of Sobolev Hilbert space is introduced to be an admissible kernel for SVMs. Then a support vector regression (SVR) machine based on the reproducing kernel (RKSVR) is constructed, and a hybrid approach to structural reliability analysis is proposed. To minimize the number of simulation and fill in the basic random variable space uniformly, the uniform design (UD) is applied to choose experiment points in the space of basic random variables. The Genetic algorithm (GA) incorporating the gradient information in FORM is employed to search for the global design point to avoid fall into the local optimal solutions. A numerical example is provided to demonstrate the accuracy, efficiency and applicability of the new reproducing kernel-based support vector regression meta-model for structural reliability analysis, compared with the support vector regression machine based on the Gaussian kernel.  In reliability analysis of large-scale and complex structures, the problem of implicit limit state functions via finite element analysis is typically handled with Monte Carlo simulation (MCS) or response surface method (RSM) (Melchers, 1999). However, excessive number of random samples is needed in MCS for complex structures when their failure probabilities are extremely small. As for the commonly used gradient-based approximate analytical methods such as the first and second order reliability methods (FORM & SORM), the tendency of easily fall into local optimal solutions has resulted in their poor performance. To improve the efficiency of structural reliability analysis and make the gradient-based methods more suitable for the implicit limit state functions, the meta-model techniques have emerged during the last decades (Hurtado, 2004a, b; Hurtado, 2007; Richard et al, 2012; Sudret, 2012, among others). Also known as surrogate models, the meta-models aim at expressing the relationship between random variables and structural responses by some mathematical functions with explicit forms. In fact, the RSM could be treated as a meta-model technique by approaching the limit state function in polynomial forms (Sudret, 2012). To avoid the iterations required in the RSM, the artificial neural network (ANN) (Guan& Melchers, 2001) and the support vector machine (SVM) (Vapnik, 1995, 1998; Hurtado, 2004a; Abe, 2010; Deng et al, 2013) techniques have been widely used in this area. In addition, ANN and SVR meta-models can provide a better estimate of the real structural behavior for a large domain of input variables (Hurtado, 2007; Sudret, 2012). From the experience of the authors (Li & Lu, 2013) through the comparative study of RSM, ANN, and SVR, it has been found that the SVR model is more suitable to handle with the high-dimension problems. In addition, it can overcome the curse of dimensionality through 12th International Conference on Applications of Statistics and Probability in Civil Engineering, ICASP12 Vancouver, Canada, July 12-15, 2015  2the kernel-based learning technique, which makes it more suitable for applications with massive random variables problems (Li & Lu, 2013). How to choose a kernel function for a support vector machine (SVM) is an important ingredient for both classification and regression problems to overcome the curse of dimension (Abe, 2010; Deng et al, 2013). In this paper, a reproducing kernel of Sobolev Hilbert space is introduced to be an admissible kernel for support vector regression machines. And then, the reproducing kernel based support vector regression machine is constructed, and a hybrid approach to structural reliability analysis is proposed, in which the uniform design (UD) is applied to choose experiment points, and the genetic algorithm (GA) incorporating the gradient information in FORM is employed to search for the global design point. Two numerical examples are provided to demonstrate the accuracy, efficiency and applicability of the new SVR meta-model and the new hybrid reliability methodology. 1. REPRODUCING KERNEL BASED SUPPORT VECTOR REGRESSION MACHINE 1.1. Support Vector Regression Machine Support vector machines (SVMs), which were introduced by Vapnik (1995), have become powerful tools to solve the problems of machine learning with finite training points and to overcome some traditional difficulties such as the curse of dimensionality, over-fitting, and so forth (Abe, 2010; Deng et al, 2013). In contrary to other learning techniques (e.g. neural networks) which rely on the minimization of the empirical risk, the SVMs are based on structural risk minimization principles where the learning problem is addressed by minimizing a bound on the true risk (Deng et al, 2013). Let the input data set 1, ,( )i i mx   be the realizations of the basic random variables 1 2( , , , )nX X XX  , the output data set 1, ,( )i i my    is the response of a mechanical or physical system ( )Y  X . They are gathered into a training set 1 1{( , ), ,( , )} ( )n mm my y  x x   . For a linear separable problem {1, 1}iy    , the analytical expression of the SVM is:  ( ) ,g b   w xx  (1) where, ,  stands for the inner product in n , nw   is a weight vector, and b  is a threshold value called the bias.  Compared with classical support vector classification (SVC) machines used for pattern recognition problems, in the case of support vector regression (SVR) machines, the margin is replaced by a loss function (Deng et al, 2013). One of the most widely used is the -insensitive loss function:  | ( ) | max{0, | ( ) | }y g y g    x x  (2) where    is an insensitive parameter that prescribes the precision of the approximation. The primal optimization model for the SVR problems can be expressed as follows:  2 *1** + +1arg min  || || ( )2s.t.  ,   {1, , }     ,   {1, , }      ( , )mi iii i ii i ii iCy b i mb y i m                    w ww xw x  (3) where, * +, +,( , ) n n ξ ξ    are slack variables, and C  is a penalty parameter to control the trade-off between the complexity and the portion of errors that are tolerated to within an accuracy  . This optimization problem can be solved by introducing the corresponding Lagrange function:  * * * 2 *11* *1* *11, , , , , , || || ( )2                          ( , )                          ( , )                           ,(( ))mi iimi i i iimi i i iimi i i iib Cy by b                       w ξ ξ α α η η ww xw x (4)  12th International Conference on Applications of Statistics and Probability in Civil Engineering, ICASP12 Vancouver, Canada, July 12-15, 2015  3where, * * ,4( , , , )i i i i      are Lagrange multipliers and must be positive. The Lagrange function must be maximized with respect to the set of parameters * *( , , , )i i i i     and minimized with respect to the set of parameters *( , , , )ib  w . By setting the partial derivatives up to zero, one ends up with:  *1*1* *( )( ) 000mi i iimi iii ii iCC          w x (5) where, i  and *i  can be solved by the following dual optimization problem of Equation (3):  ** *, 1 1* *1 , 1*, 1*1arg max  ( )( ) ,2              ( ) ( )s.t.      ( ) 0          , [0, ]m mi i j j i ji jm mi i i i ii i jmi ii ji iyC                    α αx x(6) This dual optimization problem is a quadratic programming (QP) problem, and can be solved by a standard QP program. It is notable that the quadratic form in Equation (6) guarantees the existence of a unique global optimum and avoids local optimums in SVR solutions. The solution to Equation (4) depends only on points for which * 0i i   . Such points are called support vectors and lie outside the -insensitive tube. The points which lie on the boundaries of the tube are called unbounded support vectors and can be used to compute the offset parameter b:   * *1*sign( ) ( ) ,: 0  or 0mi i i j j j ijj ii ib yi C C                x x (7) Substituting the first of Equation (5) and Equation (7) into Equation (1) gives the SVR formulation:  *1( ,) ( )mi i iibg       x xx  (8) The extension of Equation (8) to nonlinear functions can be straightforward using kernel functions to replace the inner product:  *1() )( ( , )mi i iiK bg     xx x  (9) where, ( , )iK x x  is a kernel function. There are many available kernel functions, such as linear kernel, polynomial kernel, neural network kernels, Mahalanobis kernels, graph kernels, etc. Among them, the most popular is the Gaussian radial basis kernel with a parameter  :  2 2( , ) exp( || || / )K    x x x x  (10) 1.2. Reproducing Kernel Based Support Vector Regression Machine One of the advantages of support vector machines is that the generalization performance can be improved by properly selecting kernels. Thus selection of proper kernels for specific applications is very important. We know that only if a function satisfies Mercer’s condition, then the function can be an admissible kernel function for SVMs. For a translation invariant function ( , )K x x( )K  x x , it can be a SVM kernel if and only if its Fourier transformation is larger than or equal to 0 (Smola et al, 1998), i.e.,   2( )(2 ) exp( ) ( ) 0nnKj K d   xx x x (11) In this paper, to improve the efficiency and accuracy of the SVR meta-models for reliability problems with implicit and highly nonlinear limit state functions for large-scale and complex structures, a reproducing kernel function of Sobolev Hilbert space is introduced into the SVR machine described above. The definition of the reproducing kernel (Deng et al, 2013) is as follows. 12th International Conference on Applications of Statistics and Probability in Civil Engineering, ICASP12 Vancouver, Canada, July 12-15, 2015  4Assume H  is a Hilbert functional space whose elements are real-valued or complex functions, and the inner product of which can be represented as follows:  , ( ), ( ) ,f g f g f g H         (12) where, the norm 1/2|| || ,    . If there exists a ( , )K x y  as the function of the variable x , which is the element in H ; meanwhile, for arbitrary y  and f H , ( ) (x), ( , )f y f K x y   , then ( , )K x y  is called a reproducing kernel of the space H , and H  is called a reproducing kernel space. Note that in the Soblev Hilbert space on ( , )  , the following translation invariant function (Xu, Luo, Xie, et al, 2011):   1( , ) 14 x xG x x e x x      (13) can be taken as a reproducing kernel, since it can be easily proved that its Fourier transformation satisfies Equation (11):   2( ) exp( ) ( )1 3              1 04 1G x j x G x dx        (14) It can be further proved that the function  1 1( , ) ( , ) ( )n ni i i ii iK G x x G x x      x x  (15) is a kernel, only notice that     21( , ) (2 ) ( ) 0n niiK G x    x x   (16) Based on the above considerations, the following reproducing kernel function of Sobolev Hilbert space  ( ) ( )11( , ) 14jj in x ji j i injK e x    xx x x  (17) is introduced into Equation (9), we call this new one as a Reproducing Kernel-based Support Vector Regression (RKSVR) machine. Thank to the advantage of using kernels that we need not treat the high dimensional feature space explicitly, namely, the so-called “kernel trick” (Abe, 2010; Deng et al, 2013), the new SVR can use the same training algorithm described above to solve the function approximation problems. 2. HYBRID APPROACH TO RELIABILITY ANALYSIS COMBING RKSVR, UD, GA AND FORM In this paper, a hybrid approach to reliability analysis is presented, which combines the developed RKSVR meta-model with the Uniform Design (UD) technique, genetic algorithm (GA) and FORM. Under the premise of distributing the sample points as uniformly as possibly, the UD is introduced to select experimental points in the space of basic random variables aiming to reduce the number of sample points, and hence the number of calling finite element analysis program. Then the RKSVR meta-model is established for the selected input sample points and the output structural response values. This finite element solver surrogate can fit the implicit limit state function of complex structures more accurately by a relatively small number of sample points. Then a multi-point search algorithm using the GA combined with FORM is employed for the RKSVR meta-model to find the global design point to avoid fall into the local optimal solutions. 2.1. Uniform design (UD) for selection of sample points As noted by Fang (2001), the UD method which requires less sample points to make the sample points has a better uniformity. In this paper, the UD tables are utilized to realize the UD method. Those tables can be established by the number-theoretic methods, e.g., good-lattice point (GLP) approach and other space-filling methods. Normally, let U ( )sn q  represent a UD table, where U denotes the uniform design, n denotes the number of experiments, s denotes influencing factors of experiment, and q denotes the maximum figure of each experiment point. After 12th International Conference on Applications of Statistics and Probability in Civil Engineering, ICASP12 Vancouver, Canada, July 12-15, 2015  5the table is determined, the sample points can be generated by its rules easily (Fang & Ma, 2001). After generating a group of sample points by using the selected UD table, the structural response (such as deformation and stress) will be obtained through finite element method (FEM). In this study, the above methodology is performed according to the following three steps: 1) generate the import values of random variables based on the UD table; 2) generate the finite element model; and 3) estimate the responses for each import value through finite element analysis. Through the above procedure, the inputs (the sample points) and the outputs (structural response values) can be combined together to generate the training samples settled for meta-models. 2.2. FORM-GA for RKSVR meta-model The objective function f(x) of genetic algorithms (GA) should make the design point x* satisfies two conditions: First, the distance between the design point and the origin is the shortest in the standard normal space; second, the design point should be located in the limit state surface. The mathematical expression of these two conditions can be formulated as an optimization model: 1SVR SVR SVRmin   s.t.    ( ) [ ( )] ( ) 0g g T G    ux u u  (18) where SVR ( )G u  and SVR ( )g x  denote the meta-model of the limit state function in the standard normal space and in the original variable space, respectively; 1( )T x u  denotes the inverse probability transformation, such as Rosenblatt transformation or Nataf transformation. Due to the poor performance of GA in constrained optimization, it is often difficult to satisfy the second condition, which leads to the failure of optimization. Herein we solve the constrained optimization with the exterior penalty function method, which can be realized through the SUMT method [6]. We first define the initial point (0)x , the initial penalty 1 , amplification factor 1c  , and permissible error1  , let 0k  ; and then solve the constrained optimization problem:  ( 1)min ( ) ( )kkf g    x x  (19) If the solution from Equation (19) meets the condition ( 1)( )kk g  x , then stop iteration; otherwise let 1k k  , 1k kc    , then solve Equation (19) again. By executing the above penalty function method, the reliability index can be obtained under the precondition of the design points locating on the limit state surface. Based on the objective function ( )f x  defend above, during each process of evolution of GA, the reliability index   will be calculated more than once. The gradient ( )g X  will be used to get reliability index   based on the well-known HLRF algorithm of FORM as follows:   ,( ) ( ) ( )( ) / ( )Tk kTk k k kG gG G     x uu J xu u u (20) where ku  denotes the iteration point in the standard normal space, ,x uJ  is the Jacobian matrix from x to u , ( )kg X is the gradient of the limit state function, k  is the reliability index. Equation (21) will be used many times in GA until the whole algorithm meets the requirements. The gradient of the SVR meta-model is needed to calculate reliability index   based on the form of Equation (9). We can get the gradient by using the derivative of composite function as:  *1( ) ( , )( )m i i iig K      x xxx  (21) 2.3. Procedure of the hybrid approach According to the above proposed methodology, the procedure of this new hybrid algorithm for structural reliability analysis is as follows: 1) Based on the complexity of the structure to be analyzed and the number of random variables, determine the number of sample points, finite element analysis program and structural reliability analysis program; 2) According to the uniform design table,  determrandompoints uniform3) Calculresponpoints progra4) Establimeta-mpoints until ththe req5) Solve point optimiIn thifinite elemthe other MATLAB.MATLAB calling finiof meta-mFORM-GA3. A NUMA three-baused as thein Figure 1Melchers, 2 Figure 1: Th In thiinclude 3 amoments o12th Interine the fact variable in1 2( , ,...x x x design typate the interse correspoby using tm; sh a RKSVodel withand the oue accuracyuirements; the reliabilfor the RKzation algors paper, Aent analysisprocesses  An interfis also reate element odel train. ERICAL Ey five-stor case-study (Bucher &001). ree-bay fives case, thpplied loadf inertia annational Confors and the step 1), se, )nx  based e; ested quantnding to thhe finite elR meta-m the inputput struct of the meity index SVR metaithm of FONSYS is s program inare all imace betweelized in Manalysis duing and oXAMPLE y steel fram example, w Bourgand,-story steel fere are 2s, 2 Yound 8 cross-erence on Ap levels of lect the samon the chity of structe input samement anaodel, traint experimeural responta-model mand the de-model byRM-GA. elected as step 3), wplementedn ANSYS ATLAB wring each cptimizatione structurhich is sh 1990; Guarame 1 RVs, wg’s modulusectional arplications of S6each ple osen ural ple lysis  the ntal ses, eets sign  the  the hile  in and hen ycle  of e is own n & hich s, 8 eas. Thshanlis Ta  Taran comocomotatistics and e element own in Tabd the paramted in Tableble 1: FrameElement B1 B2 B3 B4 C1 C2 C3 C4 ble 2: Probadom variabVariables P1 P2 P3 E4 E5 I6 I7 I8 I9 I10 I11 I12 I13 A14 A15 A16 A17 A18 A19 A20 A21 The corrnsidered as=0.5; 2) ment of irrelated bydulus of el = 0.9; and Probability inVancouvproperties le 1. The eters of th 2.  element proYoung’s modulus E4 E4 E4 E4 E5 E5 E5 E5 bility distribles Type of distributionsRayleigh Rayleigh Rayleigh Normal Normal Normal Normal Normal Normal Normal Normal Normal Normal Normal Normal Normal Normal Normal Normal Normal Normal elations of: 1) all loathe cross nertia of   =0.95;asticity E1 4) all cross Civil Engineer, Canada, Jof the steprobability e random perties Moment of inertia I10 I11 I12 I13 I6 I7 I8 I9 utions and pMean value 30 20 16 454,000 497,000 0.94 1.33 2.47 3.00 1.25 1.63 2.69 3.00 3.36 4.00 5.44 6.00 2.72 3.13 4.01 4.50  random vdings are csectional aeach eleme 3) the twand E2 are  sectional pering, ICASP1uly 12-15, 201el frame ardistributionvariables aCross section area A18 A19 A20 A21 A14 A15 A16 A17 arameters ofStandard deviation 9 8 6.4 40,000 40,000 0.12 0.15 0.30 0.35 0.30 0.40 0.65 0.75 0.60 0.80 1.00 1.20 1.00 1.10 1.30 1.45 ariables arorrelated brea and thnt type ao differecorrelated broperties a2 5 e s re  e y e re nt y re 12th International Conference on Applications of Statistics and Probability in Civil Engineering, ICASP12 Vancouver, Canada, July 12-15, 2015  7correlated as   = 0.13. All other variables are assumed to be uncorrelated. A top displacement equal to 0.01m is selected as the response limit, and the global limit state function is defined as  ( ) 0.01 ( )xg u X X  (24) where ux denotes the actual horizontal displacement as an implicit function of all random variables; 0.01m is the threshold of the displacement response; and the random vector X involves the variables of A1, A2 and P, whose statistical moments and distribution types are listed in Table 2. The proposed hybrid approach is applied to deal with this implicit limit state function problem of a real complex structure. Since there is no UD table for 21 random variables, the good-lattice point (GLP) method (Fang & Ma, 2001) is used to generate the uniform design table. For the SVR meta-models, the parameters  of Gaussian kernel function and the penalty function C are set to 0.3 and 100, respectively. The GA has population size of 600, crossover coefficient of 0.9 and mutation coefficient of 0.05. The solution results are listed in Table 3, compared with the results earlier by (Bucher & Bourgand (1990), and Guan & Melchers (2001).  Table 3: Reliability index results Technique Reliabilityindex Evaluation times Number of FEM FORM 3.44 ‐‐ --RSM 3.51 11min 454 Gaussian SVR 3.57 27min 74 Reproducing SVR 3.55 19min 74  From Table 3, it can be observed that the two SVRs yield the result as good as the classical RSM method if the result by FORM is considered as an “accurate” solution; however, it requires much less number of calling finite element analysis, although it takes a little more time than RSM due to the optimization by GA. Meanwhile, it takes less time to compute a more accurate index using the reproducing kernel SVR. 4. CONCLUSIONS In this paper, a support vector regression (SVR) machine based on a reproducing kernel is constructed for structural reliability analysis. To minimize the number of simulation and fill in the basic random variable space uniformly, the uniform design (UD) is applied to choose experiment points in the space of basic random variables. The Genetic algorithm (GA) incorporating the gradient information in FORM is employed to search for the global design point to avoid fall into the local optimal solutions. With the developed hybrid approach, the new RKSVR Meta-model is also compared with the Gaussian kernel SVR. Several conclusions are obtained as follows: 1) Both SVR models are capable of simulating implicit limit state functions in the global scope. In addition, the two models can avoid the tedious process of iterations. As a consequence, the efficiency of the two SVR meta-models is improved to some extent compared to traditional structural reliability analysis methods. As seen from the numerical example, both SVR meta-models have considerable high accuracy. 2) Compared to Gaussian kernel SVR meta-model, when samples are enough, for the same number of samples, the reproducing kernel SVR can fit better than the Gaussian kernel SVR. Unlike the latter, the reproducing kernel SVR meta-model may make the relative error rather small, very closed to zero. However, the limit of relative error cannot be ignored. Meanwhile, the computing time of the reproducing kernel SVR is less than that of the Gaussian kernel SVR meta-model because the exponential computation for the Gaussian kernel SVR is very large. 3) Increasing penalty function C is appropriate to improve model accuracy. As seen from the example, the promotion of penalty function would affect the relative error, while the effect of radial basis function  in Gaussian kernel SVR is not significant. 12th International Conference on Applications of Statistics and Probability in Civil Engineering, ICASP12 Vancouver, Canada, July 12-15, 2015  85. ACKNOWLEDGEMENT The financial support received from the National Science Foundation of China (Grant Nos. 91315301, 51378162, 51178150), the Research fund from Ministry of Science and Technology of China (2013BAJ08B01), the Open Research Fund of State Key Laboratory for Disaster Reduction in Civil Engineering (SLDRCE12-MB-04) is gratefully appreciated. 6. REFERENCES Abe, S. (2010). Support Vector Machines for Pattern Classification. The second edition, New York: Springer- Verlag. Bucher, C.G. & Bourgand, U. (1990). A fast and efficient response surface approach for structural reliability problems. Structural Safety, 7, 57–66. Cheng, J., Li, Q.S. & Xiao, R.C. (2008). A new artificial neural network-based response surface method for structural reliability analysis. Probabilistic Engineering Mechanics 23: 51-63. Deng, N.Y, Tian Y.J., and Zhang C.H. (2013). Support Vector Machines - Optimization Based Theory, Algorithms and Extensions. Taylor & Francis Group, CRC Press. Fang, K.T. & Ma, C.X. (2001). Orthogonal and Uniform Experiment Design. Beijing: Science Press. (in Chinese). Fang, K.T. (2001). Uniform Design and Uniform Design Table. Beijing: Beijing: Science Press. (in Chinese). Gary, W.F. & Steve, L. (2002). Efficient SVM regression training with SMO. Machine Learning, 46(1-3), 271–290. Guan, X. L. & Melchers, R.E. (2001). Effect of response surface parameter variation on structural reliability estimates. Structural Safety, 23(4), 429–440. Gomes, H.M. & Awruch, A.M. (2004). Comparison of response surface and neural network with other methods for structural reliability analysis. Structural Safety, 26, 49–67. Hurtado, J. (2004a). Structural Reliability: Statistical Learning Perspectives. Springer Verlag. Hurtado, J.E. (2004b). An examination of methods for approximating implicit limit state functions from the viewpoint of statistical learning theory. Structural Safety, 26(3), 271–293. Hurtado, J.E. (2007). Filtered importance sampling with support vector machine: a powerful method for structural reliability analysis. Structural Safety, 29(1), 2–15. Khorsand, A.R. & Akbarzadeh-T, M.R. (2006). Multi-objective meta-level soft computing-based evolutionary structural design. Journal of the Franklin Institute, 16, 16–32. Li, B.Q, and Lu, D.G. (2013). Comparisons of three meta-models for structural reliability analysis: RSM, ANN and SVR. The 11th International Conference on Structural Safety and Reliability (ICOSSAR2013), New York, USA, June 16-20, 3353–3360. Melchers, R.E. (1999). Structural Reliability Analysis and Prediction. The second edition, Hoboken: John Wiley & Sons Ltd. Richard, B., Cremona, C., and Adelaide, L. (2012). A response surface method based on support vector machines trained with an adaptive experimental design. Structural Safety, 39(1), 14–21. Smola, A.J, Scholkopf, B., Muller, K.R. (1998). The connection between regularization operators and support vector kernels. Neural Networks, 11(4), 637–649. Sudret, B. (2012). Meta-models for structural reliability and uncertainty quantification. The keynote paper of APPSAR 5, Phoon, K. K., Beer, M., Quek, S. T. & Pang, S. D. (eds)., Sustainable Civil Infrastructures - Hazards, Risk, Uncertainty, Proceedings of the 5th Asian-Pacific Symposium on Structural Reliability and its Applications (APPSAR 2012), Singapore. Xu, L.X., Luo, B., Xie, J., et al. (2011). Improved reproducing kernel support vector machine regression model. Computer Engineering and Applications, 47(24), 100–102. (in Chinese). Vapnik, V. (1995). The Nature of Statistical Learning Theory. New York: Springer- Verlag. 

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.53032.1-0076065/manifest

Comment

Related Items