Semi-Supervised Support Vector Machines algorithms as Classification Methods in Structural Health Monitoring

Document Type : Research Article


1 Ph.D. Candidate Structural and Earthquake Engineering, Department Civil, Water and Environmental Engineering Faculty, Shahid Beheshti University, Tehran, Iran

2 Assistant Professor at Civil Engineering College, Abbaspour Technical Campus, Shahid Beheshti University, Tehran, Iran

3 Associate Professor at Civil Engineering College, Abbaspour Technical Campus, Shahid Beheshti University, Tehran, Iran


One of the fields in data-based structural health monitoring (SHM) that has not been widely considered is the data classification step. Applications of the semi-supervised methods in data classification is getting more attention nowadays. In this study, an efficient semi-supervised support vector machine (S3VM) algorithm is used to for classifying between healthy and unhealthy stages. For this reason, a combined model-based and data-based approach is taken to determine the damage sensitive features. A hybrid approach has been utilized to generate the feature vectors. Using the vibrational data of the structure, the dynamic properties is obtained by system identification methods. Modal strain energy used as damage sensitive features (DSF). Different states of healthy and unhealthy conditions of the structure is used to evaluate the effectiveness of the proposed algorithm. Also, the Support Vector Machines (SVM) algorithm is utilized to compare the results. Since the semi-supervised support vector machines algorithm is based on support vector machines formulation, it is a suitable algorithm to compare the result with. It can be seen that the use of unlabeled data will enhance the effectiveness of the classification methods especially in the lack labeled data. When the labeled dataset is large enough, the result for both supervised and semi-supervised support vector machines is almost the same.


Main Subjects

  1. Doebling, S., Farrar, C., Prime, M., and Shevitz, D. (1998) A Review of Damage Identification Methods That Examine Changes in Dynamic Properties. Shock Vib. Dig., 30, 91–105.
  2. Sohn, H., Farrar, C.R., Hemez, F.M., Shunk, D.S., Stinemates, D.W., Nadler, B.R., and Czarnecki, J. J. (2004) A Review of Structural Health Monitoring Literature From 1996–2001. Los Alamos National Laboratory, Report No. LA-13976-MS.
  3. Farrar, C.R., and Worden, K. (2007) An Introduction to Structural Health Monitoring. Philos. Trans. R. Soc. London, Ser. A, 365, 303–315.
  4. Rytter, A. (1993) Vibration Based Inspection of Civil Engineering Structures. Building Technology and Structural Engineering, Aalborg University, Aalborg.
  5. Bishop, C.M. (1994) Novelty detection and neural network validation. IEEE Proceedings – Vision and Image Signal Processing, 141, 217–222.
  6. Markou, M. and Singh, S. (2003a) Novelty detection – a review. Part I: statistical approaches. Signal Processing, 83, 2481–2497.
  7. Markou, M. and Singh, S. (2003b) Novelty detection – a review. Part II: neural network based approaches. Signal Processing, 83, 2499–2521.
  8. Worden, K. (1997) Structural fault detection using a novelty measure. Journal of Sound and Vibration, 201, 85–101.
  9. Chapelle, O., Scolkopf, B. and Zien, A. (2006) Semi-Supervised Learning. MIT Press, Cambridge, MA.
  10. Zhou, Z.-H. and Li., M. (2010) Semi-supervised learning by disagreement. Knowl. Inf. Syst., 24(3), 415–439.
  11. Zhu., X. (2006) Semi-Supervised Learning Literature Survey. Technical Report 1530, Dept. Comp. Sci., Univ. Wisconsin-Madison.
  12. Zhu, X., Goldberg, A.B. (2009) Introduction to semi-supervised learning. Synth Lect Artif Intell Mach Learn, 3(1), 1–130.
  13. Chapelle, O., Weston, J., Scholkopf, B. (2002) Cluster kernels for semi-supervised learning. In: Proceedings of 16th Annual Neural Information Processing Systems Conference, 585–592.
  14. Belkin, M., Niyogi, P., Sindhwani, V. (2006) Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res., 7, 2399–2434.
  15. Chapelle, O., Vapnik, V., Weston, J. (1999) Transductive Inference for Estimating Values of Functions. NIPS, 421-427.
  16. Joachims, T. (1999) Transductive inference for text classification using support vector machines. In: Proceedings of the Sixteenth International Conference, 99, 200–209.
  17. Bennett, K., Demiriz, A. (1999) Semi-supervised support vector machines. In: Advances in Neural Information Processing Systems, 11, 368–374.
  18. Chapelle, O., Sindhwani, V., Keerthi, S.S. (2008) Optimization techniques for semi-supervised support vector machines. J. Mach. Learn. Res., 9, 203–233.
  19. Joachims, T. (2003) Transductive Learning via Spectral Graph Partitioning. In ICML.
  20. Belkin, M., Niyogi, P., Sindhwani, V. (2006) Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res., 7, 2399–2434.
  21. Zhu, X. (2005) Semi-supervised Learning with Graphs. Carnegie Mellon University, language technologies institute, school of computer science.
  22. Vapnik, V. (1998) Statistical Learning Theory. Wiley.
  23. Kristin, P., Bennett, Ayhan Demiriz (1999) Semi-supervised support vector machines, in: Proceedings of the Conference on Advances in Neural Information Processing Systems II, Cambridge, MA, USA, 1999, pp. 368–374.
  24. Bie, T.D. and Cristianini, N. (2004) Convex methods for transduction. Advances in Neural Information Proceeding Systems, 16, MIT Press, p. 73.
  25. Chapelle, O. and Zien, A. (2005) Semi-supervised classification by low density separation. Proceedings of the 10th International Workshop on Artificial Intelligence and Statistics, Barbados, 57–64.
  26. Chapelle, O., Sindhwani, V., Keerthi, S. (2006) Branch and bound for semi-supervised support vector machines. Advances in Neural Information Processing Systems, MIT Press, Cambridge, 17, 217–224.
  27. Choi, H., Kim, J., Kim, Y. (2010) A sparse large margin semi-supervised learning method. J. Korean Stat. Soc., 39(4), 479.
  28. Collobert, R., Sinz, F., Weston, J., Bottou, L. (2006) Large scale transductive SVMs. J. Mach. Learn. 7, 1687–1712.
  29. Gieseke, F., Airola, A., Pahikkala, T., Kramer, Sparse, O. (2012) Quasi-Newton optimization for semi-supervised support vector machines. Proceedings of the 1st International Conference on Pattern Recognition Applications and Methods (ICPRAM), 45–54.
  30. Hermes, L. and Buhmann, J.M. (2000) Feature selection for support vector machines. Proceedings of 15th International Conference on Pattern Recognition, 2, 712–715.
  31. Betti, R. (2014) Combining Model Based and Data Based Techniques in a Robust Bridge Health Monitoring Algorithm. Rutgers University. Center for Advanced Infrastructure and Transportation, CAIT-UTC-015.
  32. Belkin, M., Matveeva, I. and Niyogi, P. (2004) Regularization and semi-supervised learning on large graphs. In COLT.
  33. Kondor, R.I. and Lafferty, J. (2002) Diffusion kernels on graphs and other discrete structures. In ICML.
  34. Szummer, M. and Jaakkola (2001) Partially labeled classification with markov random walks. In NIPS, 14.
  35. Zhu, X., Ghahramani, Z. and Lafferty, J. (2003) Semi-supervised learning using Gaussian fields and harmonic functions. In ICML.
  36. Zhou, D., Bousquet, O., Lal, T., Weston, J. and SchÄolkopf, B. (2003) Learning with local and global consistency. In NIPS, 16.
  37. Chapelle, O. (2003) Support Vector Machines: Induction Principle, Adaptive Tuning and Prior Knowledge. PhD Thesis, LIP 6.
  38. Vincent, P. and Bengio, Y. (2003) Density-sensitive metrics and kernels. Presented at the Snowbird Learning Workshop.
  39. Bousquet, O., Chapelle, O. and Hein, M. (2004) Measure based regularization. In NIPS.
  40. Tenenbaum, J.B., de Silva V., and Langford J.C. (2000) A global geometric framework for nonlinear dimensionality reduction. Science, 290(5500), 2319-2323.
  41. Fischer, B., Roth, V., and Buhmann, J.M. (2004) Clustering with the connectivity kernel. In NIPS, 16.
  42. Peeters, B., Roeck, G.D. (1999) Reference-based stochastic subspace identification for output-only modal analysis. Mechanical Systems and Signal Processing, 13(6), 855–878.
  43. Viberg, M. (1995) Subspace-based methods for the identification of linear time-invariant systems. Automatica, 31(12), 1835–1851.
  44. Gevers, M. (2003) A personal view on the development of system identification. In 13th Federation of Automatic Control Symposium on System Identification (IFAC SYSID). Rotterdam, Netherlands.
  45. Ho, B.L., Kalman, R.E. (1965) Effective construction of linear state-variable models from input–output functions. Regelungtechnik, 12, 545–548.
  46. Peeters, B., Ventura, C.E. (2003) Comparative study of modal analysis techniques for bridge dynamic characteristics. Mechanical Systems and Signal Processing, 17(5), 965–988.
  47. Akaike, H. (1974) Stochastic theory of minimal realization. IEEE Transactions on Automatic Control, 19(6), 667–674.
  48. Verhaegen, M. (1994) Identification of the deterministic part of MIMO state space models given in innovations form from input–output data. Automatica, 30(1), 61–74.
  49. Van Overschee, P., De Moor, B. (1994) N4SID: subspace algorithms for the identification of combined deterministic–stochastic systems. Automatica, 30(1), 75–93.
  50. Van Overschee, P., De Moor, B. (1996) Subspace Identification for Linear Systems. Kluwer Academic Publishers: Dordrecht, Netherlands.
  51. Ljung, L. (2009) System Identification Toolbox 7. The Mathworks,.