Analyzing time series data is important to predict future events and changes in finance, manufacturing and administrative decisions. Gaussian processes (GPs) solve regression and classification problems by choosing appropriate kernels capturing covariance structure of data. In time series analysis, GP based regression methods recently demonstrate competitive performance by decomposing temporal covariance structure. Such covariance structure decomposition allows exploiting shared parameters over a set of multiple but selected time series. In this paper, we propose an efficient variational inference algorithm for nonparametric clustering over multiple GP covariance structures. We handle multiple time series by placing an Indian Buffet Process (IBP) prior on the presence of the additive shared kernels. We propose a new variational inference algorithm to learn the nonparametric Bayesian models for the clustering and regression problems. Experiments are conducted on both synthetic data sets and real world data sets, showing promising results in term of structure discoveries. In addition, our model learns GP kernels faster but still preserves a good predictive performance.