Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 9 Next »

 1st Place Winning Entry

Deep Kernel: Learning Kernel function from data using deep neural networks on HPCC System

Linh Le and Professor Xie had observed that there may be issues with choosing kernel methods in machine learning, not only because there are a variety of kernel functions to choose from, but also because it is extremely important to choose the correct (or best) hyper-parameters to prevent poor performance of the chosen model. In tasks like dimension reduction or visualization, it is hard to evaluate the results and so choosing the correct hyper-parameters becomes even more important. Often the kernel functions are chosen regardless of the data. So, their idea was to investigate learning the kernel function from the data itself, to lift the burden of choosing the hyper-parameters from the user when using kernel machines.

Their hypothesis was to prove that when kernel function is learnt from the data, the feature space that the data is mapped to is better optimized than it would have been if an arbitrary kernel function had been selected. Moreover, better precision in classification is also achieved.

They began to implement the deep kernel and make some initial tests. They chose a deep belief network to the learn the kernel functions because it has a strong representational capability. Training a deep architecture is computationally intensive, so they chose HPCC Systems to leverage the complexity of the algorithm.

Their initial tests showed that the deep kernel outperforms other kernel functions in both classification and dimension reduction/visualization. While in classification, the deep kernel provides higher accuracy, for dimension reduction and visualization the deep kernel maps the data to a space where classes are more linearly separable, even in low dimensions like 2D (whereas kernel functions like Radial Basis Function (RBF) fail to do this in 3D space).

The deep kernel is useful for users who want to use kernel machines in tasks like classification and dimension reduction/visualization, especially HPCC Systems users since the training process is accelerated and solves the complexity problem. Users can now use algorithms like support vector machine, kernel PCA etc without having to choose hyper-parameters for the kernel and the performance is guaranteed to be optimal. In other words, they can just provide the data, to obtain an optimized model.

Linh Le and Professor Xie are working on developing the deep kernel for other tasks, (e.g. regression), and they are also working on a sampling method to further decrease the complexity of the algorithm. Their aim is to make the deep kernel a good solution for all data types and modeling tasks. Although a ‘perfect-for-all’ solution does not exist, by using the deep kernel, users are guaranteed to have at least ‘close-to-optimal’ models for any task they want.

Their research paper was presented at the 3rd IEEE/ACM International Conference on Big Data Computing, Applications and Technologies in December 2016. View a slideshow about this research project.

Linh Le's and Professor Xie's prize winning poster presentation was entered into our competition held on Community Day at the HPCC Systems Engineering Summit in 2016. 

  • No labels