TY - GEN
T1 - Data modeling
T2 - International Congress of Mathematicians 2010, ICM 2010
AU - Xu, Zongben
PY - 2010
Y1 - 2010
N2 - Data modeling provides data analysis with models and methodologies. Its fundamental tasks are to find structures, rules and tendencies from a data set. The data modeling problems can be treated as cognition problems. Therefore, simulating cognition mechanism and principles can provide new subtle paradigm and can solve some basic problems in data modeling. In pattern recognition, human eyes possess a singular aptitude to group objects and find important structure in an efficient way. I propose to solve a clustering and classification problem through capturing the structure (from micro to macro) of a data set from a dynamic process observed in adequate scale spaces. Three types of scale spaces are introduced, respectively based on the neural coding, the blurring effect of lateral retinal interconnections, the hierarchical feature extraction mechanism dominated by receptive field functions and the feature integration principle characterized by Gestalt law in psychology. The use of L1 regularization has now been widespread for latent variable analysis (particularly for sparsity problems). I suggest an alternative of such commonly used methodology by developing a new, more powerful approach - L 1/2 regularization theory. Some related open questions are raised in the end of the talk.
AB - Data modeling provides data analysis with models and methodologies. Its fundamental tasks are to find structures, rules and tendencies from a data set. The data modeling problems can be treated as cognition problems. Therefore, simulating cognition mechanism and principles can provide new subtle paradigm and can solve some basic problems in data modeling. In pattern recognition, human eyes possess a singular aptitude to group objects and find important structure in an efficient way. I propose to solve a clustering and classification problem through capturing the structure (from micro to macro) of a data set from a dynamic process observed in adequate scale spaces. Three types of scale spaces are introduced, respectively based on the neural coding, the blurring effect of lateral retinal interconnections, the hierarchical feature extraction mechanism dominated by receptive field functions and the feature integration principle characterized by Gestalt law in psychology. The use of L1 regularization has now been widespread for latent variable analysis (particularly for sparsity problems). I suggest an alternative of such commonly used methodology by developing a new, more powerful approach - L 1/2 regularization theory. Some related open questions are raised in the end of the talk.
KW - Data modeling
KW - L regularization
KW - L regularization
KW - Sparse signal recovery
KW - Visual psychology approach
UR - https://www.scopus.com/pages/publications/84877918421
M3 - 会议稿件
AN - SCOPUS:84877918421
SN - 9814324302
SN - 9789814324304
T3 - Proceedings of the International Congress of Mathematicians 2010, ICM 2010
SP - 3151
EP - 3184
BT - Proceedings of the International Congress of Mathematicians 2010, ICM 2010
Y2 - 19 August 2010 through 27 August 2010
ER -