- 无标题文档
查看论文信息

中文题名:

 支持向量机增量算法的迁移学习模型    

姓名:

 程同    

保密级别:

 公开    

论文语种:

 中文    

学科代码:

 070101    

学科专业:

 数学与应用数学    

学生类型:

 学士    

学位:

 理学学士    

学位年度:

 2020    

学校:

 北京师范大学    

校区:

 北京校区培养    

学院:

 数学科学学院    

第一导师姓名:

 于福生    

第一导师单位:

 北京师范大学数学科学学院    

提交日期:

 2020-06-05    

答辩日期:

 2020-05-09    

中文关键词:

 支持向量机 ; 增量算法 ; 迁移学习 ; 希尔伯特独立性准则    

中文摘要:
增量学习和迁移学习是线上学习模型的两个关键组成部分。对于连续的 数据流, 需要不断地根据新数据更新已有的模型。 对于目标域中有限的数 据,需要利用数据更为丰富的近似域中的数据,来完成数据匮乏的目标域的 分类任务。本文首先在增量学习方面提出了两种不同应用背景下的算法:批 量式和单样本式。前者主要处理具有密集数据流的线上学习任务,后者主要 处理具有稀疏数据流的线上学习任务。之后在迁移学习方面提出了两种复杂 度不同的算法:简单转换式和集成转换式。前者借助希尔伯特-施密茨独立性 准则,对能否进行迁移学习进行判断;后者通过构造分类器集合,分离出代 表数据特征的支持向量集,计算支持向量集到目标域样本的近似度,选取近 似度高的集合,进而与目标域样本结合,完成对目标域数据的补充。最后, 尝试将增量学习的批量式算法和迁移学习的简单转换式算法进行融合,验证 两者组合模型的可行性。通过分析实验结果,可以得出,批量式和单样本式 在预测效果与单次训练几乎一致的同时,降低了被处理数据的规模,提高了 计算效率。与集成转换式相比,简单转换式占用的计算资源相对较少,但能 够解决的迁移学习问题有限。如果某个迁移学习问题不能被利用简单转换式 来解决,可以尝试利用集成转换式的算法来解决。最后,批量式算法和简单 转换式融合的模型效果,能与简单转换式基本一致。因此,增量学习和迁移 学习可以融合在同一个线上学习的模型中,发挥其各自的优势。
外文摘要:
Incremental learning and transfer learning are two key components of online learning model. For continuous data flows, the existing model needs to be constantly updated based on new data. For the limited data in the target domain, it is necessary to use the data in the approximate domain with more abundant data to complete the classification task of the target domain with insu?cient data. In this paper, two algorithms for incremental learning are proposed: batch and single sample. The former is suitable for online learning tasks with dense data stream, while the latter is suitable for tasks with sparse data stream. After that, two algorithms in transfer learning with different complexity are proposed: simple transformation and integrated transformation. The former utilizes the Hilbert-Schmitz Independence Criterion to judge whether transfer learning can be carried out. The latter is designed by constructing a classifier set, extracting the support vector set, calculating the approximation degree of the support vector set to the target domain sample, selecting the set with high approximation degree and combining with the target domain sample to complete the supplement of the target domain data. Finally, the incremental learning batch algorithm and the simple transformation algorithm of transfer learning are combined to verify the feasibility of the combination model. Through the analysis of the experimental results, it concludes that while the prediction effect of batch and single sample is almost equivalent as that of single training, the scale of the processed data is reduced and the computational efficiency is improved. Compared with the integrated transformation, the simple transformation costs less computing resources, but the number of transfer learning problems it can solve is limited. If a transfer learning problem cannot be solved by simple transformation, integrated transformation algorithm can be used to solve it. Finally, the model effect of the fusion of batch algorithm and simple transformation is basically consistent with the simple conversion. Therefore, incremental learning and transfer learning can be combined into one online learning model and take beneficial effects.
参考文献总数:

 28    

馆藏号:

 本070101/20036    

开放日期:

 2021-06-05    

无标题文档

   建议浏览器: 谷歌 360请用极速模式,双核浏览器请用极速模式