Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Stratified Transfer Learning for Cross-domain Activity Recognition (1801.00820v1)

Published 25 Dec 2017 in cs.CV and cs.LG

Abstract: In activity recognition, it is often expensive and time-consuming to acquire sufficient activity labels. To solve this problem, transfer learning leverages the labeled samples from the source domain to annotate the target domain which has few or none labels. Existing approaches typically consider learning a global domain shift while ignoring the intra-affinity between classes, which will hinder the performance of the algorithms. In this paper, we propose a novel and general cross-domain learning framework that can exploit the intra-affinity of classes to perform intra-class knowledge transfer. The proposed framework, referred to as Stratified Transfer Learning (STL), can dramatically improve the classification accuracy for cross-domain activity recognition. Specifically, STL first obtains pseudo labels for the target domain via majority voting technique. Then, it performs intra-class knowledge transfer iteratively to transform both domains into the same subspaces. Finally, the labels of target domain are obtained via the second annotation. To evaluate the performance of STL, we conduct comprehensive experiments on three large public activity recognition datasets~(i.e. OPPORTUNITY, PAMAP2, and UCI DSADS), which demonstrates that STL significantly outperforms other state-of-the-art methods w.r.t. classification accuracy (improvement of 7.68%). Furthermore, we extensively investigate the performance of STL across different degrees of similarities and activity levels between domains. And we also discuss the potential of STL in other pervasive computing applications to provide empirical experience for future research.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Jindong Wang (150 papers)
  2. Yiqiang Chen (44 papers)
  3. Lisha Hu (2 papers)
  4. Xiaohui Peng (9 papers)
  5. Philip S. Yu (592 papers)
Citations (214)

Summary

  • The paper presents a novel STL framework that transfers intra-class knowledge from a labeled source domain to a sparsely labeled target domain.
  • It implements a three-step process including pseudo label generation, intra-class transformation using maximum mean discrepancy, and a second annotation phase.
  • Experiments on three large datasets demonstrate a 7.68% accuracy improvement, showing robust performance in smart home, healthcare, and context-aware applications.

Stratified Transfer Learning for Cross-domain Activity Recognition

The paper "Stratified Transfer Learning for Cross-domain Activity Recognition," authored by Jindong Wang et al., introduces an innovative framework for cross-domain activity recognition by leveraging stratified transfer learning (STL). The primary concern addressed in the paper is the challenge associated with acquiring sufficient labeled activity data, which is both costly and time-intensive. STL effectively mitigates this issue by facilitating the transfer of knowledge from a labeled source domain to an unlabeled or sparsely labeled target domain, focusing on the intra-affinity of classes to enhance classification accuracy.

Methodology and Key Contributions

At the core of the proposed STL framework lies the ability to perform intra-class knowledge transfer by exploiting relationships within the same class across different domains. The approach is executed through three major steps:

  1. Pseudo Label Generation: Initial step involves generating pseudo labels for the target domain using a majority voting technique with multiple classifiers trained on the source domain data.
  2. Intra-class Knowledge Transfer: This crucial phase consists of transforming instances of both the source domain and the target domain's pseudo-labeled candidates into shared subspaces corresponding to individual classes. The innovation here is the application of maximum mean discrepancy within each class to adjust the domains, minimizing domain shift errors.
  3. Second Annotation: Following transformation, a second labeling or annotation phase assigns labels to all target domain instances, refining predictions with improved reliability over multiple iterations of the process.

The framework's efficiency was demonstrated through comprehensive experiments on three large datasets - OPPORTUNITY, PAMAP2, and UCI DSADS. STL was shown to outperform other state-of-the-art methods such as TCA, GFK, and TKL, improving classification accuracy by 7.68%.

Implications and Future Directions

STL's advantage stems from its capacity to handle intricate cross-domain shifts by breaking down the holistic learning problem into manageable subspaces at the class level. Such granularity can significantly boost model robustness and adaptability, especially in pervasive computing applications like smart home systems, healthcare, and context-aware services.

In future work, enhancements to STL could focus on the integration of deep learning architectures to further improve the automatic extraction and alignment of relevant features across domains. Additionally, investigating the deployment of STL within various cross-domain paradigms — cross-device, cross-subject, or cross-context scenarios — could provide valuable insights, enhancing the generalizability and applicability of the framework.

Overall, the paper establishes a foundational methodology that promises to advance the field of transfer learning for activity recognition, warranting further exploration and refinement to accommodate the growing complexity and scale of real-world pervasive computing environments.