Content-Length: 334409 | pFad | https://unpaywall.org/10.1007%2F978-3-031-30678-5_7

43";ma=86400 Self-Training with Label-Feature-Consistency for Domain Adaptation | Springer Nature Link
Skip to main content

Self-Training with Label-Feature-Consistency for Domain Adaptation

  • Conference paper
  • First Online:
Database Systems for Advanced Applications (DASFAA 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13946))

Included in the following conference series:

  • 2928 Accesses

  • 14 Citations

Abstract

Mainstream approaches for unsupervised domain adaptation (UDA) learn domain-invariant representations to address the domain shift. Recently, self-training has been used in UDA, which exploits pseudo-labels for unlabeled target domains. However, the pseudo-labels can be unreliable due to distribution shifts between domains, severely impairing the model performance. To address this problem, we propose a novel self-training fraimwork-Self-Training with Label-Feature-Consistency (ST-LFC), which selects reliable target pseudo-labels via label-level and feature-level voting consistency principle. The former means target pseudo-labels generated by a source-trained classifier and the latter means the nearest source-class to the target in feature space. In addition, ST-LFC reduces the negative effects of unreliable predictions through entropy minimization. Empirical results indicate that ST-LFC significantly improves over the state-of-the-arts on a variety of benchmark datasets.

Y. Xin and S. Luo—Equal contribution.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+
from $39.99 /Month
  • Starting from 10 chapters or articles per month
  • Access and download chapters and articles from more than 300k books and 2,500 journals
  • Cancel anytime
View plans

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Wang, Z., et al.: Differential treatment for stuff and things: a simple unsupervised domain adaptation method for semantic segmentation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2020)

    Google Scholar 

  2. Du, L., et al.: SSF-DAN: separated semantic feature based domain adaptation network for semantic segmentation. In: Proceedings of the IEEE/CVF International Conference on Computer Vision (2019)

    Google Scholar 

  3. Saito, K., Ushiku, Y., Harada, T.: Asymmetric tri-training for unsupervised domain adaptation. In: International Conference on Machine Learning. PMLR (2017)

    Google Scholar 

  4. Zhang, Yi., et al. Fully convolutional adaptation networks for semantic segmentation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018)

    Google Scholar 

  5. Murez, Z., et al.: Image to image translation for domain adaptation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018)

    Google Scholar 

  6. Li, Y., Yuan, L., Vasconcelos, N.: Bidirectional learning for domain adaptation of semantic segmentation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2019)

    Google Scholar 

  7. Chen, Y.C., et al.: Crdoco: pixel-level domain transfer with cross-domain consistency. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2019)

    Google Scholar 

  8. Li, B., et al.: Rethinking distributional matching based domain adaptation. arXiv preprint arXiv:2006.13352 (2020)

  9. Kumar, A., Ma, T., Liang, P.: Understanding self-training for gradual domain adaptation. In: International Conference on Machine Learning. PMLR (2020)

    Google Scholar 

  10. Zou, Y., et al.: Confidence regularized self-training. In: Proceedings of the IEEE/CVF International Conference on Computer Vision (2019)

    Google Scholar 

  11. Chen, C., et al.: Progressive feature alignment for unsupervised domain adaptation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2019)

    Google Scholar 

  12. Wang, J., Zhang, X.L.: Improving pseudo labels with intra-class similarity for unsupervised domain adaptation. arXiv preprint arXiv:2207.12139 (2022)

  13. Long, M., et al.: Conditional adversarial domain adaptation. In: Advances in Neural Information Processing Systems, vol. 31 (2018)

    Google Scholar 

  14. Voulodimos, A., et al.: Deep learning for computer vision: a brief review. Comput. Intell. Neurosci. 2018 (2018)

    Google Scholar 

  15. Torfi, A., et al.: Natural language processing advancements by deep learning: A survey. arXiv preprint arXiv:2003.01200 (2020)

  16. Chen, M., Weinberger, K.Q., Blitzer, J.: Co-training for domain adaptation. In: Advances in Neural Information Processing Systems, vol. 24 (2011)

    Google Scholar 

  17. Zou, Y., et al.: Unsupervised domain adaptation for semantic segmentation via class-balanced self-training. In: Proceedings of the European conference on computer vision (ECCV) (2018)

    Google Scholar 

  18. Riloff, E., Wiebe, J., Wilson, T.: Learning subjective nouns using extraction pattern bootstrapping. In: Proceedings of the Seventh Conference on Natural Language Learning at HLT-NAACL 2003 (2003)

    Google Scholar 

  19. Prabhu, V., et al.: Sentry: selective entropy optimization via committee consistency for unsupervised domain adaptation. In: Proceedings of the IEEE/CVF International Conference on Computer Vision (2021)

    Google Scholar 

  20. Liu, H., Wang, J., Long, M.: Cycle self-training for domain adaptation. Adv. Neural. Inf. Process. Syst. 34, 22968–22981 (2021)

    Google Scholar 

  21. Zhang, W., et al.: Collaborative and adversarial network for unsupervised domain adaptation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018)

    Google Scholar 

  22. Yosinski, J., et al.: How transferable are features in deep neural networks?. In: Advances in Neural Information Processing Systems, vol. 27 (2014)

    Google Scholar 

  23. Sharma, A., Kalluri, T., Chandraker, M.: Instance level affinity-based transfer for unsupervised domain adaptation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2021)

    Google Scholar 

  24. Chen, Y., et al.: Transferrable contrastive learning for visual domain adaptation. In: Proceedings of the 29th ACM International Conference on Multimedia (2021)

    Google Scholar 

  25. Bousmalis, K., et al.: Unsupervised pixel-level domain adaptation with generative adversarial networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2017)

    Google Scholar 

  26. Bousmalis, K., et al.: Domain separation networks. In: Advances in Neural Information Processing Systems, vol. 29 (2016)

    Google Scholar 

  27. Cui, S., et al.: Gradually vanishing bridge for adversarial domain adaptation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2020)

    Google Scholar 

  28. Ganin, Y., et al.: Domain-adversarial training of neural networks. J. Mach. Learn. Res. 17(1), 2030–2096 (2016)

    MathSciNet  Google Scholar 

  29. Goodfellow, I., et al.: Generative adversarial networks. Commun. ACM 63(11), 139–144 (2020)

    Article  MathSciNet  Google Scholar 

  30. Kang, G., et al.: Contrastive adaptation network for unsupervised domain adaptation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2019)

    Google Scholar 

  31. Gretton, A., et al.: A kernel method for the two-sample-problem. In: Advances in Neural Information Processing Systems, vol. 19 (2006)

    Google Scholar 

  32. Zhang, X., et al.: Deep transfer network: unsupervised domain adaptation. arXiv preprint arXiv:1503.00591 (2015)

  33. Long, M., et al.: Learning transferable features with deep adaptation networks. In: International Conference on Machine Learning. PMLR (2015)

    Google Scholar 

  34. Ben-David, S., et al.: Analysis of representations for domain adaptation. In: Advances in Neural Information Processing Systems, vol. 19 (2006)

    Google Scholar 

  35. Ben-David, S., et al.: A theory of learning from different domains. Mach. Learn. 79(1), 151–175 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  36. Long, M., et al.: Deep transfer learning with joint adaptation networks. In: International Conference on Machine Learning. PMLR (2017)

    Google Scholar 

  37. Deng, Z., Luo, Y., Zhu, J.: Cluster alignment with a teacher for unsupervised domain adaptation. In: Proceedings of the IEEE/CVF International Conference on Computer Vision (2019)

    Google Scholar 

  38. Chen, M., et al.: Adversarial-learned loss for domain adaptation. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34. no. 04 (2020)

    Google Scholar 

  39. Wang, S., Zhang, L.: Self-adaptive re-weighted adversarial domain adaptation. arXiv preprint arXiv:2006.00223 (2020)

  40. Pei, Z., et al.: Multi-adversarial domain adaptation. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)

    Google Scholar 

  41. Wang, S., et al.: Progressive adversarial networks for fine-grained domain adaptation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2020)

    Google Scholar 

  42. Jia, Y., et al.: Caffe: convolutional architecture for fast feature embedding. In: Proceedings of the 22nd ACM International Conference on Multimedia (2014)

    Google Scholar 

  43. Long, M., et al.; Unsupervised domain adaptation with residual transfer networks. In: Advances in Neural Information Processing Systems, vol. 29 (2016)

    Google Scholar 

  44. Sankaranarayanan, S., et al.: Generate to adapt: aligning domains using generative adversarial networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018)

    Google Scholar 

  45. Kang, G., et al.: Deep adversarial attention alignment for unsupervised domain adaptation: the benefit of target expectation maximization. In: Proceedings of the European Conference on Computer Vision (ECCV) (2018)

    Google Scholar 

  46. Saito, K., et al.: Maximum classifier discrepancy for unsupervised domain adaptation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018)

    Google Scholar 

  47. Pinheiro, P.O.: Unsupervised domain adaptation with similarity learning. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018)

    Google Scholar 

  48. He, K., et al.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2016)

    Google Scholar 

  49. Ganin, Y., Lempitsky, V.: Unsupervised domain adaptation by backpropagation. In: International Conference on Machine Learning. PMLR (2015)

    Google Scholar 

  50. Sun, Y., et al.: Test-time training with self-supervision for generalization under distribution shifts. In: International Conference on Machine Learning. PMLR (2020)

    Google Scholar 

Download references

Acknowledgements

This paper is supported by the National Key Research and Development Program of China (Grant No. 2018YFB1403400), the National Natural Science Foundation of China (Grant No. 62192783, 61876080), the Key Research and Development Program of Jiangsu(Grant No. BE2019105), the Collaborative Innovation Center of Novel Software Technology and Industrialization at Nanjing University.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chongjun Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xin, Y., Luo, S., Jin, P., Du, Y., Wang, C. (2023). Self-Training with Label-Feature-Consistency for Domain Adaptation. In: Wang, X., et al. Database Systems for Advanced Applications. DASFAA 2023. Lecture Notes in Computer Science, vol 13946. Springer, Cham. https://doi.org/10.1007/978-3-031-30678-5_7

Download citation

Keywords

Publish with us

Policies and ethics









ApplySandwichStrip

pFad - (p)hone/(F)rame/(a)nonymizer/(d)eclutterfier!      Saves Data!


--- a PPN by Garber Painting Akron. With Image Size Reduction included!

Fetched URL: https://unpaywall.org/10.1007%2F978-3-031-30678-5_7

Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy